Skip to main content

The pocket-sized AI and humanoid robots are the ultimate compilation.

 The pocket-sized AI and humanoid robots are the ultimate compilation.


The human-looking robots are the next-generation GP (General Purpose) tools.


The pocket-sized AI and humanoid robots are the ultimate compilation.


AI means a language model that can translate spoken commands to computer programs. And it's possible. That the AI can create morphing program entirety for robots. That kind of morphing program module environment means that the robot can make its missions in multiple conditions and turn the robots more flexible. Than ever before.

In AI-based systems, the center of the system is the language model. The language model is the tool, that turns spoken words into commands. That computers understand. The language model simply transforms spoken words into algorithms that computers use to control robots and other systems. The language model makes it possible.

The system can make customized computer programs in real-time. The AI follows orders that the user gives, and then it creates new programs or modules for computers or robots, this ability means that the AI can also delete those programs when it doesn't need them anymore.





The user can connect this kind of AI device to the computer using the USB. Or wirelessly, using a BlueTooth connection.

The BMW starts to test human-looking robots in their assembly lines. Those human-looking robots can use the same tools as humans. And they can make morphing networks. Those systems can use central computers or the robots can make morphing networks with each other. In that network, robots can share their data and computer capacity over the network. And that allows the robots can operate as a unit. That means large groups of robots can operate in their entirety.

Portable AI or systems that involve language models that can connect with those systems allow to make fast changes in those robots' programming. AI- or language models that can control robots and use spoken speech could be game-changers in that kind of technology. Robots can do many things that are not possible for humans. The human-looking robots are also tools. That is interesting about researchers, space, and deep-sea explorers, and military personnel.

The human-looking robots can perform surgical operations. In those cases, human operators oversee those operations. Those remotely controlled robots can bring doctors to very remote places. The robot is only a body, and its programs determine its use of the same robot.

That operates as a gardener can operate as a surgeon if its control programs are changed. Same way robots that clean floors can be reprogrammed to combat robots. And by using man-shaped robots every single aircraft in the world can turn into a robot plane. This means also large-size old aircraft can used for kamikaze missions.


https://www.freethink.com/robots-ai/general-purpose-robots


https://www.indiatoday.in/technology/news/story/rabbit-r1-the-cute-little-pocket-size-viral-ai-device-that-can-do-everything-for-you-2487278-2024-01-11


https://learningmachines9.wordpress.com/2024/01/23/the-pocket-sized-ai-and-humanoid-robots-are-the-ultimate-compilation/

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac