Skip to main content

The AI-controlled robots are the next-generation organisms.

The AI can come as intelligent as humans. That is the thing that researchers and other people are repeating many warnings. If we think that an AI-controlled robot is a dangerous tool, we must remember that. Natural organisms or AI-controlled robots can be dangerous to their environment even if they are not intelligent. 

In the case that robot faces malfunction, all robots are dangerous. If the welder-robot thinks the worker is the workpiece, takes the worker to the assembly line and starts the welding. It causes a very dangerous situation.  

In the case of AI, intelligence means that the system faces situations that it recognizes. Then the system selects the action matrix that is suitable for the missions. 

Bacteria are not very intelligent organisms, but the fact is that bacteria can be dangerous because they create poisons that can damage the internal organs of humans and animals. 

The fact is that we can think the AI-controlled robots as organisms as well as the "natural organisms" are. And another thing is this: we can make many things more dangerous than they are if we want. By using genetic engineering there is the possibility to create the xenomorph bacteria with two DNAs. The other form would be regular bacteria, and the other would be neurons. So that kinds of bacteria can transfer themselves to neurons. And maybe they could decode their DNA and turn more intelligent than humans.  

We must not think that AI is god. AI makes mistakes because it requires databases. And humans are collected data that those databases involve. 

The robot swarm can emulate the behavior of wolfpacks. That means if one member of the drone swarm is under attack. Other drones are coming to help it, the same way as lions and wolves, and many other social animals protecting each other. That is the reason for animals' social behavior. 


House of Stairs: M.C. Escher


Sometimes is introduced that robots can have feelings. Robots can emulate feelings. That emulation happens like this: when the robot sees that person cries, laughs, or seem happy or depressed, it reacts to that thing by using algorithms. There are certain descriptions of how people look and talk when we have certain moods. And when the robot sees or hears things it reacts by searching the database that matches with the image. When a robot comes to the workplace it might say "hello" or "good morning" to people. Who it sees for the first time. 

Then it might remember the face, and it will not talk to people anymore during that day. Robots might say: "Nice to meet you" just like humans. But those reactions are coming from databases that control social behavior. If somebody is yelling at the robot, it could raise the middle finger. And if the robot works as undercover law enforcement, it might speak like some gang member. 

If a person tells things like the pet has died, the robot can say I'm sorry or what programmer is stored in those databases. But the fact is that those social actions are database connections and the robot doesn't have feelings. 

But then we must understand one thing. AI can make many things, but is it intelligent anyway? If the AI controls the factory and it has sub-robots that can search for minerals by using laser spectroscopes, the AI that has the right equipment like centrifugal masons that can melt the material and separate elements by using a centrifugal sling and 3D printing system, that can make equipment from CAD images the AI can seem to have multiple skills. But does the AI knows what it does? 

When the AI makes something it follows the algorithm. That means when its sensor sees something it searches the database that has the action. That connected with that thing. So when the robot that searches minerals sees iron in its spectroscope. That thing makes a connection to the database, where are orders of what the robot must do when it sees iron ore. 

The robot factory that commands robots acts like an ant society. Those worker robots searching for raw materials from nature is model of artificial cell. The thing is that kinds of systems can use in interplanetary missions. In those models, the robot workers make the bases ready for human colonists.

The researchers might copy the robot's behavioral models from animals. But robots are not animals. They can make many complicated series of movements. And every action that the robot makes is a movement. That means the robot reacts to something. But it doesn't know what it does. 

But the origin of those spectacular plans is the 3D-printing systems that can produce things like quadcopters and machine parts from the trash. Those systems can use in everyday work and some visions when people buy cars the 3D printing tools are making them at the dealer's garage. But the military can also produce their equipment on the battlefield. 

And then when we are thinking of the robots that are protecting and helping each other, we can imagine combat robots. Those robots see other robots' IFF (Identification Friend or Foe) signals. And if some of those robots are in trouble, they call other robots to support them. This is one version of social behavior that is copied from nature to robot swarms. 

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac