Skip to main content

New AI-driven drones learn like animals.

"Photo of the “neuromorphic drone” flying over a flower pattern. It illustrates the visual inputs the drone receives from the neuromorphic camera in the corners. Red indicates pixels getting darker, green indicates pixels getting brighter. Credit: Guido de Croon" (ScitechDaily, The Future of Flight: Researchers Develop Neuromorphic Drones That Learn Like Animals)


The animals and their training are inspiring new drones and other robots. The idea is similar to teaching dogs. When a dog makes something as it should, the instructor gives candy to it. That thing motivates the dog, and in the case of a drone or robot, the operator pushes enter and accepts the operation. 

In the traditional learning model, the operator remote-controls the robot when it makes something for the first time. Then those operations are stored in the machine's memory. But the more advanced systems can imitate other machines or humans when they learn something new. 

In those systems, the robot observes other things like similar robots using cameras. And then, it copies those actions into its mass memory. If we want to make man-shaped robots, that imitate humans in the learning process, the robot can see how human moves hands and legs during some actions. Like moving boxes.

Then the robot can copy those actions into itself. Then the robot can use those movement series when it takes order to move similar boxes. The robot sees the box and then connects that image to the databases that involve the movement series. The robot can learn things by watching screens and in that model, the robot can sit in an armchair and look at the TV to learn things. Like how to make food. 

When a computer learns something, it connects a new skill module into itself. If we want to make a drone helicopter that must fly through the labyrinth we can use the film where a remote-controlled drone flies through the route. Then that film will driven to other drones' memories. Things like GPS, inertial navigation, and laser rangefinders can make drones operate independently. 

However, using drone pairs is possible to create similar results using cheap drones. That must not be equipped with complicated sensors. Only cameras are enough to make impressive solutions for independently operating drones. 

In the simplest and the most effective model drone pairs can operate as a team. The drones observe each other and tell if the other drone goes too close to the wall. The drones must only know the speed of the remote-controlled drone.

Then drones must copy the turnings to themselves. If drones are identical, they can use information on how many rotations per minute the remote-controlled drones' engines take. Then the drone needs time what certain part of the labyrinth takes. But if the labyrinth is open, another drone can hover above the second drone and tell where that low-flying drone must go. 


https://scitechdaily.com/the-future-of-flight-researchers-develop-neuromorphic-drones-that-learn-like-animals/

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac