The new mind-reading cap makes Neuraport redundant.
The mind-reading AI can turn thoughts into text. The next step is to take images from the mind. And project it into a computer screen. First time in history. The AI decoded images from the mind to the screen. That is one of the biggest advances in the BCI (Brain-Computer Interface) systems development. Those systems can control anything from quadcopters and other robots to implanted animals through the computer.
And the difference between this new invention and things like Neuralink is that this system does not need surgical operations. The system is easy to wear because it's like a hat. And because it can translate thoughts into text, it can connected with well-known software like word processing software. The system can dump that text to things like AI that creates code for the robots. And maybe the next-generation jet fighters have this kind of user interface. The BCI (Brain Computer Interfaces) are fascinating tools that can revolutionize police work and they can make new and powerful computing possible.
The next step is that the AI can read images from the human mind. And that makes the systems more interesting than ever before. The intelligent hat can used to read images from the animal's memory. But that requires that the system knows the electric actions in the animal's brain so well, that it can decode that data.
https://interestingengineering.com/innovation/worlds-first-mental-images-extracted-from-human-brain-activity-using-ai |
But interactive data cap can make many more things than just give orders for Chat GPT. Or control the robot. It's possible. The users can use VR (Virtual Reality) glasses with that data hat. That thing gives the ability to use the robot as the external body. But in the most powerful technology, the data that gives feedback information in electric stress form to brain shell. This thing makes it possible to connect the robot to part of the human nervous system.
The interactive BCI system makes it possible to create technical telepathy or technical OBE (Out of Body Experience). The simplest way to make technical OBE is to connect a person's mind-reading electrodes. The interactive mode with the VR glasses or cortex stimulation makes it possible for that person to think to fly around.
The intelligent cap that interacts with computers is a flexible tool. That system can used for technical remote view. There is a possibility that two BCI caps can transport information between each other. That kind of system makes technical telepathy possible. The thoughts-to-text application can turn the text into speech using a text-to-speech application. And that makes telepathy possible.
It's possible to connect this kind of system with a microchip that is implanted with insects. Those cyborg insects can get their commands from the remote control systems. Maybe in the future, the AI can read information that travels in insects' nervous systems. And that makes the technical OBE (Out-of-Body)-experience possible. In that system, the data cap interacts with a microchip, implanted with some other species like birds or insects. But the drones like quadcopters can give interesting experiences.
https://interestingengineering.com/innovation/mind-reading-ai-thoughts-text
https://interestingengineering.com/innovation/uk-fighter-mind-reading-helmet
https://interestingengineering.com/innovation/worlds-first-mental-images-extracted-from-human-brain-activity-using-ai
https://www.theguardian.com/technology/2023/may/01/ai-makes-non-invasive-mind-reading-possible-by-turning-thoughts-into-text
https://en.wikipedia.org/wiki/Out-of-body_experience
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.