Skip to main content

The new brain activity scanner transforms thoughts in the text.


The new brain activity scanner makes it possible to operate computers with thoughts. If the thoughts-to-text application is connected to radios. That allows it to send text-messages just by thinking. The biggest problem in BCI (Brain-Computer Interface) is how to drive EEG to the computer. 

The "thoughts-to-text" application makes uses a similar model to "voice command applications". The spoken words will transform to text, and then the system sends that text to the computer's control interface. Researchers can combine this kind of EEG-controlled system with a voice-command application. 

The system uses the layer that transforms thoughts into the text as a layer where the system drives thoughts to commands for computers. In the simplest model, the user can activate the BCI system by using the button. Then the system will transform thoughts into text, and then the AI will recognize if there are words that seem like commands. The user can use the small joystick to confirm or not confirm the orders. 


"Researchers at The University of Texas at Austin have developed a semantic decoder that converts brain activity into a continuous text stream, according to a study published in Nature Neuroscience. This non-invasive AI system relies on a transformer model and could potentially aid individuals unable to physically communicate due to conditions like strokes. Participants undergo training with the decoder by listening to hours of podcasts while in an fMRI scanner. The decoder then generates text from brain activity while the participant listens to or imagines a story." (ScitechDaily.com/Not Science Fiction: Brain Activity Decoder Transforms Thoughts Into Text)


The thing is that. By connecting this type of system with augmented reality is possible to make the operating interface where a person must not move even fingers. The idea is that the system follows a certain point in the eye, and then aims the crosshair at the right point on the screen. Then the user can blink their eyes, and blinking the left eye might be the left mouse button. And the right eye means the right mouse button. By using that system person can fill forms on the computer screen by using BCI. The aiming point tells, what field wanted to fill, and then the person just thinks something. 

The thoughts to text-applications are the pathfinders for the systems that also can project visual things that a person imagines. If that kinds of advancements are in use, we can make movies by using our imagination, and introduce them to other people. And those systems that can transform the EEG to film and sound can also uncover secrets of dreams. The ability to see what another person thinks and dreams would be revolutionary. 

But that kind of system can revolutionize things like criminal investigations. The people who are working with those cases could see everything that person sees during the crime. And that system can be the next-generation lie detector. If somebody carries that kind of system other people can see and hear everything that person sees and hear. 


https://scitechdaily.com/not-science-fiction-brain-activity-decoder-transforms-thoughts-into-text/

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac