Skip to main content

Researchers can use AI-driven video games to test machine learning.




Statistics-based machine learning is quite easy to make. 


AI playing video games is a big deal because those systems can used to create machine learning. The virtual environment is an excellent place to develop AI and its ability to learn things. The screenshot above this text is from the sniper game on the Y8 homepage. In that type of game, the AI can observe the direction in which the gamer moves. 

The machine learning base is in statistics. The system makes a database about the mover's directions. Then the AI opponent will use those statistics to aim the gamer. Those statistics are the tool that will make the AI an ultimate opponent. 

But that is the biggest weakness of this kind of statistics-based algorithm. There is the base model about the direction where the gamer wants to move. Most gamers will turn in a certain direction. Because they are right-handed. Left-handed people are better, especially in martial arts like boxing. The reason for that is that the punch comes from the wrong side. If the opponent is right-handed. 

If the opponent gamer is left-handed the system will make the wrong estimation about the direction where the gamer moves the character. Most people are right-handed. And most right-handed people follow certain formulas to select the way, where they want to move their character. The left-handed person thinks oppositely to the right-handed people and that means the AI aims in to wrong direction.


The next-generation autopilot parking systems. And use of virtual reality (VR) to test AI-based autopilots. 


The drone connected to the next-generation autopilot allows the AI can drive the car into the parking very effectively. The drone can be in the box on the roof of the car. When autopilot drives the car to the parking, the drone will rise above the vehicle, and then the autopilot can use that drone's camera to see, where the car's corners are going. 

Computer games can used to test how fast people can learn. The game is a virtual environment. And it can used for many purposes. The virtual space allows the testing of self-driving car's visual observation systems. The car can be on rolls, and the system uses interactive movies to introduce different situations for the car. The system can use media projectors to make the VR environment. 

The system follows the car's and its autopilot's reactions when its camera system sees something, like a moose that suddenly comes to the front of the car. In those cases, the vehicle's camera system drives information into the AI-based autopilot. During the R&D process, the developers can control the vehicle's digital twin in the real world. in that process system collects data for autopilot's program. 

Then collected data will first be stored in the autopilot system that is in the VR room. There is a safe environment developers and AI adjust the system so that it will be as safe as possible. 


https://bigthink.com/the-future/why-ai-playing-video-games-is-a-big-deal/

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac