Skip to main content

How to teach social skills to robots and AI?



One of the versions is to make a large-scale database structure where is different versions of words are used in social situations. Then the programmer makes the first connections between databases. After that, the system starts to talk with its creators. 

The thing is teaching social skills for AI is quite an easy thing in theory. It's just connecting databases. But social skills are more than just saying words. They are noticing the facial expressions taking the hat off when going to the church. And other kinds of things.  

But if we want to make the AI what discusses like a real person that requires a lot of databases and connections between them. If the system would make the job interviews that thing requires lots of data. And if somebody says or asks about things that are not in the database, the system might ask the programmer to answer by talking or writing it to the computer. 

The thing is that if we want that AI to talk with us about everyday situations. That thing is really hard to make. The thing is that we don't normally recognize. That people are not using written standard language in everyday speech. So that requires that the databases can understand dialects. If we want to make an AI that has a very large set of skills. That requires large-scale databases. 

The problem with regular computer programs is that they are not using fuzzy logic. In the discussion programs, the fuzzy logic is made by using multiple databases that are connected with a certain social dialogue. Those databases are involving dialect words. 

They are connected to databases there is written standard language. And then those things are connected to the database that is including social dialogue. The thing is that the programmer can write social dialogue by using dialect words. And that makes the robot more like a human. 

If the social behavior like words that the AI uses are right the programmer or teacher would accept or deny that thing. If there is no match for an answer the programmer would write the right answer for the machine. 

The idea is that when the person is discussing with AI. It would record the things that the opponent is saying, and then answer. This kind of AI might have other parameters like images of the things facial expressions in certain situations. The AI is the ultimate tool in cases like job interviews and especially in video interviews. 

The system can follow things like the length of the brakes between the words. And how the voice is chancing when the person is talking. But it can also look for things like touching the nose during the interview. What kinds of things are marks of lies. That kind of system would also pick things like does the interviewed person has some kind of skills what the interviewer wants to find. 

When the interviewer asks about things like computer skills and what to do in certain situations the job seeker would cover the missing skills in long answers. That means that the interviewer would not notice that there are missing parts in the skills that the computer operator requires. 

 Like how to connect systems to the net. Or something like that. The system might record the answer. And then compare the answer with the actions what the worker should do in those cases. If there is no match, the person would not understand what to make. 

In the cases like job interviews, the AI can search if there are lots of the same names in the lists of referees in different job applications. If the same referees are always in the CV:s of the persons whose work is not. There is something wrong with those referees. 


https://likeinterstellartravelingandfuturism.blogspot.com/

Comments

Popular posts from this blog

The new bendable sensor is like straight from the SciFi movies.

"Researchers at Osaka University have developed a groundbreaking flexible optical sensor that works even when crumpled. Using carbon nanotube photodetectors and wireless Bluetooth technology, this sensor enables non-invasive analysis and holds promise for advancements in imaging, wearable technology, and soft robotics. Credit: SciTechDaily.com" (ScitechDaily, From Sci-Fi to Reality: Scientists Develop Unbreakable, Bendable Optical Sensor) The new sensor is like the net eye of bugs. But it's more accurate than any natural net eye. The system is based on flexible polymer film and nanotubes. The nanotubes let light travel through it. And then the film at the bottom of those tubes transforms that light into the image. This ultra-accurate CCD camera can see ultimate details in advanced materials. The new system can see the smallest deviation in the materials.  And that thing makes it possible to improve safety on those layers. The ability to see ultra-small differences on surf

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

Humans should be at the center of AI development.

"Experts advocate for human-centered AI, urging the design of technology that supports and enriches human life, rather than forcing humans to adapt to it. A new book featuring fifty experts from over twelve countries and disciplines explores practical ways to implement human-centered AI, addressing risks and proposing solutions across various contexts." (ScitechDaily, 50 Global Experts Warn: We Must Stop Technology-Driven AI) The AI is the ultimate tool for handling things that behave is predictable. Things like planets' orbiting and other mechanical things that follow certain natural laws are easy things for the AI. The AI might feel human, it can have a certain accent. And that thing is not very hard to program.  It just requires the accent wordbook, and then AI can transform grammatically following text into text with a certain accent. Then the AI drives that data to the speech synthesizer. The accent mode follows the same rules as language translation programs. The ac