June 16, 2021

Coach based artificial intelligence

 

The problem why Artificial INtelligence is difficult to realize is not located within the machine itself. Because software engineers are experts in writing software. The sourcecode compiles into binary code and if the compiler doesn't find a mistake, everything is working fine. The more demanding problem is, if a software is working fine on the first look but at the same time it is doing not what is expected. That means, the program compiles great into binary code but the robot isn't able to solve a task.
Fixing such problems is much harder and if it is not done another failed robot project is the result. Before it is possible to realize more advanced software the bottleneck in existing systems has to be described. It has to do with man machine interaction. Man doesn't know who to comand the machine, and the machine isn't able to obey to the human. It is a bit unusual but Artificial Intleligence can be realized only with better man machine interaction.
IN a teleoperated situation, there is a machine, a human operator and a robot. Realizing these system is possible. The joystick command of the human are converted into machine actions. Such systems are possible for a robot crane but they have failed for more demanding tasks like controlling a train or a care. The reason is, that these systems need very complicated input signals. For example the cockpit of a car provides not a single joystick but the human operator has move the car at the same time left and right and has to decide which speed is needed and in some older cars, the human operator has to control the gear as well.
It is complicated or even impossible to control such a system with remote control. So the question is how to design a human to machine interface which fits to more complex domains. A possible attempt in doing so is a coach based interaction paradigm, which is equal to a hiearchical system. There is coach on the right seat of the car, a driver on the left side, and a cockpit which receives the commands of the driver.
Such a system consists of an advanced command hierarchy. The coach provides the high level goal for example “move ahead”, the driver converts this command into low level steering commands and the cockpit translates the actions of the driver into motor commands. Let us imagine who to automate the pipeline a bit. The idea is to replace only the human driver with a software but let the coach in the car. That means, the human operator plays a certain role. He provides high level goals for the AI Software. The AI software converts these goals into low level steering actions.
Such a system can be realized but it remains an advanced software project. Parsing the command itself is the easiest part of the system. There are many voice recognition software available which are able to convert a spoken command into textual information. The human coach produces a .wav file and this gets converted into the string “move forward”. The more demanding task is the follow up step. How to convert the command string into low level steering commands?
One option would be a lookup table which is realized with a case statement. That means, each statement is converted into predefined action sequence similar to a macro. IN the example, move forward would be converted into “car.speed(20)”. The problem is, that such a direct mapping can't handle different situation. The command “move forward” can be used in many situations but it will generate different low level actions. The more elaborated translation attempt would be to interpret a command as a goal. It is up to the solver to fulfill the goal.
For example, the command “Move forward” can be translated into:
1. remain in the current lane
2. hold a speed of 20 mph.
3. if the current steering angle or the current speed is different, then adjust them