Layered communication with a DIKW pyramid solves a simple problem: how human can submit commands to a robot. For example the human operator submits "bring me the red box" to the robot and the robot will fetch the box.
The unsolved question is why such a human to robot interaction is needed? In the classical understanding of AI until the 2000s, such kind of pattern was interpreted as a dead end. Artificial intelligence was described as autonomous system which doesn't need to communicate with humans. If a robot doesn't communicates with humans the robot is described as a closed system which is able to run a computer program written in C or runs a neural network algorithm but the robot isn't using human language because there is no need for doing so.
Grounded language realized in AI systems like SHRDLU (1968), Vitra (1987) and M.I.T. Ripley robot (2003) is only needed if human to machine interaction is intended. A possible explanation for this paradigm shift has to do with the weakness of closed AI systems. Existing attempts to build autonomous robots have failed because closed systems are overwhelmed by a complex environment. Even if the robot's software consists of 100k lines of code in the C/C++ language this code won't match to a warehouse robot task because ambiguity and vague goals. Classical computer programming language are working only in a predicted environment like sorting an array or showing pixels on a monitor. Computer programs are the internal language of machines but the code can't store the knowledge for robots task.
Before the advent of human to machine interaction there was another available to build more powerful software for robots based on ontologies. Instead of storing the world knowledge in computer code the goal was to capture knowledge in a Cyc like mental map. Unfortunately, this concept has failed too. Even OWL ontologies are not powerful enough to store domain knowledge. Only teleoperation matches to high complexity. A teleoperated robot arm can do any demanding task including dexterous grasping and novel trajectories never seen before.
During teleoperation the AI problem gets outsourced from the robot itself towards an external source which is the human operator. The teleoperation interface allows a man to machine interaction which translates the external knowledge into robot movements.
April 08, 2026
The need for human to robot communication
Labels:
Teleoperation
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment