Public lecture: Talking to robots
Robots may soon be delivering pizza and parcels, driving cars and helping the kids with their homework. How will they communicate with us?
They can download a dictionary and start talking, but a dictionary doesn’t understand its own words and nor would the robots. Language is not that simple.
So, could a robot ever say-what-it-means and mean-what-it-says? This talk is about exactly that: robots, called Lingodroids, that really do understand the language they use. We know these robots understand because they invent their own words to describe their own experiences.
This public lecture described how such robots roam the world, asking each other questions like “Where are we” and “What time is it?” Through these conversations, they invent words for places and times, linking the meanings to their own experiences. Linking a place in the world to its name is called “grounding” and it is the first step in having robots truly understand what they say. The talk also described how the Lingodroids invent robot languages, and then finish with a glimpse into how conversations with people can help robots learn the meanings of words in human languages.
The public lecture was held at the Powerhouse Museum on Tuesday 9 February and the speaker was:
- Professor Janet Wiles, Complex and Intelligent Systems, University of Queensland