How Artificial Intelligence Sees, Hears, and Relates Concepts

November 22, 2017, at 21:22 UTC

While most AI companies are focused on teaching computers to beat humans in win/lose scenarios (e.g. IBM with DeepBlue & Jeopardy, DeepMind with AlphaGo, ...), at muse.ai, we have been focussed on making machines see, hear, and relate concepts in a way similar to humans.

Ensuring a consistent perception of the world by humans and machines is not only foundational for a Video Search company like ours, it is also key to evolve machines so that they can empathize and collaborate with humans to devise win/win outcomes.

In this context, I gave the following presentation (at SecondHome) and described:

  • how the concept of Artificial Intelligence is as old as human imagination;
  • the sequence of key developments that lead to today's "AI hype";
  • how many jobs have already been automated away (gratefully); and
  • how machines see color, objects, hear sounds, recognize speech, and relate concepts.

The running theme throughout the presentation was that context, and inherent biases, have a large impact on how humans perceive color, sounds, relate concepts, and if machines are to understand the world like we humans do, they also need to be loaded with the same preconceptions.



Antonio Roldao © 1995-2017