Which human metacognitive structures AI is lacking

Learning is not only about feeding on bottom-up data

Jean-marc Buchert
UX Collective
Published in
5 min readMay 5, 2021

--

a strangely colored structure
Photo by JJ Ying on Unsplash

While computer models need large amounts of data to identify a word, a human child only needs two or three occurrences to understand it. Where does this incredible ability to generalize knowledge come from?

From birth, the human brain relies on metacognitive rules to foster the learning of language, physical laws and human relationships. Whether to move intuitively in its environment or to guess a word meaning, human intelligence does not start from simple data but structure its knowledge on strong cognitive foundations.

Here is what these foundations tell us about machine learning models' capabilities and limitations.

The power of human innate meta-knowledge

There is a lot of ways to compare human and machine learning capabilities. But the best way is to look at artificial neural network learning models.

While an algorithm like Deep Minds takes thousands of hours to learn how to play an Atari game, a human user only needs about ten hours. While AI translation models engulf millions of data to capture semantic correspondences between words, a human baby only needs to hear a few occurrences to understand the meaning of…

--

--