Member-only story
Our human habit of anthropomorphizing everything
Should we be anthropomorphizing AI?

Humans anthropomorphize everything. We assign human traits and emotions to animals, inanimate objects, and even software.
“Gmail is acting finicky today.”
“I swear my cat threw up on the rug just to spite me.”
“Siri can be so dumb sometimes.”
The reality is that animal behavior doesn’t always correlate to human behavior. Software and AI don’t “behave” at all, but rather function in accordance with their code. Humans, as social animals, find it easy to interpret certain outputs as “behaviors.” Humanizing the tech we use makes it a little bit more understandable.
But anthropomorphizing things can go wrong. Rather than making complex systems like AI more understandable, anthropomorphizing tech can actually contribute to further mystification and misunderstanding.

What does anthropomorphizing look like?
During a qualitative usability study of ChatGPT, the Nielsen Norman Group observed four patterns of user behavior that assigned human…