UX Collective

We believe designers are thinkers as much as they are makers. https://linktr.ee/uxc

Follow publication

Tesla’s new ‘mind of car’ UI signals a future we’re not prepared for

Jono
UX Collective
Published in
6 min readJul 12, 2021

A look into the ‘mind of car’ UI showing a car taking a right turn.
A look into the new ‘mind of car’ UI showing a car taking a right turn (Credit: NotaTeslaApp.com)

Amid the much-anticipated release of FSD 9.0 beta for Tesla car owners the other day, Elon Musk tweeted out shortly before an intriguing detail surrounding one of its key features:

What a way to spark some interest…But what exactly is the ‘mind of a car’?

‘Mind of the car’?… What’s this thing for?

The ‘Mind of a car’ is an enhanced visualization of what the AI sees and thinks through its vision-only cameras on a Tesla with FSD installed. In an interview with Lex Fridman back in 2019, Elon shares his early vision for it:

“The whole point of the display is to provide a health check on the vehicle’s perception of reality… That information is rendered into vector space (bunch of objects, lane lines, etc.). You can confirm whether the car is knowing what’s going on or not by looking out the window.” — Elon Musk

It’s a sort of metaphor used to describe a display that helps drivers understand how the system thinks and distinguishes objects or obstructions within its field of view.

A photo of an older version of the ‘mind of car’ display, which shows cars and nearby objects are primitive objects.
An older version of the display. The UI is really primitive. (Source: elektrek)

The purpose of this feature is to provide an experience where the driver can retain some oversight and control of the vehicle. While one can assume its hands-on-the-wheel control will slowly be forked over to FSD in the coming years, it provides drivers a meaningful way to interact with the AI system in the interim.

How does the latest version work?… And what does it do for us?

The best way to understand the latest version is to see it for yourself. After doing some digging, I found a recorded test drive demonstrating how this feature works today. You can have a look at the full video here.

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Responses (3)

Write a response

Since autonomous cars are not close to “there” yet, the relationship between humans and their vehicular robots is easy to define - unfortunately we’re doing it backwards.

Humans should not “oversee” autonomous cars. The impossibility of a…

--

Sadly, with our human-human empathy track record, I can't say it's looking good for the machines. But one sentence I read recently on Medium gives me hope:
"Don't hurt the poor machine. It will remember it when it becomes self-aware!"
So if empathy doesn't sound appealing, maybe self-preservation will 😁

--

While most focus in on humans monitoring the AI or the AI monitoring humans, the way forward will be considering the two elements as teammates who need to communicate and share responsibility.
This article, from 2004 but still very relevant, explains…

--