How a bot changed the way we talk with each other

Jennifer Schmich
Geek Culture
Published in
5 min readAug 3, 2021

--

Coauthored by Sarah Mohs, Senior Content Designer at Intuit

Two phones with messaging

How often have engineers asked your content team to build something? When we got a meeting invite from DevOps, we were confused.

Ashish and Wendi came with a project for Intuit’s Engineering Days hackathon. Diversity and inclusion is part of who we are at Intuit. Quality content is inclusive and it’s part of our content guidelines.

Ashish and Wendi didn’t want to accidentally use harmful language, but they didn’t always remember the inclusive terms and didn’t want to have to check the style guide all the time.

Screen of Intuit content design style guide
At Intuit, inclusion has equal emphasis with accuracy, voice or clarity

How could we bring language guidance to Slack from our style guide, contentdesign.intuit.com? If we created a bot, could we help teams use more inclusive language in our everyday Slack conversations?

Assessing our gaps, we needed a structured data source with a fast API. We also needed a broad set of terms and phrases with enough guidance to be served as message replies or content themselves. WordPress, used to publish our style guide, wasn’t the answer.

The power of partnerships

Instead, we turned to Writer, another platform Intuit content teams use. Along with many awesome features, Writer flags language that isn’t inclusive. The product comes with researched, curated standards for thousands of inclusivity term checks. Another powerful feature: Writer also lets us add language from our own word list that’s particular to Intuit.

Writer consistently delivers language suggestions to our teams

Integrating with Writer’s API was the obvious decision. We could create an experience for Intuit employees to support our company value, Stronger Together. The hackathon was a week away, and the team at Writer jumped in to partner with us.

How the bot works

If you send a message in a Slack channel with language that isn’t inclusive, the bot replies privately and provides relevant suggestions on:

  • Age and family status
  • Ability
  • Gender identity
  • Sexual orientation
  • Race, ethnicity and nationality
  • Substance abuse
  • Passive-aggressiveness

Intuit had language in our products and our descriptions of them that needed to change, like “master admin,” “white glove,” “whitelist/blacklist.” But there’s also language that’s pervasive in any corporate environment, like “powwow,” “circle the wagons,” and the ever-present “guys” greeting.

Bot replies are private to the message sender. No one else sees or tracks what it says.

Practice openness and empathy

Going in, we respected that people aren’t all the same in awareness and willingness. Our team knew to be open to feedback for improvement, and possibly some defensive reactions. But it was important to lead with empathy for everyone, including those who were at different places on the path to inclusive language. Ultimately, the bot provides information that employees choose to consider or ignore. That was our only expectation.

Right away, we learned that we had to create more safety for everyone. To provide this, we created a feedback form, partnered with HR on FAQs, and linked our accessibility and inclusion guidelines, which are developed by folks from content teams, the DEI team, and employee resource groups.

The feedback form has been a useful tool to get constructive feedback from the community. For example, the bot originally flagged terms like “the deaf” and suggested person-first language like “person who is deaf.” We got feedback suggesting using a capital D for Deaf, as this person took pride in their identity. Being able to manage terms in Writer lets us easily edit to make sure the bot’s suggestions include different ways people prefer to describe themselves.

Communicate and set expectations

When we launched, some people were worried the bot was going to be used as a performance management tool. (It’s not, and no identifiable data is tracked.) Some folks who belonged to large Slack channels — or infrequently visited them — didn’t know it was there. They were surprised when it responded.

More communication is almost always better — especially with inclusive language and how it can lack a base of shared understanding. Now, when the bot is added to a channel, it introduces itself to everyone in a public message. It’s clear about what it does, what it doesn’t do, and what its limitations are. The bot also automatically greets new channel members with a similar message when they join.

Hello message

What’s next? We’ll continue to be open and empathetic, offering options. Ashish and Wendi have plans to add a way for individuals to turn off replies and to integrate a feature from Writer that captures feedback on each suggestion.

Impact on content and culture

The bot is very popular at Intuit. Only one notice about its availability went out, but folks added it all over Slack — 609 channels to date. The project got attention from our executives. In their view, the bot could up-level conversations and contribute to a healthy work environment. It could be part of larger culture-building efforts, too.

The number of suggestions went down as folks learned

Intuit employees have received the bot as a learning tool. We saw the number of flags going down over 60% in a few months. We’re using it individually to examine the words and phrases we use out of habit, and change them.

Do words have meaning or history I’m not aware of? Or impact on other people I don’t see? Can I create more belonging just by choosing a different word?

People are starting to think about the words they use — not only with our customers, but with colleagues, too. The idea behind the bot is to get you thinking about the words you use in the everyday conversations that happen on Slack, and have that eventually ripple out to conversations that happen everywhere.

Over time, new habits form. The words that you used to use will be replaced with other ones. Language is always evolving. Just think of racist, insensitive, non-inclusive words that were used just a decade or two ago that are no longer acceptable today. Language and cultural shifts can and will happen.

Extra thanks to my hackathon team who built the bot: Wendi Cui, Sarah Mohs, Ashish Padmantintiwar and Vivek Saigal. And to May Habib and Maiko Cook at Writer for their incredible support.

--

--