The universal ecosystem
How AI will open up walled gardens by making file formats, protocols, and even standards less relevant.

Ever stop to think about what makes a workflow a workflow? Or an ecosystem an ecosystem? Why we must tiptoe among artificially grown archipelagos of apps, services, and devices without ever stepping off in our own individual, creative directions?
There are three primary reasons:
- File formats. Modern file types are incredibly complex. Even when they are open standards, the technology to reliably read and write them — and access their full range of capabilities — is often proprietary. Therefore, moving files through workflows (with robust and predictable results) usually means sticking to apps and services built and maintained by a single entity.
- Protocols. For devices to be able to communicate with one another, they need to be able to speak one or more common languages. Technically, that’s not a problem, but to gain a competitive advantage, sometimes platform vendors introduce incompatible dialects which give rise to exclusionary digital cliques.
- Walled gardens. There are very few universal truths anymore, but at least one remains: public companies must grow. One of the best ways to line up and build upon recurring revenue quarter-over-quarter is to ensure customers can’t easily switch to competing ecosystems.
But the world may be about to change. I’m starting to see opportunities for artificial intelligence to haul us out of some of the ruts we’ve spent decades cutting into the technology landscape. To test my theory, I put together a series of examples and prototypes designed to demonstrate a possible future where AI knocks down tall garden walls by making file formats, protocols, and even standards less relevant.

Let’s start with a simple photography workflow to lay some groundwork.
Until they are shared, photos rarely leave the ecosystems in which they are either captured (in the consumer space) or edited (in the case of enthusiasts and professionals). But let’s look at a completely different kind of photography workflow — one powered almost entirely by AI — starting with acquiring the image itself. Instead of taking a selfie or snapping a pic of a friend, let’s synthesize a portrait using NVIDIA’s StyleGAN2.

Now that we have a subject, let’s use a few completely disparate AI-powered tools to make some edits.

What I find fascinating about this workflow is that I arranged it myself. It wasn’t explicitly defined, enabled, or foreseen by anyone. These applications were never intended to be used together — and for all I know, have never been used together in precisely this way — yet I was able to jump from one app to another entirely seamlessly.
Traditional workflows like this are made possible by a common understanding of file formats and metadata — some sort of shared binary knowledge. But the only shared understanding in this workflow is that of the human face. Any process that can generate a plausible portrait as output, or interpret a human face as input, can be inserted into this workflow at any point — regardless of who wrote/trained it.
It’s worth acknowledging that I used the JPEG file format and compression standard to connect the dots of this constellation. But the fact that all the apps in this workflow can understand the color and arrangement of each individual pixel is not what’s important. What’s important is that they can also understand the meaning of the sum total.
Admittedly, a workflow that can only operate within a single narrow domain — in this case, portraits — is extremely constrained. But we’re just getting started with AI. Imagine a not-so-distant future where every tool you use has a deep semantic understanding of people, objects, environments, language, physics, and even context. And where the output of any tool in your workflow can become input for any other tool — not through common file formats or protocols or standards, but through a shared understanding of reality.
Let’s switch mediums now in order to explore four simple prototypes I built to see if I could use AI to further challenge traditional notions of workflows.
Using AI to Unlock Data Trapped in a PDF
A New AI-Powered Visual Search Workflow
Using an Intelligent Agent to Replace AirDrop and Bluetooth
Using AI to Send Text Messages From the Browser
Epilogue
To be clear, I’m not here to warn of seismic market disruption or paradigm shifts of enormous proportions. There are enough futurists and pundits out there already covering the doom and devastation angles of AI. I’m just here to have a little fun with it. To run a little creative plumbing between apps and devices and services, and then to see what happens when data flows through the system. But more than anything else, I want to leave you with a view of a possible future in which humans and computers collaborate more than we compete. After all, the first step toward any kind of respectful and productive relationship is a shared understanding of reality.