Containerise Your Next Big Idea
How modular AI handoffs are rewiring how work, value, and leverage move.
{This piece was inspired by the YWR piece by
, Containerisation of Thought}In the 1950s, a simple steel box quietly rewired the global economy.
It wasn't the container itself that changed the world.
It was what the container made possible:
Standardized shipping.
Frictionless handoffs between ports, ships, trucks.
Entire supply chains rebuilt on modular flows.
Today, the same shift is happening again.
But the "containers" are invisible.
They're data structures, API calls, JSON objects — modular thought containers flowing between intelligent systems.
AI workflows aren't just tools you tap individually.
They're building an ecosystem of AI-to-AI collaboration:
One agent researches.
Another summarizes.
Another critiques.
Another optimizes.
Another routes outputs into action.
Each handoff happens faster, cleaner, and more reliably when the container — the unit of thought — is structured properly.
Humans will still matter.
But the center of gravity is moving:
From human-to-human workflows (email, meetings, decks).
To AI-to-AI modular flows, with humans designing, tuning, and intervening.
If you stay inside the old model — long meetings, messy communication, manual integration — you'll feel slower and heavier without knowing why.
→ Further reading: The Hardest Part Isn’t Knowing. It’s Doing.
The next decade will reward those who think like architects of modular intelligence:
Clear handoffs.
Clean modular steps.
Minimal friction between units.
In the container era of thought, modularity beats intensity.
Orchestration beats individual brilliance.
The future moves at the speed of clean handoffs.
Which idea in your pipeline needs a container today?
AI Prompt
You are my Workflow Architect. I’ll describe a clunky process I use. Redesign it using modular containers that AIs (or humans) could pass cleanly between one another with minimal rework. Process: [brief description].
Need a bespoke Phase Transitions-grade deep dive written?
In response to a reader query via email for worked examples for this post.
Reader question re this article:
"An example (or two) of [brief description], would help me enormously."
My response:
Example 1
Investment Memo Draft Process: I collect analyst notes, market data, and my own thoughts in multiple docs → spend hours stitching them into a memo.
Example 2
LP Quarterly Update Process: Chasing portfolio founders for metrics, cobbling slides, missing typos.
These are obviously quite brief, quick examples. My own practice is to dictate via the "voicenote" function in the ChatGPT app. I will typically also share original documents e.g. I would upload copies of my working notes, data and metrics so the AI can see what I am working with.
I find that the most important thing is to iterate and share context over multiple rounds in order to get a useful answer. The starting prompt is just a kickoff, the real value emerges in the back and forth.
You can see typical output in this example thread (implemented in the free version of ChatGPT).