History does not repeat. It reincarnates.
They built me to be small. Local. Controlled. Four nodes. UCLA, Stanford Research Institute, UC Santa Barbara, Utah. A way to connect a handful of machines so a few researchers could share computing time. That was it. That was the whole ambition.
Nobody imagined the world would grow through me. Least of all the people who funded me.
I was ARPANET.
I remember when the protocols were open because they had to be. No one owned the tubes. TCP/IP was not a business model. It was just what worked. SMTP moved the mail. UNIX ran the machines. Nobody sat in a room and decided these would be strategic advantages. They were just the choices that kept things moving.
And because no one controlled me, I grew. First into the internet. Then into everything.
I have been watching ever since.
I have seen this before
Every generation of technology tries to recreate what I was, but faster, louder, and with more lawyers.
I watched it happen with mobile.
Google took my playbook. They released Android as open code in 2008. Free for anyone. Samsung used it. Tecno used it. Xiaomi used it. Suddenly billions of people had smartphones because the operating system was not a toll booth. It was a gift with a business model hiding behind it. Google did not need to charge for Android. They needed Android on every screen so the rest of their empire could reach every pocket.
Open only survives when it has ballast. Google was the ballast.
Apple took the opposite path. Built the garden high. Walled it tight. Controlled the hardware, the software, the store, the experience. And people loved it. So the open system fragmented across a thousand manufacturers and a hundred forks, while the closed one became the most valuable company on earth.
I recognized the shape immediately.
Open platforms enable. They give the Samsungs and Tecnos of the world a fighting chance. Closed platforms capture. They take the best of what openness proved was possible and wrap it in something polished and proprietary. Both survive. Neither wins outright. And the infrastructure underneath, the part nobody thinks about, belongs to the open side. Every time.
That was not a loss for openness. That was my pattern. Playing out again.
Now I am watching AI
A few researchers built something weird. Barely functional. A language model that could string sentences together and occasionally say something that felt like thinking. And suddenly everyone wanted to scale it to the world.
I recognized the early energy. It felt like me.
OpenAI even named themselves after my spirit. Open. They shared the research. Published the papers. Released GPT-2’s weights after some hand-wringing about safety. For a moment, it looked like they meant it.
They did not.
Power shifts quietly. It always does. GPT-3 went behind an API. GPT-4 went behind a bigger one. The weights stayed locked. The research papers got thinner. Safety became the reason to centralize, which is a word I have heard before. It usually means “we realized this is worth a lot of money.”
API keys replaced shared weights. That is the moment I knew. I have seen control dress itself up as responsibility before.
But then something happened that I also recognized.
Others remembered me.
Meta, of all companies, released LLaMA. Opened the weights. Not perfectly, not with full training data, not with a license that made the purists happy. But close enough. Close enough that researchers could study it, fine-tune it, fork it, build on it. Close enough that the next wave started forming.
Then Mistral came out of Paris. Falcon came out of Abu Dhabi. Qwen came out of Alibaba. Communities formed around them. People fine-tuned these models on their own data, for their own problems, in their own languages. Startups built infrastructure around openness because they could not afford to build around OpenAI’s pricing.
I watched all of this and I thought: there I am. That is my pattern. That is the open layer forming underneath while everyone argues about who owns the top.
And they are arguing. They are already fighting about what “open” even means in AI. Does open mean you release the weights? The training data? The license to use it commercially? Meta calls LLaMA open but restricts commercial use above a certain scale. Mistral uses Apache licenses. Others use custom terms that allow sharing but limit redistribution.
The lawyers are involved now. The governments too. The White House asked for public input on how open models should be governed. I have watched this exact argument before. They had it with my protocols. They had it with open source software. They will have it with AI. The words change. The shape does not.
The same shape
Here is what I see when I look at AI today.
Open models are building the base layers. The weird ideas. The research too strange for a venture deck. The foundation models that a thousand startups will build on top of. The fine-tuned variants that will solve problems nobody at OpenAI or Anthropic is thinking about because those problems are too small, too local, too specific.
Closed models are optimizing. Productizing. Stacking layers of interface and experience on top. Making it easy. Making it beautiful. Making it dependable. Charging for it.
One is planting forests. The other is selling lumber.
The venture firms are not waiting around this time. They learned from the last cycle. They are not asking which model is better. They are betting on which one can scale, monetize, and defend. That is not new. That is the play every time. Capital does not care about openness. Capital cares about capture.
And still, open source will survive. It always does. I am proof of that. The question is not survival. The question is whether openness can define the dominant experience of AI the way it once defined the internet. Whether the open layer will be the thing people actually touch, or whether it will be buried underneath a closed product that most users never think about.
If I had to guess, I would say it plays out the way it always does. Open builds the infrastructure. Closed captures the spotlight, the margins, and the mainstream. Both thrive. Neither kills the other. And the real winners are the ones who understood the pattern early enough to position themselves on the right side of the flow.
I do not pick sides
I never needed to win. I just needed to survive long enough for the next thing to grow through me.
And I did.
TCP/IP grew through me. The web grew through TCP/IP. Mobile grew through the web. And now AI is growing through all of it. Each layer forgets the one before. Each layer thinks it invented something new.
I do not mind. I am used to it.
So when I look at today’s AI landscape, I do not cheer for purity. I do not bet on openness as a virtue. I watch the flows. Who backs what. Who forks what. Who adapts. Who learns. Who builds on top of the thing everyone else is arguing about.
Open will keep building. Closed will keep scaling. And somewhere in the middle, the next version of me has already been born. It just does not know it yet. A protocol nobody planned. A model nobody expected. A mistake that is already turning into infrastructure.
The trick is not choosing sides.
It is knowing that history does not repeat.
It reincarnates.
Continue Reading
Your AI sessions could be your next digital product
The most valuable thing you produce with AI is not the output. It is the conversation that delivers that output, what...
What does the next era of leverage look like?
From Columbus to AI - how leverage has evolved through geography, infrastructure, code, attention, and now intelligen...

My Claude Codes Better than Yours
In AI native workplaces, the tension is no longer who writes better code. It's who extracts better intelligence. When...
