Kodak didn’t “miss” the future. It helped invent it. That’s what makes the company such a useful mirror for the AI moment we’re living through: the danger usually isn’t blindness. It’s incentives. It’s fear of cannibalizing what already works. It’s the quiet confidence that the machine you’ve built is too big, too profitable, too *inevitable* to be unseated.
The lazy Kodak story is a corporate bedtime tale: dinosaurs ignore the meteor, get what they deserve. The real story is sharper, and more uncomfortable. Kodak built a closed-loop empire around film: not just rolls in yellow boxes, but paper, chemicals, minilabs, retail partnerships, and habits so widespread they felt like nature. That ecosystem printed money. And in a business that prints money, anything that threatens the press looks less like “innovation” and more like sabotage.
So when digital photography arrived, Kodak faced a problem that today’s AI-haunted industries should recognize immediately. The new technology wasn’t simply a better product. It was a different value chain. Digital didn’t just change how images were captured; it changed where value accrued, away from consumables and toward devices, software, networks, and eventually platforms. Kodak could build cameras. But its real advantage, chemistry, manufacturing, physical distribution, a razor-and-blades profit engine, was wired for atoms, not bits.
That is the Kodak lesson we need right now, because AI isn’t arriving as a feature. It’s arriving as a business model upgrade. It manufactures abundance: abundant drafts, abundant code, abundant images, abundant analysis. If your company sells scarcity, scarcity of labor, scarcity of expertise, scarcity of time, AI is not a tool you “adopt.” It is a force that rewrites what your customer is paying for.
You can hear the Kodak conversation happening everywhere, in different accents.
In a law firm, partners ask: if AI drafts the contract in minutes, what happens to billable hours? In a consulting shop, managers worry: if the deck writes itself, what becomes of the analyst pipeline? In a university, professors ask: if students can generate essays instantly, what does learning mean? In a newsroom, editors weigh speed against credibility: if we publish faster with AI, do we also publish wrong faster? In a design studio, creatives feel the ground shift under their feet: if images are infinite, how do you price taste?
Everyone sees the wave. The question is whether we’ll surf it, or build beautiful sandcastles to defend against it.
Here are four lessons Kodak offers, if we’re brave enough to take them without smugness.
First: treat AI like a redesign of value, not a bolt-on productivity hack. Kodak tried to attach digital to a film-shaped business. Many organizations today are doing the same thing: plugging AI into workflows designed for scarcity and calling it transformation. They automate the email, accelerate the report, speed up the first draft. Useful, sure. But if your strategy ends at “faster,” you’re reenacting Kodak’s mistake, improving the old machine while the market migrates to a new one.
Second: be willing to cannibalize yourself before someone else does it for you. This is the gut-check. It requires leaders to shrink the thing that made them powerful. It requires telling teams, sometimes entire departments, that the core product is no longer sacred. Kodak’s film profits were not just revenue; they were identity. The same is true for today’s knowledge industries. Billable hours are identity. Manual analysis is identity. A “human-only” creative process is identity. The tragedy is that refusing to cannibalize doesn’t preserve the old world. It just delays your ability to compete in the new one.
Third: build the capabilities you don’t have, not the products you wish you could sell. Kodak could build digital cameras, but the real disruption moved to networks: camera phones, social platforms, cloud sharing, data-driven advertising. The winner wasn’t the company that captured images; it was the company that owned distribution and attention.
AI has a similar twist. The model is impressive, but the moat often lives elsewhere: proprietary data, workflow integration, distribution, trust, and accountability. You can buy access to a model. You can’t buy organizational fluency overnight. The organizations that win won’t be the ones who “use AI.” They’ll be the ones who redesign how work moves, from intake to judgment to delivery, with humans and machines in a deliberate loop.
Fourth: decide what you will be trusted for. In an AI-saturated world, trust becomes the premium feature. A hospital isn’t trusted because it has software; it’s trusted because it is accountable. A bank isn’t trusted because it’s fast; it’s trusted because it is regulated and auditable. A law firm isn’t trusted because it can draft quickly; it’s trusted because it stands behind the result.
AI increases the risk of plausible nonsense. That means provenance, review, and responsibility stop being compliance chores and become product features. “We used AI” won’t reassure anyone. “Here’s who approved this, how it was checked, and what we’ll do if it’s wrong” will.
Notice what’s hiding inside those lessons: AI demands business model innovation, not just product innovation. That was Kodak’s real misread. It saw digital as a camera problem. It was a value problem.
And yet Kodak’s story isn’t only a cautionary tale. There’s also a sliver of hope in its second life. Kodak today isn’t the ruler of photography. But it has found a way to exist by narrowing, focusing on defensible strengths, and leaning into what its brand still means. That matters, because the AI era won’t produce a single set of winners. It will produce niches, reinventions, and new forms of craft.
But you don’t get that second life by waiting until the transition is obvious. You get it by acting while the old world is still paying your bills.
AI will not ask whether your incentives are ready. It will only ask whether your customers are.
If you want to avoid becoming a parable, stop treating AI as an add-on. Let it threaten you early. Let it embarrass your process. Let it force the uncomfortable questions now, while you still have the resources to answer them.
The Kodak moment isn’t when you see the future. It’s when you decide to betray your past before it betrays you.
About Me:
Dominic “Doc” Ligot is one of the leading voices in AI in the Philippines. Doc has been extensively cited in local and global media outlets including The Economist, South China Morning Post, Washington Post, and Agence France Presse. His award-winning work has been recognized and published by prestigious organizations such as NASA, Data.org, Digital Public Goods Alliance, the Group on Earth Observations (GEO), the United Nations Development Programme (UNDP), the World Health Organization (WHO), and UNICEF.
If you need guidance or training in maximizing AI for your career or business, reach out to Doc via https://docligot.com.
Follow Doc Ligot on Facebook: https://facebook.com/docligotAI