In 1995, I called my Uncle Eddie from my dorm room phone in Currier Hall at the University of Iowa. It was a corded wall phone, the kind that lets you pace a few feet before the cord snaps taut. I had just started building my first website for a freshman class project, it was a digital family tree. I had to call all of my relatives long-distance to ask them to mail me photos, then I would scan them and stitch them into my very first website. Absurd, right?
When I asked Eddie to mail me photos, he laughed. Then he scolded. He told me, with all the authority of a seasoned attorney, that he’d never need the internet. He had his yellow legal pad and a number two pencil, and that was all he’d ever require. His dismissive tone left me in tears on my bunk bed.
For the record, I followed Uncle Eddie on Instagram last week and we’ve shared family photos on Facebook for many years now.
That’s the nature of disruption. At first, it feels irrelevant, maybe even threatening. Then, almost without notice it becomes essential.
AI Feels Like 1995 Again
Before anyone tells me to “get off their lawn” about the inevitability of AI, hear me out.
AI is the internet moment all over again. The technology is powerful, messy, imperfect, not universally understood and fast-moving. Employees are scared, unsure whether using these tools will get them in trouble or replace them entirely. Others are building businesses, creating efficiencies, reimagining how their work gets done, experimenting and stress-testing. After exploring this topic over the last several months, most people I’ve talked to are caught somewhere between curiosity and compliance, with no playbook to guide them.
The roots of this fear run deep. Schools and organizational leaders have labeled AI as “cheating,” and that stigma has carried into the workplace. But if today’s students, tomorrow’s workforce, aren’t taught to use AI tools safely, ethically, creatively, strategically–how will they be prepared for what’s next?
The AI Clarity Curve
At Corkboard, we’ve seen organizations across the spectrum of AI readiness. Some are still at “WTF is this?” Others are tentatively experimenting, or struggling to create guardrails while employees are using AI tools on their personal devices, creating the potential for serious reputational and competitive risks. Most are trying to make sense of it all. Introducing the AI Clarity Curve ™.
1. WTF is this? You’ve heard about AI, but it’s foggy. Feels big. Feels urgent. Feels like someone else’s job.
2. Is this allowed? You’ve tried a tool. Maybe felt guilty. No rules, no policy, no playbook. You’re stuck between curiosity and compliance.
3. Let me try that. You’re testing use cases. Your team is too. But the guidance is loose, and the risk feels unspoken.
4. This needs a plan. You’ve seen the potential and the chaos. Now you’re building governance, messaging and strategy into your workflows.
5. Make it make sense. You’re aligning cross-functional use, training execs, adapting tone, handling backlash. You’re the translator.
Where an organization sits on this curve determines how confident their employees feel, and how effectively they can operationalize AI across the enterprise.
Why It Matters
Leaders owe it to their employees to provide clear processes, procedures and governance around AI because without guidance, fear thrives. And fear shuts down innovation.
Just as companies in the late 90s had to help employees adapt to email, websites and digital workflows, today’s organizations need to demystify AI. It’s not about hype, it’s about trust, transparency, strategy and equipping teams with tools they can use without fear of obsolescence.
This blog series will explore how to do just that.
- How to move your organization along the AI Clarity Curve
- What guardrails and governance make adoption safe and sustainable
- Why AI fluency is as fundamental as computer literacy was in 1995
- And how to balance innovation with responsibility
AI is inevitable. Confusion doesn’t have to be.
