What started as a LinkedIn post about using AI as an autopilot sparked a bigger discussion: how do we stay sharp in an age where technology makes offloading the norm?
Steven and I have run soft skills workshops for high-performance teams for years. This work can be messy and complex. The methodology and tooling were built on knowledge, experience, and tweaking. We rarely changed them once they worked. This made business sense: our energy could be spent with the client or on business development.
Then came AI. Suddenly, we could operate at a much more sophisticated level: more impact at speed. Interviews, psychological safety assessments, personality profiling, and culture scans could be analyzed quickly and deeply, resulting in more innovative and tailored workshops.
We would pressure-test new ideas, simulate exercises, and use feedback to upgrade agendas in minutes. We no longer relied on safe routines; we could now turbo-boost experiences with more impact at scale.
For a while, it felt like flying. The results came quickly, and the clients were happy. But increasingly, we were not. We noticed we were becoming “uncomfortable” comfortable, overconfident, and increasingly dependent. We could do more, train more, earn more, but we were spending the time we freed up by productivity into offsetting the long-term effects of AI. We were "AI-ing": a mental decline from excessive cognitive outsourcing. We had fluency, but not the mental resilience and workflows to stay sharp and motivated over time.
That’s when we knew we needed to reset and be more intentional, not just faster. We realized that the time we gained from AI-driven productivity couldn’t be spent doing more. Some of it had to be reinvested into uploading, training, reflecting, learning, and maintaining our mental edge. That’s how you sustain a higher level of performance, not just output.
Futurebraining is a way to offset offloading with intentional brainwork and operate at peak co-intelligence without becoming dependent.
Flying on Autopilot
In an earlier LinkedIn post with Steven, a captain, we reflected on the dangers of AI overreliance through the lens of aviation. We imagined a pilot who flies every route on autopilot—no manual takeoffs, landings, or crosswind approaches. Steven pointed out that at first, the pilot loses edge under pressure. Then, stick-and-rudder instincts dull. Spatial awareness shrinks. Eventually, they still look like pilots—until something unexpected happens.
That’s the risk we face with AI. We may still look like strategists, facilitators, coaches, or creatives. But without the practice of real thinking, we’re just monitoring, not mastering.
Steven also mentions a term pilots often use: the Efficiency Thoroughness Trade-Off (ETTO). The principle is simple: deliberately or unconsciously trim our actions' depth and/or thoroughness to prioritize speed. This can compromise safety margins in aviation, so crews are careful with ETTO.
Even if ETTO works in the short term, sudden, unexpected disruptions or failures expose the shallowness of preparation and execution. In the long run, the ETTO way of working becomes the new standard, leaving crews vulnerable to undermining their standards.
Like the autopilot, AI can take off, land, and assist—but we have to stay sharp enough to take back control when things get uncertain or break. That’s the real risk: not that AI fails, but that we’re no longer ready when it does.
To stay ready, we need to understand something quieter but more dangerous: cognitive offloading. The subtle, daily trade we make between ease and engagement, speed and depth.
The Invisible Cost of Convenience
Cognitive offloading is a deeply human adaptive strategy. It means using external tools or resources to reduce the mental effort required for a task. Instead of relying solely on your brain, you delegate some thinking, remembering, or problem-solving to something outside yourself. Think of it as a crutch for your brain: writing a to-do list instead of memorizing everything, using a calculator for complex math, navigating with GPS instead of recalling a map, or searching online instead of remembering.
But each offload charges our brain a toll, even if you don´t immediately notice. The problem isn’t offloading itself; it’s doing it passively and excessively, without understanding what we’re giving up and how often we do it.
Consider how today´s tech tools erode specific cognitive functions: calculators dull math fluency, GPS impairs spatial memory, internet search engines fragment our recall, and social media splinters our attention. This isn’t theoretical; it’s been widely documented across decades of cognitive research.
One glance at the table of cognitive costs from everyday tools—calculators, GPS, social media, spell checkers—makes the pattern unmistakable: every aid helps us go faster, but also makes something softer inside us fade.
Unchecked, these losses compound. Eventually, the cost is not just what we forget, but who we become: less creative, analytical, and adaptive.
Offloading, Multiplied by AI
AI marks a big shift in cognitive offloading, from isolated tasks to complete cognitive cycles. It doesn’t just support memory or navigation; it writes, summarizes, analyzes, ideates, and plans. The offloading is no longer tactical; it’s systemic.
But here’s the paradox: the more we rely on AI, the more we risk cognitive misuse. Our skills erode when machines draft, decide, and remember for us. The 3 Cs of sustainable AI performance: critical thinking, creativity, and curiosity suffer. We outsource not just execution, but understanding.
Ask yourself: how much of your recent work could you defend—if the AI vanished tomorrow?
When Speed Isn’t Strategy
AI is often described as delivering a 10X, 20X, or even 50X boost in perceived productivity—but these are directional, not measured, claims. And yes, it speeds up surface tasks. But faster output doesn’t guarantee long-term outcomes. We felt that ourselves: workshops became easier to design, and client satisfaction stayed high, but underneath, our sharpness, focus, and energy were slipping.
Recent trials show:
P&G + Harvard hackathon: AI-assisted teams completed design tasks 12% faster. However, researchers noted no measurable increase in the participants’ ability to explain or replicate the solution unaided, raising questions about the depth and transferability of learning.
Atlassian AI Collaboration Report: While AI saves users 1–2 hours/week, their research shows that only those who treat AI as a collaborative partner—rather than a task tool—see real gains. Strategic collaborators save double the time, report higher-quality output, and are 1.8x more likely to be seen as innovative by peers. In contrast, simple users saw fewer benefits, reinforcing that mindset, not just usage, drives value.
Upwork Research Institute (2024): 77% of employees using AI said it made them less productive while increasing workload. Only 17% felt confident using AI tools, despite 96% of executives expecting productivity gains. The gap between executive optimism and readiness drives burnout and cognitive overload across roles.
Speed is not a competitive advantage if everyone accelerates equally. Depth is.
A System For Load Balancing Our Brains
We developed a framework for staying sharp in the age of AI. It’s not about resisting automation but shaping it.
To climb, you must actively upload:
AI Fluency: Understand how AI works and where it breaks. Climb the AI fluency pyramid, from basic use to real co-intelligence.
Expertise: Bring your own knowledge, maintain it, and build it. Expertise means being able to quality-check anything AI comes up with and making sure your thinking still works when the power goes out.
Focus: Know what matters and stay with it. Use tools to move faster and deepen your attention.
Responsibility: Make the final call. Don’t outsource it. Stay legally and ethically awake. Remain able to act without the tool.
Emotional Intelligence: Build trust. Read the room. Connect. When something feels off, trust that instinct.
Futurebraining Our Own Practice
Take our recent work with a hospitality client. We wanted to introduce content around emotional labour—a term we knew but hadn’t fully grasped—so we futurebrained the process.
Emotional labour is the effort required to manage one's emotions—or display emotions one may not genuinely feel—to meet professional expectations. Consider hotel staff who must stay cheerful despite exhaustion and frustration with guest behaviour.
We used fluency to map how AI understood the topic, reviewing its summaries and testing their depth. We checked and enriched the content with expertise and field research. We focused on a few key questions, not chasing every insight. We took full responsibility for the session's ethical and legal components and used emotional intelligence to land it all with the humans in the room.
This isn’t digital literacy. It’s digital wisdom.
Conclusion: No Upload, No Multiplier
If we don’t actively invest in our thinking, AI will make us faster, dumber, and dependent. It’s not enough to use AI to save time; we must reinvest some time into maintaining and growing our mental capabilities.
Uploading isn’t extra work; it’s the cost of staying sharp. Without it, we guarantee decline. As a “digital cognitive divide” widens, those who develop reflective, intentional AI habits will thrive, while those who passively adopt tools will erode.
For us, this isn’t about squeezing more from the day. It’s about staying sharp enough to matter in the long run, keeping evolving, being trusted, and doing work that challenges us, not just work we can deliver on autopilot.
The age of AI doesn’t mean the end of deep thought, unless we let it.
Yes, the reliance on tools often leads to deskilling. But it also enables upskilling: gaining skill and experience in higher-level tasks that could matter more to us.
Does it matter that we lose the skill to do basic calculations in our heads when pocket calculators allow us to spend more time solving actual mathematical problems? The reliance on cars, trains and planes allowed us to travel more widely and experience more cultures. The reliance on computers allowed us to create and share more ideas.
Some runners blame Nike for introducing a type of running shoes that all runners now depend on. Most people cannot run with bare feet anymore because the soles of our feet have adapted to soft shoes. Is that bad? It depends on what else we gained in return.
This was honest and resonant framings on what AI offloading actually costs.
I’ve been writing about this too, how working with AI demands reflection as infrastructure. Without a rhythm of reinvestment, curiosity, taste, fluency, we risk not just outsourcing tasks, but slowly eroding the clarity we used to lead with.
Thank you for putting language to something so many feel but haven’t named yet.