We tend to look at AI through the lens of speed. Who can move fastest, who understands the tools, who seems most ‘native’ to the environment. On that measure, it’s easy to assume the advantage sits with younger generations. Digital natives move through platforms with a kind of fluency that feels instinctive. They know what exists, they know where to click, and they are rarely intimidated by a new interface. There is something to admire in that.
At the same time, there is a quieter advantage emerging elsewhere. Many Gen X leaders, and a good portion of Gen Y, have now spent close to three decades working in and around technology. We were there at the start of email. We remember when Google felt like a revelation. We have lived through enough system upgrades, platform shifts, and “this will change everything” moments to develop a certain perspective. Not resistance, and not cynicism, rather something closer to discernment. That begins to matter when working with AI.
The early narrative around AI has focused heavily on capability – what it can do, how fast it can do it, and how quickly it is improving. A quieter truth sits underneath that. The quality of the output is directly proportional to the quality of the prompt. AI, for all its sophistication, still depends on the human to frame the question. That framing is not neutral. It is shaped by experience, context, judgement, and intent. In other words, it is shaped by how well someone understands the problem they are trying to solve.
This is where Gen X may have an edge. After years of navigating organisations, leading teams, and working through ambiguity, many leaders in their 40s and 50s have developed a more nuanced sense of what sits beneath a surface issue. They are often slower to jump to a solution, not from hesitation, rather from a deeper instinct to define the problem properly. AI rewards that instinct. A vague prompt produces a vague answer. A precise prompt, grounded in context and purpose, produces something far more useful.
There is a useful model here that has been around for some time: the DIKW hierarchy – Data, Information, Knowledge, Wisdom. Data is raw. Information organises that data. Knowledge interprets it. Wisdom applies judgement to decide what matters and what to do next. Knowledge is knowing that a tomato is a piece of fruit – wisdom is knowing not to put it in a fruit salad!
AI is exceptionally strong at the first three layers. It can gather, process, and synthesise information at a scale and speed that is genuinely transformative. What it cannot fully replace is wisdom. Wisdom comes from lived experience. It is built through pattern recognition, mistakes, conversations, consequences, and time. It shows up in the quality of questions we ask and the decisions we make with the answers we receive.
This is not about age as much as it is about accumulated perspective. Many Gen X leaders have spent enough time around the block to have a more developed sense of what good looks like, what risk feels like, and what trade-offs are acceptable. That perspective shapes better prompts, better decisions, and in turn – better outcomes.
There is another factor worth noticing. The conversation around AI often assumes that technological capability will dictate the pace of change. The logic is simple: if the tools are improving rapidly, organisations will adopt them at a similar speed. That has not been my experience. Across most organisations I work with, transformation tends to move at a more human pace. The common pattern is familiar: projects take twice as long and cost twice as much as expected. The limiting factor is rarely the technology itself. It is people.
Relationship tension, resistance to change, competing priorities, and uncertainty about value are all part of the picture. These are not new challenges. They have been present in every major shift, long before AI entered the conversation. In that sense, the real barrier to AI adoption is not the technology curve. It is the human curve, and the human curve moves more slowly. That perspective matters. It tempers the more dramatic predictions about overnight transformation and places the focus back on leadership, culture, and behaviour – areas where experienced leaders tend to be more comfortable operating.
None of this suggests that one generation ‘wins’. There is a more interesting possibility. Younger generations bring fluency – an ease with tools, a willingness to experiment, and a lack of intimidation when something is new. Older generations bring framing – context, judgement, and a stronger instinct for what questions need to be asked. Put those together and something useful happens. Gen Z and Gen A can show what is possible, and Gen X and Gen Y can help shape how it is used. The combination becomes more powerful than either on its own. Wasn’t it always thus!
There is a slightly amusing twist here. Many younger professionals would benefit from asking more experienced colleagues what questions to ask of AI. At the same time, the early part of most careers tends to be spent projecting certainty rather than seeking guidance. It is a familiar phase, one most of us have moved through with a straight face and a quiet hope no one asks a follow-up question.
Gen X carries a small, hard-earned advantage here. We have already had our turn at thinking we needed to have all the answers, followed by the more useful realisation that better questions tend to get you further. My observation is that it takes most people the best part of four decades to move from avoiding feedback to actively seeking it, which may explain a few things. The irony is that those who could benefit most from that perspective – Gen Z and Gen A – are often the least likely to ask for it, still busy presenting (and occasionally believing) that they already know.
At the same time, there is a balance to strike. While Gen X may have a clearer sense of what to ask, Gen Z and Gen A are often better placed to explain how the tools actually work and what is possible. They bring a fluency that can accelerate understanding, if we are open to it.
A final thought sits underneath all of this. While the tools are new, much of what will determine their impact is not. The ability to have honest conversations, to navigate feedback, to work through tension and conflict, and to notice when ego gets in the way remains as relevant as ever. In many ways, AI simply amplifies what is already there. It will reward clarity and curiosity, and it will expose avoidance and defensiveness just as quickly.
For leaders, this brings the focus back to capacity. Not just technical capability, rather the capacity to stay open when challenged, to ask better questions under pressure, and to engage others in conversations that move things forward. The opportunity is not only to get better outputs from AI, it is also to facilitate better ways of working with people.
That work has always mattered – I suspect it always will.
Leadership development begins with understanding your organisation, your leaders, and the leadership challenges you are navigating. A conversation allows us to explore:
Your current leadership environment
Where leadership is working — and where it is not
What stronger leadership capability would change
Whether this work is a good fit for your organisation