These three dysfunctions will stall your AI transformation

Abstract banner image illustrating an office

Summary

AI transformation isn’t only about building or buying new systems. New technology often faces systemic inanities in legacy organisations. In this article, I list three blockers to AI transformation.

  1. A TLDR/ TLDW way of working

  2. Poor delegation practices

  3. Bullshit jobs and managerial feudalism

It’s hard to have a 30-minute business conversation these days without the mention of AI. Rightly so! Companies can leapfrog an entire generation of systems and rethink how they run their businesses using a new technology paradigm. Many systems claim to be AI-first in recruiting, knowledge management, sales, and other company functions. The challenge, however, isn’t as much in shopping for the right tools. It’s in shaping how the company works, and how people behave. Working with my employers on a few AI transformation projects, I notice three dysfunctions likely to stall most AI initiatives. 

TLDR/ TLDW

AI loves data. Generative AI loves content. Consider chatbots, which are the most prolific interface for working with AI. Recent improvements notwithstanding, most chatbots still resort to intelligent guesswork when they generate a response. Retrieval augmented generation (RAG) makes a difference to the quality of guesswork by relating responses to existing knowledge bases. Still, there is only so much AI can do with a poor-quality knowledge base. Indeed, that’s the reason RAG evals are a thing

This is where poor corporate habits stall AI transformation. Async-first companies that write a lot and practice low-context communication will have no problem building or buying RAG applications that feed off their knowledge bases.

In contrast, “Too long, didn’t read,” a.k.a TLDR can be counterproductive to AI transformation. A TLDR workplace also leads to a TLDW workplace. If no one reads, why write? You may notice dysfunctions such as a meeting-heavy way of working at such workplaces. The best people at these firms usually don’t have much time for deep work. Communication often happens through death by PowerPoint or slideuments. As it turns out, slideuments are neither good documents nor slides

Imagine a RAG application that indexes a knowledge base full of slideuments. The half-sentences of such decks hide untold knowledge and context between bullet points. When you leave a RAG application to guess what’s between the lines, you end up with hallucinations. 

Poor delegation

A few months back, a friend took up a role that sounded interesting but ended up being relatively low-end for their skills. Here’s the back story. The friend’s boss was already wilting under meeting overload driven by the TLDR/ TLDW workplace. So, they needed someone to do the actual work. But no one paid attention to what the actual work was. One thing led to another, and a fancy job description came alive, which had little to do with reality. My friend applied for the job and got it.

As they settled into their jobs, they realised the role was a glorified copy-paste gig. No one meant badly, but the boss hadn’t spent time on a proper job analysis, and now they had a team member who was grossly overqualified for the job they were doing. My friend’s been hankering for a role change ever since, but imagine the waste of talent and skill erosion the company and the individual face in such a situation! Thoughtful delegation could have helped avoid this problem.

Thoughtful delegation is essential for AI transformation as well. When everyone in a company, particularly the middle managers and leaders, knows their jobs in and out, they can decide which tasks to delegate to AI. For example, my friend is arguing his case for a role change by showing how some simple documentation and two citizen AI apps can do the job he’s doing. Of course, he risks losing his job, but for someone confident in their skills, it’s an opportunity to create value in a more challenging space. Imagine the flipside. If people can’t delegate effectively, you’ll struggle to even build reusable chat prompts, let alone drive large-scale AI transformations.

Bullshit jobs and managerial feudalism

Poor delegation brings me to the most insidious problem that plagues the Dilbertesque corporation - bullshit jobs. We’ve discussed the concept earlier on this website, but here are two David Graeber quotes to recap.

“A bullshit job is a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence, even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case.”

“Shit jobs tend to be blue-collar and pay by the hour, whereas bullshit jobs tend to be white-collar and salaried… Those who work bullshit jobs are often surrounded by honour and prestige; they are respected as professionals, well paid, and treated as high achievers - as the sort of people who can be justly proud of what they do.” 

In his irreverent book and his New Yorker essay, Graeber lists a few categories of bullshit jobs that I’m sure you can identify examples for.

Category Description
Flunkies Their purpose is to make others feel important.
Goons The only reason for their existence is that other companies also hire them.
Duct-tapers They exist because the system is broken, and their bosses would rather have a human being bridge those flaws than make a systemic change.
Box-tickers They allow organisations to claim that a lot is happening, often through paperwork, reports, surveys, newsletters, and other such activities, when in truth, they’re not creating much value.
Taskmasters Their entire responsibility is delegating work to others. They manage people who don’t need management and create more unnecessary work for others.

Readers of this blog will also remember three categories I added to Graeber’s list.

Category Description
Empire builders Like taskmasters, they also delegate work to others. Unlike taskmasters, the importance of people in these jobs depends on how many layers of people work under them and how big each layer is.
Hand wavers These jobs may not create anything of value themselves, but the people who play them are articulate, smart, and can make convincing points in a boardroom or, for that matter, a video conference.
Gatekeepers Their existence is all about approving things that others need to do. If it weren’t for the fear of an improbable adverse event, these jobs wouldn’t exist. Yet they do, and they bottleneck people who might create something valuable.

Earlier this week, my friend Nag alerted me to Graeber’s idea of “managerial feudalism”, which is often a root cause of bullshit jobs. Managerial feudalism is the tendency of bureaucratic corporations to adopt hierarchies that resemble medieval, feudal systems. You’ll recognise a few characteristics of this tendency.

  • There are many management layers, with roles with no meaningful purpose. Some layers exist only to bolster individual managers’ status and prestige.

    Example: A middle manager hires a “report consolidation specialist” solely to create summaries of automated reports, just to grow their empire of direct reports.


  • Control and hierarchy trump productivity. Bullshit jobs emerge only to justify the system.

    Example: A department uses a manual approval process for minor decisions, creating roles like “workflow coordinators” to manage easy-to-automate steps.


  • Managers create roles to expand their influence, not organisational need, trapping workers in meaningless tasks that sustain hierarchy.

    Example: A manager hires a “data liaison officer” whose sole job is to reformat spreadsheets between two teams using different templates, even when an automated tool can handle the task. This position exists only to retain the manager's influence, not because the company needs it.


Managerial feudalism and bullshit jobs are particularly destructive in a lousy job market. When people find themselves in a bullshit job, they have little incentive to speak up and seek another role. After playing the bullshit job for a few years, people get so disconnected from the skills they had that they become redundant in the job market. What happens next is no surprise. People playing bullshit jobs strongly guard their turfs and, worse, become middle managers who propagate managerial feudalism. It’s a vicious cycle.

If you notice all the examples I listed above, you’ll see they can all benefit from automation and even AI interventions. But in a stiff job market, why would a feudal system abound with bullshit jobs render themselves irrelevant? How many people are confident to admit that AI can do their current jobs? How many managers are willing to give up their empires to AI? Executive fiat will only go so far if people are insecure about their employment opportunities in an AI-first world.


All this said, AI transformation is coming to your workplace soon, whether you like it or not. If you’re an average Joe worker like me, I suggest we acquire and strengthen valuable skills that AI can’t yet disrupt. Soon there’ll be no room to hide behind a prestigious bullshit job. I also reckon those most conversant with the consumer AI tools relevant to our jobs will be most resilient to disruption.

Conversely, leaders will probably do best if they recognise their moral obligations to their employees. If your workforce fears job losses, you can’t gee up your organisation to transform itself with AI. Workers must feel psychologically safe to delegate their jobs to AI and to create a corpus of knowledge that informs AI-driven actions. That psychological safety can only come from an assurance that the company will redeploy people into new, more fulfilling, more challenging jobs in an AI-first world. Easier said than done? For sure. Necessary? You bet!

Previous
Previous

"Close knit" leadership may be a red flag

Next
Next

Project management - not just for project managers