When Everyone’s A Thought Leader And No One Has An Original Thought

Facebook
Twitter
LinkedIn

There’s a particular kind of fatigue that’s starting to set in among healthcare leaders who scroll through LinkedIn these days.

It usually starts with some variation of “In today’s rapidly evolving healthcare landscape,” followed by perfectly structured paragraphs, that 1200 words later leaves you wondering how you just managed to read absolutely nothing, about almost everything.

Welcome to the new era of healthcare thought leadership —where the content is polished, professional, and almost entirely forgettable.

Generative AI has created a world where every Tom, Dick and H.. I mean Jane, can now produce a 1,200-word article on value-based healthcare, leadership in health management, or digital transformation in approximately ninety seconds. The outputs will of course be grammatically flawless, strategically vague, and more than likely utterly devoid of original insight.

Let me declare well ahead of time, that this reflection piece right here isn’t just hypothetical and it isn’t just sarcasm shamelessly disguised as wit (..not entirely anyway).

But the fact is, if you open LinkedIn on any given Tuesday you’re going to find health leaders sharing posts that hit every expected buzzword that’s been known to the sector since clinical governance became a thing. Like the likes of “innovation,” “collaboration,” “patient outcomes,” and “systems leadership,” while offering nothing substantial that couldn’t have been found in the top three results of a Google search from back in 2017.

What we’re talking about is content that looks professional and sounds intelligent, reads well and feels legit, but contributes almost nothing of value to the issues that matter. Content where AI-powered human efficiency has apparently become so super-charged that handfuls of articles are hitting us in the face every few minutes from every corner of the world.

What’s probably even worse, and which adds some seriously annoying insult to injury, is that these posts are starting to now also be accompanied by tell-tale AI-generated images. Invariably they feature well-lit groups of impossibly good looking clinicians often depicted deliberating around a polished conference table or gathered around a glowing tablet. Sometimes shown idling in operating theatres that look like luxury apartments and smiling like they’ve just solved the health system sustainability problem before morning tea.
Yes it’s true that we do know none of this actually exists in the real world and these are just exaggerated reflections of a kind of dystopian clinical AI-fantasy world. Which, in fairness, probably does explain the optimism.

Jokes aside, healthcare LinkedIn has well and truly now entered the “AI generated image” era, ushering in a contagion that is single-handedly causing the devastating Synthetic Image Epidemic —key symptoms have been known to include frequent bouts of violent nausea.

Why on earth is all this happening, you ask?
Simple. Because the content marketing industry is complex. And metaphorically ugly. But mostly just superficial.

Social platforms reward volume and popularity, not depth. Even the best engagement algorithms can’t really distinguish between original insight and competent mimicry. A post is a post is a post, and the platform just amplifies the content that generates more likes and more reactions, regardless of any intellectual merit. What once required creative writing now requires only clever prompting.

Another reason is that LinkedIn presence has increasingly become intertwined with career advancement. Demonstrating “some thought leadership” is believed to boost reputations, maintain the illusion of expertise, and to factor into the head-hunting of executives. So when the appearance of such expertise is unquestionably valued over its substance, AI-generated content becomes the clear and rational shortcut.

The unintended tragedy of this fragile social construct though is the desensitisation to high value content, where people are now just mindlessly scrolling past everything. Even the good stuff. Even the genuine reflections. Even the content that took time, was creatively designed, and contributed something original.

Because when expertise is available in bulk where everything looks synthetic, authenticity doesn’t stand out, and gets ignored. Readers then scroll past genuinely insightful analysis because they’ve been burned too many times by content that promised depth, and delivered nothing. And when the authentic voices are drowned out in a sea of AI-generated content that’s been optimised for engagement rather than insight, the signal-to-noise ratio collapses.

So finding something genuinely insightful on LinkedIn these days, is starting to feel increasingly like you’re on an archaeological dig. And every third or fourth post in our news feed continues to hit us with the requisite accompanying image, of clinicians with flawless skin and suspiciously symmetrical faces, and of patients who appear mildly delighted to have been hospitalised.

The real damage though, extends far beyond just aesthetic annoyance. Even extreme aesthetic annoyance.

This is because healthcare is a field where ideas have consequences, where the difference between genuine insight and recycled platitudes can influence policy decisions, investment priorities, and ultimately patient outcomes.

Still, no need to despair, perhaps all is not lost just yet. For those few healthcare leaders who are committed to contributing authentic value and insight over just “building their brands,” all that’s required is an intentional pivot. We must learn to create better content, to produce a higher standard of writing, and to develop new more creative ways of thinking.
Because what AI excels at is producing generic content on common topics. When it comes to genuine analysis, novel concepts, and ideas that don’t already exist in its training data, it struggles. So if that article you just posted could have been written by a person of limited imagination and unlimited access to ChatGPT, you should probably reconsider whether it even needs to exist in the world.

At the end of the day, this never ending flood of AI-generated content forces some of us responsible healthcare leaders (ok some sarcasm detected) to confront the deeper questions about what we actually value now in professional development.
Do we believe that authentic insight matters, or are we comfortable with a performance of beautifully polished yet fundamentally empty expertise?
Do we want a reputable space where healthcare leaders can share ideas that actually elevates professional development and improves systems and outcomes, or are we happy to continue to be thrilled to attend this and that forum?

And if we do want something more authentic, then are we willing to invest the time and effort to identify these, bother to differentiate them from the stock standard AI noise and dare to actually engage with them?

These aren’t questions with straightforward answers, because human nature is human nature (hang on I see something new and shiny over there..), and the platforms that host these conversations have limited incentive to prioritise quality over the number of likes or comments (they call it engagement).

If this is the case, and it clearly is, the burden then falls on leaders themselves, leaders as the readers who can choose what to engage with and leaders as writers who can choose what to create.

The irony of this moment is that healthcare has never needed genuine thought leadership more desperately than it does now. The challenges facing the industry, such as workforce sustainability, value transformation, equitable access, technological integration, are genuinely complex issues and demand the best thinking that may be available.

Instead, what we’re getting are a thousand variations of “Healthcare is at an inflection point,” accompanied by what looks like an oil-painting of clinicians carrying stethoscopes looped around necks that connect to faces assembled from statistical averages. And nearly always on a background of advanced-looking computer screens for some reason.

The point is that this noise isn’t going away anytime soon. Neither I suspect is the aesthetic annoyance. But if we’re still interested in contributing something meaningful to healthcare leadership,
we can always choose to write things that AI can’t. We can choose to be creative and design the images ourselves. We can choose to produce genuine insights. And we can choose what we pay attention to and reward.

Because honestly, and for the love of credibility, enough already with the AI generated clones in the images and migraine-inducing over-populated info-graphics.

Disclaimer: while this article includes the requisite 90 second AI generated initial outline, it’s only 6 and a half hours of writing in my own words later, that one could’ve hoped to ever end up with this level of sarcasm. Without inadvertently damaging the fragile AI ego by daring it to self-reflect in an AI article about its own annoying AI articles, that is.