The Day Robert Trusted AI a Little Too Much: An AI Workplace Fail

“Robert, have you done the presentation?” Someone asked from across the room.
He didn’t even look up. “Almost done,” he replied, although …

AI

Estimated reading time: 4 0 minutes

It started like any other Friday in the office, the kind where everyone is trying to wrap up the week, clear their tasks, and hopefully leave a little earlier than usual.

Robert was in his element—headphones in, laptop open, fingers moving quickly across the keyboard.

“Robert, have you done the presentation?” someone asked from across the room.

He didn’t even look up.
“Almost done,” he replied, although “almost” almost always meant done already.

The guy who made work look easy

Over time, Robert had built a quiet reputation in the office. He was efficient, fast, and reliable, so whenever something urgent came up, it naturally found its way to his desk.

People admired it, and honestly, they relied on it.

What they didn’t quite see, however, was how he managed to move so quickly.

The silent partner

At first, it was small things—he used AI to draft emails, refine reports, and summarise long documents, which made his work easier and faster.

But gradually, that support became dependence.

He began using it for presentations, insights, and even responses to internal questions, and before long, AI had quietly become his constant partner.

And because the output was always clean, structured, and confident, Robert started to trust it more and question it less.

So he stopped double-checking, and he stopped interrogating the information, and slowly, without realising it, he stopped adding his own layer of thinking.

Friday’s presentation

That Friday, the room was filled not tense, but attentive.

Robert stood in front, moving smoothly through his slides, and everything looked exactly as expected: structured, clear, and professional. Until it wasn’t. 

“Robert,” a voice interrupted from the back. He paused. “Yes?” “Can you confirm the source of this data?”

It was a simple question, and under normal circumstances, an easy one to answer.

“This was part of my research,” he replied confidently. But then came the follow-up. “Which report exactly?” And just like that, the room shifted. Not dramatically, but enough to be felt.

The moment everything paused

After the meeting, the checks began, quietly at first, then more carefully. And as the details were reviewed, the truth revealed itself. The data wasn’t just slightly off or outdated; it was completely incorrect.

And the most uncomfortable part was not just the error itself, but the realisation behind it. Robert hadn’t verified it, because he had trusted it.

Later that day, Robert sat at his desk, staring at his screen, with the same tool he had relied on all week still open in front of him. Out of habit, he typed again.  He paused, read it again, and this time, it didn’t feel helpful or impressive, it felt incomplete.

The truth is, in many offices today, Robert is not alone, because more people are working faster than ever before, producing more and delivering quicker. However, in the process, something important is being quietly traded.

Speed is replacing scrutiny, efficiency is replacing accuracy, and convenience is replacing judgement.

Because here’s the truth

AI can help you start, and it can guide you, structure your thoughts, and even refine your work.

But it cannot verify your facts, and it cannot fully understand your context, and more importantly, it cannot interpret nuance the way you can.

And because of that, it also cannot take responsibility for the outcome.

That part still belongs to you.

Gene’s Office Survival Tip 

Don’t just accept what AI gives you—pause, question it, and make it make sense.

Robert didn’t stop using AI after that day; instead, he started using it differently, more intentionally, more carefully, and with a stronger sense of responsibility.

Because he came to understand that AI is not the problem, but how we use it can be.

And more importantly, he realised that the real skill is not just generating content quickly, but having the judgement to review it, the awareness to interpret it, and the discipline to refine it so that it fits the context.

So perhaps the most important rule to remember is this:

Let AI do 30%, but ensure you do the 70% that requires judgement, interpretation, and verification. Because in the end, it is not about how fast the work is done, but how accurate, relevant, and reliable it is. And that will always depend, not on the tool, but on you.

WRITTEN BY
Genevieve Amponsah
Jobberman Ghana
Notification Bell