A magnifying glass resting on a printed data report — representing the STEM skill of scrutinising evidence before accepting a conclusion.

Why STEM Students Are Better at Spotting Nonsense

April 15, 20264 min read

Someone sent me an article last year. Confident headline, lots of statistics, a conclusion delivered like it had been carved into stone. They wanted to know what I thought.

I told them the study had a sample size of forty-three people, the control group was poorly defined, and the conclusion didn't actually follow from the data even if you accepted the methodology at face value.

They asked how I spotted that so quickly.

The honest answer is: I wasn't doing anything special. I was just reading the way I was trained to read.


We are living through a period of genuinely impressive nonsense production. Not because people are less intelligent than they used to be — they aren't. But the tools for packaging a bad argument in the clothing of a good one have never been more accessible, more polished, or more widely distributed.

A chart with no axis labels. A percentage without a baseline. A study cited without its sample size. A correlation dressed up as a cause. These things are everywhere — in news coverage, in corporate presentations, in government policy documents, in the content people share with absolute confidence every day.

Most people cannot see them. You are learning to.


The specific skill that STEM training builds here is not scepticism. Scepticism is cheap — anyone can distrust anything, and plenty of people have turned reflexive distrust into a full-time personality. That is not what this is.

What STEM training builds is something much more precise: the ability to evaluate a claim against the evidence that is being offered for it, and to notice the gap between the two when it exists.

Those are not the same thing. The first is a posture. The second is a method.


In a laboratory setting, you develop a relationship with evidence that most people never have to develop. You learn that data doesn't speak for itself — it has to be interpreted, and the interpretation can be wrong even when the data is clean. You learn that the methodology shapes the result before the result ever arrives. You learn that the difference between what was measured and what was concluded is a gap that requires justification, not assumption.

That kind of training changes how you read a claim. You stop asking "does this feel right" and start asking "what would have to be true for this to be right, and is that what the evidence actually shows?"

That question is devastating to nonsense. Nonsense rarely survives it.


I want to be careful here, because this is a point that is easy to overstate in a way that becomes its own problem.

Knowing how to evaluate evidence does not make you correct more often than everyone else. It does not make you immune to motivated reasoning or confirmation bias — nobody is. What it does is give you a method for checking yourself that goes beyond whether a conclusion feels satisfying.

The person who has been trained to ask "what is the quality of the evidence here" will sometimes find that the evidence supports the conclusion they were hoping to reach. What they will not do — if the training has done its job — is skip the question entirely.

That habit is rarer than you would like to believe. And it is becoming more valuable as the volume of information, and misinformation, keeps increasing.


There is a version of this skill that pays off immediately — in assessments, in professional settings, in the ability to quickly identify when someone in a meeting is extrapolating well beyond what their data supports. That version is useful and worth recognising.

But the deeper version pays off over a lifetime. It is the accumulated advantage of having applied a consistent standard to the claims you encounter, rather than filtering them through whether you already agree with the speaker. The person who does that across decades ends up with a genuinely different picture of the world than the person who doesn't.

Not a perfect picture. But a more accurate one.


You are building that skill right now — not in a class on critical thinking, not through a module about media literacy, but in the practical, daily discipline of working with actual evidence and being held accountable for what you conclude from it.

The world is full of confident, articulate people presenting conclusions that don't survive scrutiny. You are learning to apply the scrutiny.

That is not a minor thing. In the current environment, it might be one of the most important things.

Adam

Adam Burgess studied theoretical physics, became a police intelligence officer, and then spent 30 years in ICT — none of which are as unrelated as they sound. He has never worked in a lab but uses his science degree every single day. He writes about critical thinking, technology, and why the skills nobody thinks are practical turn out to be the most useful ones.

Adam Burgess

Adam Burgess studied theoretical physics, became a police intelligence officer, and then spent 30 years in ICT — none of which are as unrelated as they sound. He has never worked in a lab but uses his science degree every single day. He writes about critical thinking, technology, and why the skills nobody thinks are practical turn out to be the most useful ones.

Back to Blog

Looking for tailored solutions? Discover our Consultancy Offerings.

Have more questions?

Ask us today.

Copyright 2025 ABC Training and Consulting – All rights reserved. Alan Bartlett Consulting T/A ABC Training RTO #5800