Showing posts with label accuracy. Show all posts
Showing posts with label accuracy. Show all posts

20 November 2023

Often wrong, rarely in doubt.


We’re living in the Information Age. I don’t know what they’ll call the next age, since we don’t yet have enough information to make the call, probably because it’s too hard to imagine anything more wonderful than our modern technology (in the archaic sense of the word – filling a person with wonder).

The computing ability of that little device in my pocket is powerful enough to deliver most of the world’s information in a matter of seconds, any time, day or night. I marvel at this, in the same way I marvel at giant aircraft that fly to Japan and the little handful of pills I take every night I’m told is keeping me alive. The pleasant female voice on my iPhone telling me how to get from Dublin to Killarney. I feel if you aren’t dazzled by these technical miracles, you aren’t paying attention. But still.

Hadron Collider
Hadron Collider

Is information the same as learning, and is learning the same as knowledge?

I’m what they call an Infomaniac, which is a common condition with writers, who want to know everything all the time. I obsessively absorb all the information I can grab, which is a lot, because I never know when it will come in handy. Though I’m beginning to think it’s too much.

One of the conclusions emerging from this gush of information is that much of it is inaccurate. While disinformation is rampant, most inaccuracies are unintentional, because the individual chronicler can only know so much, as is true with those who advise her, so she has to get some things wrong. Consequently, you have to take the things you learn with a grain of salt. A big, honking, room-sized boulder of salt.

A recent article in the New York Times by a learned scientist tells us we really shouldn’t expect science to have the right answers. Actually, quite the contrary. They’re often wrong, and the more conviction they display, the less reliable their assertions. I’ve known this for some time, having studied the history of science. Nearly every groundbreaking study and elegant theory is full of caveats, and put forth usually more as a proposition than an iron-clad, done deal. They will only know how close they got to a definitive answer over time, as additional research adds to the understanding, and the worthy process of challenges and counterarguments takes its course.

And the most wonderful thing to me, is that while science can often predict with 100% certainty what will happen from a set of organized interactions, they often don’t know why. Much of modern electronic wizardry is based on theories of quantum mechanics, which not a single physicist in history has fully understood. They can just guess and approximate, and hope that their children and grandchildren will get us closer to the truth.

(Quantum mechanics is so hard to understand that at least one theoretical physicist thinks his science has given up trying. I agree with him that this is foolish. What if Lewis and Clark had stopped in Kansas, telling each other, this is just too hard?)

So that’s the other leg of the stool. Information leads to learning, which may or may not yield reliable knowledge, which rarely serves up truth, in the absolute way we all understand the word.

Consequently, truth is likely the most revered and slipperiest word in the language. An advertising colleague of mine once said, in the midst of a very confusing and stressful period at work, “I know my name is Joan and I live in a house.” Like her, I know certain things to be true. I love my wife, my dog and my family. I love the places I live, and my friends. I was born in Philadelphia and if I root for the Phillies, they’ll likely lose in the playoffs. Everything else is up for grabs.

glass of red wine

Everyday I read something that totally contradicts what we’ve always considered to be established fact. Coffee is bad for you? Nope. It’s great. Drink all you want. Red wine is great for your health? Nope. Even a little bit will shorten your life. Neanderthals were lumbering, inferior oafs. Nope. Their brains were bigger than ours and they could kick our asses with one foot tied behind their backs. Honey bees are disappearing? Nope. We’re lousy with them.

My goal, and intended default setting, is to be a skeptic, without becoming a cynic. To be open to everything, without believing anything prior to further examination. Trust but verify. As much as you can, and then still keep some skepticism in reserve.

As a young person, I was usually flush with passionate conviction. At his stage, when someone asks my opinion on something, anything, I usually say, “I’m not sure.”