Somehow, around two weeks ago, I strained? pulled? twisted? some thew or something in my left shoulder. Driving a tractor with all the controls on the right became a real hassle, and putting on my pants a challenge. Needless to say, posting one-handed wasn't fun either.
So I didn't.
But it's getting better in that casually glacial pace I now realize is how geezers heal.
So let's talk irrationality. It seems fitting just now when a new farm program is taking shape.
As many of you know I have been fascinated with efforts by cognitive psychologists to understand why we make irrational choices, indeed how we choose in the first place. A considerable body of research has piled up on the odd ways our mind works.
All of it is pretty damning when it comes to imagining our species as having anything at all in common with say, Vulcans. We are persuaded by the trivial, attention-grabbing, or emotional in the face of hard facts to the contrary.
In fact, it all seems pretty futile.
The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.Currently, I'm reading Jonathon Haidt's The Righteous Mind, which could be the best of the bunch. And despite the warnings voiced above, I think it has been remarkably helpful, especially in this election season.
The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.
Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:
There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful. [More]
But I agree about the "narrative bias". The reason farmers watch USFR is hear a story about the commodity markets that has all the elements of any good yarn. Good guys and bad guys, a range of motives, and nearly miraculous abilities to shape events.
Most of all it transforms a raft of transaction details (sales) into an living entity - The Market. This entity goes shopping for acres, gets confused, and tries to persuade farmers to do various things like hold crops or shift acres. It is this enduring fairy tale that takes a mess of dry data and turns it into something we think we understand. In reality it is little different from seeing a pony in the sky when looking at water vapor accumulations.
The bigger question is if we can or even should override this predilection to choose a narrative over say, a statistical summary. Currently, I still hold out hope we can, but it's so much hard work with often little reward, we don't.
Regardless, just knowing our thoughts are riddled with inconsistencies should enable us to proceed with more caution, especially when demanding we know The Truth.