Strategy. Innovation. Brand.

Critical Thinking

My Brain Is Ripening

My brain is aging like a fine wine.

My brain is aging like a fine wine.

What cognitive advantages do young people have over me? Not as many as we once thought.

We once assumed that our brains grew until, oh say, our mid-twenties and then gradually declined until death. We had what we had and would never get any more. As cells died, they weren’t replaced. After reaching its peak, the brain was essentially static – it couldn’t grow or enrich itself. It could only decay.

A new paradigm holds that the brain is plastic – it can grow and change and build new connections well into our mature years. It may even be possible to do brain exercises to improve our mental performance.

This model generally divides intelligence into two types: fluid and crystallized. Fluid intelligence is the ability to think critically and manage effectively in novel situations. People with fluid intelligence can reason their way through unfamiliar territory by recognizing patterns and relationships. They figure out stuff on the fly.

People with crystallized intelligence have valuable facts and data stored in their brains and know what to do with it. (I have some data and I know how to use it!) It’s all about what we’ve experienced, learned, and remembered over a lifetime.

Young people tend to excel at fluid intelligence. Why? They get more practice. Since they don’t have many experiences, a greater proportion of their experiences will be novel. When our 14-year-old niece spent a summer with us in Stockholm, everything was new to her. She experienced her first taxicab, her first subway, and her first molten chocolate cake. She had plenty of chances to hone her fluid intelligence.

Older people, on the other hand, tend to have more crystallized intelligence. I experienced my first subway long ago. I learned from the experience and stored what I learned somewhere in memory (where it crystallized). I can deal with subways because I know about them. I don’t need to spot new patterns; I already recognize them. As I deal with fewer novel situations, my fluid intelligence gets rusty.

Now, there’s a new, new paradigm of brain function. It’s not just fluid versus crystallized. Rather, there are multiple cognitive skills and they peak at different times in our lives.

The new view is exemplified in the work of Laura Germine and Joshua Hartshorne. Germine and Hartshorne have recruited thousands of people of all ages to play mental games at testmybrain.org and gameswithwords.org. The resulting data allow the researchers to identify different cognitive skills and relate them to different age ranges. (The original paper is here. A less technical overview is here).

Here’s a summary of what they found:

Peak mental processing speed occurs, as expected, in late teens and early 20s and declines relatively rapidly afterward. But other skills peak at different times. Working memory climbs in the late 20s to early 30s, and then declines only slowly over time. Social cognition, the ability to detect others’ emotions, peaks even later — in the 40s to age 50 — and doesn’t start to significantly decline until after 60. …. Crystalized intelligence, measured as vocabulary skills, didn’t have a peak. Instead, it continued to improve as respondents aged, until 65 to 70.

The finding that crystallized intelligence doesn’t peak until 65 to 70 seemed to contradict earlier studies. When Germine and Hartshorne analyzed earlier studies however, they found that the peak itself rose over time. Studies conducted in the seventies, for instance, suggested that crystallized intelligence peaked in the early 40s. Studies conducted in the eighties and nineties found a later peak: around 50. Studies conducted since 1998 showed an even later peak: around 65. So, perhaps, our entire society is getting smarter.

So don’t assume that my brain is declining as I age. Rather, in Germine’s phase, it’s ripening. Maybe yours is, too.

The Hedonic Treadmill and Brain Health

The treadmill? Again?

The treadmill? Again?

A couple of years ago, I wrote an article that explains why your dog is happier than you are.

The general idea is rather simple. Everything that happens in a dog’s life is new and stimulating. Each car ride brings a new adventure. For us humans, new things or new experiences soon become the new normal. Rather than being stimulating and refreshing, new things quickly become part of a new routine. We’re soon back in the same old rut. It’s known as hedonic adaptation or the hedonic treadmill.

I’ve remarked on this to many dog-owning friends and they all agree that it’s real. Their general explanation is that dogs live in the moment and we don’t.

But why would that be? Why would dogs live in the eternal present while we humans continue to flit back and forth between past, present, and future? To live in the present, we humans need special training in mindfulness and meditation. Why isn’t it just our natural state of being? It seems to work pretty well for dogs.

Then I considered what it takes to keep the human brain healthy. Most of the sources I’ve consulted suggest that seeking novelty is a key ingredient of brain health. Seeking out novel experiences, learning new skills, visiting new countries all stimulate us and contribute to brain health. Even reading a political columnist whom you disagree with can apparently contribute to a healthy brain. So can doing more things with your non-dominant hand.

Why would these activities contribute to brain health? Novelty stimulates new connections in the brain. We all have bazillions of brain cells. That’s all well and good but it’s the richness and density of the network that connects those cells that seems to influence brain health. Doing new things stimulates growth. Doing the same old things can reinforce existing connections but is less likely to create new ones.

So how do we encourage humans to seek novelty? Simple: make old things boring. Perhaps we experience the hedonic treadmill because we need novelty to promote brain health. With simpler brains, dogs don’t need a hedonic treadmill. They can live in the moment and stay perfectly healthy. We can’t.

Think about that the next time you’re lusting after a new car or a new house or a new toy. The acquisition might just bust your budget. But it might also make your brain healthier. At least for a while.

Do Smartphones Make Us Smarter, Dumber, Or Happier?

So which is it?

Smartphones:

  1. Make you lazy and dumb.
  2. Make the world more intelligent by adding massive amounts of new processing power.
  3. Both of the above.
  4. None of the above.

Smartphones have an incredible impact on how we live and communicate. They also illustrate a popular technology maxim: If it can be done, it will be done. In other words, they’re not going away. They’ll grow smaller and stronger and will burrow into our lives in surprising ways. The basic question: are they making humans better or worse?

Smarter or dumber?

Smarter or dumber?

Researchers at the University of Waterloo in Canada recently published a paper suggesting that smartphones “supplant thinking”. The researchers suggest that humans are cognitive misers — we conserve our cognitive resources whenever possible. We let other people – or devices – do our thinking for us. We make maximum use of our extended mind. Why use up your brainpower when your extended mind – beyond your brain – can do it for you? (The original Waterloo paper is here. Less technical summaries are here and here).

Though the researchers don’t use Daniel Kahneman’s terminology, there is an interesting correlation to System 1 and System 2. They write that, “…those who think more intuitively and less analytically [i.e. System 1] when given reasoning problems were more likely to rely on their Smartphones (i.e., extended mind) ….” In other words, System 1 thinkers are more likely to offload.

So, we use our phones to offload some of our processing. Is that so bad? We’ve always offloaded work to machines. Thinking is a form of work. Why not offload it and (potentially) reduce our cognitive load and increase our cognitive reserve? We could produce more interesting thoughts if we weren’t tied down with the scut work, couldn’t we?

Clay Shirky was writing in a different context but that’s the essence of his concept of cognitive surplus. Shirky argues that people are increasingly using their free time to produce ideas rather than simply to consume ideas. We’re watching TV less and simultaneously producing more content on the web. Indeed, this website is an example of Shirky’s concept. I produce the website in my spare time. I have more spare time because I’ve offloaded some of my thinking to my extended mind. (Shirky’s book is here).

Shirky assumes that creating is better than consuming. That’s certainly a culturally nuanced assumption, but it’s one that I happen to agree with. If it’s true, we should work to increase the intelligence of the devices that surround us. We can offload more menial tasks and think more creatively and collaboratively. That will help us invent more intelligent devices and expand our extended mind. It’s a virtuous circle.

But will we really think more effectively by offloading work to our extended mind? Or, will we forevermore watch reruns of The Simpsons?

I’m not sure which way we’ll go, but here’s how I’m using my smartphone to improve my life. Like many people, I consult my phone almost compulsively. I’ve taught myself to smile for at least ten seconds each time I do. My phone reminds me to smile. I’m not sure if that’s leading me to higher thinking or not. But it certainly brightens my mood.

Aristotle, Cyberpunk, and Extended Minds

How far does it go?

How far does it go?

Aristotle argued against teaching people to read. If we can store our memories externally, he argued, we won’t need to store them internally, and that would be a tragic loss. We’ll stop training our brains. We’ll forget how to remember.

Aristotle was right, of course. Except for a few “memory athletes”, we no longer train our brains to remember. And our plastic brains may well have changed because of it. The brain of a Greek orator, trained in advanced memory techniques, was probably structurally different from our modern brains. What we learn (or don’t learn) shapes our physical brains.

Becoming literate was one step in a long journey to externalize our minds. Today, we call it the “extended mind” based on a 1998 paper by the philosophers Andy Clark and David Chalmers. Clark and Chalmers ask the simple question: “Where does our mind stop and the rest of the world begin?” The answer, they suggest, is “… active externalism, based on the active role of the environment in driving cognitive processes.”

If our minds extend beyond our skulls, where do they stop? I see at least three answers.

First, the mind extends throughout the rest of the body. As we’ve seen with embodied cognition, we think with our bodies as much as our brains. The physical brain is within our skulls but the mind seems to encompass our entire body.

Second, our minds extend to other people. We know that the people around us affect our behavior. (My mother warned me against running with a fast crowd). It turns out that other people affect our thoughts as well, in direct and physical ways.

The physical mechanism for “thought transfer” is the mirror neuron – “…a neuron that fires both when an animal acts and when the animal observes the same action performed by another.” When we see another person do something, our mirror neurons imitate the same behavior. Other people’s actions and moods affect our thoughts. We can – and do –read minds.

The impact of our mirror neurons varies from person to person. The radio show Invisibilia recently profiled a woman who could barely leave her own home so affected was she by other people’s thoughts. (You can find the podcast, called Entanglement, here). The woman was so entangled with others that it’s nearly impossible to draw a line between one mind and another. Perhaps we’re all entangled – each brain is like a synapse in a much larger brain.

Third, we can extend our minds through our external devices. We now have many ways to externalize our memories and, perhaps, even our entire personas. In Neuromancer, the novel that launched the cyberpunk wave, people save their entire personalities and memories on cassette tapes. (How quaint). They extend their minds not only spatially but also into the future.

Neuromancer is about the future, of course. What about today’s devices … and, especially, the world’s most popular device, the smartphone? As we extend our minds through smartphones, do we reduce the “amount of mind” that remains within us? Do smartphones make us dumb? Or, conversely, do they increase the total intelligence availability to humanity – some of it in our brains and bodies and some of it in our external devices?

Good questions. Let’s talk about them tomorrow.

Prospero’s Precepts – Thinking About Thnking

Really?

Really?

Did Shakespeare really write Shakespeare? It’s a question that’s been analyzed many times – mainly by historians and literary critics. But Peter Sturrock, a professor of physics at Stanford, recently took “A Scientific Approach to the Authorship Question.”

In his book, AKA Shakespeare, Sturrock uses probability, logic, Bayesian statistics, and good old-fashioned critical thinking to revisit the question. Sturrock argues that the real author of the Shakespearean plays could have been one of three different people. He uses a scientific, rationalist method and fashions a conversation between multiple observers, each with his own perspective.

For Shakespeare buffs, this is catnip. But even if you’re not caught up in the intrigues of the Elizabethan era, Sturrock provides a fascinating look at how to think about complex and fractured issues.

Sturrock also collects and organizes 11 key insights into critical thinking that he calls Prospero’s Precepts. These form the intellectual foundation for his inquiry into the authorship question. For me, the list itself is catnip, and worth the entire cost of the book. Here are the Precepts. I hope you enjoy them.

All beliefs in whatever realm are theories at some level. (Stephen Schneider)

Do not condemn the judgment of another because it differs from your own. You may both be wrong. (Dandemis)

Read not to contradict and confute; nor to believe and take for granted; nor to find talk and discourse; but to weigh and consider. (Francis Bacon)

Never fall in love with your hypothesis. (Peter Medawar)

It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories instead of theories to suit facts. (Arthur Conan Doyle)

A theory should not attempt to explain all the facts, because some of the facts are wrong. (Francis Crick)

The thing that doesn’t fit is the thing that is most interesting. (Richard Feynman)

To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact. (Charles Darwin)

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. (Mark Twain)

Ignorance is preferable to error; and he is less remote from the truth who believes nothing, than he who believes what is wrong. (Thomas Jefferson)

All truth passes through three stages. First, it is ridiculed, second, it is violently opposed, and third, it is accepted as self-evident. (Arthur Schopenhauer)

By the way, I first discovered Sturrock’s book and the Precepts on Maria Popova’s excellent website, Brainpickings.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives