
Pardon me while I unitask.
I’m reading a delightful book by Maria Konnikova, titled Mastermind: How To Think Like Sherlock Holmes. It covers much of the same territory as other books I’ve read on thinking, deducing, and questioning but it reads more like … well, like a detective novel. In other words, it’s fun.
In the past, I’ve covered Daniel Kahneman’s book, Thinking Fast and Slow. Kahneman argues that we have two thinking systems. System 1 is fast and automatic and always on. We make millions of decisions each day but don’t think about the vast majority of them; System 1 handles them. System 1 is right most of the time but not always. It uses rules of thumb and makes common errors (which I’ve cataloged here, here, here, and here).
System 1 can also invoke System 2 – the system we think of when we think of thinking. System 2 is where we logically process data, make deductions, and reach conclusions. It’s very energy intensive. Thinking is tiring, which is why we often try to avoid it. Better to let System 1 handle it without much conscious thought.
Kahneman illustrates the differences between System 1 and System 2. Konnikova covers some of the same territory but with slightly different terminology. Konnikova renames System 1 as System Watson and System 2 as System Holmes. Konnikova proceeds to analyze System Holmes and reveal what makes it so effective.
Though I’m only a quarter of the way through the book, I’ve already gleaned a few interesting tidbits, such as these:
Motivation counts – motivated thinkers are more likely to invoke System Holmes. Less motivated thinkers are willing to let System Watson carry the day. Konnikova points out that thinking is hard work. (Kahneman makes the same point repeatedly). Motivation helps you tackle the work.
Unitasking trumps multitasking – Thinking is hard work. Thinking about multiple things simultaneously is extremely hard work. Indeed, it’s virtually impossible. Konnikova notes that Holmes is very good at one essential skill: sitting still. (Pascal once remarked that, “All of man’s problems stem from his inability to sit still in a room.” Holmes seems to have solved that problem).
Your brain attic needs a spring cleaning – we all have lots of stuff in our brain attics and – like the attics in our houses – a lot of it is not worth keeping. Holmes keeps only what he needs to do the job that motivates him.
Observing is different than seeing – Watson sees. Holmes observes. Exactly how he observes is a complex process that I’ll report on in future posts.
Don’t worry. I’m on the case.

Meet your genome.
We all may well agree that 2013 was just plain weird. So, what’s next? Well, 2014 is the 100th anniversary of the beginning of World War I. It seems that all of our recent wars result from World War I, directly or indirectly. Perhaps we should just re-name the era the Second Hundred Years War.
Are there brighter things ahead? Do we have something to look forward to? Here are some suggestions from some of my favorite sources.
Meet your genome – Science magazine suggests that the era of personal medicine is just beginning. We’ll sequence your genome to develop personalized treatments for diseases like cancer or multiple sclerosis. In fact, it won’t be long before we sequence the genome of every newborn baby, just as a matter of course.
Meet your advertiser – as medicine gets personal, so does advertising. We’re changing from broadcast adverts to narrowcast – targeting demographic slivers wherever we can find them. Soon, it will be personalcast – advertising aimed at you and only you. Brick-and-mortar stores are even developing tools to track your movements in the store and make real-time special offers based on where you are.
Meet the robots – Technology Review notes that robots are ready to take their place in the workforce. They’ll start in dangerous places like battlefield rescues, but they’ll soon be able to “integrate seamlessly and safely in human spaces.” How will they learn? By studying us.
Meet your drone rescuer – the World Bank says that drones will be a “game changer” in disaster relief. They’ll help pinpoint where the problems are and drop supplies to isolated survivors. They might even “drone-lift” survivors to safety.
Meet an extinct species – 2014 is also the 100th anniversary of the extinction of the passenger pigeon. There are plans to bring it back. What next? I wonder if a T Rex would make a good pet.
Meet Consumption 2.0 – why bother to own things? Why not just pay for each use? We see it with music streaming … why not other things? We could conceivably stream books and magazines and pay for each page we read. Similarly, I just bought a new mobile phone. But I didn’t really buy it. I bought a service that provides me a phone and the right to upgrade it once a year. With technology changing so fast, why would you buy it?
Insert your computer here – biological transistors should allow us to insert computers into any living cell. That may help us repair or replace diseased bits of soft tissue just like we can replace bones and joints today. Indeed bio-computers might help us understand our own brains better. We didn’t really understand what our hearts did until we invented pumps. We may not really understand what our brains do until we build biological computers.
Meet the tech-lash – robot, bio-brains, big data, technology-driven job destruction, loss of privacy, drones, etc. etc. Where will it all lead? According to The Economist, it will almost certainly lead to tech-lash – as the technology elite “join bankers and oilmen in public demonology … in a peasants’ revolt against the sovereigns of cyberspace.”
Meet the world champion – of course, 2014 also brings us the World Cup of football. My country is in the “group of death” and I fear that we won’t make it to the knockout round. My money’s on Germany.

Go UT!
When I’ve written about best-of-breed education in the past (here, here, and here), I’ve mainly stressed the benefits to students. Students can acquire competencies in many different ways, have them tested and verified, and get on with life. They will have more choices, more flexibility, and a better education at lower cost.
That’s all well and good … but what about teachers? Would best-of-breed (BOB) education be better for us?
More specifically, could I start my own university?
I enjoy teaching and I especially enjoy teaching at the University of Denver (DU). But teaching at any university imposes some constraints. For instance, DU is on a quarter system so I can only start new classes four times a year. If I had my own university, I might start a new class every Monday on the Internet. Students would have a lot more choices of when and how to get an education.
I could probably lower the cost of education as well. While DU is competitive on cost, it’s not cheap. I could offer my courses for $100 per student – very competitive – and make it up on volume. I would probably make more money than I’m making now and, at the same time, students could reduce their costs dramatically. They’d also get some darn good classes (if I do say so myself). Quality and flexibility go up while costs go down.
Setting up my own university is not as far-fetched as it might sound. It all has to do with how Learning Management Systems (LMS) have evolved. LMSs typically run on the Internet and help teachers manage all aspects of the education process – from lectures to grading to student communication. They can even help you identify plagiarism.
Traditionally, choosing an LMS was an institution-wide decision. A school (or a department within a school) would choose an LMS and all the teachers and students in the school would use it. For instance, I teach in University College (UCOL), the professional and continuing education unit within the University of Denver. Some years ago, UCOL standardized on an LMS called Pearson eCollege. Every course we teach now uses eCollege. (By the way, it’s quite good and I recommend it).
As LMSs have evolved over the past decade, they’ve become very rich platforms for serving up myriad educational experiences. The decision-making has also changed. It used to be an institutional decision. Now it’s a personal decision. Many LMS platforms are now free (or close to it) and they’re fairly easy to maintain. So I could acquire an LMS, set it up on this website, and start teaching. I could probably have it up and running in less than a month.
While students could learn a lot at the University of Travis (UT), they might find it difficult to prove that they had learned a lot. I would, of course, provide verifiable test scores and certificates. But that’s probably not enough. We would still need some type of universal testing system and “student passport” to verify what the students have learned and retained. As I’ve argued in the past, it’s not conceptually different from evaluating fine wines.
Andy Warhol once said that, in the future, everyone would be famous for 15 minutes. I have a slightly different take. In the future, everyone will be a teacher for 15 minutes. Barriers to entry are collapsing. While some people will teach full time, most people will teach from time to time. Opportunities to learn will expand dramatically and costs will drop.
Just one remaining question: what mascot should we choose for the University of Travis?

More efficient. Not more competitive.
Why is milk always at the back of the grocery store? Because of the precursor of Big Data. Let’s call it Little Data.
Retailers have always studied their customers’ behavior. An astute observer is just as valuable as mountains of data. In the era of Little Data, grocers noticed that shoppers usually waited until they needed several items before going to the store. Milk was different, however. If a household were out of milk, a family member would go to the store for the express purpose of buying milk – and only milk.
Once grocers noticed this, they moved the milk to the back of the store. Shoppers who came in only for milk might notice several other things they needed (or wanted) on the trip through the store. Rather than buying one item, they might buy half a dozen. By relocating the milk, the grocer could sell more.
What happened next is instructive. Once one grocer figured out the pattern and moved the milk to the back, all other grocers followed suit. I’ve verified this in at least a dozen countries. The milk is always at the back. No grocer can establish a competitive advantage by putting the milk at the back of the store.
What does this have to do with strategy? I’ve always subscribed to Michael Porter’s insights on the difference between operational effectiveness and strategy. In his classic article, What Is Strategy?, Porter defines operational effectiveness as doing the same things as competitors but doing them better. Strategy, on the other hand, means, “… preserving what is distinctive about a company. It means performing different activities from rivals or performing similar activities in different ways.”
In the era of Little Data, we could figure out simple things like how consumers buy milk. Now, in the era of Big Data, we can identify much more subtle patterns in much greater detail. However, the underlying dynamic doesn’t change. Once one company figures out a new pattern, every one of its competitors can also implement it. As Porter points out, “…the problem with operational effectiveness is that best practices are easily emulated. … competition produces absolute improvement in operational effectiveness, but relative improvement for no one.”
Big Data, then, is about operational effectiveness, not strategy. Yet when I read about Big Data in management journals, I sense that it’s being treated as strategic weapon. It’s not. Companies may have to invest in Big Data to keep up with the Joneses but it’s never going to be a fundamental differentiator or a strategic advantage. It’s time for Big Companies to wise up about Big Data.

Time to make it cool again.
When my clients talk to me about innovation, it’s almost always a conversation about products rather than processes. They want to know how to create new ideas that create new products. I often remind them that they should also be talking about new processes. New processes can lead to greater efficiency, reduced costs, and – sometimes – to new products.
When my clients do talk to me about new processes, it’s almost always about customer service and satisfaction. I can’t remember the last time I had an engaging conversation about innovation in manufacturing.
It’s a shame really because we may be entering a new paradigm in manufacturing. New processes and methods may just allow the USA to re-establish itself as a leader in manufacturing. That’s the point made by William Bonvillian in a recent article in Science. I think it’s an important trend that we need to think more about – so I’d like to summarize Bonvillian’s article here.
Bonvillian identifies the differences between front-end and back-end innovation. Front-end innovation is mainly about R&D and new products. Back-end innovation focuses on manufacturing; what’s the best way to produce those products? Bonvillian argues that our national innovation investment used to be fairly balanced between the two. Today, with the possible exception of the defense industry, we focus most of our innovation investment on the front-end.
The background to this shift is a change in the way we see manufacturing – a change in the paradigm. The original paradigm was innovate-here-and-produce-here. The current paradigm is innovate-here-and-produce-there. This has largely been driven by offshore manufacturing. Bonvillian argues, however, that the current paradigm is not just driven by low wages. Offshore centers like China have also invested heavily in back-end innovation.
The next paradigm could be innovate-there-and-produce-there, which would leave the USA essentially as a services economy. Bonvillian argues, however, that we could reverse this trend and return to the original paradigm – innovate-here-and-produce-here – through initiatives in Advanced Manufacturing or AM. He points out that Germany, like the USA, has a high-cost manufacturing infrastructure yet runs “…major trade surpluses in manufactured goods, whereas the United States has run large deficits”.
So, what does Advanced Manufacturing consist of? Bonvillian outlines six different initiatives:
Network-centric production – embed IT into every stage of the manufacturing chain and use big data to raise the IQ of the entire production process.
Advanced materials – “Create a ‘materials genome’ using supercomputing to design all possible materials….” Designers could then select the most appropriate materials for any given product.
Nanomanufacturing – “Embed nano-features into products to raise efficiency and performance.”
Mass customization – use advances in 3D printing (also known as additive manufacturing) to create one-off products “at the cost of mass production”.
Distribution efficiency – the goal might be to reduce distribution costs by 10%. According to Bonvillian, that’s enough to shift decisions about onshore versus offshore manufacturing. (Could delivery by drones be part of this?)
Energy efficiency – Bonvillian argues that “U.S. manufacturing has long been overly energy-intensive.” Using energy efficient technologies “could significantly drive down production costs.”
Bonvillian develops a good list but I think one more thing is needed. Somehow we need to make it cool to participate in back-end innovation. Today, it’s cool to do financial innovation and product innovation. But I don’t see the best and brightest minds drawn to manufacturing innovation. Time to launch a branding campaign to make manufacturing cool again.