Strategy. Innovation. Brand.

Innovation

The Future of 3D Printing (and Elliot)

table 1I don’t know if 3D printing will change the world, but I do know that it’s helping Elliot start up a new business in Berlin.

Elliot’s a good designer. He’s designed everything from 3D videos to websites to room dividers to packaging. But he has a special knack for furniture – especially furniture designed on a computer.

Using 3D software Elliot designed the table in the photos. He then built one, using a numerical controls (NC) laser to cut the wood and a 3D printer to create the red resin joints. (You can get them in any color you want).

The design incorporates both additive and subtractive manufacturing. Elliot creates the tabletop by cutting wood away – it’s subtractive. He creates the joints in a 3D printer by adding one layer of resin on top of another repetitively.

Elliot has now rented a studio in Berlin and is starting up a furniture design and manufacturing business, under the brand name Studio Elliot White. None of this would have been possible without 3D design software and printers.

So, is Elliot a harbinger of things to come? Well, maybe. A few days ago the New York Times had a lively discussion on the topic in table 2its Room For Debate section. Here are some highlights along with my ruminations.

Big picture – the grand vision has been that we will all have 3D printers in our homes. We could order Elliot’s table (or bench or chair) and he would simply send us the design files for us to print at home. That seems unlikely, at least in the near term. Home printers just don’t produce the necessary quality.

Future schlock – Amazon recently opened its 3D printing store. You can create your own products. Unfortunately, most of it is schlock – like a 3D printed plastic dog bone. Even with such simple products, you don’t get to print it yourself. Amazon prints it and ships it to you.

Jet engine parts – some analysts suggest that 3D printing will always be low quality because of inherent weaknesses in additive technology. But General Electric has figured out how to print fuel nozzles for jet engines by printing layers of metal. By doing so, they reduce costs and lead times while improving quality.

table 3Clothes – I wouldn’t have thought of printing clothes but start-up companies are pushing the trend. Indeed, the US Army seems to think that it can clothe soldiers in high-tech printed uniforms for greater comfort and safety. And, yes, there is even a range of 3D printed bikinis.

Food – yes, it’s possible to print food. I was heartened to learn that one of the first applications is to print chocolate onto other foods.

Guns – yep, printable guns are here. And here. They’re cheap and undetectable. But don’t worry. They can only fire a few bullets before they break.

Manufacturing – could 3D printing return manufacturing to advanced countries? Maybe. As one Room For Debate writer noted, manufacturing productivity has risen much more quickly than overall business productivity. Meanwhile, the cost of labor in China is rising at 10 to 15% per year. If these trends continue, on-shoring makes a lot more sense.

So, what’s the future? Well, once again, I think we’re caught in the hype cycle. The initial hype was intense. Now there’s a bit of disappointment as reality seeps in. But that’s usually followed by a rising productivity curve as entrepreneurs sort out which technologies fit which markets. I’m generally optimistic. Now … does anyone want to buy a table?

 

 

 

 

 

 

My iPod Is Conscious

Speak, wise one.

Speak, wise one.

Apparently my iPod is a sentient being. It senses its surroundings, understands context, and makes intelligent decisions.

Here’s the latest example. Yesterday, we received this week’s edition of The New Yorker. The cover features a couple kissing on the 59th Street Bridge. This morning, at the gym, my iPod randomly selected (from more than 4,000 choices) the 59th Street Bridge Song, the goopy old standard by Simon & Garfunkel. Even more eerily, the lyrics told me to “Slow down, you’re moving too fast…” which was exactly what I needed to do on the exercise machine I was using.

Clearly, my iPod knew about the magazine (the print edition!) and also knew that I was over-exerting myself. It selected the perfectly appropriate song from thousands of possibilities. Thank you, Steve Jobs.

But wait … really? Clearly the magazine’s cover art primed me to think about the 59th Street Bridge. When I heard the song, I made the connection. That’s the effect of priming. As for the advice on slowing down … well, I wouldn’t have noticed it if I weren’t overdoing it. In other words, I was being primed (or conditioned) in two different ways. I noticed things that I wouldn’t otherwise have noticed. I assumed there was a connection, but it was really just a coincidence.

Coincidences can screw up our thinking in myriad ways. Let’s look at four ways to consider coincidences and causes:

A) It’s a coincidence and we recognize it as such – most people would conclude that my iPod is not conscious … Apple’s not that good. We correctly conclude that it’s not a cause-and-effect situation.

B) It’s a coincidence but we thinks it’s a cause – this is where we can get into big trouble and deep debates. This is a problem in any discipline — like economics, sociology or climate science — where it’s difficult to run experiments. It’s hard to pin down if X causes Y or if Y causes X or … well, maybe it’s just a coincidence. (Maybe it was the rats).

C) It’s a cause and we recognize it as such – we know that certain germs cause certain diseases. So we take appropriate precautions.

D) It’s a cause but we think it’s a coincidence – before the 19th century, we didn’t recognize that germs caused diseases. We thought it was just a coincidence that people died in filthy places.

I suspect that many conspiracy theories stem from Category B. We note a coincidence and assume mistakenly that it’s a cause. The dust bowl in the United States coincided with over farming and also with the rise of communism in Europe. A small but noisy group of people concluded that the dust bowl was not caused by farming techniques and drought but was actually a communist conspiracy.

We can also suffer from Category D problems. I read recently of a man who had a chronic infection in his right ear. Doctors couldn’t figure it out. Finally, the man took some earwax from his left (healthy) ear and stuck it in his right ear. The infection went away. It seemed coincidental that his left ear was healthy while his right ear was not but it actually pointed to a cause. His left ear had healthy bacteria (a healthy microbiome) while his right ear did not. The man suspected that the difference between his left and right ears was not coincidental. He was right and solved a Category D problem.

In a weird way, this all ties back to innovation. If we want to stimulate innovation, we can usefully ask questions like this: “I note that A and B vary coincidentally. Is that really a coincidence or does it point to some deeper cause that we can capitalize on?” While Category B can generate endless debates, Category D could generate novel solutions.

(How do you know if something is true? Click here.)

Rats, Lice, and Innovation

Fundamental driver of history

A fundamental driver of history

I took a lot of history courses in college. Most of them were focused on a particular geography at a particular time like, say, Latin America in the 19th century.

A few, however, sought to describe and explain the entire arc of history – the grand narrative of the really big picture. I especially remember reading Arnold Toynbee’s A Study of History, a 12-volume set that chronicled 26 different civilizations and offered a cohesive explanation of why they rose and fell. (Truth be told, I read the abridged version).

As I read Toynbee, I finally came to understand the entire ebb and flow of history – why things happened and how one thing led to another. Then I read another book that shook my confidence and taught me that I probably didn’t understand much at all.

The book was Rats, Lice and History by Hans Zinsser. It’s a much more modest book than Toynbee’s but perhaps more enlightening. Its basic thesis is simple: a lot of stuff happens by accident and stupidity. One army defeats another not because of the grand arc of history but because rats have eaten the losing army’s grain. A civilization falls not because of religion (or lack of it) but because lice have spread disease among the population.

I’ve always taken Zinsser’s book as a cautionary tale. Whenever I read a grand narrative that claims to explain it all, I wonder if the author didn’t miss something random and elemental. Was Karl Marx right about the rise of the working class or did he just miss the fact that plague destroyed prevailing social structures? Did America become a great power because of Manifest Destiny or because two great oceans protected us from pathogens?

Though I understand something about pathogens, I never connected them to innovation – until last week. That’s when I stumbled across an article by Damian Murray in The Journal of Cross-Cultural Psychology. Murray connects pathogens to a culture’s ability to generate scientific and technical innovations.

The path from pathogen to innovation (or lack of it) is a bit circuitous. The basic argument is that the presence of pathogens causes cultures to adopt certain practices and behaviors that suppress disease. These practices include “xenophobia and prejudiced responses to foreign individuals”, the adoption of “conformist attitudes and behaviors”, traditionalism, collectivism, and authoritarianism.

Murray (and many others before him) argues that xenophobia, conformism, traditionalism, collectivism, and authoritarianism “serve to buffer against disease transmission.” In other words, they’re good for you and your culture. While these behaviors help ward off diseases, Murray argues that they also have a hidden cost: reduced ability to innovate.

These variables are linked in multiple ways. For instance, xenophobia, conformism, and traditionalism tend to produce collectivist as opposed to individualist cultures. Murray notes that a number of researchers (including Hofstede) have connected individualism to innovation. Similarly, conformism may lead to authoritarianism (or is it the other way round?) which may lead to a reduced rate of innovation.

Murray’s argument is ingenious and intriguing. But, for me, it’s not just about innovation. It illustrates a much bigger problem in understanding reality: we often don’t know what causes what. We look at the past and build arguments that A leads to B and B leads to C. Our stories are logical and comforting but probably wrong.

The French philosopher Henri Bergson warned us to beware of the “retrospective illusion”: that mechanistic forces predetermined every event in history. That history could not have happened any other way. (This feels similar to the illusion of explanatory depth). For me, Murray and Bergson are saying the same thing: don’t assume that A causes B just because it seems logical and intuitive. Maybe it was just the rats.

Jill Disrupts Clayton (Sort Of)

I've been disrupted!

I’ve been disrupted!

My career has been a steady diet of disruption.

Three times, disruptive innovations rocked the companies I worked for. First, the PC destroyed the word processing industry (which had destroyed the typewriter industry). Second, client/server applications disrupted host-centric applications. Third, cloud-based applications disrupted client/server applications.

Twice, my companies disrupted other companies. First, RISC processors disrupted CISC processors. Second, voice applications disrupted traditional call centers.

In 1997, a Harvard professor named Clayton Christensen (pictured) took examples like these and fashioned a theory of disruptive innovation. In The Innovator’s Dilemma, he explained how it works: Your company is doing just fine and understands exactly what customers need. You focus on offering customers more of what they want. Then an alternative comes along that offers less of what customers want but is easier to use, more convenient, and less costly, etc. You dismiss it as a toy. It eats your lunch.

The disruptive innovation typically offers less functionality than the product it disrupts. Early mobile phones offered worse voice quality than landlines. Early digital cameras produced worse pictures than film. But, they all offered something else that appealed to consumers: mobility, simplicity, immediate gratification, multi-functionality, lower cost, and so on. They were good enough on the traditional metrics and offered something new and appealing that tipped the balance.

My early experiences with disruption – before Christensen wrote his book — were especially painful. We didn’t understand what was happening to us. Why would customers choose an inferior product? We read books like Extraordinary Popular Delusions and The Madness of Crowds to try to understand. Was modern technology really nothing more than an updated version of tulip mania?

After Christensen’s book came out, we wised up a bit and learned how to defend against disruptions. It’s not easy but, at the very least, we have a theory. Still, disruptions show no sign of abating. Lyft and Uber are disrupting traditional taxi services. AirBnB is disrupting hotels. And MOOCs may be disrupting higher education (or maybe not).

Such disruption happens often enough that it seems almost intuitive to me. So, I was surprised when another Harvard professor, Jill Lepore, published a “take-down” article on disruptive innovation in a recent edition of The New Yorker.

Lepore’s article, “The Disruption Machine: What The Gospel of Innovation Gets Wrong”, appears to pick apart the foundation of Christensen’s work. Some of the examples from 1997 seem less prescient now. Some companies that were disrupted in the 90s have recovered since. (Perhaps we did get smarter). Disruptive companies, on the other hand, have not necessarily thrived. (Perhaps they, too, were disrupted).

Lepore points out that Christensen started a stock fund based on his theories in March 2000. Less than a year later, it was “quietly liquidated.” Unfortunately, she doesn’t mention that March 2000 was the very moment that the Internet bubble burst. Christensen may have had a good theory but he had terrible timing.

But what really irks Lepore is given away in her subtitle. It’s the idea that Christensen’s work has become “gospel”. People accept it on faith and try to explain everything with it. Consultants have take Christensen’s ideas to the far corners of the world. (Full disclosure: I do a bit of this myself). In all the fuss, Lepore worries that disruptive innovation has not been properly criticized. It hasn’t been picked at in the same way as, say, Darwinism.

Lepore may be right but it doesn’t mean that Christensen is wrong. In the business world, we sometimes take ideas too literally and extend them too far. As I began my career, Peters and Waterman’s In Search of Excellence was almost gospel. We readers probably fell in love a little too fast. Yet Peters and Waterman had – and still have — some real wisdom to offer. (See The Hype Cycle for how this works).

I’ve read all but one of Christensens’s books and I don’t see any evidence that he promotes his work as a be-all, end-all grand Theory of Everything. He’s made careful observations and identified patterns that occur regularly. Is it a religion? No. But Christensen offers a good explanation of how an important part of the world works. I know. I’ve been there.

 

 

 

 

Let’s Get Digical

I’m becoming a digical life form. Here’s the evidence:

  • I wear an electronic bracelet that keeps track of all the calories I burn. It even beeps to remind me when I sit still for too long.
  • An app on my smartphone keeps track of the calories I consume. In theory, I should be able to keep my calories-in lower than my calories-out.
  • We were in Berlin recently and were very impressed – but somewhat confused – by the extensive public transportation system. The solution? A digital mapping app on my smartphone. We could get from anywhere to anywhere quickly and easily (and drink beer along the way).
  • My smartphone also controls my new digital hearing aids. Among other things, I can program my earbuds to a given location, like a conference room. Whenever I return to that conference room, my smartphone senses where I am and sets the parameters automatically. In some cases, I can hear better than my colleagues with “normal” hearing.
  • All of the devices I use today are external. If I live for another 20 years or so, I’m sure that some of the devices will be implanted in my body.

Digical is a blend of the physical and the digital. I think of it as adding digital extensions to humans (or other animals). But Bain & Company actually coined the term (in a recent white paper) and they think of it as business, not biology.

Let's get digical.

Let’s get digical.

In Bain’s usage, digical refers to the merger of a company’s physical and online operations. When e-commerce took off back in the 90s, some wild-eyed analysts predicted that it would spell the end of brick-and-mortar stores. As Bain (and many others) have pointed out, nothing could be farther from the truth.

As we all know (but sometimes forget) humans are social animals. We like to be around other people. We generally thrive in society and wither in isolation. (It’s why tall buildings make you crazy). For this very reason, Bain suggests that the future of retailing will be the digical world. Retailers will increasingly merge physical stores and online operations into “omnichannel” solutions.

Other industries – especially entertainment and technology – will go digical quickly. Even industries like construction, which might not seem like digical leaders, are getting digital tools to dig better holes and build smarter buildings. Smart tractors use GPS and a databank of seed information to help farmers plant smarter, conserve resources, and increase yields.

In reading Bain’s white paper, three things stood out for me:

  • The biggest change is yet to come – Yikes! We’ve seen a lot in the past two decades. But Bain says the near future “…will bring far more innovation to most industries than they have seen in the past.”
  • Silos are major impediments – siloed organizations will be followers at best, never leaders. Perhaps the first step to becoming digical is to break down silos and…
  • …build a cohesive culture – The Bain authors never actually use Peter Drucker’s famous quote – Culture eats strategy for breakfast – but they certainly imply it. To become digical leaders, focus on culture first.

I like the term digical; I hope it becomes the word of the year in 2014. Bain has a very clear definition and useful advice for businesses. Personally, I’d like to see the definition expanded to include biology as well as business. After all, we’re all going digical.

(Digical is a sales mark of Bain and Company).

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives