To iPad or not to iPad.

January 31, 2010 in Notes from the Field
Mickey McManus
BCG Senior Advisor

Did you watch the keynote or video highlights of the iPad release?

A master class

Hate him or love him, Steve is teaching a masters class on big picture thinking and business balance. It was classic and masterful (or to put in he and his team’s own words… “amazing, incredible, simple, amazing, incredible, incredible, amazing”).

Seeded with employees and press, the event was structured to get applause even if everyone kinda expects the impossible daily from Jobs. The press and the applause where certainly more muted than at the launch of the iPhone. I suspect mostly because people were talking about and expecting this sorta “inflated iPhone” for some time now, so it was a bit harder to see the reason to get too excited. In fact right after the launch the web flooded with lovers and haters of the new device.

But I think Steve managed a clever balance of showing just enough without giving away everything, confounding expectations along certain dimensions but following the iPhone playbook on others (well, that was the most cited success in the recent history of business and computing, maybe we should build on it… or more likely maybe this was part of the plan all along and the iPhone and recent laptop models were just triangulations to learn boundaries).

Not a one-off stunt, an architectural play

This is such a great example of a strategic play and maybe even more importantly, an architectural play:

  1. Seed the market with a new kind of product that pretty much ushers in a new interaction paradigm (the iPhone)
  2. Get tons of things built and make a true economy for all involved so large groups of people are motivated to innovate (instead of small groups of fanatic complexity hobbyists building ever more faithful open source copies of the past)
  3. Make everything scalable so that as you extend things you don’t force anyone to have to rewrite their applications too much, but give people more hooks so they are encouraged to invest time in making newer better ones for the next big thing
  4. Leave the PC connectivity so you still really have to have a computer to update and manage it (at least while they still sell those old clunky things)
  5. Step by step take on just about every kind of competition when they pretty much least suspect it (and of course be frenemies, wanna use kindle on the iPhone or iPad, sure we all play nice)
  6. And transform dying industries when they are most desperate for help (hello music industry, hello publishing industry, etc.)
  7. Oh yeah, don’t make it look like a PC so people don’t fall prey to the baggage or expect all the “Good” things like a mouse from the 70s and a constantly spinning beach ball or all the trappings of a dead or dying style of computing)

A look at past futures may give us a clue

Although a tablet has been the subject of just about every other sci-fi movie and book about the near future for the last 100 years (actually more), we never quite knew how it would come about. I can’t help remembering watching Star Trek the Next Generation and seeing Picard holding two slim tablets with different things on them and a few scattered on his desk. I thought, “oh that’s so wrong and sad!” Why would people have multiple tablets when one could show multiple pages on a single one?

How silly and anachronistic of the set designers to do such a thing.

I was wrong

Fundamentally two forces will make that scene inevitably come true.

The first is a phenomenon called “Where It Is Is What It Is,” something we stumbled on over 20 years ago in our research into how people pile and file documents on their desk. Humans use the physical environment to “remember” for them. We (and yes actually squirrels and many other animals) evolved the ability to remember a large number of things by their location in the real world. That’s why the aforementioned squirrels can find their well buried winter bounty of acorns months after hiding them. It’s why, no matter how cluttered someone’s desk is, they can find exactly the right report from the right pile instantly. So although we can only remember about seven plus or minus two things in short term memory (a bit of pop psychology that is approximately right), we can “index” tens, or hundreds of things by where they are in the world.

How does this play into the iPad vision?

Even though you could use one display, you’d have a harder time “indexing” the things you need to know because they’d all be hidden from view (like the fifth bullet point on a 50 slide powerpoint, you would be a savant if you could call it to the forefront of your memory at the drop of a hat). Now the increased real estate of the iPad over the iPhone will serve in good stead for a while. And Apple has a number of nice visual organizational interaction patterns that simulate “Where It Is Is What It Is.” A nice example is how they help you remember photos in your collection by creating a large 2D surface of snapshots that you can pan around. But studies have shown that business productivity increases significantly when workers have more than one display or a larger amount of physical display real estate.

You won’t own an iPad

You’ll have a bunch them. Ahh you say, those things are too expensive, $600 dollars and I’d have a bunch?

Well that brings us to the second thing that will drive the future of interaction. The semiconductor industry recently noted that we make more transistors than grains of rice, and cheaper.

Today the iPad is considered expensive (good to remember that Steve’s first foray into visual desktop computing was the Lisa which cost in today’s dollars around $20,000-$30,000 a piece).

I can bet that you already have a spare iPhone sitting around and if you don’t someone at your office has one in a drawer or parts bin or has recently gifted a slightly older iPod to a friend or family member. Soon iPad like devices will be literally dirt cheap. They’ll give them away in cereal boxes (heck the cereal boxes of some brands will actually just be hobbled iPad screens running advertising apps in less than five years). And just like USB thumb drives, we’ll find them in the crevices of our briefcases, scattered on the desks of our friends, and lost in the creases of taxicabs around the world .

Nano Crispies

So don’t be surprised when app developers start to come out with apps that run across multiple iPads, iPhones, cereal boxes, walls and floors.

Our research has focused on just this sort of ubiquitous, location aware computing for the last two decades (in fact Bruce Horn, the designer of Apple’s foundational “Finder” in the original Mac was an early researcher at MAYA). We can assure you pervasive computing is coming and it is going to transform the world. One of our projects based on this sort of “Where It Is Is What It Is” thinking has driven improvements in collaborative decision making by over 400% (not our numbers by the way, but actually the results of a published study by the government agency that built the original seeds of the internet back then they called it the ARPAnet).

The secret that Steve has been keeping from everyone is not only a very agile and persistent business mind and long term vision, but also the notion that in some senses we should forget technology and design for people.

Initial reaction

Seems everyone will complain about no camera, no multitasking, and no handwriting recognition, but that all just seems like a master plan from Steve. I can’t help but think he could have done all those things (um, he already has in just about every other device he’s made so it isn’t for lack of tech) but decided there was way too much money to capture (and far more human-centered culture change to drive) by phasing this stuff in as needed and leaving the vocal minority of geekboys, complexity hobbyists, and newshungry pundits to lovingly snipe.

I think it is just about perfect for the first release of a new kind of computer. Others will say tablets have been around forever and never sold well, but they will miss the details.

We have to remember rule number one of innovation, big ideas are easy (a dime a dozen), details are hard (that’s why they take sweat and perseverance and timing and luck, and all the other things that successful innovators live and breath).

Even though there have been tablets in the past, they didn’t have a successful economic model, they didn’t have multi-touch (an interface that at least gets us all a little closer to using our natural abilities than a strange puck that we indirectly use to manipulate things) and all the other sensors that we take for granted on a current generation hand held device (GPS, orientation, compass, etc. which again help us treat information as a tangible thing), and the culture wasn’t at the right moment of readiness for them.

I think it will be wildly successful, will transform industries like Health Care, Education, and Publishing, and get us one step closer to something sane in the computing industry.


Sadly entire industries of business consultants will now spring up (as they did after the iPod, iPhone, and iTunes success), preaching how you too can make money “JUST” by copying good ol’ Mr. Jobs. Gee, how did he do it Mr. Wise Pro from Dover?

Shucks, I’ve decoded his secret formula and it’s pretty damn easy, “JUST” make a shiny object of desire, be a little different, act like David in front of Goliath, and build your own metaphor of an app store. Easy as 1-2-3.


Related Posts

A Guide to Remote Teaming

Jun 01 2017

Our Senior Strategist from our New York location weighs in on how to build a successful team while working remotely.

Derek Lasher
Senior Product Manager

The Man in the (Inclusive) Arena

Apr 13 2017

Why is a diverse workplace important? MAYA’s CCO hosted Director of Engineering, Leslie Miley, to weigh in on the conversation.

Adam Paulisick
Chief Product Officer

You Have No More Excuses for Avoiding VR + AR

Mar 28 2017

Has your organization embraced VR and AR yet? Here’s how to start.

Kent Vasko II
See all posts in Notes from the Field