If I learned anything during my years at IBM it was the importance of managing your business. If I learned anything about Apple during my years at IBM it was the importance of delivering delight.
Near-death-experiences have a way of sharpening your focus on survival and IBM’s NDE in the early 90′s did exactly that. With the surprising fortuitous arrival of the Web in their darkest days IBM focused on infrastructure, thinking it might be a long-term secular “big-play.” They were right. In Wall Street parlance, it turned out to be a Kondratieff Wave. The bespectacled vision of a few wise IBMers saved the company. To gain presence quickly, IBM began acquiring companies that played in the software infrastructure and security space for Web-enablement. Acquisitions forced us to evolve business management capabilities that facilitated taking on new organizations and customers. The principles of financial and resource management in the Theory Of The Firm were well-understood by the growing number of MBAs pouring into the company. Today over 100 acquisitions later, IBM is as much a business management platform as it is a tech-company. That aspect of IBM is not well-understood on the street.
Less than 5 years after IBM’s near-death, Apple suffered their own NDE. Not because they didn’t know how to manage their business but because they stopped delivering delight. Steve Jobs returned in 1996 with a starship full of delight and made Apple the most successful technology company on the planet by selling delightful hardware during the exact same years IBM abandoned its hardware business for software. But the irony didn’t begin there. Looking back, you can see that for 10 years (’86-’95) a soft-drink salesman from Pepsico managed Apple to near-death and over a like time frame (’93-’02) a cookie salesman from RJR-Nabisco managed IBM back to life—pure poetry given the crucial role cookies would come to play on the Web.
IBM’s mojo was its business management culture. Sure, we relied on various sales pipeline management tools, CRM, complex spreadsheet apps, SAP, you name it but culture was the key. Peer-standing and upward mobility was largely a function of how well you managed your business—department heads and individuals alike. Apple’s mojo was its delight culture. There’s no doubt they knew how to design, build, and sell hardware devices but their mojo was creating delight. Peer-standing and upward mobility was a function of your ability to deliver delight to customers. Just ask Jony Ive. Sir Jonathan Ive. He was 9 years old when the Apple I launched. Today he’s arguably the most powerful person at Apple. And all he ever did was deliver delight.
Yesterday, Ginni Rometty and Tim Cook announced a new strategic alliance between IBM and Apple. The details of the alliance is inoccuous at first glance—IBM will create industry-specific iOS apps and connect them to their cloud service for business. Apple gets into the enterprise, IBM gets their cloud infrastructure into the iOS eco-system. IBM still thinks infrastructure is the big play. Apple still thinks delight will sell. Only a fool would sell them short.
Few in the tech-press or on the street know the history of these two companies: their near-death experiences, how they transformed themselves, or the vast intellectual capital they developed around business management and delight. The people writing about this alliance today are too young to remember. They don’t even know that Tim Cook is an old IBMer.
Nothing is ever as simple as it sounds and most stories write themselves. You couldn’t make this story up if your tried.
Like the diabolical twists and turns on the descent to the dark recesses of The Grand Canyon, Machiavelli’s The Prince is a tour de force of pretzel logic exposing such strange insights into human nature that it completely upends the common understanding of the way things are. But the book is also filled with so many contradictions that disagreement still reigns over its true meaning. The basic idea is that to do good, it’s often necessary to do evil.
People who work in information technology regard their work as a force for good and believe they’re part of something that makes life better. Information technology cuts across every industry, every business, and every person on the planet. It’s special. Few of us think doing evil can produce good but ethical hacking is just one example among many that challenge that notion. When tech companies, newspapers, and government agencies use I/T to get at private information to dispense a societal benefit we are all returned to 16th century Italy to study the Florentine Master of insoluble problems surrounding virtue. E.g., one tech company has the anti-Machiavellian motto, “Don’t do evil” to which they have found it impossible to adhere. And none of us are innocent—we’re all irreparably conflicted by the fact this company owns technology that is the sine qua non of the Web which we all use and which makes life better for 7 billion people.
The Web, social media, and the cloud present their own set of strange insights into human nature that often upend the common sense of what’s good and what’s evil. We’re still working through the protocols that protect privacy and build trust but we’re not there yet. It’s complicated. The dark thoughts of Machiavelli’s century led to The Enlightenment in the next, so there’s hope but…
In the new e-book Learning With Big Data the authors describe how an online professor uses real time data to correct mistakes in course design. Pop-up quizzes are inserted into the course and results are analyzed in a way to show which material students return to when they don’t know the answer. It turns out that the order of presentment has a huge impact on learning. In one example, a short Algebra review at the beginning was useless in helping students solve a problem presented later. Moving the review toward the relevant section increased problem-solving ability dramatically—the professor learned from his students. Today, he continuously improves the course and increases the value of education.
Most of us think of big data as vast amounts of information processed by big computing revealing big insights. The professor used the Web, laptop computers, a little imagination, and a paucity of data to impact learning in a huge way.
What if the biggest insight is actually in Small Data?
In a metro of 6.3 million, what are the odds of recognizing a face on a crowded dance floor at a free concert…in a city park…at night?
After processing this photo we did actually recognize a couple faces. This kind of thing happens a lot. We’ve recognized faces in crowded airports all over the world. The world with 7 billion faces. Some we’ve seen several times in vastly different places. How is it possible?
Life is not random—it consists of patterns. Travel patterns, work habits, and universal love of coffee expose certain people to the same scenes over and over. Sooner or later we begin to recognize faces against what appears to be impossible odds. Humanity swims in a stream of patterns—the odds favor facial recognition.
Until recently, facial recognition was a function of eyesight combined with brainpower. But today, facial recognition is a function of surveillance cameras attached to the internet integrated with Big Data computers to identify commercial opportunities and security threats. These systems do not yet rival human brains but they are infinitely quicker at processing raw data. What they lack in nuance they more than make up for in brute-force.
Facial recognition is now a real-time planetary-scale app. Life will never be the same. We are like the native peoples of the 1800s who lived in the Western Territories right before a wave of settlers, fortune-hunters, and railroads came and transformed reality…forever. No one then knew what the future would bring but transformation took place over decades so people had time to adjust. Today transformations occur over the course of a few years…or months. With technology advancing at an increasing rate, the future is more now than later.
But when transformation occurs at the speed of electrons…will we be able to recognize its face?
The promise of ‘Big Data’ to clean-up misperception remains elusive but the shortcomings of information technology can no longer be blamed. Data and the tools to work it are in abundance but truth is really hard work. In Star Trek, Data’s evil older brother Lore finds truth easy—he has a positronic brain possessing a total linear computational speed of 60 trillion operations per second. Lore instantly sees the ugly truth behind the beautiful myths humans concoct and he strives to subject the human race to the Borg collective where no one can make up stories about anything. Lore’s name is brilliantly ironic.
Lore’s younger brother Data also has total linear computational speed of 60 trillion operations per second and he also knows the truth instantly. But Data has something Lore does not—an intense curiosity regarding ethos: why do humans invent beliefs, ideals, morals, and what is the real purpose of “lore” in human evolution? We can not forget that in the development of Data, Lore comes first. And that eventually Data deactivates Lore…an act rich with symbolism.
With each nanosecond it becomes more clear that some version of transhumanism is our destiny. Even now corrective DNA sequences are being printed on storage media and soon engineers will be able to insert the code into our bodies and correct mistakes in our “programming.” Technology like that is already so mundane you can watch it on CNN. The merging of information technology with human evolution is happening right now and sometime this century we too will experience life at 60 trillion operations per second. The only question is…