Prior to the 1970s, computer and electronics-based technology was hardly a pervasive part of our everyday lives. Automobiles were not yet computerized, fax machines were just taking off, and, significantly, the personal computer had yet to be popularized. Individuals who dreamed of a career in information technology departments were likely those who happened to come into contact with technology. Microsoft founders Bill Gates and Paul Allen famously attended the Lakeside School in Seattle together that had a time-shared PDP-10 computer, which allowed them to learn computer programming and, equally important, to dream big dreams that would have an impact on us all.
Today technology is everywhere. Some of us are awakened by clocks that are MP3 players, have coffee makers that operate on electronic timers, navigate the routes we drive each day with global positioning systems (GPSs), check in to flights and select our seats before we arrive at the airport, and operate Blackberries to send and receive email messages with colleagues wherever in the world we may be.
Today we are all technologists, insofar as we interact with technology every day. The younger one is, the more innate all of this is. I did not have a computer in my home until I was in high school. My children have been using them since they first learned to speak. The younger generations are becoming what Richard Nolan, professor emeritus at Harvard Business School, refers to as ‘‘Digital Natives.’’1 Technology is in their DNA, and the consequences are profound.
A similar evolutionary path can be tracked within corporations. Well into the 1980s technology leaders tended to follow a typical career track. They probably studied engineering at a university, they got their first jobs in some technology-centric business, and they worked their way up the corporate ladder. If they reached the apex of the technology department, they were nowhere close to being viewed as peers of other division leaders within the organization. To make matters worse, the tenure of the average chief information officer (CIO) was roughly two years. No wonder the acronym ‘‘CIO’’ was cynically referred to as ‘‘Career Is Over.’’
During this time, IT executives likely were overseeing the development of systems that assisted with ‘‘back-office’’ functions. They might have developed a home grown accounting system; they might have developed systems to track employees’ hours.
These were important undertakings, to be sure, but CIOs tended not to garner invitations to the strategy-setting table at which the company’s long-term vision was being discussed. They tended not to develop new innovations that would help the sales staff understand their customers better, to say nothing of systems that customers themselves would use. Frankly, customers did not expect to interact with the technology. There was no Internet; there were no self-serve gas stations or grocery checkout lanes. ATMs were barely known.
The late 1970s and early 1980s were a turning point. Personal computers gained in prominence. Answering machines became household necessities, and VCRs were used to record television programming or to watch movies. CIOs started to make inroads into adding recognizable value to the businesses in which they operated, though typically that value was in the form of cost-cutting. This was a time when automation was quickly replacing manual labour.
In the 1990s, a key innovation sparked the leap forward in the prominence of technology in everyday life: the Internet. Many people point to Netscape’s initial public offering in August of 1995 as the watershed moment in this trend. The resources that the Internet offered increased the efficiency of research, putting a world of information at one’s fingertips. Those who were connected—both individuals and companies—were advancing leaps and bounds beyond those who were not.