In this retrospective article, Morgan Pare charts the inventions and partnerships of the computing industry over the last 25 years. What did it take to get computers into our homes, then into our pockets, and then to the “cloud”? Where will developments in artificial intelligence and machine learning take us to next?


By Morgan Pare

In the past 25 years, computing has done its best to deliver on the promises highlighted by Bill Gates when he remarked that “Never before in history has innovation offered promise of so much, to so many in so short a time.” Nearly everyone would have to agree with him when it comes to computing. In this article we explore the main advancements that brought us to where we are today and to help us along the way, I have scoured Spotify for the best songs of 1991 to remind us just how far we’ve come.

Cooperation (Everything I Do, I Do It For You)

If we were to hop into a time machine and set course for 1991, we would land right in the middle of a blossoming romance that has shaped the last 25 years of personal computing history – the software-hardware marriage of Windows and Intel. The two firms had been brought together 10 years earlier through the launch of its IBM PC and the 1990s was a great decade for both of them.

 Microsoft’s commercial success with the Windows OS began with Windows 3.0 in 1990. Windows 95 debuted the hallmark Start Menu and now infamous Internet Explorer which was followed up in the 2010s with the excellent XP, the better-forgotten Vista, the redeeming 7, start-button-less 8 and Windows-as-a-service 10.

For Intel, 1991 was the year in which it launched the “Intel Inside” marketing campaign. Intel achieved further brand differentiation in 1993 with the introduction of the Pentium microprocessor. Whilst Pentium brand proved to be a great a marketing success, some techies were initially upset by the absence of numbers (think HAL 9000 or R2D2!). That clearly didn’t hold sales back, and by 2015, Intel’s products featured in over 80% of PCs shipped worldwide[i].

Both firms benefitted greatly from the Internet-fueled demand for PCs in the late 1990s: the Wintel partnership provided both consumers and businesses with a fast, stable and intuitive interface with which to harness the major productivity gains that the internet offered.

However, neither firm did as well in the later wave of mobile computing devices, such as smartphones and tablets. Here, a hot new couple has emerged - ARM and Android. Android’s OS powers 87% of the world’s smartphones[ii], and ARM’s chip designs feature in a whopping 99% of the world’s smartphones and tablets combined[iii].

Methods (Things that make you go Hmmm…)

While less visible than the advances in personal computing, the evolution of back-office servers has been no less remarkable. The rise of the PC and Internet over the last 25 years has seen firms move from centralized mainframes, to distributed client-server architectures, and on to the current paradigm of virtual servers and cloud computing.

The availability of general purpose PCs in the 1990s, combined with the ability to interconnect them over local area networks opened the door to flexible client-server architectures. Far more affordable than monolithic mainframes, the client-server approach was widely adopted by smaller firms that didn’t require the high-volume transaction capabilities of mainframe and couldn’t afford their high price tag. File servers replaced ‘Sneakernet’ and commercial off-the-shelf software sales boomed.

The next wave of change came in the 2000s with the emergence of ‘cloud computing’. Rather than dedicate servers to particular applications and have them stand idle when not needed, cloud computing uses virtual machines that can share server hardware. These virtual machines are created in software and can thus be set up and torn down very quickly, without the need to order, rack and configure physical machines. This ability to dynamically assign computing power is far more efficient for applications with demand that changes by time of day, or day of month (think: “billing cycle”). It also provided the capability to have multi-tenanted platforms, which has enabled new business models that charge for computing power – Infrastructure as a Service – on a usage basis.

The economics, flexibility and ease of consumption of virtualization has led to rapid adoption throughout the IT industry. Many firms have chosen to use multi-tenanted, public cloud services such as Amazon AWS and Microsoft Azure. Others have built their own private clouds, and some have a hybrid of both. It’s interesting to note that Amazon’s cloud efforts were initially a response to scaling its private computing needs; however, by sharing the platform with tenants it has reaped enormous scale (and unit cost) benefits for its retail business.

Virtualization has been key to scaling some of the largest and best-known online services. Another important advance has been in the field of distributed computing. Distributed computing divides large tasks into subtasks and distributes these for execution across many servers which massively aids speed and efficiency. Importantly, these approaches are designed to work with large clusters of inexpensive, commodity servers which is the exact opposite of the mainframe world. This parallel processing approach has enabled firms to cost-effectively process huge volumes of data, allowing web giants such as Google and Facebook to scale and fuel their businesses.

In the software market, cloud computing has also spawned the concept of Software as a Service (SaaS). In the SaaS model, applications are hosted online and accessed using a web browser as a thin client. Examples include Google Docs (on which this article is being typed) and Microsoft’s O365. Hosting software in this way enables new features and bug fixes to be applied centrally, much like mainframe software. It promises lower support costs by removing the need to distribute, install and maintain local copies on PCs. It also provides an opportunity for the vendor to charge on a recurring basis and is a good defense against piracy.

Finally, with greater computation fire power, there has been increasing focus on artificial intelligence (AI) and machine learning. Both awe-inspiring and slightly terrifying, these technologies have stirred controversial debate in the late 2000s and into the 2010s, mainly surround safeguarding against future machine-mishaps and defining the increasing role computing plays in our lives. IBM’s Watson is most famous for its triumph on Jeopardy in 2011. Today, Watson is helping revolutionize the healthcare industry; it can process vast repositories of medical data and help doctors to reach a correct diagnosis and prescribe for even the rarest symptoms. Last year, Google’s Deepmind used a machine that learned through playing itself to beat Go! champion, Lee Seedol, and brought into stark focus the growing resemblance of computers to humans.

Whilst the means by which computers have been employed over the past 25 years has varied massively, it is clear that each innovation brings a world of opportunity, functionality, and fascination.

Moore’s Law (Should I Stay or Should I Go?)

Moore’s Law captures an observation by Intel Co-founder Geoff Moore, that the number of transistors that can be put on a microchip doubles every year or so. Since 1965[iv], this exponential increase in computational power has been a driving force behind advances in the electronics and high-tech industries. The ability to process more information, more quickly, and at lower cost than was possible the year before, has fueled competition and new entrants in many sectors. It also accounts for the fact that your new smartphone is likely to have vastly more power than a PC you bought in the previous decade.

Given the importance of Moore’s Law to the industry, advances in microprocessors are closely tracked and much effort is spent on finding new paths to higher performance. Chipmakers have been nearing atomic limits for some time and it will soon be physically impossible to continue shrinking silicon transistors as well as fund the R&D required to do so.

However, industry observers will tell you that the law has always had its sceptics, as Peter Lee (VP at Microsoft Research) jibes, “The number of people predicting the death of Moore’s law doubles every two years.”

The possibility of an end to Moore’s law presents an inflexion point and incentivizes researchers to experiment with other exotic methods in order to extend the rise in computing power. Such methods include exploring different materials beyond silicon; others take an even higher level approach, questioning the very nature of current computing methods through harnessing quantum mechanics or emulating biological brain functions.

The Future (Get Ready For This)

Questions still surround the future of computing. As the cloud continues to drive down costs and revolutionize the way that enterprises and consumers alike approach computing, will expensive hardware-ownership become extinct? Will other relationships emerge to define this era – perhaps even the very relationship between Man and Machine? And finally, will a new champion step up to honor Moore’s law – be it similar or a different beast entirely?

Despite the lingering questions, when looking back at the pace of innovation in computing over the last 25 years and the promising trends that are emerging, it is clear that there is reason to be optimistic and excited about computing’s future. <>

For your Playlist:

  • “Everything I Do, I Do It For You” Bryan Adams, Albums: Robin Hood: Prince of Thieves (soundtrack) and Waking Up the Neighbours (June 1991).
  • “Things That Make You Go Hmmm…” C+C Music Factory, Album: Gonna Make You Sweat (June 1991).
  • “Should I Stay or Should I Go” The Clash, Album: Combat Rock (June 1982). Rereleased: Rush (1991)
  • “Get Ready for This” 2 Unlimited, Album: Get Ready! (September 1991).

To mark Cartesian’s 25 years in the telecoms, media, and technology sector, we asked our consultants to reflect on industry topics and write about how they have changed over the last few decades. Click here to receive your copy of our anniversary eBook: 25 Years – A retrospective on innovation in the telecoms, media, and technology sector


Notes:

[i] King, Ian. Bloomberg: “Intel Forecast Shows Rising Server Demand, PC Share Gains”. July 2015.

[ii] IDC: “Smartphone OS Market Share, 2016 Q3”. Q3 2016.

[iii] Vance, Ashlee. Bloomberg: “ARM Designs One of the World’s Most-Used Products. So Where’s the Money?”  February 2014.

[iv] Simonite, Tom. MIT Technology Review: “Moore’s Law Is Dead. Now What?” May 2016.