Computing Evolves. Part VII: All Around You (2009-2019)

Last Updated:Sunday, May 28, 2023
SHARE:

215 years of computing (1804-2019)

It’s been a long march from the Jacquard loom to now. Over the course of the last 200+ years, computers have gone from being mechanical to electro-mechanical, from big and programmable to tiny and everywhere (yet also secretly still big).

In this series, we’ve covered the evolution of computing from A to Z, starting with the First Industrial Revolution, which produced the ancestors of today’s computers, and segueing into World War II and its existentially motivated spike in innovation.

After the war, peace gave computers a chance to grow into civilian applications. In the late 40s and 50s, former WWII researchers became free-wheelin’ entrepreneurs, and Silicon Valley was born with the development of the first silicon-based semiconductors.

Over the 60s and early 70s, giant mainframe computers operated by trained personnel dominated the scene. This status quo finally gave way in the late 70s, with the advent of affordable personal computers. 

In the 80s, computers infiltrated polite society. Video games were one Trojan horse, getting a generation of kids hooked. Graphical user interface (GUI) and mass marketing helped too.

In the 90s, with the help of the Internet, computing went full-on mainstream. This decade saw intense competition in the computer hardware market, the pop culture phenomenon that was Windows 95, and the launch of the first iMac.

The desktop computing boom of the 90s gave way to the rise of laptops in the mid-2000s. As laptops became more powerful and less brick-like, miniaturization gave rise to the first smartphones. The 2000s also saw the advent of cloud computing, social media, and YouTube. The stage was set for the computer ecosystem of the 2010s.

Over the past decade, computing has been defined by distributed networks, cloud infrastructure, the development of massive server farm infrastructure, and deep learning supercomputers. 

Which brings us to today. We hope you like computers, because in the 2010s you can’t do anything without one!

Getting personal

As our personal computers have shrunk and become a ubiquitous feature of our lives, pulled from our pocket in the form of a smartphone or plonked down at the local third wave cafe in the form of a laptop, the massive computers that back up our everyday usage have proliferated.

In the final part of our seven-part series on the history of computers, we look at the computing ecosystem since the 2008 global recession and take stock of where tech is today.

(If you missed Part VI, you can check it out here.)

The long tech boom

The 2008 Great Recession dramatically changed the world economy, and so too the world of computing. 

The crisis created an aversion to risk in the business world. New emphasis was placed on using tech to make companies more adaptable to fast-changing market conditions.

One consequence has been the meteoric rise of co-working spaces like WeWork, where membership dues, leases, and office spaces are flexible. The idea: When your company grows, just rent more desks. 

In a similar vein, the mainstreaming of Agile, Scrum, and Lean project management practices reflects an emphasis on doing what needs to be done without waste. It’s all about making a company sustainable yet fast-moving.

With that in mind, it’s interesting to note the historic financial meltdown didn’t actually hit tech and computing too hard.

After the dot.com bubble popped in 2000, tech had put itself back together slowly but surely, experiencing modest growth in the mid-2000s. The NASDAQ (and by extension the tech and computing economy) dipped in the 2008 recession, but it recovered quickly, unlike the rest of the global markets. 

Unsustainable lending practices in American real estate were the catalyst for the global recession in 2008, which didn’t directly impact tech, apart from turning more consumers into pinch-pennies for awhile. This was the exact opposite of what happened during the dot.com/NASDAQ meltdown of 2000, when tech crashed into a deep recession, while other markets quickly recovered and went back to being their perky selves.

Indeed, what happened this time around was unique⁠—the rest of the global economy turned to tech as the new engine for growth. The ‘digital economy’ now comprises 15.5% of the global GDP, and it has grown 2.5x faster than the entire global economy over the past 15 years. 

The rise of ecommerce companies has been abetted by big data, and sophisticated analytics, business intelligence, marketing, and SEO tools. The maturation of the Internet economy has delivered the value that many of the 90s dot.com companies tried to deliver too fast, too soon.

The 2010s is all about the rise of SaaS apps, including CRMs and a dizzying range of creativity and business-orientated digital tools. CRM became the fastest growing business software category. Demand for tools that facilitate remote and mobile work increased exponentially over this decade.

The tech hub transmogrification of San Francisco, Seattle, New York, and a host of other cities globally speaks to this new condition. It’s all down to the ever-deepening penetration of computing into daily work and life.

According to Hootsuite’s 2019 report on the global state of digital, Internet penetration worldwide has reached 4.3 billion people, 57% of the earth’s total population, and a whopping 95% in the USA. 67% of the world use a mobile phone, many of them smartphones (which are now dirt cheap). 

As the number of people going online has increased, so too has the need for education that provides digital literacy. The centrality of the Internet in contemporary life has also raised thorny issues about privacy, personal data, and corporate and government surveillance.

Many parts of the developing world have skipped a whole phase of infrastructure-building and gone straight to accessing the Internet on their phones. In sub-Saharan Africa, historically the least ‘Interneted’ neck of the woods, cheap Android devices are enabling the poorest and most rural segments of society to be a part of the global information society for just under $50.

The increasing speeds and falling costs of mobile broadband networking, and the ubiquity of Wi-Fi have led to a new computing ecosystem: cloud computing. 

Cloud computing and giant hidden machines

Cloud computing and distributed networking have allowed us to do our work across laptops, smartphones, tablets, and desktops, sync everything we’ve done so we can access it on any device we choose, and offshore data storage to server farms far, far away.   

These days, if you really wanted to, you could probably get by with an iPad and an external keyboard alone. 

Working in Google Docs or the web version of Microsoft Office, it’s easy to take the ease of collaborative work and file-sharing for granted, ditto for on-demand apps and gigabytes of cloud storage. Laptops are now light, thin, and work wirelessly for a long time. The brawny desktop towers and CRT monitors of the near-past seem a distant memory.

Yet this UX breeziness masks the reality of what’s running all of our #nofriction lifestyles, namely a bunch of dizzyingly vast computer infrastructure.

The computerization of society requires…more computers

The story of computing in the 2010s is all about giant, hidden-away machines that run cloud services or do extremely complex calculations on everything from artificial intelligence, machine learning, automation, healthcare, weather, physics, and energy.

Massive server farms run companies like Amazon and Google, creating not-insignificant carbon emissions in the process. In fact, it’s estimated that data centers consume 200 terawatt-hours per year (that’s the equivalent of the entire country of South Africa’s annual emissions). As the ‘computerization’ of society invariably continues to increase, so too will this figure.

Meanwhile, demand for supercomputing has soared over the last decade, as the machines find applications far from basic research, in fields as diverse as data analysis and measuring climate change.

Supercomputers are mainly busy doing two things. The first is machine learning, analyzing giant data-sets at extremely fast speeds to come up with valuable, practical insights. The second is deep learning, solving hyper-complex equations that were previously unsolvable (or solvable only at a great cost in time and resources). 

A lot of this is done with AI technology, which has so far proven to be most adept at tackling hyper-specific tasks. This emphasis on compartmentalization means it probably will not produce a hyper-sentient, apocalypse-inducing machine-like Terminator’s Skynet or the Allied Mastercomputer from I Have No Mouth, and I Must Scream. At least not in the foreseeable future (sorry).

These days it’s China and the USA who have the fastest AI-enabled supercomputers. The two countries are locked in a tech arms-race, which has become part of the ongoing US-China trade war as the US withholds technology to Chinese firms it deems a national security risk. That proviso has upped the ante, as both sides try to deliver homegrown tech that outperforms the other. 

China’s contribution to the trade war has included some of the most advanced supercomputers in the world. They have 219 machines on the list of the top 500 fastest supercomputers, and have held the top spot for six years (2012-2018). 

The Sunway TaihuLight was the last Chinese machine to claim the title. 

Sunway TaihuLight

Installed at the National Supercomputing Centre in Wuxi, Jiangsu province (about a two-hour drive from Shanghai), the Sunway TaihuLight computer, which came online in June 2016, was the fastest computer in the world for two years. 

With a speed of 93 petaflops, or 93,000 trillion calculations per second, the Sunway TaihuLight is about 125,000 times faster than the most high-end personal computer you could ever work on at home.

The machine uses a dizzying amount of Chinese-designed and built processors (40,960, in total) known as ShenWei 26010 processors, also known by the anglicized name of Sunway.

Each 64-bit ShenWei 26010 processor is comprised of a single management processing element (MPE) and 260 cores arrayed in 8x8 computer processing elements (CPE) with 8GB of DDR3 memory. That gives it a performance equivalent to the Intel’s Xeon Phi series of super high-end chips, which is important because it demonstrates China’s newfound ability to close the technology gap in processor design with American companies. 

The TaihuLight’s deep learning capabilities have been for pharmaceutical research, weather forecasting, and industrial design. They’ve also been used to simulate the universe (no big deal).

IBM Summit  

As of 2019, the fastest supercomputer in the world is the IBM Summit supercomputer at Oak Ridge National Laboratory, Tennessee (the Oak Ridge facility dates back to 1942 and played a key role in the Manhattan Project). The IBM Summit gave the USA back the ‘fastest supercomputer’ crown when it got up and running in June 2018. 

The IBM Summit can do 200 quadrillion (i.e. 200,000,000,000,000,000) calculations per second. That’s over double the TaihuLight computer. 

It’s powered by 9,216 super-fast, super-advanced IBM POWER9 CPUs, with performance further accelerated by 27,648 Nvidia Volta GPUs. 

Each Nvidia Volta GPU delivers 125 teraflops of deep learning performance. Teraflops, by the way, are a way of measuring mathematical capability in terms of trillions of floating point operations (i.e. finite calculations involving real numbers) per second.

The whole thing weighs over 340 tons, or about the size of 5 fully-fattened male African elephants (i.e. the largest land animal on earth). Its girth takes up 5,600 square feet (the size of two regulation tennis courts). The whole thing is strung together by 185 miles of fiber-optic cables, and kept cool by 4,000 gallons of water piped in per minute. 

Summit also uses a very large amount of energy—13 megawatts worth (just in case your metrics are a bit rusty: a kilowatt is 1,000 watts, so a megawatt is 1,000,000). That’s enough energy to power around 8,500 residential homes. Quite a lot of power indeed.

Still, it’s ever-so-slightly more energy-efficient than the TaihuLight, which consumes 15 megawatts. 

Summit, like the TaihuLight, is geared towards accelerating innovation and human discovery with deep learning algorithms. Some of the weighty tasks it has been used for, so far, include detecting climate change, health research, and high-energy physics. 

So what’s next?

Super-fast exascale computing is set to be the next ‘big thing.’ To put how fast into perspective it’s important to note that an exaflop is a million trillion calculations per second. 

If that still sounds like gibberish, consider this: If every human baby, tween, adult, and oldie-goldie on the planet could somehow do 150 million simultaneous calculations per second, an exascale supercomputer would still out-perform them all. 

The Exascale Computing Project in the USA is planning to launch its first exascale computer by 2021. China, meanwhile, aims to beat Uncle Sam to the punch, with the Chinese National University of Defense Technology (NUDT) projecting its roll-out of an exascale computer by 2020 as part of the country’s 13th Five-Year Plan (yes, they still do those).

This speed boost will have many profound consequences for the pace of human innovation. Allowing for more complete simulations of the complex forces in the universe, exascale computing will alter everything from nuclear physics to healthcare.

On a less cosmically weighty note, supercomputer-assisted deep learning and artificial deep neural networking also promises to deliver truly autonomous driving vehicles in the near future.

Hello computer, my old friend

“Consider a future device …  in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.”

— Vannevar Bush, “As We May Think,” The Atlantic, 1945 

From pretty much the beginning, computers have been primarily concerned with science and commerce. This continues to be the case, although somewhere along the way (i.e. the 2000s), they also became an indispensable aid to our personal lives: We share very private details in messages with friends and family, pay our credit card bills, look up stuff on Wikipedia at 3AM, buy matcha, and do any other hundreds or thousands of everyday things on our devices.

Computers have become a part of ‘being human.’ It’s pretty much impossible to imagine a non-computerized existence in any Western or developing world society. To quote computer pioneer Ted Nelson, now “everything is deeply intertwingled,” so much so that in 2016 the UN declared Internet access to be a human right.

Today, computers are everywhere, and absolutely necessary to participate in the world in any meaningful way. Which makes it downright quaint to think about how few people ever got to see, let alone know how to operate, early, purpose-built computers like the COLOSSUS or ENIAC

Of course, even if your Luddite heart wants that world back, it’s just not possible.

But that, my friend, is why we have history articles like this; to remember the paths that brought us to our current predicament, so we can know where we fit in and how things might be different in the future. 

We hope you’ve enjoyed our seven-part series on the evolution of computing.

SHARE: