Technology Prediction Hits and Misses of the Decade

As the decade comes to a close, we can look back on a number of rapid transformations in the IT landscape. But how much of what has transpired was predicted? We thought it would be fun to take a look back at the predictions made for this past decade to see which predictions did and didn’t pan out.


What they got right

Prediction: “Experts agree that more computing services will be available in the cloud. A recent study from Telecom Trends International estimates that cloud computing will generate more than $45.5 billion in revenue by 2015.”

The 2010s saw the rise of Big Data. In fact, the cloud computing industry has grown from $24.5B in $2010 to an estimated $156B for 2020. A good example is Amazon Web Services (AWS). At the start of the decade Amazon was still better known for eBooks than anything, but in the background AWS was getting off the ground. Over the course of the decade, while most view Amazon as a hallmark of the drastic shift in retail sales, it is AWS that has been the real business success for Amazon. AWS now accounts for fully 71% of Amazon’s overall income.

Prediction: “The Internet will be a network of things, not computers” “By 2020, it's expected that the number of Internet-connected sensors will be orders of magnitude larger than the number of users.”

Well, not quite orders of magnitude, but certainly the number of devices connected to the internet has grown substantially. Most current estimates put the number somewhere in the range of 20 billion (vs about 4 billion Internet users). Still, from watches to doorbells, to smart speakers and refrigerators, the number of devices going online is expanding at an exponential rate. It’s all the buzz as we step into the new decade, with 5G cellular technology being touted as a catalyst for a huge jump in connected devices in the coming years. Paired with the rise of cloud computing, there is a rapidly expanding ocean of digitized data about our lives and waiting to be used… either to our benefit, or at our peril.

Prediction: The Internet will attract more hackers.

With ever increasing amounts of valuable data to be exploited, hackers have indeed come in ever increasing number to expose it. While the number of records exposed each year is volatile, in the first year of the decade 16 million records were exposed, compared to 447 million in 2018. Throughout this decade and heading into the next, Extract Systems is proud to play a role in preventing the exposure of sensitive data for our customers through our automated document redaction software.

 

What was missed

 

Machine learning

Scouring predictions from the beginning of the decade, we didn’t find any direct predictions of the resurgence of AI and machine learning. There were references to technologies that ultimately built off machine learning, such as the apparent prediction of the now ubiquitous digital assistants Siri, Alexa, and Google assistant: “Most searches will be spoken rather than typed.” At the start of the decade, Google had secretly begun researching the uses of machine learning to drive cars, but the world only knew of niche uses such as DARPA’s challenges. IBM’s intent to enter Watson into a Jeopardy challenge was announced, but the supercomputer powering it required did not allude to the sudden relevance machine learning would have in wide-ranging software applications as this decade closes.

Why did we not see this coming? Well in large part, because at the start of the decade, machine learning was considered old news and not terribly useful outside of a few very targeted applications. Research in machine learning has been going on since the middle of the 20th century. Yet nobody had been terribly successful at employing machine learning to solve significant problems.

So, what happened? First, the Graphical Processing Unit (GPU). A GPU is a chip that handles specific types of calculations that were needed for rendering 3D graphics. These chips were developed primarily to drive video games. But it turns out the same type of calculations these chips performed were also the ones needed to train machine learning algorithms. GPUs were orders of magnitude faster at these operations than CPUs, so it was suddenly viable to do types of training that had previously been impractical. Another significant factor was the growth of Big Data. As a Stanford AI professor stated in 2009, “Our vision was that Big Data would change the way machine learning works. Data drives learning.”

As a result, ML applications that had previously failed to be successful, such as image recognition, were suddenly successful. The rush was on to find all the new ways ML could be exploited. Google reports that it has been able to increase the accuracy of voice recognition from about 75% to on par with human recognition; this allowed for the  of smart speakers. While it did take a bit longer than some predicted at the start of ML’s resurgence, there are indeed now driverless cars shuttling riders in Arizona. It is also now a core component of Extract’s data capture engine; an engine that can at once speed the extraction of data that is useful to your organization and protect against data exposure detrimental to your customers.


What else was predicted

Of course, not all predictions for the past decade involved IT. Here are some other predictions we found that we found interesting to look back on:

Chicago Tribune, January 2010

  • DVDs will be as old school as videocassettes were in 2010. Blockbuster Video, Netflix and other stand-alone DVD rental outfits will be out of business, replaced by online, on-demand movies and TV programs. 

  • Division I college football will have a playoff system.

  • Major League Baseball will be using instant replay to adjudicate disputed fair/foul, catch/no-catch and swing/check calls

  • Sarah Palin will be featured in end-of-decade "whatever happened to ...?" stories. 

  • The abortion and immigration debates will remain as poisonously polarized as ever.

Popular Science, January 2010

"The challenge for the next decade will be to integrate molecular engineering and computing to make complex systems," says George Church, a professor of genetics at Harvard Med Synthetically engineering parasite-resistant crops or photosynthetic organisms that churn out biomass, we can alter the economic landscape.”

NY Daily News

  • By 2020, memory devices will be integrated into our clothing. And the very idea of a "smart phone" will begin to change. Rather than looking at a tiny screen, our glasses will beam images directly to our retinas, creating a high resolution virtual display that hovers in air.

  • By 2020, we will be testing drugs that will turn off the fat insulin receptor gene that tells our fat cells to hold on to every calorie. Holding on to every calorie was a good idea thousands of years ago when our genes evolved in the first place. Today it underlies an epidemic of obesity.

If you'd like to know more about how Extract incorporates some of the technologies mentioned above in our data capture and redaction software, please reach out.


Written By:

Steve Kurth- Software Development Manager