Site icon Bizety: Research & Consulting

IoT and the Rise of Edge Computing

With AWS poised to become a $13 billion business, cloud computing has overtaken the enterprise computing industry. However, it’s no secret to insiders that the next disruption is already on its way. IoT gateways and routers that can support edge computing are in the works by Dell and Intel, while software companies are beginning to scale and develop products like Apache Spark to be used in edge computing. The rise of edge computing is here, thanks to the increasingly ubiquitous Internet of Things and how it’s changing the game for data processing.

IoT Problems with Cloud Computing

The rise of edge computing has resulted from a shift in the way users interact with edge devices. Whereas we used to merely consume information at the edge, increasingly dynamic apps and websites have shifted interaction toward data consumption and production at the edge. This trend has only exploded with the boom of IoT devices. Cisco predicts there will be 50 billion things connected to the Internet by 2020, and many of these devices rely on data-hungry machine learning techniques to make decisions without user intervention. With a growing number of things, each creating and consuming huge amounts of data, cloud computing is beginning to face some significant challenges:

 

Applications of Edge Computing

Just as the cloud allowed for the rise of new markets like hyperconverged storage and software-defined networking, the rise of edge computing will allow for new applications of technology:

 

Challenges for Edge Computing

Although many of the problems IoTs and dynamic applications are beginning to face with the cloud will be solved by the rise of edge computing, there are still a number of issues that need to be addressed in this emerging industry:

 

Changes to the Industry

To some, this could seem like the cloud computing industry is poised to implode almost as soon as it’s come into its own. Due to AWS’s success with cloud computing, many companies are still investing massive resources in it, with Oracle, Microsoft, and Google among them. Will the rise of edge computing make these investments moot?

Not necessarily. Although the rise of edge computing will shift many computing tasks away from the cloud, it’s unlikely to altogether replace it. The edge computing paradigm isn’t so much built around eliminating the need for the cloud as streamlining the amount of information sent to and from it to maximize efficiency. For example, while using video analytics to search for a missing child would be impossible through the cloud due to privacy concerns, edge computing could perform the task by searching local data on devices, then deliver the results back to the cloud. In this case, the cloud would perform the task of gathering and processing data sets that have been processed at various local sources, rather than doing the time-consuming work of analyzing each individual video itself. In addition, historical data or large data sets within the cloud can be used to tweak functionality of edge computing processes or leveraged to improve the reliability of data sensing and communication.

A study by IDC suggests that by 2020, ten percent of the world’s data will be produced by edge devices. While the cloud may not be going away, it’s certain that computing will need to adapt to this shift in where data is being produced in order to save energy, decrease latency, and ensure privacy.

 

Copyright secured by Digiprove © 2016
Exit mobile version