You may have heard last year that General Electric was completely committing itself to the public cloud. This really was a major announcement just for the sheer size, prominence, and historical success of GE. General Electric tech COO Chris Drumgoole remarked that more than nine out of every ten applications deployed by the corporation throughout 2014 were built and released via cloud computing.

“If you look at our new apps, north of 90 percent of what we deployed this year has gone into a public cloud environment,” Drumgoole explained. “We still have a lot of old stuff that hasn’t moved yet, but if you look at our new stuff, we’re there.”

Many people assume that the shift by GE is about the speed and efficiency of cloud computing itself, but that’s not actually the case. The real value of cloud is how it integrates seamlessly with the Internet of Things (IoT).

GE: What a Difference 100 Years Makes

General Electric is not exactly a startup looking to create volatility in its market. Founded in Schenectady, New York, in 1892, the company made a name for itself with a product and service model centered on equipment – profiting immensely both from the machines themselves and from subscriptions to maintain them.

In the last decade, though, chinks in the armor of the industrial giant have appeared. IBM, SAP, and Internet-era big data outfits have threatened GE’s dominance by asking its customers to re-think the integration of machines within their business. Rather than buying equipment and simply using it to make stuff and support their operations, companies started to realize that the real power of machines only became apparent through sophisticated analytics run on the data they produce.

Four years ago, General Electric was ready to take action so they would not get shown up by the also-ran’s. The company announced it would be pumping billions of dollars into the industrial segment of the Internet of Things, including:

  • installing digital sensors on all of its equipment;
  • making equipment interoperable by integrating it through a cloud platform-as-a-service (PaaS) system;
  • building a robust environment with innovative tools for app development;
  • strengthening its algorithms and abilities to analyze its big data;
  • opening up its business to new ideas through crowdsourcing.

Digital Connectivity & the IoT

Computing enabled us to digitize. The Internet enable us to connect. Cloud computing vastly improved connectivity, particularly for mobile. The dawn of the Internet of Things, though, represents an incredible new level of connectivity.

Via the IoT, “the pervasive deployment of digital sensors is extending digitization and connectivity to previously analog tasks, processes, and machine and service operations,” notes the Harvard Business Review. “Moreover, virtually limitless computing power is available with  low-cost cloud computing.”

As technology is evolving at such an incredible and revolutionary clip, everyone in business – not just the tiny startups but the time-tested global giants such as General Electric – are having to transform with it or become irrelevant.

Fog Computing: Cousin to Cloud

When a cloud comes down to earth, it becomes fog. When cloud computing comes down to earth, it becomes fog computing.

You see, the enormous potential of the Internet of Things cannot be realized by just using the power of distributed servers that are potentially at a great distance from the relevant devices. Fog computing takes full advantage of the power of in-house devices – hesitating to throw data and tasks out to the Internet unless it absolutely has to.

Fog computing is not an alternative to cloud but a corollary to it, says Todd Baker of Cisco.

There is no limit to the business world’s desire for speed. There is, however, a limit to getting information back and forth from the cloud within budget. Technologists have been grappling with how to solve the issue of bandwidth.

“We talk about 50 billion sensors by 2020,” Baker comments. “If you look today at all the sensors that are out there, they’re generating 2 exabytes of data. It’s too much data to send to the cloud.”

Right now, all data is typically sent to the cloud for processing. “The fog” will allow companies to process information and refine their strategies at the level of the IoT machines themselves. Data will only go to the cloud when really necessary.

Baker says that the fog will make big data actionable. You can process data initially through fog computing and then send it to the cloud once it’s legitimately useful.

The Value of Data Exhaust

The issue of how to make sense of all the big data is not a new problem, of course. In 2013, TechCrunch profiled, a startup created by Scott Brown and Chris Dancy to run data analytics on bounced email and turn it into a stream of advertising revenue. Beyond the ads, the company was also sending information out to security firms to get a more granular sense of spam.

“It may be time for us to capitalize on the power of data exhaust so we can use it ourselves and perhaps not be so willing to let it trail from us with such disregard,” suggests TechCrunch.

By processing data exhaust through fog computing, the information will be more meaningful when it arrives in the cloud.

What this Means Today

As you can see by the case of an old-school firm like General Electric and a startup like, the cloud isn’t about the cloud. It’s about moving your company to the third platform. It’s about taking advantage of data exhaust. It’s about using predictive analytics to improve the way you do business.

GE is going all-in. What about you? Deploy fast cloud hosting in 30 seconds and take advantage of our many options like Windows VPS Hosting and more.

By Moazzam Adnan