General Electric’s head of IT told InfoWorld editor Eric Knorr in October that the company was moving aggressively toward the public cloud, with 90% of new systems built within that environment and moving to 100% public cloud over the next 15 years:

  • Why General Electric’s Announcement is Important
  • General Electric’s Vision – Beyond Hybrid
  • Tomorrow vs. 15 Years
  • Big Data and the Massive Scale of GE
  • Total Cost of Ownership

Why General Electric’s Announcement is Important

Although the incredible investments into cloud infrastructures – most prominently Amazon, Microsoft, Google, and IBM – have brought increasing attention to the public cloud, the technology often gets little respect. Many firms still don’t trust the multi-tenant environments of cloud service providers, at least not with their most sensitive applications.

“That’s why it’s a fairly big deal when the CIO of General Electric, staple of the Fortune 10, says it’s ‘all-in’ for public cloud,” explains Barb Darrow of Gigaom. Although not everyone believes in reference accounts (accounts that can be contacted by potential customers to verify the quality of your service) – as when they were called “a bad idea” by Inc. contributing editor Geoffrey James, General Electric is almost unbeatable in that role. Darrow notes that General Electric has a market capitalization (the complete value of the outstanding shares of any publicly traded firm) of almost a quarter of $1 trillion, with products and services ranging from home appliances to jet engines to financial products.

Let’s look at the original InfoWorld interview, in which General Electric’s Chris Drumgoole, technically the COO of information technology, stated his organization’s firmly held commitment to the public cloud model.

General Electric’s Vision – Beyond Hybrid

Drumgoole explained why General Electric previously said that it would be doing away with nine out of every ten data centers and transitioning to the public, IaaS (infrastructure as a service) model. Essentially, the IT leadership at one of the world’s largest and most prestigious corporations has determined that cloud is a better way to provision resources: “We have a model where we’re operating outside of our four walls in someone else’s environment, but we’ve been able to ensure that GE data — compute, memory, and storage — remain single-tenant” within a data center that is used by many different businesses.

That comment raises an important issue: terminology.  Private cloud computing is commonly understood by cloud service providers to refer to single-tenant environments, whether they occur within hosting environments or within a client’s own data centers. Since General Electric is talking about the company’s data being partitioned off from everyone else’s, it’s referring to what many CSP’s list as a private cloud. In essence, what GE is really describing is their position that cloud service providers are the ideal source of computing resources.

Drumgoole noted that although the enterprise does still have internal systems that are designed to operate in a similar manner to the cloud, they don’t see themselves using both third-party-hosted and internal servers for long. In fact, GE believes hybrid cloud is not the ultimate goal. It’s unclear exactly how many years it will be before the full transition is made to using outside providers, but Drumgoole said he sees “no reason why everything except the rarest of apps” would not be moved to a cloud service provider eventually.

Tomorrow vs. 15 Years

When Knorr expressed his shock about the announcement given the well-documented misgivings of many organizations toward the cloud, the IT chief confirmed that General Electric was completely onboard with migrating everything to cloud data centers. However, not everyone at the mega-corporation agrees when that transfer should be complete. Some advocate making the comprehensive switch immediately, while others say the transition should be piecemeal, gradually migrating through 2030.

Big Data and the Massive Scale of GE

Part of what is particularly interesting about the case is General Electric’s sheer scale, particularly in the age of big data. Recognizing that GE is not the first organization people think of when they consider information-age overload, Drumgoole provided an example: each one of its jet engines produces about 2 TB of information each time its plane is flown. “A flight with a GE engine takes off or lands every three seconds,” the COO explains. “All of a sudden, the data gets very, very large very, very fast.”

Total Cost of Ownership

A typical argument that people have against public cloud is that it represents an operating expenditure (opex) rather than a capital expenditure (capex). When Knorr asks the IT head about the internal discussions related to opex, Drumgoole explained that GE frames its IT needs instead in terms of total cost of ownership. Furthermore, General Electric doesn’t just concern itself with cost but adaptively reconceptualizes the role of computing in its business.


Is the perspective of General Electric intriguing in terms of your own business needs? Get a quote for public cloud hosting within our SSAE 16 certified data center today.

By Moazzam Adnan


Atlantic.Net can also assist with dedicated, managed and HIPAA cloud hosting solutions – contact us today for a consultation.