HIPAA Compliant Cloud Hosting

The History of Cloud Computing

Believe it or not, the concept of cloud computing technologies has actually been around since the mid-1990s; back then, it had a more complex name: “on-demand infrastructure.” How has this technology evolved over the past two decades to become the immensely powerful phenomenon it is today?

From basic web server architecture to simple database management and from less-than-technical email applications to minimal disk space, the original web hosting services we born around the 1990s and exploded in popularity during the dot-com era (1995-2001). Some of the earliest shared hosting companies included ValueWeb, Interland and HostGator.

The first shared hosting solutions offered multi-tenancy capability, automated provisioning, a monthly billing cycle and an easy-to-use interface for maintaining resources. However, these solutions did not inherently provide infrastructure on demand, resource-size flexibility or scalability. It was a simplistic offering but helped to create the foundation for the cloud hosting industry.

Around 1998, virtual private servers (VPS) arrived on the scene. By offering some more flexibility and administrative root access, VPS solutions offered a significant step up from shared hosting capabilities of the past.

Early VPS hosting companies provided servers that offered occasional infrastructure on demand, slight resource-size flexibility, multi-tenancy, automated provisioning and the convenience of monthly, quarterly or annual billing cycles.

For businesses that needed stricter security measures and more stable resources, dedicated hosting solutions that were developed soon after the release of VPS did the trick. These servers offered more power along will complete administrative access and control of server resources.

These dedicated servers did not provide multi-tenancy, network flexibility or scalability. However, providers supplied both managed and unmanaged dedicated hosting options, giving customers the ability to choose between relying on professionals to maintain the architecture, or employing an IT department to handle it.

The launch of Amazon Web Services in 2006 really began to change the industry. Between 2007 and 2010, several managed hosting companies developed and released a more scalable and more virtualized Infrastructure as a Service (IaaS) offering. Today, this is referred to as grid/utility computing.

IaaS providers offer computers—whether physical or virtual—and other resources to customers. The earliest providers of utility computing included Layered Technologies, NaviSite and Savvis. These hosts offered infrastructure on demand, partial resource-size flexibility, multi-tenancy, occasional automated provisioning, partial scalability, a monthly, quarterly or annual billing rate and a slightly easy-to-use interface.

As discussed before, the development of Amazon’s Web Services really kicked things off in the way of cloud computing. In fact, the AWS system transitioned from a grid/utility computing model and moved toward what we can only call “Public Cloud Computing 1.0.”

Between 2008 and 2009, developers and startup hosting companies alike had the ability to compute and store data like never before, and, with time, they were able to eventually scale this data and infrastructure resources at a whim. Along with Amazon, Rackspace Hosting was the main component of this transition.

Cloud servers infrastructure on demand, partial resource size flexibility, multi-tenancy, automated provisioning, slight scalability, hourly billing (the first of its kind!) and a fairly easy-to-use interface.

The introduction of hourly billing in cloud computing 1.0 was a big deal, for both providers and customers. This model gave customers the ability to pay what they really should—not some previously agreed-upon subscription price. By narrowing the billing down to the hour, customers saved money, and this made them happy.

Today, we are witnessing a progression into the Cloud Computing 2.0 era. The next generation of cloud computing will need to be easier, more flexible and billed based upon a true utility model (like that of electricity and water) in order to provide customers with the services and products they need.

Current cloud 2.0 companies, such as Atlantic.Net, offer infrastructure on demand, fully customizable resource-size flexibility, multi-tenancy, applications on demand, network flexibility, automated provisioning, complete scalability, billing down to the minute or second and a simple drag-and-drop interface control for ease of use. Sure, cloud computing 2.0 companies are offering services that speak truer to the definition of the cloud than ever before, but we’re still not quite there.

In the future, the Cloud and the technologies serving as the backbone of the Cloud will need to cross the metaphorical river of development to attract a wider audience beyond that of organizations and enterprises. Small successful startups are driving innovation today, and cloud computing will need to become more saleable, more flexible and more based upon a true utility model in order to drive this innovation even further.

Since 1994, Atlantic.Net has stayed above the competition to fuel innovation and move technologies to new environments never previously imaginable. As the technology continues to evolve from 2.0 into 3.0 and beyond, you can rest assured knowing that you are relying on the most up-to-date architecture and standards in the business.

To learn more about the Atlantic.Net business model, see our full line of one-click cloud applications and to see how transitioning to a cloud hosting service can help transform your business, contact us today.

We also offer HIPAA cloud hosting solutions – contact us for a consultation.



Check Out These Amazing Cloud Statistics!

I think I’m suffering from brain overload; not that I’m trying to defeat the object of learning. I just decided to quantify in this article how much storing data on a cloud server affects us. It’s no small thing I’ve discovered, and shouldn’t be taken lightly. Here at Atlantic we are operating at always increasing our upper limits of capability, redundant internet service and backbone connections, to make sure that we can provide optimum services in the event of peak loads. Here’s 3 articles that allow you to grasp the unseen efforts of data management:

Read More


Do You Know the Difference between a Gigabit and a Gigabyte?

Many people confuse the terms “gigabit” and “gigabyte” by either using them synonymously or confusing their meanings. While both are units of measurement describing digital data, how much they measure and how they are used are different. We clearly explain these different meanings to clearly define it.

A Bit

A bit is the most basic unit used in computing and telecommunications. A bit is a binary unit, meaning it can have one of two values: a 1 or a 0. In computers, this value can indicate expressions such as “true” or “false”, “yes” or “no”, “come hither” or “ain’t gonna happen”. (Just kidding with that last one!)

A Byte

A byte is 8 bits*. Werner Buchholz, an American computer scientist, coined the term “byte” in 1956 during the construction of the IBM Stretch computer. He deliberately spelled the term differently to avoid confusion with the term “bit.”

Read More


Windows VPS vs. Linux VPS – Comparing Cloud Based Server OS

Eddie November 27, 2012 by under HIPAA Compliant Cloud Hosting 0 Comments

One of the most important decisions to make when choosing a virtual private server (VPS) is whether a Linux or Windows-based option is right for you.  While IT professionals often swear by one or another, these two systems are aimed at meeting the requirements of two different types of users.  Windows and Linux are both very different operating systems, with Linux being an open source operating system that is free to use and Windows a commercial operating system.  While many basic functions are shared between the two, the prices and uses of a Windows cloud hosting VPS and a Linux VPS are very different.

Read More


Cloud Hosting Should Expect Growth in the Health Care Industry According to a Recent Report

Joel August 24, 2012 by under HIPAA Compliant Cloud Hosting 0 Comments

As a whole, the healthcare industry is very conscientious about data privacy and problems with making IT systems inter-operable have played a major part in delaying the adoption and growth of cloud computing in the industry.  Health care data has specific requirements regarding security, confidentiality, availability to authorized users, reversibility of data, and long-term preservation.  Regulations such as the electronic health record guidelines on meaningful use and the Health Insurance Portability and Accountability Act (HIPAA) have stunted public cloud adoption in this industry.

Read More


Five Cloud Computing Myths

Separating the myths from the facts of cloud computing is a good place to start when you are trying to figure out the best solution for your business.  Let’s take a look at five popular cloud computing myths and the realities behind them so that you can make intelligent and informed decisions about the cloud.

Read More


New York, NY

100 Delawanna Ave, Suite 1

Clifton, NJ 07014

United States

San Francisco, CA

2820 Northwestern Pkwy,

Santa Clara, CA 95051

United States

Dallas, TX

2323 Bryan Street,

Dallas, Texas 75201

United States

Ashburn, VA

1807 Michael Faraday Ct,

Reston, VA 20190

United States

Orlando, FL

440 W Kennedy Blvd, Suite 3

Orlando, FL 32810

United States

Toronto, Canada

20 Pullman Ct, Scarborough,

Ontario M1X 1E4

Canada

London, UK

14 Liverpool Road, Slough,

Berkshire SL1 4QZ

United Kingdom

Resources

We use cookies for advertising, social media and analytics purposes. Read about how we use cookies in our updated Privacy Policy. If you continue to use this site, you consent to our use of cookies and our Privacy Policy.