Zephyrnet Logo

Organizations Must Consider Cost and Security Before Moving to the Public Cloud

Date:

Click to learn more about author Jon Toor.

About three to five years ago, organizations tended to view the public cloud as something of a panacea — an easy solution to the challenges associated with managing their data on-premises themselves. Recently, the COVID-19 pandemic has bolstered this attitude for many enterprises, with some CIOs convinced they must go all-in on the public cloud to support their new remote workforce. But these CIOs should take note of the many organizations before them that migrated large volumes of their data to the public cloud, such as Seagate, only to later repatriate it back on-premises. While the public cloud provides clear benefits for certain use cases, companies that move all or most of their data there often end up paying higher costs than expected while jeopardizing the security of their data and sacrificing performance. Anyone planning a major move to the public cloud should first think about the following cost, security, performance, and vendor lock-in considerations.

Public Cloud Costs Exceed Expectations

Public cloud providers employ complex pricing structures, inhibiting
visibility into how much an organization will actually have to pay. These
pricing structures can differ significantly from cloud to cloud, making it
harder to determine costs for organizations evaluating or deploying in multiple
platforms. Even in a single public cloud platform, there are varying types of
fees and several thresholds that pile on costs when resource usage hits a
certain point. Because of this complexity, many CIOs are stunned when they
begin receiving monthly invoices that dramatically exceed what they had
budgeted.

At a high level, three factors determine public cloud costs. First is
the amount of data an organization stores in the cloud. This is the easiest of
the three components to measure and estimate, but in large organizations, the
volume of data being kept in the public cloud can often grow more quickly than
anticipated. The next factor is data access frequency, with organizations
having to pay a fixed rate every time they leverage data. This cost can be
difficult to predict for large enterprises, with many users spread across
several business units that need to access data for various purposes. Finally,
there’s the cost of WAN bandwidth usage. In addition to simple access fees,
organizations have to pay network bandwidth charges when they use that data.
These fees vary greatly depending on how much data is being used, how it’s
being used, and the network bandwidth required to support these instances.
Generally speaking, it takes three hours using a 1G link to access a single TB
of data in the public cloud. To achieve the access performance needed, organizations
might have to pay for an expensive WAN upgrade.

A False Sense of Security

One major misconception about the public cloud is that it’s more secure than on-prem deployments. However, this simply isn’t the case. For example, ransomware is possibly the biggest cybersecurity threat facing organizations today, and according to a recent Sophos survey, 59 percent of ransomware attacks took place in the public cloud.

Many enterprises falsely believe that cloud providers will take care of
their security needs. In reality, it’s incumbent upon the organizations
themselves to protect their data. And security best practices are very
different in the public cloud than they are on-prem. This causes confusion
regarding what customers need to do to secure their deployments, such as how to
properly configure their public cloud storage bucket services, so they’re
safeguarded from unauthorized access.

Unpredictable Performance

Performance in a public cloud setting can be broadly defined as the
amount of time it takes to transfer data to and from the cloud. In these
environments, performance depends on available WAN bandwidth and the cloud
provider’s overall workload burden at that moment, making it highly unpredictable.
When an organization needs to use an app involving a significant amount of data,
they can encounter considerable latency. This variability in performance can be
unacceptable for mission-critical applications.

A Hybrid Future

Despite these drawbacks, the public cloud offers a number of clear
benefits. For one, it’s highly scalable, making it a good fit for apps that
have elastic compute requirements. Think about a retail application that must support
huge increases in traffic during the holidays but then only needs to support
one-quarter of that traffic during the rest of the year. The public cloud is
also cost-efficient and convenient for disaster recovery use cases.

The best option for most organizations is a hybrid cloud approach, keeping the majority of data on-prem (in a traditional data center or private cloud setting) while putting select data in the public cloud for certain use cases. By managing most data on-prem, enterprises can keep costs lower, better secure their data, and enjoy consistent performance. Meanwhile, they have the freedom to expand their deployments into the public cloud when circumstances require it.

With remote work here to stay for the foreseeable future, there will be
continuing buzz about the necessity of migrating completely to the public
cloud. To best serve their organizations, CIOs must resist the hype and adopt a
flexible hybrid strategy that allows them to leverage public cloud services
when it fits business needs while storing most of their data on-prem.

Source: https://www.dataversity.net/organizations-must-consider-cost-and-security-before-moving-to-the-public-cloud/

spot_img

VC Cafe

VC Cafe

Latest Intelligence

spot_img