Home » Technology » Cloud Computing » 10 ways to allocate workloads in the cloud
As companies tackle how to move applications to the cloud, they face decisions on how to prioritize software workloads. Companies in various industries have specific requirements on which workloads to prioritize. Health care organizations give preference to applications that adhere to the Health Insurance Portability and Accountability Act (HIPAA), while an online retailer often prioritizes its customer-facing applications.
Here are 10 strategies enterprises should consider when allocating workload priorities in the cloud.
When a retailer experiences seasonal traffic spikes, it may choose to have applications run in a public cloud to accommodate the increase in computing capacity requirements. Called bursting, this strategy provides some flexibility for companies when managing their cloud workloads.
“The flexibility and elasticity of cloud environments makes them an attractive option for workloads that may experience temporary or seasonal traffic spikes, or where traffic is unpredictable,” said Karyn Price, an analyst with Frost & Sullivan’s Stratecast practice.
“Bursting allows an application to temporarily occupy cloud-based computing or storage resources, based on a temporary need. In a hybrid configuration, businesses can achieve the same elasticity without moving the workload from its current, dedicated environment,” she added.
When a power failure occurs, such as during a hurricane, businesses need to ensure that cloud applications will continue to run. Failover allows companies to switch applications to a secondary, dedicated location or a public cloud environment.
“The latter choice will often offer quicker overall recovery at a lower price point than when a secondary facility is ‘hot’ at all times, even when not in use,” Price said.
Companies can implement workloads dynamically based on scalability and disaster recovery requirements, according to a Symantec-VMware white paper called “Securing the Cloud for the Enterprise.”
With application workloads drawing code and data from multiple sources, businesses should avoid placing all of an application’s components in the cloud, according to Price.
“Enterprises with a complex storage-area network (SAN) may keep the complex configuration in-house, while moving other elements of the workload into the cloud,” Price said.
“A cloud management platform can help greatly in adding automation and orchestration capabilities and delivers service quality, security and availability for workloads running in cloud environments,” Intel stated in a planning guide on “Private Infrastructure as a Service.”
Platforms such as Dell Cloud Manager can help. An enterprise console for the cloud, Dell Cloud Manager (formerly Enstratius) provides automation tools to help companies manage cloud applications. It offers auto-provisioning, auto-scaling, automated backups, recovery and cross-cloud bursting.
“Dell Cloud Manager is a modern, service-oriented piece of software that allows IT personnel to manage through a simple interface enterprise applications in hybrid environments,” said Roger Kay, founder and president of market-intelligence firm Endpoint Technology Associates.
“The tool allows them to dial cloud resources up and down and mix them with on-premise easily and quickly and in accordance with legal and privacy obligations,” Kay wrote in an email. “The extensions and cache management allow this type of tool to run more efficiently.”
When deciding whichcloud computing resources to use for various applications, customer-facing tools such as online-shopping interfaces should get priority over internal applications, according to Kay.
“Customer, student and patient-orientation applications should come before back-office housekeeping,” Kay said.
For privacy and security purposes, databases of proprietary customer information could remain onsite as companies save money by moving the application itself to the cloud, Price advised.
Just like data centers, cloud computing is incorporating more automation as processing of data becomes more event-driven rather than happening at a specific time, according to Robert Stinnett, a data center automation architect with Carfax.
“You cannot have workload management and you can’t even have cloud, I believe, unless you have some sort of automation in place,” Stinnett told TechTarget. “The whole idea behind workload automation is that keyword: automation.”
A stable internal workload automation process is required before companies can offload applications to the cloud, he added.
A virtualized environment is often the foundation behind a cloud platform and the applications running on it.
A Gartner survey of 505 data center managers worldwide revealed that planned or in-process virtualization of infrastructure workloads would increase from about 60 percent in 2013 to nearly 90 percent in 2014.
“Pervasive virtualization is a strategic approach that provides a method for judiciously bringing legacy applications into your cloud to meet your strategic goals or as time and budget allow,” Intel stated in its cloud planning guide. “Its benefits include better quality of service, improved availability and business continuity, faster resource deployment and lower energy consumption.”
Trust zones are workloads that share common security and compliance policies. When moving data to a public cloud, an adaptive trust zone can keep multiple tenants separate, according to Intel.
“Workloads and the appropriate security policies can then be associated throughout the workload’s life cycle,” the chip maker wrote.
When creating trust zones, companies can set up virtualized security controls to isolate applications.
“Attaching policies to workloads is a stepping stone to addressing security and compliance in the public cloud,” Symantec and VMware stated in its white paper. “Workloads can be migrated to the public infrastructure cloud as long as the same set of policies can be enforced in a consistent and visible manner.
In addition, companies should prioritize according to privacy concerns and legal obligations, Kay added.
“For decisions between premise and cloud, data sovereignty, privacy and legal obligations should be taken into consideration when allocating workloads,” Kay said.
New instructions in Intel’s Xeon processors, called Advanced Vector Extensions 2 (AVX2), help cloud workloads by doubling performance gains for enterprises and expanding the width of vector integer instructions to 256 bits. These advances will enable applications in fields such as engineering, life sciences and physics to increase their performance by 1.9 times.
Products such as Intel’s Cache Monitoring Technology track how threads, applications or virtual machines use cache space.
“Cache monitoring allows real-time, high-confidence measurements of LLC [last level cache] usage, providing control of workload placement and load balancing across shared infrastructure in the virtualized environment,” according to an Intel Xeon product brief.