One of the unexpected outcomes of the COVID-19 pandemic in 2020 was the rush by businesses in Asia-Pacific to adopt cloud computing solutions. Part of the rush was attributed to making sure remote workers were able to access systems from wherever they were.
However, during a PodChat dialogue, IDC’s Rajnish Arora, vice president for enterprise computing research at IDC Asia/Pacific, says there remains a large number of legacy applications and systems that will likely remain on-prem in the near-term held down by concerns around security and performance.
The result will see a preference towards a hybrid approach to computing needs – one that sees enterprises continuing to own and operate their own data centres while at the same time operating some applications and services in public clouds, private clouds, community clouds, virtual private clouds, edge clouds, and what Gartner predicts, distributed clouds.
This will raise new concerns for CIOs, some more than others, some already existing with their current setup but expanded with the introduction of new environments. Issues like compliance, compatibility difficulties, visibility and control, SLAs, governance, data security, proper redundancy, over and under-provisioning and data migration, and the right skills and experience to operate a complex environment. CISOs will weigh in with escalating cybersecurity concerns.
Definitions
Arora believes that terms and terminologies need to be correctly stated. IDC says a customer is in a multi-cloud environment if they are using a mix of private and public cloud services.
Private Cloud can be on-premise or in an off-premise environment such as a hosted private cloud. If the customer is using a mix of private and public cloud services with a low degree of interoperability and connectivity for seamless mobility of workloads/ applications and data, IDC defines the environment as simply multi-cloud.
However, a customer environment with a high degree of connectivity and interoperability between private and public cloud resources is defined as a true Hybrid Cloud.
Concerns about shifting from on-prem to a hybrid model
Keith Budge, executive vice president, Asia Pacific and Japan, Teradata says there are different kinds of concerns associated with shifting from on-prem to a hybrid model. The first concern will be about security.
He opined that despite this being a real concern, the cloud is still more secure than their on-premise data centre equivalent.
“Cybersecurity is also something that remains a specialist area but is a broad risk in the public cloud, but again you get some measure of protection supplied by the cloud providers themselves,” he continued.
The next concern he called out is vendor or cloud lock-in. “Many cloud services are not offered across multiple clouds or on-premise. This is where your architecture can save you. You need a good service-oriented and API first architecture design that is modular and provides good abstraction and isolation,” he elaborated.
“The function of the service, and most importantly, the set of APIs that connect the services, is the key here, as they are not dependant on technology, you can have different software deployed in different clouds and as long as they provide the same service and connect using the same APIs then your application architecture is portable.”
Keith Budge
The third concern has to be with physics. According to Budge, it takes time to move big things around. “The cloud doesn’t magically make this better unless you can make something massively parallel. When you are using shared infrastructure you need to share!
“You need to adjust your expectations to the reality of consuming shared resources and make sure you build your applications to be resilient to more common outages, failures and software upgrades and varying performance levels,” he explained.
For Arora concerns around the use of what type of cloud depends on any number of the application or use case:
- Application performance especially for OLTP type workloads with a very large number of transactions being processed within a given time frame, requiring the highest levels of reliability and security capabilities
- Highly latency-sensitive workloads such as credit card, debit card or other forms of electronic payment processing, pre-paid telecom billing systems or airline reservation systems. It could also include shopfloor automation or MES systems in a high-tech or highly resilient manufacturing environment
- Strong resistance where there are stringent regulations around data governance, security and compliance purposes
- Workloads and applications which require large pools of infrastructure resources but are extremely predictable which mean that customers and CIOs can work with their tech partners to use a mix of CAPEX and flexible consumption OPEX models to streamline and optimise their infrastructure costs. Public cloud shines and is extremely attractive when workloads are highly elastic with unpredictable peak resource requirements.
- Large number of geographically dispersed users accessing applications and data which is an area where public cloud offers much better connectivity, performance and cost economics for the organisation
Barriers to cloud adoption
Teradata’s Budge cites nervousness over security and compliance as a barrier to cloud migration.
He concedes an organisation feels it is generally more in control of infrastructure sitting on its premises. The company can set policies relating to security and compliance. The caveat is that the internal cost effort, time, and risk involved in trying to detect all personal data manually is substantial and inefficient. They may choose to consult experts to help establish regulatory compliance.
“In moving to the cloud, faith is put in a third party to carry out tasks that matter to the organisation, like managing confidential data. However, this does not mean that organisations should expect cloud providers to bear all responsibility,” he opined.
He suggests that to draw a clear line and determine where responsibility lies, organisations must have a discussion with cloud providers and implement processes to ensure their environment meets the organisation’s security and compliance requirements.
“This may not necessarily mean building an entirely new set of security and compliance processes but to review existing policies and ensure they are relevant for cloud-based environments,” he added.
According to Budge, organisations that have invested in securing their data, addressing personally identifiable information issues and providing lineage and traceability of data from source to insight have a distinct advantage in the compliance space as they can show clearly to regulatory bodies their risk management and mitigation approaches.
He cautioned that if this investment has not been made, then it becomes difficult for developers and IT teams as they may deploy capabilities into the cloud unaware of the risk, thus risking brand damage if a breach occurs, or the risk is difficult to manage and innovation is stifled.
For his part, IDC’s Arora observes that organisations are setting up teams responsible for data governance and data management KPIs.
"These teams are responsible for ensuring the right security protocols are in place for data protection at different levels – all the way from the edge access to the core data centre, data encryption and key management to secure data in motion and have policies in place to ensure data loss or data can be quickly recovered in the event of an outage or a security breach."
Rajnish Arora
“Data governance is all about providing the right access for information assets to support business workflows and ensure agile and quick decision making to address the changing external and internal business paradigms,” he pointed out.
Hidden traps and gotchas in securing the hybrid infrastructure
Budge cautioned that like all technology, cloud computing comes with risks, with security as one of the top risks.
“When moving to a hybrid infrastructure, the involvement of different environments and increasing complexity of operations and security management requires organisations to have a solid cybersecurity strategy in place for both on-premise and cloud," said Budge.
“Another consideration involves centralised management and reporting across the hybrid cloud environments, a “single pane of glass” that can collect data from the technologies and give a view of costs and activities,” he concluded.
Jeff Yong Xun Xie, senior market analyst for security research, IDC Asia/Pacific, sees poor data and process classification as leading to improper segmentation of what stays on-prem and what goes on to the cloud.
He believes this often leads to poor security strategy applied as organisations are not clear of the data that they need to secure. It also leads to issues like the redundancy of data and overlapping backups that increases the number of attack vectors for cybercriminals.
The other common issue is not having clear data maintenance and exit strategy got a hybrid model. Hybrid infrastructures are dynamic and versatile. This results in sensitive information transmitted through various locations in daily operations.
“Not having a clear indication of where these data reside makes it complicated when an organisation decides to purge certain information be it due to regulatory requirement or as a maintenance process. An exit strategy should also be considered so that proper data handover/destruction processes can be executed when changing vendors or setup."
Jeff XIe