Statista estimates that 65% of the cloud market is dominated by hyperscalers: AWS (32%), Azure (23%) and Google (10%). Organizations use cloud services from these vendors for machine learning, data analytics, cloud-native development, application migration, and other services.
But as enterprises expand their offerings, hyper scalers and their customers must face up to greater challenges around areas like security, data privacy, protection, scalability, and resilience, as well as performance. Cloud customers must also grapple with concerns around vendor lock-in, ease of management, technical debt, skills shortage, and regulatory compliance, on top of what cloud providers have to deal with.
Hyperscalers in the cloud market
As hyper scalers account for 65% of the cloud market, Jenkins said he does not see a problem with it as long as those providers are providing the right services to meet organisational goals.
“Organisations want to be portable across clouds, so we think about how we can help our customers avoid vendor lock-in so they can use the best-of-breed services across all clouds. Hyperscalers provide different types and quality of services at different prices. We want to help our customers take advantage of that and move workloads across those different clouds,” the Akamai executive noted.
Considering alternative cloud providers
He said that enterprises should always consider alternative cloud providers.
“With each new application they are building, they need to consider the requirements for that individual application from a latency, performance, and cost perspective. They should also look at the changing technologies, as organisations often get locked into a certain kind of architecture or thinking, and they’ll realise that those constraints no longer apply.”
He adds that organizations need constant reevaluation of how these providers can optimise costs, and when to consider an alternative cloud.
One of the key selling points of the cloud is scalability or resilience. When asked if the scalability in the cloud equal, Jenkins’ short answer is no.
“The longer answer is it shouldn’t be because scalability in the cloud is not a one-size-fits-all kind of model. It depends on what your cloud service model is as an organisation, and what your specific application architecture is.”
He added that Akamai Connected Cloud is focused on distributed types and multi-cloud types of architecture to get services closer to where customers are.
“We have extremely low egress costs to encourage organisations to scale and take advantage of multiple clouds,” he added.
From his conversations with different customers around the globe, Jenkins found out that the application teams do not think about egress costs but about building those applications in terms of performance, redundancy, and reliability.
“Our customer in India, Zolvit, was concerned about the common cloud challenges of cost, price performance, and scalability. With Akamai Connected Cloud, they thought about how they can put these applications exactly where and when their customers need them instead,” he said.
Determining an appropriate cloud provider
He said that most businesses understand their average use, and what their steady state looks like in terms of scalability and resilience.
“For example, e-commerce sales produce massive spikes on infrastructure. Businesses need to factor costs and customer experiences, among various factors, in optimizing performance to serve customers. Then, business leaders can make decisions for a cloud provider (or cloud providers) that can best align with their business objectives, while considering the changes in technology and outdated constraints, which their architectures might be locked into.”
Impact of AI
He is positive that the adoption of AI and machine learning is growing and will continue to mature.
When it comes to the end-to-end chain of machine learning services, organisations were focused on the training side in the past year. But in 2024, Jenkins said that organisations will focus on machine learning operations.
“Customers are focused on cutting latency by shifting functionality close to the edge, especially in use cases like AI for autonomous vehicles.”
Processing data and inferencing at the edge to deliver results is going to change customer experiences fundamentally and push AI forward.Jay Jenkins
He said that there are a lot of open-source models and materials on automation and AI available, making it easier for teams to leverage technology and learn about it. However, in terms of machine learning operations, he would say that some enterprises are not yet capable to operationalise machine learning.
“That is going to be the tougher part for organisations moving forward because there isn’t a lot of automation in that space now. That’s going to change over time, but I think that’s probably the toughest thing, and that is why organisations are relying more on vendors for machine learning operations.”