Cloud Computing's Destiny: Operating As A Single Global Computer, Enabled By Serverless

"We are well down the road of executing our vision of making the world's cloud resources as easy to use as a single computer. When we do, we will finally realize the full revolutionary potential of the cloud. The ability to get what we need when we need it down to the millisecond with the click of a button."

clouds-over-chicago-cropped2-nov-2015-photo-by-joe-mckendrick.jpg
Photo: Joe McKendrick

That's the word from Priya Nagpurkar, director of hybrid cloud platform for IBM Research. In a recent interview with SVP and Director of IBM Research Dario Gil. Nagpurkar explained how IBM Research is pioneering a serverless computing architecture that will transform the cloud into the world's largest computer. Serverless computing will make this all possible opening access without the complications of backend provisioning and security management. 

There are data centers for the world's top public clouds in hundreds of locations that span nearly every continent. "However, this only paints a portion of the picture," says Gil. There are also a "massive number of private computing environments that exist in silos across the globe. The cloud has dramatically evolved over many years to what it is today, a massively distributed network of public and private data centers comprising zettabytes of computing power and data storage."

For all the progress of what's happening on cloud, we have to "get to the point where we get the cloud to work as if it was a single infinitely powerful computer," says Nagpurkar. Right now, there are too many obstacles in the way, she adds. "Think about the simplicity of just working on your laptop. You have a common operating system tools you you're familiar with. And, most importantly, you're spending most of your time working on code. Developing on the cloud is far from that. You have to understand the nuances of all the cloud providers -- there's AWS, Azure, GCP, IBM, and private clouds. You have to provision cloud resources that might take a while to get online. And you have to worry about things like security, compliance, resiliency, scalability, and cost efficiency. It's just a lot of complexity."

Proprietary software stacks from different vendors "not only add to all this complexity but they stifle innovation," she says.  "Key software abstractions start with the operating system. Linux as the operating system for the data center era unleashed this proliferation of software, including virtualization technologies like containers. That ushered in the cloud era." 

Serverless technologies are paving the way to being able to access and leverage this emerging global computer, she continues. "Serverless technologies are the key to realizing this. "There are three key attributes to serverless: ease of use, on-demand elasticity, and pay for what you use," says Nagpurkar. "For example, take a simple data prep task on the cloud, which is fairly common. But the data in this case could be coming from anywhere -- edge environments for example. To make this as simple as a command, you could issue on your laptop a lot of things have to happen under the covers, and today, it's the developers and the data scientists doing these things manually.  I have to worry about: do I have access? Am I allowed to move the data? Where are the API keys? How many containers should I spin up? This is what I spend most of my time on. But with serverless you can literally boil this down to one single command, as simple as moving files around on your laptop -- the serverless platform does the rest underneath. That's the beauty of serverless."

IBM Research is "pushing this vision forward today in the Knative open source community, she continues. With IBM supporting this capability with Red Hat OpenShift Serverless. "We continue to push this evolution of serverless and it's getting us closer and closer to that vision of the cloud as a computer," says Nagpurkar. 

Realizing the vision of a single global computer? "It's one of the greatest challenges that we should solve right now in computer science, to harness this tremendously heterogeneous and distributed system," says Nagpurkar. It's time for a distributed operating system that provides "that common layer of abstraction across these heterogeneous and distributed cloud resources," she says. Kubernetes is the open technology that's emerging as the winner in this evolutionary battle. So you have Linux containers, Kubernetes, both open technologies."

RECENT NEWS

Reassessing AI Investments: What The Correction In US Megacap Tech Stocks Signals

The recent correction in US megacap tech stocks, including giants like Nvidia, Tesla, Meta, and Alphabet, has sent rippl... Read more

AI Hype Meets Reality: Assessing The Impact Of Stock Declines On Future Tech Investments

Recent declines in the stock prices of major tech companies such as Nvidia, Tesla, Meta, and Alphabet have highlighted a... Read more

Technology Sector Fuels U.S. Economic Growth In Q2

The technology sector played a pivotal role in accelerating America's economic growth in the second quarter of 2024.The ... Read more

Tech Start-Ups Advised To Guard Against Foreign Investment Risks

The US National Counterintelligence and Security Center (NCSC) has advised American tech start-ups to be wary of foreign... Read more

Global IT Outage Threatens To Cost Insurers Billions

Largest disruption since 2017’s NotPetya malware attack highlights vulnerabilities.A recent global IT outage has cause... Read more

Global IT Outage Disrupts Airlines, Financial Services, And Media Groups

On Friday morning, a major IT outage caused widespread disruption across various sectors, including airlines, financial ... Read more