Edge-forward: Akamai eyes sweet spot between centralized & decentralized AI inference

The New Stackby Adrian BridgwaterApril 1, 20265 min read1 views
Source Quiz

For this edition of The New Stack Makers, we sit down with two leaders at Akamai: Lena Hall, senior director, The post Edge-forward: Akamai eyes sweet spot between centralized & decentralized AI inference appeared first on The New Stack .

For this edition of The New Stack Makers, we sit down with two leaders at Akamai: Lena Hall, senior director, developers & AI engineering, and Thorsten Hans, senior developer advocate

Keen to understand what’s going on in the cloud-native AI universe and where Akamai fits into that story, this discussion took place at KubeCon + CloudNativeCon Europe 2026 in Amsterdam.

We know Akamai as a Content Delivery Network (CDN) business with a focus on cybersecurity and software application development technologies. The Akamai of today is also a modern, developer-friendly cloud infrastructure business ready to deliver for the age of AI in every location.

But what shape does that business model take in real-world terms?

“There are so many use cases that benefit from really low latency distributed processing, and Akamai has always been known for our services around distributed computing. So this is why we have developed managed container services for Kubernetes; this technology works fluidly with our low-latency serverless functions and our distributed AI inference platform,” says Hall.

Bringing compute closer

In our discussion, Hall and Hans explain how the company achieves its proximity play. With 41 core datacenters in 36 countries, Akamai extends its reach through around 4,400 smaller “distributed reach” datacenters worldwide.

“The intention is to bring compute closer to wherever the user is around the planet in order to reduce latency,” Hall says. “But it’s important to remember that there are so many different types of workloads that users like to run. There are those that require really deep thinking and a lot of computing, so this is where centralized data centers do a great job. But when you combine those stacks with distributed edge capabilities, you can deliver faster feedback loops when required.”

Those faster feedback loops are critical in areas such as robotics, fraud detection, or in conversational agents, where a delay can lead to customer loss very quickly. Bringing both centralized and decentralized edge resources together is clearly the sweet spot that Akamai is aiming for.

Brittle integration points?

But let’s question this theory for a second. In taking this approach, is Akamai not building its own integration and configuration nightmare? With a whole string of dependencies and integration points, doesn’t that create an inherently more brittle compute and data stack for AI to rest upon?

“Doing this correctly is precisely the infrastructure service layer that Akamai is capable of providing,” says Hall. “We’re used to delivering this for really large corporations in a managed way with a simplified setup. Users can then move forward to develop new services without having to manage the infrastructure element of the equation and leverage the tools we have.”

Enthusiastic advocates of self-service systems, Hall and Hans describe a computing landscape where users have all the toolkits and services they need to spin up an ecosystem and deploy at will, often with a single command.

Equally positive about the need to underpin open-source support, the Akamai pair again points to the company’s managed Kubernetes service. Akamai also has its own application platform project that runs on top of Linode Kubernetes Engine (LKE) to package a selection of frequently used open source projects. This means users can access these tools through Akamai’s interface without manually installing each piece of software.

We put developers at the center, always… a developer can go from a blinking cursor to a live production-deployed application that is globally distributed on top of the Akamai cloud in around two minutes.

Serverless suitability

Regarding who uses Akamai’s platform within any given software engineering team, developer advocate Hans says that his firm’s serverless technologies, such as Akamai Functions, are especially accessible to developers at all levels. This part of the company’s platform is designed to help developers build, deploy and scale applications and AI workloads using WebAssembly functions across Akamai’s distributed cloud without the burden of managing infrastructure.

“We put developers at the center, always. Akamai worked with the Cloud Native Computing Foundation (CNCF) to create the sandbox project known as Spin. This is a framework for building and deploying serverless applications in WebAssembly. This leads us towards NoOps, so a developer can go from a blinking cursor to a live production-deployed application that is globally distributed on top of the Akamai cloud in around two minutes, all built on different layers of popular open source projects,” says Hans.

Akamai’s work with Spin stems from its December 2025 acquisition of cloud-native Wasm Function-as-a-Service company Fermyon. So-named to embody its mission to bring cold start times down to under 1 millisecond, the Spin team also created SpinKube in 2024 to serve as a Kubernetes runtime. Hans has been responsible for the increased use of Wasm within Akamai, as the team has pledged to make it easier for developers to execute lightweight code at the edge.

Focus on the logic, not the logistics

Promising to always “meet developers where they are” in terms of individual skills, Hans says that Akamai has provided guidance via tutorials, hands-on labs, and ready-to-use applications. In practice, it’s all about telling developers that they shouldn’t spend time worrying about server provisioning and management; they should be able to see what’s inside the box (of any given Akamai service) and think about how they can apply that to their environment’s requirements.

Hall and Hans say they appreciate that there will always be software engineering teams that need to work with the internal infrastructure they run on. But for the majority of its customer base today, Akamai provides a way to work with a higher level of abstraction. This means engineers don’t have to worry about the server underneath; all they need to focus on is the business processes they aim to encapsulate in application logic and the functionality their users need.

TRENDING STORIES

Group Created with Sketch.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by AI News Hub · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Edge-forwar…The New Sta…

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 336 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Products