AI at the Edge: Akamai’s India Inference Cloud & the Shifting Power from Central Compute

Akamai’s India Move: What’s Changing

Inference at the edge, rather than training in a central hub
The idea is to reduce response times, save bandwidth, and offload heavy requests from the core cloud.

Hardware integration
Akamai intends to deploy NVIDIA’s newer Blackwell chips to power the inference cloud by end of December 2025.

Strategic growth in a high-demand market
India has been buzzing as a major AI growth region — local infrastructure for inference means better access, lower cost, and potential for new local AI apps.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *