Home-Slide, News, Technology

F5 accelerates AI at the edge for service providers with NVIDIA BlueField-3 DPUs

Ahmed Guetari, VP and GM, Service Provider at F5.

F5 has announced BIG-IP Next Cloud-Native Network Functions (CNFs) deployed on NVIDIA BlueField-3 DPUs, deepening the companies’ technology collaboration.

This solution offers F5’s proven network infrastructure capabilities, such as edge firewall, DNS, and DDoS protection, as lightweight cloud-native functions accelerated with NVIDIA BlueField-3 DPUs to deliver optimised performance in Kubernetes environments and support emerging edge AI use cases.

The F5 Application Delivery and Security Platform powers a majority of the world’s Tier-1 5G, mobile, and fixed line telco networks. Service providers recognise the challenges of scaling AI applications across distributed environments, particularly as legacy infrastructures in the network core often lack the processing power required to make AI inferencing practical.

F5 CNFs running on NVIDIA DPUs can now be embedded in edge and far edge infrastructures to optimize computing resources, dramatically reduce power consumption per Gbps, and limit overall operating expenses. Further utilizing edge environments to add functionality and AI capabilities to subscriber services also comes with added security requirements, which F5 and NVIDIA BlueField technologies deliver alongside advanced traffic management while minimising latency.

Deploying CNFs at the edge puts applications closer to users and their data, promoting data sovereignty, improving user experience, and reducing costs related to power, space, and cooling. Enabling low latency remains essential for AI applications and capabilities such as:

  • Immediate decision making, supporting autonomous vehicles and fraud detection.
  • Real-time user interaction, including NLP tools and AR/VR experiences.
  • Continuous monitoring and response, required for healthcare devices and manufacturing robotics.

Including CNFs on BlueField-3 DPUs expands on F5’s previously introduced BIG-IP Next for Kubernetes deployed on NVIDIA DPUs. F5 continues to leverage the NVIDIA DOCA software framework to seamlessly integrate its solutions with NVIDIA BlueField DPUs. This comprehensive development framework provides F5 with a robust set of APIs, libraries, and tools to harness the hardware acceleration capabilities of NVIDIA BlueField DPUs. By utilising DOCA, F5 achieves rapid integration and high performance across various networking and security offloads while maintaining forward and backward compatibility across generations of BlueField DPUs. Further, accelerating F5 CNFs with NVIDIA BlueField-3 frees up CPU resources which can be used to run other applications.

Edge deployments open up key opportunities for service providers, including distributed N6-LAN capabilities for UPFs, and edge security services to support Distributed Access Architecture (DAA) and Private 5G. In addition, AI-RAN is gaining momentum, with SoftBank recently showcasing their production environment with NVIDIA.

Unlocking the potential of AI-RAN with NVIDIA and F5

AI-RAN seeks to transform mobile networks into multi-purpose infrastructures that maximise resource utilisation, create new revenue streams through hosted AI services, and improve cost efficiency. Enabling mobile providers to support distributed AI computing with reliable, secure, and optimized connectivity, AI-RAN strengthens edge infrastructure capabilities by taking advantage of otherwise dormant processing power. Together, BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will accelerate AI-RAN deployments with streamlined traffic management for both AI and RAN workloads, as well as provide enhanced firewall and DDoS protections. Multi-tenancy and tenant isolation for workloads tied to essential capabilities will be natively integrated into the solution. With F5 and NVIDIA, mobile providers can intelligently leverage the same RAN compute infrastructure to power AI offerings alongside existing RAN services, driving significant cost savings and revenue potential through enhanced user offerings.

“Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures, driving continued collaboration between F5 and NVIDIA”, said Ahmed Guetari, VP and GM, Service Provider at F5. “In particular, service providers see the edge as an area of rising interest, in that data ingest and inferencing no longer must take place at a centralised location or cloud environment, opening up myriad options to add intelligence and automation capabilities to networks while enhancing performance for users”.

Availability

General availability for F5 BIG-IP Next Cloud-Native Network Functions deployed on NVIDIA BlueField-3 DPUs is anticipated for June 2025. For additional information, please visit F5 at the NVIDIA GTC event taking place March 17–21 in San Jose, California, read the companion blog, and contact F5.

Image Credit: F5

Previous ArticleNext Article

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines