The News: Hewlett Packard Enterprise (HPE) announced the expansion of its AIOps network management capabilities by integrating multiple generative AI (GenAI) large language models (LLMs) within HPE Aruba Networking Central, HPE’s cloud-native network management solutions, hosted on the HPE GreenLake Cloud Platform. Read the full press release on the HPE website.
HPE Infuses GenAI LLMs to Uplift HPE Aruba Networking Central AIOps
Analyst Take: HPE unveiled the integration of multiple GenAI LLMs within HPE Aruba Networking Central, augmenting its overall HPE GreenLake Cloud Platform hosted AIOps network management proposition. HPE Aruba Networking Central’s new self-contained set of LLMs was designed with pre-processing and guardrails targeted at improving user experience and operational efficiency, with a focus on search response times, accuracy, and data privacy.
With what HPE characterizes as one of the largest data lakes in the industry, HPE Aruba Networking has collected telemetry from nearly four million network-managed devices and more than one billion unique customer endpoints, which power HPE Aruba Networking Central’s machine learning (ML) models for predictive analytics and recommendations. The new GenAI LLM functionality will be incorporated into HPE Aruba Networking Central’s AI Search feature, complementing existing ML-based AI throughout HPE Networking Central with the goal of providing deeper insights, improved analytics, and more proactive capabilities.
Impact to App Connectivity and Container to Pod Experience
HPE Aruba Networking provides a solution to manage the complexities of both North/South and East/West traffic leveraging service mesh and API gateways. Leveraging advanced technologies, HPE Aruba Networking optimizes the flow of data across networks, helping to enable seamless communication between users and resources. With a focus on enhancing efficiency and reliability, HPE Aruba Networking facilitates the smooth transmission of North/South traffic, which encompasses data flowing in and out of the network, as well as East/West traffic, referring to communication between devices within the network. Through intelligent routing and traffic shaping mechanisms, HPE Aruba Networking can optimize network performance, minimizing latency and maximizing throughput to support diverse workloads and applications.
Service mesh is a network infrastructure comprising microservices and their communications, ensuring seamless interaction between distributed applications. It operates as a dedicated layer facilitating secure service-to-service communication, routed through proxies for enhanced security. Istio, an open-source service mesh platform, offers comprehensive support for distributed applications with API integrations for logging, telemetry, and policy systems, streamlining the process of securing, connecting, and monitoring services. It automatically manages load balancing for various types of traffic, including HTTP, gRPC, WebSocket, and TCP. HPE Ezmeral Runtime Enterprise provides support for Istio, enabling efficient management of Istio service mesh. Additionally, HPE GreenLake Central supports Istio Authorization Policy, allowing for access control on mesh workloads. Kiali, a management dashboard, provides a graphical interface for monitoring and managing the enabled Istio service mesh, validating configurations and monitoring traffic flow to ensure network health. Launching the Kiali dashboard provides users with detailed insights into their service mesh configuration and performance.
The integration of GenAI LLM functionality into HPE Aruba Networking Central’s AI Search feature marks a significant milestone in network management capabilities. By incorporating advanced machine learning algorithms, HPE Aruba Networking Central enhances its ability to analyze data, identify patterns, and predict network behavior. This integration extends beyond traditional analytics, offering deeper insights and enabling more proactive measures to address potential issues before they impact network performance. Moreover, with the inclusion of service mesh and API gateway communication, HPE Aruba Networking Central further enhances its agility and flexibility, enabling seamless integration and communication between distributed services and applications. This comprehensive approach empowers organizations to build robust, adaptive networks capable of meeting the evolving demands of modern digital environments.
Observability and LLMs: Integral to Boosting App Usefulness Across Distributed Enviros
A key aspect of HPE Aruba’s observability strategy is its centralized management platforms, such as Aruba Central. Aruba Central provides a single pane of glass view into the entire network infrastructure, allowing administrators to monitor and manage devices, applications, and users from a unified interface. This centralized approach streamlines network operations and troubleshooting, enhancing overall observability.
HPE Aruba integrates advanced analytics and machine learning capabilities into its networking solutions to provide proactive insights and predictive analytics. By analyzing data patterns and trends, HPE Aruba helps identify potential issues before they impact network performance, enabling administrators to take preemptive action and ensure optimal network operation. LLMs play a significant role in enhancing observability within HPE Aruba’s networking solutions. By incorporating LLM functionality into their platforms, such as HPE Aruba Networking Central, the network’s ability to analyze and understand vast amounts of data is greatly enhanced. LLMs excel in processing and interpreting natural language, which allows them to ingest, analyze, and interpret network telemetry, logs, and other data sources more effectively.
With the integration of LLMs, HPE Aruba’s networking solutions gain the ability to understand complex queries and commands in natural language, improving the user experience for network administrators. This capability enables administrators to interact with the network more intuitively, facilitating faster troubleshooting, configuration, and optimization tasks.
LLMs enhance the analytics capabilities of HPE Aruba’s networking solutions by enabling deeper insights into network performance and behavior. By analyzing unstructured data such as network logs, LLMs can uncover hidden patterns, anomalies, and trends that may indicate potential issues or optimization opportunities within the network. This proactive approach to analytics empowers administrators to identify and address issues before they impact network performance or user experience.
LLMs are essential to enhancing observability within HPE Aruba’s networking solutions by enabling natural language interaction, improving analytics capabilities, and facilitating proactive network management. By leveraging LLM technology, HPE Aruba enhances its ability to provide administrators with the insights and visibility needed to effectively monitor and manage modern network environments.
The Verizon Business Connection
HPE also announced Verizon Business is expanding its managed services portfolio to include HPE Aruba Networking Central. Through the HPE relationship, Verizon Business offers Managed Network Services (MNS) that include network infrastructure, software-defined (SD) branch, and managed local area network (LAN) as well as its network as a service (NaaS) framework for secure and reliable network connectivity.
The NaaS offering enables Verizon Business to offer more flexible network capabilities, aligning specifically with customer needs. As such, customers can transition from fixed-cost, capital expenditure (CapEx)-intensive, and complex do-it-yourself (DIY) approaches to a managed service that can evolve according to business requirements and objectives. NaaS solutions are well-suited for LLM-driven AIOps innovation as it emphasizes Software as a Service (SaaS) performance optimization for cloud applications specifically.
As such, we find that the inclusion of HPE Aruba Networking Central can further strengthen the Verizon Business proposition, encompassing fixed, Wi-Fi, and WAN Edge implementations across environments ranging from large campus to small branch. This includes AI-driven solutions built to deliver better user experiences and quality of service. Together, Verizon and HPE are now better positioned to bolster workforce productivity and support evolving cloud-hosting business applications that attract greater developer prioritization, especially across the HPE GreenLake Cloud Platform.
HPE AIOps: A Winning Approach to Attaining Zero Trust Security
Organizations are becoming more distributed due to the swift expansion of hybrid work models alongside workforces and devices operating in multiple contexts, facilitating the dissolution of the traditional network perimeter. This puts a premium on security with IT teams increasingly tasked with protecting assets across the distributed continuum without compromising performance. Plus, local and regional sovereign data mandates require that data privacy is upheld, and data management complies with workload distribution requirements.
Eve-Marie Lanza, Senior Security Solutions Marketing Manager, Aruba, championed in the “Better together: Opportunities for network and security team collaboration” blog, HPE Aruba Networking solutions can now provide comprehensive visibility, insights, centralized policy management, data protection, threat defense, and access control in a single platform.
We find that LLM-enhanced HPE AIOps are critical to improving collaboration between security and network teams as they focus on common objectives for data protection and advancing digital transformation. This includes implementing zero trust principles throughout the entire networking fabric encompassing edge, campus, data center, and cloud environments. Through enhanced HPE AIOps capabilities, unified visibility built into the infrastructure of all users, devices, applications, and data can become more readily available by streamlining networking and security operations using a common source of data, unleashing AI/ML-powered, security-first networking.
Key Takeaways: HPE Aruba Networking Central Ready for GenAI Prime Time
We believe that HPE Aruba Networking Central’s new approach of deploying multiple LLMs to incorporate GenAI capabilities provides the portfolio foundation key to ensuring security-first, AI-powered insights into their critical infrastructure, propelling AI innovation on an organization-wide basis. By using LLM technology, HPE Aruba elevates its portfolio to give administrators the insights and visibility needed to effectively monitor and manage modern network environments.
Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.
Other Insights from The Futurum Group:
HPE’s Game-Changing $14 Billion Acquisition of Juniper
5G Factor: HPE Juniper Deal – The 5G Ecosystem Impact
HPE GreenLake Lights Up Hybrid Cloud Scoreboard with New Deals
Author Information
Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.
He is a recognized authority at tracking the evolution of and identifying the key disruptive trends within the service enablement ecosystem, including a wide range of topics across software and services, infrastructure, 5G communications, Internet of Things (IoT), Artificial Intelligence (AI), analytics, security, cloud computing, revenue management, and regulatory issues.
Prior to his work with The Futurum Group, Ron worked with GlobalData Technology creating syndicated and custom research across a wide variety of technical fields. His work with Current Analysis focused on the broadband and service provider infrastructure markets.
Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.
At The Futurum Group, Paul Nashawaty, Practice Leader and Lead Principal Analyst, specializes in application modernization across build, release and operations. With a wealth of expertise in digital transformation initiatives spanning front-end and back-end systems, he also possesses comprehensive knowledge of the underlying infrastructure ecosystem crucial for supporting modernization endeavors. With over 25 years of experience, Paul has a proven track record in implementing effective go-to-market strategies, including the identification of new market channels, the growth and cultivation of partner ecosystems, and the successful execution of strategic plans resulting in positive business outcomes for his clients.