Networking gear in Facebook data center in Sweden.
Data center managers, including operations teams, are just now starting seeing the potential of Software Defined Networking, or SDN. Through the convergence of applications and networking, new technology is enabling faster development of applications, smoother administration of networks and better understanding of the whole application and infrastructure ecosystem.
These of new developments and their possible implementation are exactly what data center pros want to continue to learn about and apply in their organization, in order to keep pace with the accelerated environment of the data center.
This spring, Data Center World Global Conference and Expo will include many topical sessions, which will cover issues and new technologies that data center managers and operators face.
Data Center Knowledge had the opportunity to speak with Arup Chakravarty, formerly of Cisco, and now director of network engineering at insurance company, MetLife, about his presentation on creating operational efficiency in the data center.
Chakavarty says his presentation will address a very common issue among enterprise IT systems and infrastructure.
Using Data To Optimize Efficiency
“In the operations environment, there are many particular components, such as infrastructure, routers, switches, compute, storage,” he said. And all of these elements create data, but it is often separate and siloed. “We need to be putting things together.”
He is suggesting more operational efficiency can be achieved by breaking down barriers by correlating information contained in different locations, such as server logs, network logs, firewall logs, and load-balancer logs.
“Who is stitching these together?” he asked. “In today’s modern environment, workloads cannot be siloed.”
For example, in a virtual machine there could be about 20 different components and they have a cascading effect on each other. There’s the NIC, IP address, VLan association, MAC address, and these have a cabling impact. “Today, we can correlate the data and get understanding of what’s going on,” Chakravarty said.
By bringing data together, the planning for infrastructure needs becomes easier and you can understand the application landscape, he explained.
Chakravarty added that enterprises have thousands of applications, and if there is a better understanding, the organizations will be better able to budget on the infrastructure side. Conversely, a lack of understanding can lead to over-provisioning, based on educated estimates or guesses.
When asked what tools are used to bring this data together for analysis, Chakravary said there are multiple tools, but there is also so much customization and legacy systems, that often a customized approach is needed.
Across the Enterprise IT Sector
In enterprises, it is very common that the left hand doesn’t know what the right hand is doing. As a consultant at Cisco for about 15 years, Chakravarty saw this in his clients from the vendor side and now he’s seeing the same issues from enterprise side.
“You have to find the operations touch points that the enterprise is depending on,” he noted. “It is important to understand the key revenue generating apps. Once they are understood, they can be mapped to the environment. If, for example, there are the resources, you can have Devops write their own analytics tool. There is not one single tool in the industry that can solve these problems.”
His session will examine what challenges most enterprise environments are facing. “The bigger the enterprise, the more legacy equipment, the bigger the challenges,” he added.
To find out more about how examining data can lead to better efficiency, you can register and attend the session by Chakravarty at spring Data Center World Global Conference in Las Vegas. Learn more and register at the Data Center World website.