I read an interesting article this week about Sonoco, a large multinational packaging company, using containers to save on licensing. While the prime subject was interesting in itself, this quote from Nancy Lawson, the primary SQL Server DBA, caught my eye, “My main concern is that we have enough issues with network performance within our own data centers.” While nearly everyone is looking for ways to create a fatter pipe between data centers using tools like WAN accelerators, these techniques don’t reduce overall traffic within the network. So then, the organizations use complex QoS and application prioritization algorithms to attempt at ensuring reasonable performance for critical applications. Then this leads to adding discovery tools, monitoring systems, and more and more complexity – not just of the application infrastructure, but for the management infrastructure for ensuring performance and availability. This complexity causes every infrastructure management decision to become a strategic decision, not just due to complexity, but because these become six to seven figure spending decisions. If you can reduce overall network traffic you get several benefits including:
- better application performance,
- reduced complexity of the application and management infrastructure,
- lower costs of bandwidth and infrastructure.
The only way to address this is at the endpoints (the sum of all the infrastructure to support the applications: servers, desktops, laptops, etc.) Endpoints are where the network traffic begins and ends. If you don’t address network performance at the root, the endpoint, that’s when you need to add huge complexity to your environment. Implementing acceleration at the endpoint will improve application network performance while also reducing complexity and overall congestion across the entire network (LAN, WAN, Cloud).
Fred Johannessen- CEO