Take Five: Why Partial Network Visibility Isn't Good Enough
2021-07-09 | 13 min read
An Exec's View on Preventing Blind Spots for Better Security and Performance
With massive volumes of data, an expanding network edge, and rapidly evolving technologies, networks are growing more complex. Cybercriminals are also constantly evolving and finding new ways to exploit network vulnerabilities. Seeing all network traffic is a must for ensuring high security and performance. Partial visibility is simply not good enough. If data packets are missing, corrupted or dropped entirely before they even reach your security and monitoring tools, blind spots develop, making it impossible to see the full picture of everything happening on your network. Network blind spots open the door to performance issues and security risks. Recently I had the opportunity to sit down with Mark Pierpoint, Keysight’s president of network applications and security, to discuss how implementing a network visibility architecture can uncover and prevent dangerous blind spots that too often wreak havoc on even well-established enterprises.
How do blind spots and network vulnerabilities impact a business beyond their network?
Mark: With digital transformation, it is very difficult to separate the network from the business itself because, almost by definition, everybody is using data in some form to improve their processes, performance, and customer satisfaction. Just four years ago, McKinsey published a study pointing to only 40% of businesses having embraced digital transformation. But today it is obviously much higher, especially as a result of the pandemic. In more tech-based companies, you’ve got everything from mobile networks that form a significant business for providers like Verizon and others but also more operational technology (OT) networks that extend far beyond traditional IT. These could be to control a factory, the HVAC in an office building, or something very critical to the water or fuel supply. The world has really changed in the past 12 months and across all businesses, with more employees working from home and many more electronic engagements across partners, suppliers, and customers. So, I think it is clear that the network is very tightly linked with any business, and how this network operates directly impacts the business.
When blind spots and vulnerabilities cause a breach, it has a wide-ranging impact on customers, employees, and so on. Just last week it was a meatpacking plant that was breached, a few weeks before that, the Colonial Pipeline, and before that it was SolarWinds. We still don’t know the full impact of the SolarWinds breach. Cognizant issued a press release in May 2020 about the breach they had experienced the previous month, saying they expected a $50 million to $70 million impact to their business in just that quarter. But obviously, it’s about more than dollars. On average, it takes 200 days to know you have a breach and find out exactly what happened and about another 80 days to remediate the impacts of that breach. Beyond that, there is brand reputation to consider and loss of customers. Lost customer data could potentially have an even bigger impact and take longer to resolve, so it’s important to get a clear view into the network and understand your risks and vulnerabilities. It’s why this is such a big issue for most company boards these days.
What does it mean to send the right data to the right tools at the right time, and how does this impact the bottom line and operational efficiency of a business?
Mark: This is no different than going to your doctor. It would be a terrible day if they took blood work and sent it to your radiologist, or maybe they took an X-ray or CT scan and sent it to your dermatologist. Sending the wrong information to the wrong place is obviously not a sensible thing to do. That may sound simplistic, but too often, network monitoring and security tools receive too much, too little, or not the right data, and it’s costly not to address this. When used properly, there are many effective tools these days, but they can be expensive. It’s important to not just send the right data but to send the right rate of data so that tools are not overwhelmed. Say you’ve got a voiceover-IP tool. It would know nothing about video and would be absolutely useless if you were trying to get any video analytics out of it. At the heart of any effective visibility architecture is optimizing the number of tools you have, especially the more expensive ones. The right visibility solution has been shown to save more than 3x in terms of deployment costs. But you also need to make those tools more effective and getting the right data to the right places at the right time really helps accomplish this.
How do organizations benefit when IT has the insight to proactively resolve problems faster?
Mark: We typically talk about IT, but I’ve also mentioned OT, whether in smart buildings or something like manufacturing, utilities, or transportation. We’ve always considered IT as being separate from the business. But again, I don’t think IT is simply a supportive activity these days, and it’s inherently integral to driving the businesses forward in many cases.
If I consider our own business, not understanding who we’ve sold to, or where and when, would make a huge difference in how we put products together and go to market or solve problems in different ways. If a company can troubleshoot faster because they have a visibility architecture that provides insight into all network traffic, they can more quickly identify these issues. We’re talking about a shift in mean time to repair (MTTR) from what is typically hours of response time down to only minutes. Ultimately this means a much better outcome for our customers and our customer’s customers — the end users. It’s also important to realize that in a modern IT network, perhaps as much as 30% or more of the traffic in the network is “management” related — in other words, handling backups, dealing with configuration changes and copies of traffic being used to gain visibility. Implementing these systems optimally can help with overall network performance as well.
Let’s expand on that. How do end users benefit when enterprises have a network visibility strategy?
Mark: End users are ultimately interested in continuously accessing the services or types of capabilities they’re used to having, whether it be as simple as walking into a store and paying with a credit card to streaming a video and not seeing the picture break up. Fundamentally, the value of a well-architected visibility solution allows those services and capabilities to have an increased uptime. So, yes, sure, we can reduce vulnerabilities. But ultimately, it’s a bit like a burglar alarm in that it won’t stop the most determined thief from getting in, but it will give you an early warning. It will allow you to deploy the right resources at the right time so you can respond rapidly and minimize the damage. I think of it like a thermal camera pointed at your house — you can see the hot spots, the cold spots, and so on. It makes sense to spend some money to address the hot spots, the lossy areas, knowing full well you can never stop heat loss 100%. It is about enabling businesses to make proactive decisions, deploy capital more effectively, and do all of that based on real data
What do you think the biggest continuing network security threats will be, and how can a visibility architecture help companies stay ahead of new threats?
Mark: I would start out by saying that inevitably the weakest link in any of the systems we have is always human. No matter how a breach occurs, most breaches require some information and an entry point to be enabled. We see that manifested through phishing and other scams that attempt to use social engineering to collect vital information. Ultimately, I think this will continue to be one of the biggest challenges in this cybersecurity world. Education and continual awareness are critical to addressing this and making progress.
Beyond that, I think we’ll continue to see hackers targeting nontraditional areas. With SolarWinds, we saw the first major supply chain hack. We forecasted that type of vulnerability in one of our forward-thinking security threat reports published in 2019. I don’t know whether that’s a good thing or not. It’s never good to be right about something bad. And I would anticipate that’s going to continue impacting some nontraditional areas because, ultimately, ransomware and other attacks are designed to cripple businesses with a view to extracting money. Today cybercrime exceeds $6 trillion a year, and its rate of growth is not slowing yet. The percentage of hackers that are caught and brought to justice remains a very small number. In terms of a crime, it’s probably viewed as one of the lowest-risk ventures since nobody actually has to be physically present and may even be locally supported, so they tend to get away with it. If a company is not prepared, the remediation is very drastic and costly in many ways.
One interesting program to note that I think points the way to the future is Cyber Catalyst by Marsh, a program that Keysight joined last year. It helps insurance providers who provide coverage against things like ransomware and other security breaches to evaluate network and security products that help to reduce risk for their clients. The program offers training materials and access to best practices. When companies follow these best practices or use specific certified products, they receive reduced insurance rates. This helps everyone become more aware of best practices that reduce the risk of breaches and how to promptly take remedial action if a breach should occur.
Thanks, Mark for the valuable insight! You’ve certainly given us much to consider. It’s clear to see just how critical it is for companies prioritize their network visibility and monitoring strategy.
Download our new eBook to see why partial network visibility is not enough
Explore Keysight Network Visibility solutions at: https://www.keysight.com/us/en/products/network-visibility.html