301.519.9237 exdirector@nesaus.org

While data breaches continue to grab daily headlines, the culprits aren’t the firewalls themselves but are more attributable to misconfigurations, especially in cloud applications.

12.13.19 – SIW – New survey recommends increased automation to negate complexity issues and staff shortages

While it is perhaps the most basic security defense in the IT network toolbox, firewalls present the greatest challenges to today’s CISOs because of their ubiquitous nature. At least that is the finding in the sixth annual State of the Firewall Report recently released by FireMon, a Dallas-based software company that provides security policy management solutions for hybrid cloud enterprises. Its annual benchmark of current issues in firewall management says that a failure to abandon manual processes in lieu of automation by organizations that are already short-staffed and drowning in systems complexities is at increased risk of breach and attack.

According to Tim Woods, VP of Technology Alliances for FireMon, firewalls are not being managed effectively and that inherent technology bloat threatens security policy and procedure with unnecessary complexities that come in different shapes, sizes and forms.

“What happens to a firewall over time as it evolves is that it takes on new rules. But unfortunately, the rules are rarely purged. You get a system clogged with unused rules, redundant rules, and shadowed rules. And probably the worst offender is the overly permissive rules that usually gestate because the poor IT professional is not given adequate information when they are tasked from the business to add access but not provided the kind of details they need. Where did the order come from, where are they going to, what services, what ports, what’s the application? It can be a mess since they are usually up against a timeline,” says Woods. “Staff will try to create the best rule possible, or in order to honor a business deadline, they’ll go ahead and create an overly permissive rule with the spirit and with the good intention of going back and cleaning that rule up later. But then they get waylaid with 15 other priority rules requests and the initial request gets moved to the backburner. Now you have this overly permissive rule embedded in the policy that at some point could potentially exhibit risk or some nefarious individual or bad actor could take advantage of because it’s access shouldn’t be there.”

Survey Sez

The 2019 State of the Firewall report features feedback from nearly 600 respondents, including nearly 20% from the executive ranks, detailing their enterprise firewall operations in the spectrum of digital transformation initiatives. The report highlights the fact that C-level executives and their security teams need more control and visibility over an organization’s network security processes in order to drive digital transformation and maintain compliance within the enterprise.

While data breaches continue to grab daily headlines, the culprits aren’t the firewalls themselves but are more attributable to misconfigurations, especially in cloud applications. A recent survey by Gartner estimates that through the year 2023, “99% of firewall breaches will be caused by misconfigurations, not firewalls.” Gartner also states that “50% of enterprises will unknowingly and mistakenly have exposed some IaaS storage services, network segments, applications or APIs directly to the public internet,” which is an increase of 25% since last year.

“Five or six years ago you’d see a firewall that had 10,000,15,000 rules on it. That’s a big firewall. Now it’s not uncommon for us to come across firewalls with 40,000, 50,000 even 60,000 rules. That’s more rules than is humanly possible to evaluate, especially when it comes down to policy behavior. Unless you have some type of automation that can perform behavioral analytics to tell you which rules are being used the most, which rules are being used the least, which rules are not being used at all, which rules are the most complex, you can become overwhelmed,” Woods explains. “In other words, how do we accurately assess the rules that are overly permissive or providing the most access? How do I surface those rules so that I can tighten them up and apply better discrete control?”

One of the most shocking revelations of the FireMon survey centered around the lack of automation in managing an enterprise firewall change process. Sixty-five percent of all executive respondents said that they are not using any automation to manage their environment and another 68% of C-level executives also affirmed automation was absent. Inaccuracies, misconfigurations or other network issues account for 10% to 24% of the changes that require rework responded 36% of the survey executives.

“That particular stat shocked me and I don’t get surprised by much having been doing this for so long. That one was the first time that automation made the stack and popped up as a major issue in the six years we’ve done the report. It not only made the list, but it was a top concern. That is pretty illuminating,” Woods says.

Overworked and Understaffed

When you combine the lack of automation in an enterprise firewall operation with a high volume of process changes, rising complexity and a shortage of qualified IT security personnel, the recipe for disaster is obvious. The report’s findings on the lack of automation being used across the industry highlight the need to deploy this missing solution. Finding the correct approach of security automation for each enterprise helps to improve real-time visibility and control over network security processes and to comply with regulations. The best approach to automation will enable an organization to minimize human error, increase efficiency and close the gap between driving transformation initiatives and maximizing security resources and agility. 

“I think everybody would agree that the cybersecurity shortage is real. I recently read an article that put it about 350,000 jobs open in North America alone. I talked to people every day, even security directors that tell me they have some of their best people doing some of the most mundane, reoccurring tasks that are offloaded onto them. I need them to work on the things that I hired them to do. Unfortunately, they’re eaten up with a lot of these reoccurring tasks that still must be met for the business to run. I can’t hire more people and these people just have too many things on their plate,” Woods adds. “The processes that we’re using to honor the business access requests are still somewhat outdated. We have to somehow gain parity with the speed of business or another security (process) has to gain parity with the speed of the business.”

Woods contends that if organizations aren’t adding security personnel and they are failing to remedy the processes that are broken or not working as fast as they should, then staff will only wait so long before they look for a way around the problem.

“That’s exactly where we find ourselves today. We see individuals, we see business owners, stakeholders, we see DevOps, we see the different stakeholders taking responsibility for defining their own security controls for the assets and the resources and the services that they’re deploying in the cloud. And again, it’s not that they’re not smart people, they’re just not well-grounded with a security background and they’re not getting it right. So, you see these misconfigurations taking place,” laments Woods.

Is There A Solution?

The million-dollar question is how does an organization attain that smooth security track where all departments are in sync and the vision of the C-suite and its DevOps team is on the same page? It helps if the speed of technology or the advancement of technology and the growing speed of business cuts across all sectors of the organization beyond IT. The survey concludes that “the gap between the urgency to innovate and ensuring a secure network creates additional stress for security teams that can lead to errors, accidental exposure or service disruptions”.

“Technology lends itself to helping the business. A successful business is accelerating for the right reasons. They’re trying to be innovative; they’re trying to gain a competitive advantage in the marketplace and that innovating is not going to stop. The business is not going to raise their hand and say, ‘Okay, we’ll wait on you.’ If we’re going to put the security team in the middle of DevOps, then those teams must work cohesively as well. It’s as much cultural as it is having the right technology platform. You can have the best technology on the planet but if it’s not managed correctly, or if it’s not leveraged correctly, then you’re not going to realize the return out of that security investment. A cohesive risk strategy has to be driven from the top,” says Woods.

How to Tackle the Major Issues

Woods figures there are at least three top issues related to security and network protection that an organization should embrace to stay current with today’s threats. First and foremost is tackling the growing technology complexity confronting an organization and developing a strategy to deal with it.

“So many organizations talk about complexity within the organizations but don’t call it out for what it is. Whether that’s the firewalls or whether it’s the proper management of the rule bases that are in place or whether it’s getting my percentage of change correct, that’s an area that requires focus. That’s an area where I need to be able to sandbox changes in advance of those being implemented into the network which could result in an impact on my business continuity. That is why this whole subject of growing complexity needs to be called out and needs to be identified with some initiatives built around it,” adds Woods.

The second major issue according to Woods is visibility.

“You can’t manage what you can’t see, and you can’t secure what you don’t know about. If you don’t feel confident that you have good insight and good visibility into your network infrastructure, and I’m talking about when a change happens; do I have the ability to analyze that change and do I know when a change takes place? Any good regulatory compliance initiative is going to ask that right off the bat,” Woods continues. “Are you monitoring for change? And if you come back and say, no, then you get a big red X right off the bat. When a change takes place on a network, you must ask the question, was it good change or bad change? Did it impact my security posture? Did it impact my compliance posture? Did it introduce unacceptable risk within our environment? Because at the end of the day, we’re trying to manage risk to a level that’s acceptable by the business.”

His third premise is all about staff.

“How are you empowering your people to use the technology that you have? I think this is where automation comes into play. If I’m not adding people, then I must make my people more effective. Automation comes in as the leverage at this point,” says Woods.

The survey adds that implementing automation into the change process will help enterprises:

  • Reduce human error by eliminating misconfigurations that can increase surface attacks
  • Help eliminate the friction between DevOps and SecOps so they can deliver security at the speed of business
  • Increase security agility while shortening SLA timeframe
  • Maximize efficiency while reducing operational and security costs
  • Prevent compliance violations through continuous monitoring of global security policies across the enterprise’s hybrid environment.

About the Author:

Steve Lasky is a 33-year veteran of the security publishing industry and multiple-award-winning journalist. He is currently the Editorial and Conference Director for the Endeavor Business Security Media Group, the world’s largest security media entity, serving more than 190,000 security professionals in print, interactive and events. It includes Security Technology Executive, Security Business and Locksmith Ledger International magazines, and SecurityInfoWatch.comthe most visited security web portal in the world. He can be reached at steveo@securityinfowatch.com.