Research Hub > How Greater Visibility Can Make Your Organization More Resilient
Article
6 min

How Greater Visibility Can Make Your Organization More Resilient

Developing a clear picture of your digital ecosystem is the first step in minimizing the damage that could result from a breach.

Cyber resilience refers to an organization’s ability to prevent, withstand and recover from cyberattacks. Achieving this requires taking a holistic view of the enterprise rather than focusing on specific cyber risks.

Organizations should look at cyber risk as business risk, and promoting resilience is a key part of ensuring business continuity. Operational interruptions and downtime can result in significant financial losses, even without data exfiltration. A new set of attack vectors and hard-to-detect infiltrations have opened a new front with respect to tactics, techniques and procedures.

Identity and access management and data security are the two pillars of cyber resilience. Who (humans, devices or application programming interfaces) is accessing what (applications or data) is the primary question, and most cybersecurity tools seek to assess the validity of these requests.

Another element of cyber resilience, and certainly of cyber recovery, is having a thorough understanding of an organization’s minimum viable state. What must be recovered to achieve minimum viability as an organization, and on what timeline? What the second phase? The third? Cybersecurity professionals must be aligned to business needs and outcomes, and it’s important to ensure that recovery data is not only available but also can be trusted.

A key tenet of a zero-trust architecture is ensuring a continuous understanding of precisely what data identities and devices are accessing across an environment. The continuous understanding requirement underpinning ZTA demands visibility into who or what is entering the network from what device, what application workloads are affected by that identity, and what infrastructure is being leveraged to access that data. As a result, ZTA improves observability for organizations, typically through policies, rules, controls and analytics driven by artificial intelligence.

Recent research conducted by CDW indicates that 41 percent of survey respondents have reached an advanced level of maturity in their zero-trust journey. But more than half are still finding their way to greater maturity.

This disparity didn’t surprise me, because many factors affect where organizations are on their ZTA journeys. Everyone has a different starting point based on their needs, which are predicated on the nature of their business. A publicly held company will need more visibility into its data than a private one. A large global company with a complex infrastructure will require more work in areas such as network segmentation and identity. A company with a fully virtual workforce will need very different tools and processes than a company with a hybrid workforce. And an institution that’s been around for more than 50 years, though probably further down the zero-trust road than most startups, is likely to have legacy tools, business logic and infrastructure that aren’t well suited for a ZTA model, requiring evaluation and eventual replacement.

How Valuable Is Visibility?

According to recent cybersecurity research by CDW, 90 percent of survey respondents say that they are somewhat or very confident that they have sufficient visibility into their cybersecurity landscape. On the whole, this is a good sign.

These IT decision-makers are confirming that they are able to interpret the outputs associated with their cybersecurity tools.

Over the past 10 years, there has been an increased focus on establishing a dedicated CISO role in organizations, and that role is often aligned with senior decision-making and leadership. There was a time when cybersecurity tools were acquired by many different teams (network security, development, infrastructure, application, mainframe/host, cloud, etc.), often with project-specific scopes. This resulted in functional overlaps and a lack of visibility.

While increased visibility into the cybersecurity tool landscape is a welcome development, that’s a different discussion from evaluating the way these tools map to the mitigation of threats, whether the risk tolerance and security spending of an organization is rationalized across that landscape of tools, and whether these tools are successful at identifying anomalistic events.

A Lack of Visibility Undermines Detection and Response

Visibility is the primary factor in driving early detection and quicker response and mitigation of threats. In a more distributed cybersecurity landscape, where users need to access systems and data from anywhere on any device, effective monitoring must account for multiple attack vectors. Policy that is established at a single-system level may not be enough to identify anomalistic behavior.

Endpoint detection and response, managed detection and response, and extended detection and response are important, but they are insufficient to ensure that lateral moves are prevented in the case of a successful social engineering effort that leads to compromised credentials, for example.

Similarly, a lack of visibility can create significant barriers to compliance. In the case of public companies that must now report on their cybersecurity posture via disclosures, a lack of visibility could unintentionally lead to misrepresentation. Should a material event occur, regulated companies must report it to the Securities and Exchange Commission, and any prior misrepresentation could exacerbate the reputational damage caused by a breach.

Let Business Needs Dictate Your Cybersecurity Tools

Instead of talking about traditional cybersecurity tools, I like to look at how strategic business needs relate to technology. If we start at the business level, evaluation and quantification of cyber risk is key, but it has historically been challenging to deliver specific value and metrics in this area. Recent developments in solutions and the incorporation of cyber insurance industry data have made this a low-cost, realistic exercise for most organizations. This is a key element of informing senior leaders and boards of directors about risks and the impact of accepting those risks.

In terms of tool visibility and observability of the environment as a whole, security orchestration, automation and response tools are being adopted at an increasing rate. This is driving greater automation and orchestration, which increases (and depends on) visibility. Many organizations have adopted security information and event management tools to promote visibility as well. Understanding the use cases for visibility and how these support mapping to a controls framework can enable better decisions about tooling investments — and ultimately decisions about business, risk and operational intelligence.

Finally, incident response planning is a key factor in tying visibility to cyber resilience. Establishing a computer security incident response team with key stakeholders can drive cybersecurity awareness for an organization, and it helps to align security initiatives and uncover projects and needs that relate back to the cyber environment. As part of this, end-user education should also be prioritized. Creating aware and effective security partners within an organization is a key win.

Sophos

Sophos Cybersecurity as a Service secures your company against advanced cyberattacks.

Buck Bell

CDW Expert
Buck Bell leads CDW’s Global Security Strategy Office, bringing over 20 years of cybersecurity and risk management experience.