aidarrow-end-inversearrow-endWhy choose AvectoAchieve complianceOperational efficiencycompliancedefendpoint-coloureddefendpoint-thin-2DesktopScaleResources.iconsAsset 21insider-threatsavecto-logo-smallquotation-marksransomwareResources.iconsResources.iconsResources.iconsResources.iconsResources.iconsResources.iconssafePrevent attacksAsset 19social-engineeringTrustedtriangleStop insider attacksAsset 20Resources.iconsResources.iconszero-days

The thinking problem

Contributor:
James Maude
Date published
9/30/2016 10:58:41 AM

Over the past 30 years we’ve seen business technologies come and go in a rapidly evolving landscape of innovation and ingenuity. You could be forgiven for thinking that in the time we shrunk a computer down to the size of a wristwatch and established high speed connectivity to every corner of the globe that we would have also revolutionized information security.

Yet when I look around trade floors today I see vast swathes of vendors all proclaiming to have the answer, but what is the question and why are we still seeing breaches? Often the question they are answering is how do I detect more of the “unknown bad things” rather than how do I protect the data and the known good. The problem with the former is that the unknown is infinite and we are setting ourselves up for a fall at some point.

As an industry and possibly now a society we have a highly skewed perception on what makes us secure. If you ask someone on the street how they protect their computer they will inevitably say “have anti-virus”. Yet updating software, installing from trusted sources, not logging in with admin rights and backups go unmentioned. This phenomenon leads to detection fever, where we are constantly chasing the new next-gen solutions that will provide higher detection rates than the last.

Like a gambling addict many IT departments rely heavily on AV, chasing that big win they once had 10 years ago when AV saved them from a major incident. Over the years their expenditure has increased despite ever decreasing odds but they can’t seem to move on, constantly pushing money into newer next gen solutions that all promise better odds.

This arms race occurs across entire industries as companies increase budgets and match competitors spending on InfoSec in fear of being left behind. If you are breached, you want to be able to blame the state of the art multi million dollar kit that failed to detect the attack rather than the fact that software was unpatched and endpoints were vulnerable. Despite all this spending we still see breaches every day, in 2016 an attack doesn’t have to be advanced it just has to be unique.

"In 2016 an attack doesn’t have to be advanced it just has to be unique."

Whilst detection and next gen solutions absolutely have a place it’s time we realigned our security mindset to reality. Whilst securing endpoints and controlling access to data and privilege might not seem sexy and next gen, it is the fundamental layer of security that underpins everything else you do. Ultimately the attackers are targeting endpoints and if you haven’t done everything you can to reduce that attack surface than an attacker will be successful sooner rather than later.

There has been plenty of research that has shown how measures such as application white-listing and least privilege are the top defenses against cyber threats. Yet when you survey IT departments they talk of network filters and detection as their top priority. If we look back at the problem we are trying to solve, is it better to try and detect if an infinite number of unknown applications launching on a system are malicious or simply focus on only allowing the ones we know are good to launch? Should we make sure the user only has access to the resources they need or try and monitor/screen their over privileged access? This is just good cyber hygiene, best practice and all too often the missing piece in security programs. We need to think like an attacker and focus on blocking and disrupting the attack chain not just detecting a specific attack.