Strategy planning is an essential process in order to determine when to propose and implement new technology so that it aligns with the business requirement and user needs. New technology appears on a daily basis, and with applications designed to tame the most mundane tasks and simplify them, part of the strategy proposition typically includes a means of streamlining and improving current business processes in order to make them more efficient – after all, you can never really sell something based on technology alone – it needs some form of justification as to how it’s implementation will yield a positive and desired result.
Strategy and the risk of weakening your security model
With any new strategy comes an element of unprecedented risk. It is this risk that, if overlooked or simply ignored, could open the floodgates and enable a tidal wave of unexpected attacks on your network – possibly resulting in the loss of intellectual property, or personally identifiable information. Additionally, you could be notified by law enforcement or a third party that information under your control has been leaked into the dark web and is available for sale.
The concept of installing bleeding edge software or hardware to resolve immediate business issues and requirements is not a new one, but could also introduce a wide range of hidden vulnerabilities that a cyber criminal can use to their advantage. This could be in the form of a 0day exploit, or a known vulnerability in firmware (for example) – either one of these could be leveraged and used as the platform that allows infiltration – and if the target environment permits, the extraction of data on a potentially massive scale. It’s not uncommon these days for businesses to make use of gigabit internet connections, as bandwidth is a cheap commodity. However, these larger scale connections can also be used to suck data out of your network at an alarming rate in the event of a breach. In some cases, smaller businesses may leave their firewalls with unfiltered access to the internet, or deploy a poorly designed security template that could permit access to a cyber criminal given the right circumstances.
Your choice of strategy and deployment decision may initially have started with good intentions, but without an adequate risk assessment or relevant due diligence performed beforehand, the actual result could cause more problems than it set out to resolve in the first instance. With performance being the foremost issue that impacts the user experience, security is often pushed down the list in order to address this issue, or fixes implemented that do not factor in any potential security risk.
Using a bulldozer to break an egg isn’t always the best approach to resolving performance issues – particularly when it can have a negative impact on your existing security model
Strategy decisions are also heavily influenced by current and legacy infrastructure. Organisations should place particular emphasis on legacy equipment with a view to phasing it out as soon as is feasible from a business, technical (and if applicable, regulatory) standpoint. The central question I always encourage here in relation to legacy infrastructure is how often this particular system is used. If it sits there dormant and unloved for long periods of time, should it really still be on your network ? There are various use cases and scenarios where it could be argued that any legacy system would need to remain, but would it still need to be powered on 24/7 ?
Strategy decisions can impose security risk on legacy infrastructure
Legacy infrastructure will always contain vulnerabilities, and given that the product in question is already classed as end of life, it would no longer be supported. This lack of support is what a cyber criminal will leverage and use to their advantage when looking for an easy way into your network. Legacy systems will remain vulnerable, and by definition, should always be in scope when making business decisions around new hardware (and software in most cases). The new hardware may tick all the boxes from an audit perspective in terms of performance, stability, and security, but that very security you set out to create could well be sidestepped by the legacy equipment in extenuating circumstances. You can align the thought process to a game of chess. This in itself is all based on strategy, and one poorly executed move could cost you dearly if you haven’t considered the outcome of such an action. However, be mindful of one thing that is certain
Losing your queen in a game of chess doesn’t carry the same consequence or responsibility as losing your data in a breach.
Software has an additional impact on your strategy model, as access requests to any new packages can be (and often are) driven by users from both standard and board level, and form part of a larger project that is often outside of your control. Implementing such new solutions can change your vulnerability landscape dramatically as soon as the installation is completed. An example of this is where the software application in question contains another package that is an essential component of the main system in terms of functionality, yet contains vulnerabilities introduced by an older version considered stable for production – but not necessarily the most secure iteration. Without adequate analysis beforehand, you cannot realistically know what the application adds as part of the installation routine (think registry changes, file permissions etc.).
This in itself is dangerous if overlooked. Another area of risk is the software installer that is designed to be lightweight. The actual components are downloaded from an external source (several software houses do this already, with many more following suit on a daily basis), and if your endpoint protection isn’t quite up to scratch (or even worse, not present at all), you could be downloading a little more than your expected if that vendor has been compromised. With drive-by downloads, malware can find it’s way into your network unannounced (and uninvited).
A drive-by download refers to the unauthorized installation of malware onto your computer or mobile device. Such incidents leverage exploits in a browser, mobile device application, or an operating system that is out of date and has a security flaw.
Once malware is inside your network, it can lay dormant and undetected for a surprising period of time before unleashing it’s payload – or being remotely activated by command and control. All software that is proposed to be introduced into your network should be thoroughly reviewed and inspected for malicious content or activity. Installing first in a sandbox environment is a generally well accepted mechanism before deploying into test. This way, it can be monitored in terms of requesting connections to external networks that are known to be malicious, or making bulk changes to files and the registry. Another possibility is Ransomware being installed in your environment. In a ring-fenced sandbox, this threat is contained, and unable to spread.
How do you balance strategy and security ?
There is no simple answer to this. If your users request a solution to a problem, they will probably not be aware or concerned about the security risk that an unsuitable solution imposes – they are more interested in resolving an issue that prevents them from doing their job, or performing their business function. It is the responsibility of IT departments and CISO’s to ensure that all classes of information (regardless of sensitivity) and associated infrastructure are not placed at risk of loss or breach. This is becoming increasingly difficult to police given the onset of BYOD. A poorly configured Mobile Device Management policy and a rogue application installed by an end user (either knowingly or not) could expose your business, intellectual property, and associated information to unprecedented threat.
What’s your view ?