Accelerating the software delivery pipeline while improving the overall security posture. Sounds like a tall order but it is possible with modern Privileged Access Management.
The rise of Continuous Delivery/Continuous Integration (CD/CI) and DevOps as the preferred way to deliver software is way beyond niche and is now the de facto standard for any software builder. It’s the way Amazon or Netflix can make dozens of updates to live systems every day but it’s also a winning strategy for even the smallest application development organisation. And that’s no surprise: the ability to deliver value to users as quickly as it can be coded and change plans in response to user feedback is a win-win for software creators and users.
But it’s not easy to do well or, importantly, securely. Building software so fast or bringing together components from various development teams or even different companies can lead to taking short cuts or using poor security.
There have been numerous examples of the risks. For example, during coding phases, it might be tempting to have embedded passwords or security tokens in plaintext files and to keep them in the same source code repository as the application source code. That not only exposes those credentials to other developers that may not need them but could be even more dangerous if that repository is hosted somewhere in the cloud such as GitHub. Indeed, the risk is so real, it’s even spawned tools to make searching for such secrets easy. Although, to be fair, GitHub continue to improve their tools to reduce the risk of exposing secrets, it’s never a complete solution.
The solution must be to separate those highly valuable “secrets” (passwords, certificates, tokens etc) from the source code or developers. Clearly that must be in such a way that it doesn’t introduce new risks or slow down the delivery pipeline. Privileged Access Management is the solution for separating those valuable secrets from the application code without impacting productivity as we’ll see below.
But there’s another issue that’s sometimes overlooked. Increasingly, multiple virtual environments are being used for development, test and production. These virtual machines may be entire systems or dedicated containers at each stage, for example one may just house the database service while another may handle load-balancing. It’s critical that these systems are well managed – it should be impossible for developers to access Production systems, for example, which is a requirement for security standards such as PCI DSS, the standard for payment handling systems.
Validating the security of the application as part of the DevOps pipeline is critical. That validation should not only cover the behaviour of the application but should also consider the system architecture, coding practices, and the CD/CI pipeline. This extended workflow is known as “DevSecOps.”
Let’s look at the different phases of a DevSecOps pipeline and the relevant security aspects of each.
Well written source code has separation of concerns to make code more logical and readable. There is always intellectual property in these techniques and algorithms which describe the business process and potential points of attack. Even without the issues of embedded “secrets” it’s important to consider the value in the design.
At the development stage it’s common that many virtual machines are stood-up and subsequently destroyed as the code base progresses. Managing credentials to access these machines is tricky and can lead to bad practice. Modern Privileged Access Management (PAM) makes it easy to access these systems even if each has unique credentials. By securing access credentials for each device and separating users from those credentials when connecting to them, the risk of exposing valuable secrets is removed.
User Acceptance Testing (UAT) and Staging environments are often run against copies of live data. This raises all sorts of GDPR and security questions as the test data may include personally identifiable information.
In principle, the test data should be as representative as possible but not identify real people. If absolutely necessary, this data has to be managed in a highly controlled environment. In a fast moving, continuous delivery environment, it’s too easy to accidentally connect to Production rather than Test environments. In a particularly bad case, this test data can even end up in public demonstrations. The Osirium PxM Platform has a feature called ‘Device Group Separation’. This feature ensures that a user, for example a developer, cannot be connected to Production at the same time as UAT/Staging/Testing systems.
Once code has been committed, automated processes take over to build, validate and push the updates towards Production servers. Again, these include valuable information in logs, configurations and test results.
Traditional CD/CI tools help automate this process but have little or no understanding of how to do it securely. Osirium’s Opus (now known as PPA) is a new generation of Robotic Process Automation (RPA) that is built to secure automation systems, something known as Privileged RPA (PRPA). With Opus, the virtual machines and devices can be provisioned, and software deployed automatically and securely.
Production systems are clearly the most critical part of the DevOps environment. This is where users have access, and parts of the system may be outside firewalls or hosted in 3rd party environments. Live user data and payment handling may be included. Protecting access to these environments and carefully controlling how they’re updated is critical.
Again, this is where PAM is the solution. PRPA ensures Production systems can be secured while still allowing automated access in a secure manner. PAM ensures that when system admins and others must gain access, they can without the potential of revealing connection credentials.
It’s also important to consider the on-going servicing of those Production systems, for example updating configurations or certificates without re-releasing the system. This is another opportunity for PPA to automate tasks across multiple systems, separate operational staff from credentials and ensures a auditable record of changes.
In summary, CD/CI and DevOps have been, and will continue to be, the way successful organisations deliver value to their users. Building those systems within a Privileged Access Management framework ensures DevOps happens securely and removes painful manual processes that are error-prone and slow. Overall, another win-win for application creators and end-users.