Strategies for Managing Widely Deployed Code with Kiuwan

Sep 2, 2020

As applications become increasingly cloud-based – or even, cloud-native – more and more such code is sending data to and from cloud-based stores, both public and private. This makes the methods and controls that such applications use to access the cloud of particular interest. It also keeps the onus on application owners to protect and preserve application data, particularly when it involves information subject to compliance and regulatory requirements. That brings a host of other concerns into play that range from preserving privacy and confidentiality to the “right to be forgotten” (a GDPR requirement that obliges organizations to dispose of data about any registered individuals within 30 days of request for same, or face fines and penalties).

Pass the Data, But Not the Buck

Indeed, organizations must realize and own up to their responsibility for data, even when it leaves their hands and goes into the cloud. At best, the cloud service provider will assume a “shared responsibility” for an organization’s data once it hits their servers or data stores. But always, the organization that acquires (and presumably controls and protects) such data remains legally responsible for its privacy, confidentiality, and disclosures of breach, theft, or unwanted access or disclosure. Thus, organizations that use cloud platforms should thoroughly understand the provider’s security capabilities, and any data protection (such as encryption, access control and audit, and so forth) that the provider offers, and what responsibility and liability it assumes for data and applications that run within its systems.

Best Security Practices for Cloud Access

For cloud-consuming organizations, that’s just the beginning. Best security practices also insist that organizations implement the following principles where access to cloud applications, data, configurations, and resource consumption are concerned:

  1. Apply the Principle of Least Privilege (PLP): all access should be set to “deny” by default and only so much access allowed for authorized parties as they need to use an application (ordinary users) or administer the organization’s cloud environments and settings (and all admin level access should be logged, and routinely audited, especially use of privilege, account management, configuration and set-up of applications and data stores, and so forth).
  2. Use strong authentication2FA or better: Ideally, all access to cloud-based applications and data should require jumping demanding hurdles before access requests get granted. At a minimum, ordinary users should be required to use two-factor authentication (2FA: cellphone or email confirmation of one-time pads). Higher-level access, should probably use multi-factor authentication that includes something beyond 2FA, such as a certificate, smart token device, biometric data (fingerprint, facial scan, and so on), or be tied to a specific admin workstation’s MAC address.
  3. Encryption for data in motion and at rest: By default, organizations should turn on and use the strongest encryption they can employ without unduly affecting data access and/or application performance. Data should also be encrypted wherever it’s stored, both at endpoints when used on the client side, and in data stores when in use by an application or truly at idle rest (active or multi-tiered storage repositories). Even if such data is leaked, it shouldn’t be accessible to those who lack its decryption keys.
  4. Proactive use of threat intelligence and attack surface management: Even though the provider should take security responsibility for patching and updating its platforms and services, those who consume such things must remain aware when exposures or vulnerabilities become known.
    Because the application-providing organization remains responsible for data and application security, even when running in the cloud, they must know when exposure is possible, and make sure to manage associated risks properly. This should include constant security and vulnerability monitoring, and code monitoring for Open Source and proprietary APIs that the cloud provider offers for application use. In fact, organizations are well-advised to plan regular white hat or red
    team exercises, and make sure that penetration testing includes checks on cloud security as well as code under the organization’s direct and more immediate control. Remediation and workaround responses to vulnerabilities and exposures should therefore also include cloud- focused fixes (code updates, workarounds, security policy changes, privilege management, and so forth).
  5. Proactive use of monitoring and anomaly detection: Here again, the provider should take responsibility for ensuring that its authentication, authorization and access control mechanisms are safe and secure. And again, the application-providing organization is also responsible for safeguarding data and applications from unauthorized access, use, disclosure and loss. Thus, organizations should not just periodically audit logs (and track administrative access). Ideally,
    they should use AI/ML-based tools to establish a security baseline for typical usage and access behaviors. Then, they can use sophisticated anomaly detection tools to respond quickly and decisively to evidence of attack or unauthorized access. Thus, for example, wholesale encryption of files not usually encrypted in batches or bunches becomes a big red flag that a ransomware attack might be underway. Other big red flags that anomaly detection can alarm about include
    creation of new and unexpected administrative accounts, unwanted delegation or extension of administrative privileges, and so on, which often indicate slow infiltration and takeover of systems by outside attackers. Likewise for sudden deletions or alterations of security log and audit files themselves, in fact.
  6. Use of strong backup and recovery tools: The ultimate defense against compromise is a safe, secure write-protected backup image, snapshot, or file-by-file copy that inaccessible to outside access and attack. Should worst come to worst, and data be lost or corrupted, this provides the final line of defense against complete and total loss. It also usually provides a starting point, along with transaction logs or other replay techniques to bring a disaster recovery or business continuity scenario into active use. Thus, it also provides an important and tamper-free baseline for security recovery as well.

In fact, cloud security is becoming an important enough field that numerous training and certification programs and credentials make it their primary focus. In 2018, I wrote a story for HPE about this area called 5 surefire cloud security certifications that remains accurate and relevant in 2021. That said, organizations that include SANS GIAC (which offers a dedicated cloud security track), cloud platform providers (including AWS, Microsoft Azure and the Google Cloud Platform), ISC-Squared (home to the Certified Cloud Security professional or CCSP, as well as the well-known Certified Information Systems Security Specialist or CISSP), and others – even Cisco with its CCNP Cloud – also offer a wide variety of cloud security credentials. Companies that wish to play the cloud game seriously, and that developcloud-based applications, would be well-advised to have one or more certified cloud professionals on staff on both the development and the operations sides of IT.

Get Your FREE Demo of Kiuwan Application Security Today!

Identify and remediate vulnerabilities with fast and efficient scanning and reporting. We are compliant with all security standards and offer tailored packages to mitigate your cyber risk within the SDLC.

Related Posts