Best Practice for Ultra Secure Deployment on Amazon Cloud


AllCloud Blog:
Cloud Insights and Innovation

In this article I will introduce our in-house best practice for an ultra-secure application deployment on the AWS cloud. This best practice is based on our experience in performing dozens of infrastructure projects based on the Amazon Web Services’ platform.

The AWS infrastructure features and API, granularity and on demand resources enable us as a system integrator to deploy quickly, maintain control and improve fast. For example we can run a quick POC for our customers without large up-front investments in matter of days and with the AWS multi-regions we can create a robust high available service as you can see in the scheme below.

AllCloud is an AWS solutions & consulting vendor, serving ~100 AWS customers. We build on Amazon Web Services (AWS), in order to take advantage of the leading public cloud platform that provides the highest levels of security.

In order to provide end-to-end security and end-to-end privacy, AWS builds services in accordance with security best practices, provides appropriate security features in those services, and documents how to use those features. In addition, AWS customers must use those features and best practices to architect an appropriately secure application environment. Enabling customers to ensure the confidentiality, integrity, and availability of their data is of the utmost importance to AWS, as is maintaining trust and confidence.

Check out AWS Security and Compliance Center to learn more

What Is a Secure Cloud?

We have been in the market for the last 15 years. In the first 10 years, we helped our customers to develop and deploy applications on their on-premise data centers. During the last 5 years, we have focused in building secure cloud environments and migrate applications and data from traditional data centers to the public cloud. Based on our continuous experience, we’ve identified the following characteristics of the secure and resilient data center:

  • Isolated and controlled
  • Firewalled
  • Secure access including VPN and SSL
  • Audited
  • Intrusion detection & prevention
  • Configuration on going analysis
  • Data encryption
  • Antivirus
  • Frequent updates
  • User management
  • One time password
  • One spot for monitoring
  • Centralized alerts and notifications
  • Regulatory compliance

Access Management & Traffic Control

Control the Data Flow (AWS VPC)

With Amazon VPC you can create an isolated and private environment where you can launch your own protected instances. With the VPC, we can control both inbound and outbound traffic resources.

a. Network  Access Control List (ACL)

Contrary to the AWS security groups, the VPC ACL allows you to do  “black listing” (instead of “white listing”, check out Security groups below), meaning that you can configure the data resources that are blocked (not approved as a data resources).

b. Routing

VPC enables you to manage the routing. You can decide what will be the data routes inside your AWS environment. This important feature, together with appliances that we install, provides a precise log of the traffic and control network routing between isolated “VPC islands”.

c. Handle all in/out traffic

Users that connect to the environment have their own IDs, and their sessions start and end times can be tracked

Access Control

  • Security groups – “AWS firewall” elements. As mentioned above, the security group enables White Listing, which is selecting only the approved outbound and inbound data resources.
  • Identity access management

Another best practice provided by Amazon is the use of One-time-password together with the IAM system to protect AWS console access. AWS users should also make sure to be familiar with AWS provided security guidelines and best practices. We recommend implementing a one-time-password mechanism (Password is enabled per session and changed once the user session ends) for their AWS cloud accounts’ users.

Amazon supports token-based Multi-Factor Authentication (AWS MFA) – requiring you to have both something you know (your username and password) and something you have (the physical token) in order to log in to your account. This practice can be easily, and without any additional cost, implemented using AWS MFA and IAM.

Data Protection and Anomalies Detection

Our architecture includes multiple subnets enabled by AWS VPC. The main purpose of these subnets is to enable control of the routing; making sure it is aligned with the planned networking, including data flow between sources and targets.

With this architecture, we can ease monitoring and isolate specific group of resources quickly and in real time. As a result, I can terminate only a minimal amount of connections while rejecting/accepting connections actively or applying rate limiting (using nginx) to avoid instances overload in real time. In the Hybrid cloud model, this is crucial due to the constant inbound traffic volume.

On top of controlling the data flow, you must be able to know what’s inside the packets. If there is malware or some Trojan horse, I want to understand what it is sending out as well as what is incoming (SQL injection, for example). For this we deploy Snort – a source network intrusion prevention and detection system (IDS/IPS). Snort supports data signature, protocols, and anomaly-based inspection. All data transmissions are logged and history is kept in case you need to perform an analysis of stolen data.

Snort: Host-based IDS supports:

  • Detection of configuration changes
  • Tracking of running processes
  • Tracking of file access
  • Resource access
  • Detection of abnormal behavior!

In-Flight Data Encryption

Incoming traffic goes through the AWS ELB (Elastic Load Balancer)  that is used as an HTTPS terminator; it is proxy traffic to the backend IDS instances as HTTP. The snort servers are analyzing the packet and then decide to accept or reject it. Once transmitted, outgoing data must be encrypted using SSL. Using Snort, the system examines the outbound packet and logs the results; only then does SSL certificate encryption take place and the data is sent.

Encryption Data at Rest

Once stored on the instances as static files, DBs or other each piece of data must be protected and encrypted. We detect abnormal patterns by logging file changes. For example, configuration files that are static files should not be changed; if a suspected illegal change is detected, our security monitoring tools alert in real time. The file integrity routine checks, including the real-time alerting, are provided by OSSEC.

Antivirus dynamic protection is a must, including aggregation and routine analysis of its logs. All S3 storage objects and data on EBS disks must be encrypted. Learn more about Amazon S3 Server Side Encryption for Data at Rest

Logging

  • VPN access logs
  • Traffic audit logs
  • Network IDS logs
  • Host IDS logs
  • Anti-virus log

Security Lifecycle Management

Deployment is only the initial step. Once deployed, you must have a defined routine to support cycles of refinements. Utilizing tools such as Splunk and Newvem helps you stay on top of  your cloud dynamic environment changes, making sure that you maintain an ongoing discovery log and analysis.

Recently, we have partnered with Newvem, in order to add its usage analytics capabilities to the set of tools we use to serve our customers. Newvem watches our multiple customers’ accounts and triggers alerts on changes and risks.

The analysis triggers actions, such as configuration setting enhancements. No less important is to keep up with AWS security publications and new features releases, which can constantly compliment your solution’s design.

Final Words

I’m always amazed in seeing how open source tools can help in complimenting a great architecture like the one demonstrated with the usage of snort above. Note that security encryption, detection tools and software packages can affect the application’s performance. Deploying such tools must be tested and monitored on an on-going basis as well.

It is very important to make it clear that the level of security I present here is the IT wrapper – and it does not replace the need of a robust, scalable and secure application stack.

The essence (better use the word “beauty”) of our best practice architecture is to be able to integrate the right tools into one design, using AWS cloud building blocks to achieve a robust compliance and security cloud deployment.

Lahav Savir

Founder and CTO, Cloud Platforms

Read more posts by Lahav Savir