1-888-317-7920 info@2ndwatch.com

What are the Greater Risks of Cloud Computing?

 

There have been countless numbers of articles, blogs and whitepapers written on the subject of security in the cloud and an even greater number of opinions as to the number of risks associated with a move to the same.  Five, seven, ten, twenty-seven?  How many risks are associated with you or your company’s move to the cloud?  Well, in the best consultant-speak, it depends.

One could say that it depends on how far “up the stack” you’re moving.  If, for instance, you are moving from an essentially stand-alone, self-administrated environment to a cloud-based presence, you most likely will be in for the security-based shock of your life.  On the other hand, if you, in the corporate sense, are moving a large, multi-national corporation to the cloud, chances are you’ve already encountered many of the challenges, such as regional compliance and legal issues, which will also be present in your move to the cloud.

The differentiator?  There are three; scale, complexity and speed.  In the hundreds of clients we have helped migrate to the cloud, not once have we come across a security issue that was unique to the cloud.  This is why the title of this article is “What are the Greater Risks of Cloud Computing?” and not “What are the Unique Risks of Cloud Computing?”  There simply aren’t any.  Let’s be clear – this isn’t to say any of these risks aren’t real. They simply aren’t unique, nor are they new.  It is just a case of a new bandwagon (the cloud) with a new crew of sensationalists ready to jump on that bandwagon.

Let’s take a few of the most popularly-stated “risks of cloud computing” and see how this plays out.

Shared Technology

This often makes the list as though it is a unique problem to the cloud.  What about companies utilizing colo’s?  And before that, what about companies using time shared systems – can you say payroll systems?  Didn’t they pre-date the cloud by some decades?  While there might not have been hypervisors or shared applications back in the day, there just as surely could have been shared components at some level, possibly network components or monitoring.

Loss of Data/Data Breaches

In looking at some of the most widely touted data breaches – Target, Ashley Madison, Office of Personnel Management and Anthem to name just a few – the compromises were listed as “result of access to its network via an HVAC contractor monitoring store climate systems,” “unknown,” “contractor’s stolen credentials to plant a malware backdoor in the network,” and “possible watering hole attack that yielded a compromised administrator password.”  Your first thought might be, “Do these hacks even involve the cloud?”  It’s not clear where the data was stored in these instances, but that doesn’t stop articles from being written about the dangers of the cloud and including references to the instances.  Conversely, there is an excellent article in Business Insurance on the very opposite viewpoint.  Perhaps the cloud can be a bit safer that traditional environments for one very good reason – reputation. We have seen customers move to the cloud in order to modernize their security paradigm.  The end result is a more secure environment in the cloud than they ever had on premise.

Account or Service Traffic Hijacking

Now we have a security issue that really makes use of the cloud in terms of scale and speed.  Let’s clarify what we’re talking about here.  This is the hacking of a cloud provider and actually taking over instances for the use of command and control for the purpose of using them as botnets.  The hijacking of compute resources, whether they be personal computers, corporate or cloud resources, continues to this day.

Hacking a cloud provider follows the simple logic of robbing a bank vs. a taco stand in more ways than one.  Where there’s increased reward, there’s increased risk, to turn an old saying around a bit.  If you’re going to hit a lot of resources and make it worth your while, the cloud is the place to go.  However, know that it’s going to be a lot harder and that a lot more eyes are going to be on you and looking for you.  Interestingly, the most recent sightings of this type of activity seem to about the 2009-’10 timeframes as Amazon, Microsoft, Google and the other providers learned quickly from their mistakes.

If you were to continue down the list of other cloud security issues – malicious insiders, inadequate security controls, DDoS attacks, compromised credentials, and the list goes on – it becomes pretty evident that there simply aren’t any out there that are unique.  We’ve seen them before in one context or another, but they just haven’t been as big an issue in our environment.

The next time you see an article on the dangers of the cloud, stop for a moment and think, “Is this truly a problem that has never been seen before or just one that I’ve never encountered or had to deal with before?”

-Scott Turvey, Solutions Architect

Facebooktwittergoogle_pluslinkedinmailrss

SCOR Velogica Moves to AWS for Better Security, SOC2

While some large enterprises avoid moving to the cloud because of rigid security and compliance requirements, SCOR opted for the cloud for a key block of its business precisely because of the cloud’s rigid security and compliance offerings.

SCOR is a leader in the life reinsurance market in the Americas, offering broad capabilities in risk management, capital management and value-added services and solutions. A number of primary insurers use SCOR’s automated life underwriting system, Velogica, to market life insurance policies that can be delivered at the point of sale. Other companies use Velogica as a triage tool for their fully underwritten business.

“Through the Velogica system, we get thousands of life insurance applications a day from multiple clients,” explains Dave Dorans, Senior Vice President.  “Velogica is a significant part of our value proposition and is important to the future of our business.”

Data security has always been a priority for SCOR but the issue became even more critical as data breaches at some of the largest and most respected companies made headline news. SCOR decided to invest in a state of the art data security framework for Velogica.  “We wanted clients to have full confidence in the way Velogica stores and handles the sensitive personal data of individuals,” Dorans said.

SCOR’s goal was to have Velogica accredited as a Service Organization Control (SOC) 2 organization – a competitive advantage in the marketplace – by aligning with one of the more respected information security standards in the industry.  Determining what it would take to achieve that goal became the responsibility of Clarke Rodgers, Chief Information Security Officer with SCOR Velogica. “We quickly determined that SOC2 accreditation for SCOR’s traditional, on premise data center environment would be a monumental task, could cost millions of dollars and perhaps take years to complete.  Moreover, while SOC2 made sense for Velogica, it wasn’t necessary for other SCOR businesses.

Once it was determined that SOC2 was business critical for the company, Rodgers, analyzed the different ways of obtaining the security and compliance measure and determined that moving to the cloud was the most efficient path. SCOR Velogica turned to 2nd Watch to help it achieve SOC2 accreditation with AWS, figuring it would be easier than making the journey on its own.

On working with 2nd Watch, Rodgers commented, ““They came in and quickly understood our technical infrastructure and how to replicate it in AWS, which is a huge feat.” SCOR met significant benefits thanks to the migration, including:

Adherence to specific security needs: In addition to its SOC2 accreditation, 2nd Watch also implemented several security elements in the new AWS environment including; encryption at rest in Amazon Elastic Block Store (EBS) volumes leveraging the AWS Key Management System (KMS), Amazon Virtual Private Cloud (VPC) to establish a private network within AWS, security groups tuned for least privilege access, Security-Enhanced Linux, and AWS Identity and Access Management (IAM) Multi-Factor Authentication (MFA).

AWS optimization: 2nd Watch has helped SCOR identify opportunities for optimization and efficiencies on AWS, which will help down the road if the company wishes to expand the AWS-hosted application to regions outside of North America.  “With our SOC2 Type 1 behind us, we are now focused on optimizing our resources in the AWS Cloud so we can fully exploit AWS’s capabilities to our security and business benefit.” Rodgers explains. “We will rely on 2nd Watch for guidance and assistance during this optimization phase.”

Cost savings on AWS: Rodgers hasn’t done a full analysis yet of cost savings from running the infrastructure on AWS, but he’s confident the migration will eventually cut up to 30% off the price of hosting and supporting Velogica internally.

Hear from SCOR how it achieved better security with AWS on our live webinar April 7. Register Now

Facebooktwittergoogle_pluslinkedinmailrss

Ahead of the Hackers: Free Tools and Tips for Testing the Security of Your Environment Against Attacks

There are several open source (aka free) tools that you can use to the security of your applications and servers like a hacker. One of the best is Kali Linux, a free tool that s almost every layer of you environment (Application, Network, Host, Foundation).

About Kali Linux

Kali Linux was a creation of Offensive Security in an effort to achieve effective defensive security through an offensive mindset. Kali is supported not only by Offensive Security, but also a very impressive community of people who contribute content and software to the project. Kali is preinstalled with over 600 penetration ing scripts and programs (http://tools.kali.org/tools-listing). Formerly known as Backtrack, it’s been used by security professionals and hackers alike for years. This is one of the best tools that you can use to your security.

Kali has just recently released version 2.0 of its open source penetration ing kit. It can be downloaded here.

Steps for ing your security with Kali Linux

Step 1: First you want to do some information gathering on your servers:

  • Run a python script called the harvester to query google, Bing, Linkedin, and PGP to find information related to your domain. It will include email addresses, IP addresses, and server configurations.
  • OS fingerprinting will give you the versions of operating systems you may be running, which will allow you to look up any outstanding vulnerabilities.
  • Run fragroute, which has a simple rule set language to delay, duplicate fragment, and analyze any intrusion detection that you might have in place.
  • Finally, run NMAP, which will simply scan your IP address to find what TCP/UDP ports are open. You want to make sure that the only ports open are what you need to conduct business—nothing more and nothing less.

Step 2: Nessus is a tool used by auditors and analysts to assess vulnerabilities in systems, networks, and applications. While this doesn’t replace the auditors who certify you for compliance, it does make you more secure by giving you a better understanding of the risks within your environment. It has configuration and vulnerabilities scanning capabilities, as well as malware detection and sensitive data searches. You can also utilize particular cloud services that will conduct the same scans and auditing in a way that is built for the cloud.

Step 3: WPScan is a great tool if you are utilizing wordpress in your infrastructure. WPScan looks for vulnerabilities that might have been installed in your environment through vulnerable plugins and themes. The capabilities of this tool include brute forcing your passwords, finding vulnerable themes/plugins, and enumerating user lists to focus a password dictionary brute force. This is a very efficient tool and is maintained by the community and the WPScan team.

Step 4: The automater is a script that will scan various blacklists to verify if your IP addresses have ever been involved in any botnet activity—if the previous or current users of that IP address were compromised and used to attack others, they would appear on one of those lists. This will ensure your public IP address won’t be blocked when you launch your live site.  The automater checks IPvoid.com, Robtex.com, Fortiguard.com, unshorten.me, Urlvoid.com, Labs.alienvault.com, ThreatExpert, VxVault, and VirusTotal.

These are just a few of the tools that are offered in Kali Linux, but they will get you started down the right path, by exploring the distribution of Kali and ing your environment to see how secure you really are.

Learn more about 2W Managed Cloud Security and how our partnership with Alert Logic can ensure your environment’s security

Article contributed by Alert Logic

AlertLogic_Logo_2C_RGB_V_Tag

If you missed the last article in our four-part blog series with our strategic partner, Alert Logic, check out the guide to help digital businesses prepare for—and respond to—cyber incidents here.

Facebooktwittergoogle_pluslinkedinmailrss

Bridging the gap between DevOps and Security

Security should be baked into the DevOps process, from tools to skills to collaboration. DevOps and security are not mutually exclusive.

The problem with digital innovation is that considerations for compliance come later, after the product or service is on the market. From public cloud infrastructure to Internet of Things to mobile apps and even to DevOps, tough requirements like security aren’t built into innovators’ plans. Entrepreneurs are thinking primarily about shiny, new, fast and disruptive. Yet for the CIO and other chief executives accountable to customers, laws and financial markets, managing risk around sensitive data is top priority.

DevOps processes are at the heart of business innovation: think Netflix, Facebook, Etsy and Nordstrom, all leaders in their sectors. Yet many of the popular DevOps tools and methodologies, whether commercial or open source, haven’t been optimized for the needs of enterprise security. An application running in a container, for instance, will still require attention around configuration to ensure application security.

As well, many security professionals haven’t yet made the leap to understanding the changing best practices for security in this new world of cloud/agile/mobile IT. Some security experts have imposed barriers to DevOps, by resisting the switch to faster, more iterative development along with the public cloud.

On the surface, the speed at which DevOps teams are approving and releasing code would suggest an increase in security risks to end users by eliminating rigorous security review phases. Yet managing security, as with ing, is in fact optimal when performed side-by-side with developers as code is being written. By integrating security, people and processes tightly within the continuous delivery cycle, DevOps can do a better job of eliminating loopholes and gaps in the code before production. DevOps tools emphasize the use of frequent and automated processes to improve software quality: also an ideal model for handling security ing and fixes. Determining the best way to merge security with DevOps is a work in progress. The following concepts can provide a framework for getting started:

  1. Use the best of DevOps for security: DevOps, with its focus on automation and continuous integration, provides a more holistic framework for security management. Start by considering security through every step of the development and production cycle. Security professionals can help developers root out design problems in the beginning – such as ensuring all data transport is encrypted. Integrate automated security checks into development, ing and deployment phases, and educate all team members about the importance of incorporating security thinking in their specific job roles. Security should no longer be the last process before committing the code to production.
  2. Investigate new DevOps and Cloud security tools: Fortunately, the security technology industry is ramping up quickly to the needs of DevOps security. Static Application Security (SAS) tools for security when code is being written while Dynamic Application Security (DAS) tools for interface risks. A few of the reputable systems include Checkmarx, Veracode and Parasoft. The third area of security automation tools covers penetration vulnerability ing, such as Nessus, developed by Tenable. Other contenders in this area include Qualys and OpenVAS. These tools can integrate smoothly into the software development lifecycle, such as by plugging into Jenkins. By adding automation, security is not only built-in, but doesn’t slow down the DevOps process.
  3. Getting buy-in from security teams: This might just be the hardest part. While developers are incentivized to go faster and do more, security professionals are incentivized to control, monitor and reduce risk. Meeting in the middle is definitely possible – but it will require some opinion shifting on both sides. Developers and product managers will need to understand the importance of working collaboratively with the security team, and in an accountable way. Security people can benefit from a more comprehensive understanding of security in the cloud. This should include continuous education on the new tools and services available today to manage risk and to deliver even higher levels of security than in the past – from better reporting, to API-based security and easier encryption at rest.\
  1. Manage tool sprawl: The concept of self-organization is an important one in DevOps, because it fosters a spirit of flexibility and rapid collaboration. Yet this same principle can also lead to environments of dozens or even hundreds of different tools in use to manage deployment, configuration, QA and orchestration. That creates risks for visibility and monitoring as well as standardizing around security controls and access. Engineering leads should help strike a balance between too much and too little governance when it comes to tools and workflows by providing guidelines for tool selection. The DevOps automation infrastructure itself can introduce risks. If a hacker gains access to a tool like Puppet or Chef, he can modify any number of configurations and add new user accounts. Configuration and change management tools must be adequately secured and governed, lest they become a new attack plane.\

With the advent of DevOps, there’s an opportunity at last for security to become an integral and seamless aspect of innovation. We think it’s not only possible but critical to give security the attention it demands in the world of fast IT.

-Kris Bliesner, CTO

This article was first published on DevOps.com on 12/3/15.

Facebooktwittergoogle_pluslinkedinmailrss

5 Tips for Getting Started with Cloud Security

Implementing security in a cloud environment may seem like a difficult task and slows down, or even prevents, some organizations from migrating to the cloud.  Some cloud security models have similarities to traditional data center or on-premises security; however, there are opportunities to implement new security measures as well as tweak your existing security plan. Here are five tips for getting started with cloud security.

  1. Secure your application security code
    Knowing and understanding account usage and the types of coding languages, inputs, outputs, and resource requests is essential.
  2. Implement a solid patch management and configuration management strategy
    These strategies are usually more people and process driven, but are important components to the care of feeding of the technology solution.  Organizations should take inventory of all the data they are maintaining and understand what type of data it is, where it is being stored, what accounts have access to this data, and how is it being secured.
  3. Dedicate time and resources to the design and maintenance of identity and access management solutions
    Attackers continue to use brute force attacks against accounts to crack passwords and gain authenticated privileges in your environment.  Accounts should follow the least privilege concept and account activity should be logged.  A robust logging and log review system should be a standard implementation for all systems, accounts, and configuration modifications to ensure accountability of legitimate activity.
  4. Understand the shared responsibility of security
    Generally, cloud providers will have security implemented throughout their core infrastructure, which is primarily designed to safeguard their systems and the basic foundational services for each of their customers.  Cloud providers will maintain and secure their infrastructure; however, they won’t necessarily provide customers reports or notifications from this layer unless it impacts a significant amount of customers.  Therefore, it is highly recommended that you implement a customized security plan within your own cloud environment.

    At the moment a cloud provider drops a network packet onto your systems, you should employ security monitoring and network threat detection.  The customer responsibility for security increases when moving from the network level to the host level and further to the application level.  Once you have access to your operating system, you are giving root/administrator access and therefore, that system is yours to secure and manage.

    At this point, the customer is responsible for the security of the applications and the application code that is used on the host systems. Cloud customers need to pay particular attention to the application code that is used in their environment since web application attacks are the most prevalent type of attacks used by adversaries.

  5. Stay informed about the la threats and vulnerabilities
    Organizations should also stay informed about the la threats and vulnerabilities to their cloud systems.  Adversaries, hacking groups and security researchers are constantly working to discover new vulnerabilities within systems and keeping up with these threats is imperative.  Organizations that have dedicated resources to monitoring and responding to the la threat activities are able to anticipate cyber activity and minimize the impact of an attack.

    Implementing effective security within a cloud environment may seem to be a challenging task; however, a strategic plan and the proper integration of people, process, and technology enable organizations to overcome this challenge.

Learn more about 2W Managed Cloud Security and how our partnership with Alert Logic can ensure your environment’s security.

 

Blog contributed by Alert Logic

AlertLogic_Logo_2C_RGB_V_Tag

 

 

Facebooktwittergoogle_pluslinkedinmailrss

Understanding the AWS Security Model and Services

Protecting and monitoring networks, applications and data is simple if you know and use the right tools

Security is a stifling fear for organizations considering public clouds, one frequently stoked by IT vendors with vested interests in selling enterprise IT hardware and software using security as a catalyst for overall FUD about cloud services. The fears and misconceptions about cloud security are rooted in unfamiliarity and conjecture. A survey of IT pros with actual cloud experience found the level of security incidents relative to on-premise results quite similar. When asked to compare public cloud versus on-premise security, the difference between those saying the risks are significantly lower versus higher is a mere one percent. Cloud infrastructure is probably more secure than typical enterprise data centers, but cloud users can easily create application vulnerabilities if they don’t understand the available security services and adapt existing processes to the cloud environment.

Cloud-security-survey_int-v-public

Whatever the cause, the data shows that cloud security remains an issue with IT executives. For example, a survey of security professionals found that almost half are very concerned about public cloud security, while a 2014 KPMG survey of global business executives found that security and data privacy are the most important capabilities when evaluating a cloud service and that the most significant cloud implementation challenges center on the risks of data loss, privacy intrusions and intellectual property theft.

KPMG-cloud-2014_security-eval

 

KPMG-cloud-2014_adopt-challenges

Unfortunately, such surveys are fraught with problems since they ask for subjective, comparative evaluation of two very different security models, one (on-premise) that IT pros have years of experience implementing, managing and refining, and the other (public cloud) that is relatively new to enterprise IT, particularly as a production platform, and thus often not well implemented. The ‘problem’ with public cloud security isn’t that it’s worse, no, it’s arguably better. Rather, the problem is that cloud security is different. Public cloud services necessarily use an unfamiliar and more granular security design that accommodates multi-tenant services with many users, from various organizations, mixing and matching services tailored to each one’s specific needs.

AWS Security Model

AWS designs cloud security using a shared security model that bisects security responsibilities, processes and technical implementation between the service provider, i.e. AWS, and customer, namely enterprise IT. In the cloud, IT relinquishes control over low-level infrastructure like data center networks, compute, storage and database implementation and infrastructure management to the cloud provider. The customer, i.e. enterprise IT, has control over abstracted services provided by AWS along with the operating systems, virtual networks, storage containers (object buckets, block stores), applications, data and transactions built upon those services, along with the user and administrator access to those services.

AWS_shared-security-model

The first step to cloud security is mentally relinquishing control: internalizing the fact that AWS (or your IaaS of choice) owns low-level infrastructure and is responsible for securing it, and given their scale and resources is most likely doing better than most enterprise IT organizations. Next, AWS users must understand the various security control points they do have. AWS breaks these down into five categories:

  • Network security: virtual firewalls, network link encryption and VPNs used to build a virtual private cloud (VPC).
  • Inventory and configuration: comprehensive view of AWS resources under use, a catalog of standard configuration templates and machine images (AMIs) and tools for workload deployment and decommissioning.
  • Data encryption: security for stored objects and databases and associated encryption key management.
  • Access control: user identity management (IAM), groups and policies for service access and authentication options including multifactor using one-time passwords.
  • Monitoring and logging: tools like CloudWatch and CloudTrail for tracking service access and use, with ability to aggregate data from all available services into a single pool that feeds comprehensive usage reports, facilitates post-incident forensic analysis and provides real-time application performance alerts (SNS).

Using CloudTrail Activity Logs

Organizations should apply existing IT security policies in each area by focusing first on the objectives, the policy goals and requirements, then mapping these to the available AWS services to create control points in the cloud. For example, comprehensive records of user access and service usage are critical to ensuring policy adherence, identifying security gaps and performing post hoc incident analysis. CloudTrail fills this need acting as something of a stenographer recording all AWS API calls, for every major service, whether accessed programmatically or via the CLI, along with use of the management console. CloudTrail records are written in JSON format to facilitate extraction, filtering and post-processing, including third party log analysis tools like Alert Logic, Loggly and Splunk.

CloudTrail so thoroughly monitors AWS usage that it not only logs changes to other services, but to itself. It records access to logs themselves and can trigger alerts when logs are created or don’t follow established configuration guidelines. For security pros, CloudTrail data is invaluable when used to build reports about abnormal user or application behavior and to detail activity around the time of a particular suspicious event.

The key to AWS security is understanding the division of responsibilities, the cloud control points and available tools. Mastering these can allow cloud-savvy organizations to build security processes that exceed those in many on-site data centers.

-2nd Watch Blog by Kurt Marko

Facebooktwittergoogle_pluslinkedinmailrss