?
Security Transformation Endpoint Security
13 minute read

Crowdstrike EDR Testing in the ATC

Crowdstrike has been one of the hottest companies in the last 18 months. We can validate this statement with the testing activity we have done in the Advanced Technology Center (ATC) around Crowdstrike as an Endpoint, Detection, and Response (EDR) solution that our customers have asked us to evaluate with them.

In This Insight

copy link

Summary

Several of our Global Financial customers have asked us to evaluate Crowdstrike versus several other EDR solutions on the market today.  This insight outlines how WWT's ATC Lab Services team tests EDR solutions with customers based on the use cases that are important to each customer. Crowdstrike was tested for three of our Global Financial customers using our Proof of Concepts (POCs) in the ATC.
 

 

copy link

ATC Insight

Several of our customers at WWT have asked us to test out Endpoint Detection and Response (EDR) solutions with them in the Advanced Technology Center (ATC).  EDR is one of the hottest technology spaces in security these days. Additionally, there are many players in this space that are on the market for customer evaluation.  Why do customers rely on WWT, the ATC, and the ATC Lab Services team for their security EDR testing efforts?  One of the biggest reasons this type of testing happens in the ATC is because World Wide invested in a Malware Lab to test solutions all the way back in 2018.  Since then, the Malware lab environment has matured and has a very appealing set of testing tools to help customers evaluate security solutions they need to be implemented into production.

ATC Lab Services Malware Lab

The Malware Lab at WWT is a service that we provide to our customers within the Advanced Technology Center (ATC).  The lab contains a controlled environment that contains secure access to our customers, partners, and WWT engineering teams.  We are able to evaluate EDR solutions from any vendor by using an agent-based tooling approach that is agnostic to any security vendor on the market today.  We can evaluate how well endpoint devices manage and block malicious attacks as well as variants of scripted attacks.  The technology behind this type of testing is called Breach and Attack Simulation (BAS).

A methodical approach based on phases

In our Malware Lab in the ATC, we also take a very methodical approach to our security testing which is predicated on phases. We normally focus first on Scripted and CLI Variant testing which allows us to utilize pre-made scripts to execute from. This type of testing allows us to understand how these scripts attempt to make various changes to the endpoints when malware attempts to infect or alter machine parameters to enable exploits.

Mandiant Protected Theater

We also utilize a product called Protected Theater from our partner Mandiant.  This specific tool from Mandiant allows us the ability to safely perform potentially dangerous and destructive tests with real or live malware on our customers' endpoint defenses to truly determine what threats their endpoint controls will and will not block.

The Score Card

Once we start to get measurements and results back on how an EDR solution is responding to our Scripted, CLI Variant, and Destructive Live Malware tests we then record this information in a scorecard system.  This allows us to measure the action taken towards bad acting events. The scorecard process depicts this information.  We even have an algorithm that applies a “weighted system” that allows

Crowdstrike Falcon EDR Focus Features

Crowdstrike has a number of features that will be focused on within this ATC Insight. Standout features of Crowdstrike were:

  • the ease of deployment
  • the support and recommendations for testing
  • the ability to meet security requirements
  • the endpoint policy configuration
  • the usability of Falcon cloud tenant threat detection
  • the data exportation from the solution
  • the overall options available for Crowdstrike's Endpoint Protection Platform

Each Financial Institution had different needs and wants for their EDR Solution and Crowdstrike was able to deliver with the features discussed. 

Similarities in Customer Testing

There were a number of similarities between each Financial Institution's needs and wants. WWT's ATC Lab Services performed an Endpoint Detection and Response (EDR) Proof-of-Concept (POC) evaluation for each customer to assess the capabilities and performance of the Crowdstrike Falcon EDR solution.  Again, the testing occurred in WWT's Malware Lab, located in St. Louis, MO, USA.  

WWT utilized a seven-step system for testing Crowdstrike EDR solutions for each customer. The seven-step system used was the Lockheed Martin Cyber Kill Chain. APT or Advanced Persistent Threat is the focus of this technique. Cyber Kill Chain Steps are as follows: Reconnaissance, Weaponization, Delivery, Exploitation, Installation, Command and Control, and Actions on Objectives. WWT only used part of this chain for some of the customers' use cases. Reconnaissance examples include harvesting email addresses, conference information, etc. Weaponization examples include coupling an exploit with a backdoor into the deliverable payload. Delivery consists of delivering the weaponized bundle to the victim. Exploitation consists of using a vulnerability to execute code on a victim's system. Installation is when the malware is installed on the victim's machine or asset. Command and Control (C2) is a term used when the attacker establishes a command channel to remotely manipulate the victim. Action on Objectives is when the attackers accomplish their original goal and have access to the network, assets, etc. 

WWT utilized Mandiant's Protected Theater and Host Command Line Interface (Host CLI) for testing for all three Customers. Protected Theater was for the Windows Systems under test. Host CLI was used in the Linux and macOS Systems under test. 

Differences in Customer Testing 

There were a number of differences between each Financial Institution's desires in their testing efforts.  These are defined by the work detailed in the phases below.  As you can see the customer needs for lab testing can be very different.  The ATC Lab Services team is able to customized testing in the Malware Testing Lab to fit the customers' needs. Advanced Persistent Threat Groups (APT), classified by Fireeye, are threat actors that receive direction and support from an established nation-state. Fireeye also mentions that APT groups try to steal data, disrupt operations or destroy infrastructure. APTs adapt to cyber defenses and frequently retarget the same victim. Both Customer 1 and 3 utilized APT malware signatures in their respective testing. 

Below is a breakout of how each customer test lab was adapted to specific use cases important to each customer.

Customer 1: Three Phases of Testing 

Customer 1 was looking to replace their current EDR solution.

Customer 1 Phase 1: WWT subjected the Crowdstrike security solution to a battery of seven tests to assess efficacy in blocking

Customer 1 Phase 2: WWT subjected the Crowdstrike security solution to a battery of three tests, each based on a scripted attack developed and documented by Customer 1's personnel. 

Customer 1 Phase 3: Separate testing in the Malware Lab performed by Customer 1

Customer 2: Two Phases of Testing

Customer 2 was unique compared to the other Customers because they performed AntiVirus (AV) Testing on top of the EDR testing.

Customer 2 Phase 1: Block and detection efficacy using endpoint security validation tool (Mandiant Protected Theater)

Customer 2 Phase 2: Customer 2 chose to utilize a self-evaluation with their own tailored tools and techniques

Customer 3: Three Phases of Testing

Customer 3 Phase 1 Testing: Automated Efficacy (using Cyber Kill Chain taxonomy)

Customer 3 Phase 2 Testing: Automated Efficacy Testing (using APT-Focused taxonomy)

Customer 3 Phase 3 Testing: Customer-led Manual Testing within the Malware Lab in the ATC 

 

*Expanded detail on each Phase of Testing can be found within the Test Plan/Test Case section.

Final Impressions and Summary

Crowdstrike performed very well in each of the tests administered. 

Support for Platforms

Testing found that Crowdstrike Falcon EDR supported a wide variety of system platforms required for testing. These included Windows, macOS, and Linux with the added benefit of supporting older Linux Distributions. 

Ease of Use

Testing of the Crowdstrike Falcon EDR had no compatibility issues in all WWT lab test implementations. This led to deployment and configuration taking place with ease. 

Endpoint Policy Configuration

A common tenant for EDR capabilities as well as endpoint policy configuration contributed to the observed ease of operationalizing the tool in the lab. Falcon EDR's performance was due to multiple features. 

Features
  • ease of deployment which is important for Customers when implementing Falcon EDR on a production environment
  • Crowdstrike Falcon EDR made it simple for each Customer to meet their security policy requirements.
  • EDR Cloud Tenant UI is great to navigate and understand.
  • Support: throughout testing Crowdstrike provided help if required by the testing, but there were no technical errors encountered by WWT.
  • Policy configuration was set up based on Falcon EDR best practices and little to no adjustments were needed. However, in certain cases, WWT testers believe that the security policy was overly permissive.
  • The Endpoint Protection Platform prevented most of the attacks it faced.
  • Falcon EDR provided a method for exporting and collecting cloud tenant threat detection and alert events for assessment. This data was collected within the Falcon cloud tenant.

Overall, Crowdstrike Falcon is an excellent EDR solution. WWT looks forward to building and developing the relationship in the future. 

copy link

Test Plan

Each of the three Financial Institutions testing Crowdstrike EDR tools had different methods of testing. 

Customer 1 

Two Main Phases of Testing by WWT with a third performed by Customer 1 

Customer 1 Phase 1

WWT subjected each security solution to a battery of seven (7) tests to assess efficacy in blocking:

  • Signature-based malware: Daily list of identified malware samples dropped on disk, without execution (Win/Lin/Mac, Approx. 3000 total samples).
  • Behavior-based ransomware: Collection of pre-compiled ransomware behavior-based simulations executed using different methods. (Win only, Approx. 560 scenarios).
  • Behavior-based trojans: Collection of pre-compiled trojan behavior-based simulations executed using different methods (Win only, Approx. 200 scenarios).
  • Rootkits: Collection of pre-compiled rootkit execution behavior-based simulation (Win only, Approx. 75 scenarios)
  • Behavior-based worms: Collection of pre-compiled worm behavior-based simulations executed using different methods (Win only, Approx. 40 scenarios).
  • DLL side loading: Collection of pre-compiled DLL side loading execution behavior-based simulations (Win only, Approx. 20 scenarios).
  • Customer-selected APT attacks: Collection of up to thirty (30) APT groups and malware templates, as selected by Customer 1 from a list of available options (Win only, 30 scenarios).

Each security solution will receive the battery of seven (7) tests in two different configurations:

  • “Offline” Mode (Scenario 1): Endpoints will have no outbound Internet access, except for a set of whitelisted destinations that allow for connectivity to WWT cloud-based resources used for conducting the assessment.
  • Online Mode (Scenario 2): Endpoints will have full Internet connectivity.

Customer 1 Phase 2:

WWT will subject each security solution to a battery of three (3) tests, each based on a scripted attack developed and documented by customer 1's personnel. The tests will be limited by the following constraints.

  • Customer 1 will provide full step-by-step documentation on the steps to stage and launch an attack.
  • Customer 1 will define scoring criteria for each test and stipulate artifacts to collect.
  • Tests will leverage existing malware lab infrastructure.
  • Tests will be conducted on one (1) OS platform per test.

Customer 1 Phase 3: 

WWT provided Customer 1 remote access to the lab environment for a period of four (4) weeks following completion of Phase 1 and Phase 2. Customer 1 will be provided with access to the lab environment, along with the vCenter infrastructure to allow for the taking/restoring of snapshots. This phase is set aside to allow Customer 1 to perform their own battery of tests and review the functionality of each security solution. WWT ATC personnel will be available to provide basic lab support.

Customer 2

Two Testing Phases with Phase 1 being performed by WWT and Phase 2 performed by Customer 2

Customer 2 Phase 1: Block and detection efficacy using endpoint security validation tool using Protected Theater by Mandiant Security Solutions

Customer 2 Phase 2: The Customer chose to utilize a self-evaluation with tailored tools and techniques

All virtual machines in the OEM enclaves traversed a Bluecoat explicit proxy appliance for HTTP/HTTPS connectivity. The WWT team leveraged a combination of VM templates and snapshots to keep all evaluated images consistent and resilient through simulated attacks. WWT leveraged a security validation tool to take attacker actions on endpoints under test and record observed results. A portion of phase 1 test cases also include an element labeled Protected Theater by Mandiant Security Solutions. These are a product of the security validation tool leveraged in the evaluation and signify potentially destructive attacker actions taken on endpoints under test. 

14 OS platforms included in Customer 2's evaluation

  • Windows 7  (Build 7601)
  • Windows 10 (Build 1703)
  • Windows 10 (Build 1709)
  • Windows 10 (Build 1809)
  • Windows Server 2008 R2 SP1 (Build 7601)
  • Windows Server 2012 R2 (Build 9600)
  • Windows Server 2016 (Build 1607)
  • Windows Server 2019 (Build 1809)
  • macOS Big Sur
  • macOS Catalina
  • Red Hat Enterprise Linux Server release 6.10 (Santiago)
  • Red Hat Enterprise Linux Server release 7.9 (Maipo)
  • Oracle Linux Server release 6.10
  • CentOS Linux release 7.5.1804 (Core)

Customer 2 Phase 1: Block and detection efficacy using endpoint security validation tool using Protected Theater by Mandiant Security Solutions

Testing Categories from Cyber Kill Chain:

T1: Reconnaissance

T2: Delivery

T3: Exploitation

T4: Execution

Customer 2 Phase 2: The Customer chose to utilize a self-evaluation with tailored tools and techniques

Customer 3

Two Phases of Testing Performed by WWT and a Third Performed by Customer 3

Phase 1: Automated Efficacy (using Cyber Kill Chain taxonomy) (Behavioral Actions)

Phase 2: Automated Efficacy (using APT-Focused taxonomy) (Protected Theater by Mandiant)

Phase 3: Customer-led Manual Testing (conducted after completion of Phase 1 and 2)

Operating Systems Assessed for Customer 3

The following two (2) operating system platforms were included in Customer 3's evaluation:

  • Windows 10 (Build 1809)
  • Red Hat Enterprise Linux (RHEL 7.9)

Testing Taxonomy Categories for Customer 3

T1: Reconnaissance

T1-A: WWT ATC Reconnaissance (Windows) 20 tests

T1-B: WWT ATC Reconnaissance (Linux) 2 tests

T2: Delivery

T2-A: WWT ATC Delivery (Windows) 4 tests

T3: Exploitation

T3-A: WWT ATC: Exploitation (Windows)  8 tests

T4: Execution

T4-A: WWT ATC: Execution (Windows) 316 Tests

T4-B: WWT ATC: Execution (Linux) 7 tests

T5: Command and Control

T5-A: WWT ATC: Command and Control (Windows) 6 tests

T5-B: WWT ATC: Command and Control (Linux) 1 test

T6: Action on Target

T6-A: WWT ATC: Action on Target (Windows) 235 tests

T6-B: WWT ATC: Action on Target (Linux) 36 tests

T7: APT Specific

T7-41 APT41 (China) 

Fireeye describes APT41 as a prolific cyber threat group that carries out Chinese state-sponsored espionage activity in addition to financially motivated activity potentially outside of state control. 

Deliverables 

The logs and results of all phases of testing were condensed into Excel spreadsheets, documentation, Executive Summaries, direct raw results from the solution as well as Statements of Work. These were for WWT internally as well as each Customer. Documentation and results are used for interpretation and scoring. The process was conducted for each of the Customers. 

Below is an example Results Scoreboard with sample data.

 

Scoreboard example with Sample Data 

copy link

Technologies

Crowdstrike Falcon is considered to be one of the top EDR solutions on the market today. 

All 3 Financial Institutions examined Crowdstrike Falcon.

Customer 1: EDR Solution

Customer 2: EDR Solution and AV Endpoint

Customer 3: EDR Solution