Reduce Time Or Do Not, There Is No ShiftDave Krasik
In previous blog posts we’ve discussed the value of a data-driven approach to security operations. In this post, we’d like to reflect and take a closer look at what that approach means to the automation of SOC (Security Operations Center) workflows and how it has influenced the product and design decisions of ThreatQ and ThreatQ TDR Orchestrator. With the user experience in mind and considering how to drive adoption of automation with our users, we first derived a few core characteristics that served as our product guideposts:
- Facilitate learning and improved processes over time
One of the core objectives of users, when they look to automate a process, is a reduction in the overall time they spend manually performing tasks. One of the great ironies of automation tools is that they often result in shifting time and work loads rather than actually reducing the time they spend using them. This shift generally takes two basic forms, which we will go into greater detail next.
In one case the shift is further down the process funnel to the next layer of manual processing. The result may take the form of a slew of low fidelity alerts that now have additional context. This is great, but if only 10% of those alerts were worthy of additional investigation, we’ve now added bloat to the 90% that are of little to no use. If we are transacting this data through integrations between our SIEM, SOAR, and ticketing systems we’ve now burdened those APIs and tools with an order of magnitude of additional data to share and sort through. The key to addressing this problem is to find those 10% of important data points further up the process funnel. To do that a user needs to be able to specify a much more refined set of criteria that shaves off that 90% of data points AND consequently all those automated runs that were adding context. For this reason ThreatQuotient designed and wanted to empower the users to craft highly specific queries that incorporate any and all characteristics of the data they work with.
This is why ThreatQ TDR Orchestrator was built with Threat Library Smart CollectionsTM as a foundation. Any user can quickly and easily build, from the UI, even very complex criteria without the need for expertise in SQL or SQL-like query languages and the syntax and debugging that comes along with them. This means any user could, within a matter of seconds, build a collection that identifies events created within the last week that are related to indicators of compromise (IOCs) associated with malware types that target their sectors and geographies of interest and were sourced from a specific set of intel sources considered high quality. This collection, which is a mouthful, is the outcome of only a few seconds of work and can now be deployed to drive an automation that zeros in on the 10% of events that the user cares about most.
The second case of burden shifting, also ironically, falls onto developers. Of the many automation tools available to security teams, most of them are marketed as “low code” or “no code”; that is they claim to require little to no coding expertise to use. In practice, this is often untrue. The complex process diagrams behind all but the most simplistic playbooks, often replicate the complexity of code but in visual diagram form, yet still require the skill set of a software developer to configure, interpret and maintain. In fact, many of these tools include some rudimentary form of integrated development environment (IDE) so that someone can further code the tool to actually accomplish specific user objectives.
For ThreatQ TDR Orchestrator, we’ve made the decision to abstract a lot of that complexity away. Changes in configuration, for example which data points to request, send, store, and evaluate are made with simple UI components such as check boxes, drop downs, and dedicated text entry fields. This again means that any user simply needs to understand the security and business context behind their automation processes. No coding skills are required. If specific configurations are not available, we’ve built a framework that makes it very easy to add those UI components independent of any updates to the platform.
The notion that a user can pull a playbook wholesale from a vendor catalog and roll it into production, is one that, in practice, does not generally hold water. Some or a lot additional configuration will be required to conform to the desired workflows of a given set of users. As noted before, the ThreatQ TDR Orchestrator framework allows for easily expanding UI-based configuration options. This allows ThreatQuotient to provide the best of both worlds. Let’s say there is an enrichment that returns a count of what are considered malicious sightings for a given indicator. And as part of a playbook, a user wants to store all available enrichment data only if the malicious count exceeds some threshold. If this control were not already available, using the ThreatQ TDR Orchestrator framework, it would be a simple update to make a UI-based field available for any non-coding user to configure. Zooming out, this means that we now have two layers of flexibility built into the automation creation process. One layer where non-developer users can configure many available user-friendly nobs or controls AND another layer where ThreatQ or user developers can easily add to the UI toolkits available at the first layer.
Facilitates learning and improves processes over time
One gain of understanding that has become clear as we continue to speak with customers and learn about their processes, is that they are constantly evolving and searching for improvements. Because we’re big fans of the OODA loop and iterative software development concepts, this is something that resonates with us. ThreatQ TDR Orchestrator incorporates this on at least two levels. First, new data points that are generated as part of an automation are always stored as normalized intelligence within the Threat Library. This means that the data is not simply stored as part of what is primarily a one-time-use case or ticket. Rather, the data points are now incorporated as part of the collective threat knowledge base that is the Threat Library. This means that the data now becomes available as Smart Collections that can be used for analysis, sharing, and of course use of further automations.
The second aspect of ThreatQ TDR Orchestrator is that it facilitates learning returns to the concepts of configurability and ease of use. As customers learn more about their data, as their objectives and threats evolve, so too will their workflows, both manual and automated. Assuming this, a system that requires software developers writing code to be a part of the iteration cycle will more than likely be slower to iterate, adapt, and learn. Conversely, one like ThreatQ TDR Orchestrator, where any user regardless of technical skill, is armed to adapt an automation to evolving workflow needs will have more stakeholders available to engage in and accelerate that learning cycle.
The high-level value of automating workflows is pretty straightforward. It should significantly reduce the time and effort required to execute a lot of the repetitive tasks associated with SOC workflows. To actually achieve these desired benefits in practice requires more than throwing the brute force of an automated process into the mix. It also means empowering existing users, regardless of their technical skills, to continue to engage with those automation processes so that they are efficient, well understood, and can continue to improve over time.
To learn more, check out ThreatQ Platform and ThreatQ TDR Orchestrator. You can also join us on June 16 at 10:00 am ET for an interactive discussion on Automation Adoption with Nabil Adouani from StrangeBee and The HiveProject. Register here.