Future Proof your SOC with the Power of the Azure Ecosystem and Defender Threat Intelligence

In today's world of ever-evolving sophisticated threats, time is of the essence when it comes to an efficient SOC's continuous feedback loop for reducing attacker dwell time. The days of humans writing effective rules to detect malicious activity are gone. Reducing attacker dwell time requires a host of people, processes, and modern technologies including Artificial Intelligence (AI), (ML), Threat Intelligence (TI), and to find and eliminate threats before they become a nightmare.

Microsoft Sentinel, along with the Defender Extended Detection and Response (XDR) platform, provides an enhanced toolset of top-grade technologies for your SOC to achieve its goals in today's difficult threat landscape. With Microsoft's XDR + SIEM/SOAR capabilities, you are future proofing your company's security and modernizing your capabilities.

Operationalizing an integrated security platform requires a shift in process, and change can be frightening when seconds matter. Many customers are concerned about their budget when considering a “lift and shift” to Sentinel because ingesting massive amounts of raw and contextual signal to a top-tier solution that allows for up to 2 years of hot, analytical retention can be price-prohibitive and, for some data, unnecessary.

As Microsoft Sentinel sits in the Azure cloud, this grants unmatched flexibility and simplicity to meet the “do-more-with-less” standards of a budget-restricted security organization. The Azure integration gives companies the ability to leverage Data and AI tooling to further enhance capabilities at a manageable cost rate for the value of that data.

A simple rule that we advise companies when designing lower-cost data collection and retention for Microsoft Sentinel architecture is to focus on ingesting into Sentinel only the data with contextual security value. Given Sentinel's modern capabilities geared toward detecting sophisticated threats across the kill chain, it is best utilized when data contains some security context.

However, with attacker sophistication, continuous improvement and (sometimes) requires quality forensics where a security analyst or researcher may need to look backwards across huge amounts of signal, including firewall logs, diagnostic logs, and network flow logs, to discover the “needle in the stack of needles.”

Although it may be convenient to centralize data collection and to one highly advanced technology that includes detection and forensic capabilities at your fingertips, it is a thing of the past! As organizations feel the squeeze of economic conditions, the extensibility of Microsoft Sentinel and its easily-integrated tooling within Azure allows for significant cost savings without sacrificing the power of the platform.

Now, we'll walk through a tutorial that illustrates the “art of the possible” with Azure Data Explorer and Sentinel used together to reduce costs while offering access to utilize Azure Data Explorer (ADX) data.

(To cut through some of the explanation, the tutorial actions are represented in Green Italic text below.)

1. Select from high volume, lower security value data sources.

A. Consider security use cases from your traditional SIEM environment.

Although Microsoft Sentinel provides data connectors for Firewall logs, Netflow, and Defender Advanced Hunting tables, among other lower-context data sources, you may find that your immediate detection value does not justify the cost of such high-volume ingestion directly into Sentinel. (Ex. Security Threat Firewall Logging vs. Ping and TraceRoute Firewall data.)

B. Common IOC hunting is a great place to start.

Many organizations have under-utilized the power of a Threat Intelligence feed / integration within the SOC. However, IOC hunting is crucial to filling in the gaps in an attacker's methodology where many detections may be unique to a business where you know your environment best.

2. Map fields and ingest the data into Azure Data Explorer (ADX).

For a sample data set, we've followed the instructions provided by @Jeff_Chin in his blog post

Limitless Microsoft Defender for Endpoint Advanced Hunting with Azure Data Explorer (ADX), where he illustrates enable streaming of Microsoft 365 Defender data into ADX.  You may want to experiment with some alternative data ingestions scenarios where you'll notice a degree of flexibility (as well as the cost-to-benefit trade-off of increased effort) that is not available with Log Analytics.

For example, with ADX queries, time filters are not required, but can be utilized. Without time filters, you can query an entire database, which speaks to the potential need for optimization and adjustments in processing power. Because ADX is a Platform as a Service (PaaS) solution in Azure, you have a great deal of flexibility over the architecture and cluster sizing, allowing for selection in how clusters are allowed to scale for changing demand: Manage cluster horizontal scaling (scale out) to match demand in Azure Data Explorer | Microsoft Lea…

For the following sample, we'll use a query from Microsoft Defender for Endpoint's (MDE's) DeviceLogonEvents table. To follow along with the rest of this tutorial, you'll want to reference and follow the instructions in Limitless Microsoft Defender for Endpoint Advanced Hunting with Azure Data Explorer (ADX). 

Spoiler
Follow steps 1-3 in their entirety, then you can focus on ingesting the DeviceLogonEvents table under Step 4. 

Follow steps 1-3 in their entirety, then you can focus on ingesting the DeviceLogonEvents table under Step 4. 

(As an added bonus, the process in the above blog post has been automated with @sreedharande 's script here: Azure-Sentinel/Tools/AzureDataExplorer at master · Azure/Azure-Sentinel · GitHub)

3. Form a filtering query and do a test run in Azure Data Explorer

Security Advantages of using Azure Data Explorer (ADX) with Sentinel

ADX is a documented and supported method for log retention and integration with Microsoft Sentinel. As with all Azure Platform Services, Security can be configured for Role Based Access Control (RBAC), Identity management, Encryption, etc. This is described in depth here: Integrate Azure Data Explorer for long-term log retention | Microsoft Learn.

Row Level Security

In a prior blog post, Simple Row-Based Access Workbook: Lab Walk-Through with Azure Sentinel and Azure Data Explorer (ADX)…, we described an example for utilizing the benefit of Row Level Security in Azure Data Explorer to present individual views for Cross-solution workbooks presented in Sentinel. Here, you'll experience the familiar data platform Kusto for query building, filtering, grouping, etc., as you've experienced with Log Analytics as well as Advanced Hunting in the Microsoft 365 Defender portal.

Built in anomaly detection with Python, R

Outside of the scope of this blog post, but also very interesting and useful for security scenarios is the ability use the AI service, Anomaly Detector to bring your own ML models. A fantastic sample walkthrough is described by @Louise_Han here: Announcing Univariate Anomaly Detector in Azure Data Explorer – Microsoft Community Hub

Sample Query for Brute Force (use for illustration purposes only):

Back to the intent of this post, here's a (potentially noisy, no proven detection accuracy) query to detect potential brute force based on successful login to a device after a designated number of failed logins. The successful login is joined with the remote IP used to authenticate to project the results needed for the sample analytic rule used later. You can test and adjust this query on your Azure Data Explorer cluster:

DeviceLogonEvents
| where TimeGenerated >= ago(180d)
| where RemoteIPType == "Public"
| summarize
  LOGONFAILED=countif(ActionType =="LogonFailed"),
  LOGONSUCCESS=countif(ActionType =="LogonSuccess")
 by RemoteIP, DeviceName
| where LOGONSUCCESS > 0
| where LOGONFAILED >= 300
| join kind=innerunique
(DeviceLogonEvents
| where ActionType == "LogonSuccess")
on RemoteIP
| project TimeGenerated, RemoteIP, AccountName, LOGONFAILED, LOGONSUCCESS, Protocol, MachineGroup, LogonType, DeviceName
Spoiler

Feel free to adjust the number of days queried in the TimeGenerated line, as well as tuning the LOGONSUCCESS and LOGONFAILED quantities to deliver desired results.

The Arg_max KQL operator can be a life saver for situations when you plan to append data to an existing table or watchlist from ADX into Sentinel. This operator aggregates rows to find the row that has the maximum value for a certain column. It then returns the values of other columns in that row. For example, to find the maximum latitude of a storm event in each state, you can use the following query: Kusto StormEvents | summarize arg_max(BeginLat, BeginLocation) by State. This will return a table with two columns: State and BeginLocation, where each row represents a state and its corresponding location with the maximum latitude.

For your DeviceLogonEvents query, you can add a line to the end to see how this works.

| summarize arg_max(LOGONFAILED, LOGONSUCCESS, AccountName, DeviceName, LogonType) by RemoteIP

Query and results with arg_max operatorQuery and results with arg_max operator

Feel free to adjust the number of days queried in the TimeGenerated line, as well as tuning the LOGONSUCCESS and LOGONFAILED quantities to deliver desired results.The Arg_max KQL operator can be a life saver for situations when you plan to append data to an existing table or watchlist from ADX into Sentinel. This operator aggregates rows to find the row that has the maximum value for a certain column. It then returns the values of other columns in that row. For example, to find the maximum latitude of a storm event in each state, you can use the following query: Kusto StormEvents | summarize arg_max(BeginLat, BeginLocation) by State. This will return a table with two columns: State and BeginLocation, where each row represents a state and its corresponding location with the maximum latitude.For your DeviceLogonEvents query, you can add a line to the end to see how this works. 
| summarize arg_max(LOGONFAILED, LOGONSUCCESS, AccountName, DeviceName, LogonType) by RemoteIP
 
Query and results with arg_max operator
 

4. Create a Logic App to run the query on a recurrence and send the data to Sentinel.

Logic Apps are the workhorse of Security Orchestration and Automated Response (SOAR) in Sentinel. They provide a low code method for data ingestion, enrichment, and .

A. To create a new Logic app, you can select “” in the navigation pane on the left in the Sentinel portal.

From here, you'll create a Blank Playbook.

select select “Blank playbook” from the playbook creation options.

B. Once your Logic App Deployment is complete, you can create a playbook to start with the common “Recurrence” Trigger.

For the sample scenario described here, we can start with a recurrence Interval of “1” and frequency of “Day.”

Option to decide on a recurrence interval.Option to decide on a recurrence interval.

C. You'll add a “New step” to query your ADX Cluster with the built in Connector for Azure Data Explorer

 Easily run a KQL auery in Azure Data Explorer from within Logic Apps.Easily run a KQL auery in Azure Data Explorer from within Logic Apps.

Spoiler

Once you select the Action for “Run KQL query,” the box will request the details for your Cluster URL, Database Name, and the query.

 Fill in cluster URL, Database name and desired KQL Query.Fill in cluster URL, Database name and desired KQL Query.

To find these details, you can search for “ADX” in the central search at the top of your Azure Portal and select the Azure Data Explorer Clusters Service.

Easily find Azure Data Explorer with search key Easily find Azure Data Explorer with search key “ADX”.

 The Cluster URL can be found in the cluster overview URI Field:

Select the URI as the cluster URL.Select the URI as the cluster URL.

 And the database name can be reference under the “Databases” option in the navigation menu:

Beth_Bischoff_2-1688590148487.png

Once you select the Action for “Run KQL query,” the box will request the details for your Cluster URL, Database Name, and the query.
 Fill in cluster URL, Database name and desired KQL Query.
To find these details, you can search for “ADX” in the central search at the top of your Azure Portal and select the Azure Data Explorer Clusters Service.
Easily find Azure Data Explorer with search key “ADX”.
 The Cluster URL can be found in the cluster overview URI Field:
Select the URI as the cluster URL.
 
 And the database name can be reference under the “Databases” option in the navigation menu:
 

D. Now that you have your ADX Database referenced, paste in the KQL query that you wrote or the sample from above.

Populated details in the ADX Query Connector in Logic AppsPopulated details in the ADX Query Connector in Logic Apps

For the next step, we'll break into two options for sending filtered data from ADX into Sentinel for analytics.

5. Option A: Send data into a Sentinel Watchlist

Watchlists are easy to create, update and query in detection rules. However, they are limited to 10 million rows across all watchlists in a single Sentinel workspace. They also are incredibly straightforward for query purposes, so if you plan to use complex joins and functions, you may prefer to skip to Step 6: Option B (Custom Table).

1. First, create a CSV file with the aligned fields based on query results and then add that to Sentinel's Watchlists.

Spoiler

Hint: Although you can create a watchlist with the Logic app, this gets into some complexity outside of the scope of this blog, because we don't want a new watchlist each time the recurrence executes. So, one way to get the appropriate CSV file to upload to Sentinel is to test run your query in ADX while adding |take 1

|take 1​

to the end of the query. This will give you a single result to download as a CSV file and then upload to Sentinel for continuous bulk updates from the logic app that we're building.

Use the option to export a Use the option to export a “take 1” query result to CSV to upload into Sentinel for your Watchlist Template.

Hint: Although you can create a watchlist with the Logic app, this gets into some complexity outside of the scope of this blog, because we don't want a new watchlist each time the recurrence executes. So, one way to get the appropriate CSV file to upload to Sentinel is to test run your query in ADX while adding |take 1
|take 1
to the end of the query. This will give you a single result to download as a CSV file and then upload to Sentinel for continuous bulk updates from the logic app that we're building.
 
Use the option to export a “take 1” query result to CSV to upload into Sentinel for your Watchlist Template.
  

You'll need to select a search key, and for this sample, we've used the RemoteIP.

Validated Watchlist fieldsValidated Watchlist fields

Once you name and save your watchlist in Sentinel, you'll return to the Logic App to send the query results to it.

2. From here, search for the control operator, and then select “For Each.” This means that each time the query above delivers results, they'll be appended to the watchlist. For this reason, you'll want to work out the query time frame with the recurrence to deliver the desired results to append to the watchlist.

Select a Select a “For each” control in the Logic App

From Dynamic Content, select the value of the ADX Query run previously in the Logic app,

Select the Dynamic value of the ADX query output for the Control ActionSelect the Dynamic value of the ADX query output for the Control Action

3. And then add a final action selecting the operation for Microsoft Sentinel

Select Select “Microsoft Sentinel” for the next operation.

Scroll down and select “Watchlists – Add a new Watchlist Item (preview)

You can Add a new Watchlist Item as a preview action in Logic AppsYou can Add a new Watchlist Item as a preview action in Logic Apps

You'll validate the workspace ID in which the Watchlist lives by cross-referencing Sentinel Settings –>Workspace Settings

Where you can get the Subscription, Resource Group and Workspace ID from the workspace Overview.

Paste in your watchlist alias that you named when you created the watchlist (or check under Watchlists in Sentinel where the Alias is easily referenced as well).

Then, specify the results that you'd like as the JSON body, pulling from the Dynamic content exposed through the KQL query in the prior steps.

Watchlist Item Fields are available as dynamic content from the ADX KQL Query above.Watchlist Item Fields are available as dynamic content from the ADX KQL Query above.

Spoiler

Hint:

Here's JSON to copy/paste, just ensure you validate your specific dynamic content. If fields are missing, you may need to remove and re-add each dynamic field to fully validate the query results that will be sent to the watchlist:

{
  "TimeGenerated": "@{items('For_each')?['TimeGenerated']}",
  "AccountName": "@{items('For_each')?['AccountName']}",
  "DeviceName": "@{items('For_each')?['DeviceName']}",
  "LOGONFAILED": "@{items('For_each')?['LOGONFAILED']}",
  "LOGONSUCCESS": "@{items('For_each')?['LOGONSUCCESS']}",
  "LogonType": "@{items('For_each')?['LogonType']}",
  "MachineGroup": "@{items('For_each')?['MachineGroup']}",
  "RemoteIP": "@{items('For_each')?['RemoteIP']}"
}

Hint:
Here's JSON to copy/paste, just ensure you validate your specific dynamic content. If fields are missing, you may need to remove and re-add each dynamic field to fully validate the query results that will be sent to the watchlist:
{
“TimeGenerated”: “@{items(‘For_each')?[‘TimeGenerated']}”,
“AccountName”: “@{items(‘For_each')?[‘AccountName']}”,
“DeviceName”: “@{items(‘For_each')?[‘DeviceName']}”,
“LOGONFAILED”: “@{items(‘For_each')?[‘LOGONFAILED']}”,
“LOGONSUCCESS”: “@{items(‘For_each')?[‘LOGONSUCCESS']}”,
“LogonType”: “@{items(‘For_each')?[‘LogonType']}”,
“MachineGroup”: “@{items(‘For_each')?[‘MachineGroup']}”,
“RemoteIP”: “@{items(‘For_each')?[‘RemoteIP']}”
}
 

4. Save the Logic app, then Run the Trigger, where you should see each step succeed:

The Logic App ran successfully.The Logic App ran successfully.

Test your query in Sentinel by selecting the Watchlist from the Watchlist blade and clicking the “View in logs” button where you should see the results:

Here, we can see the Watchlist results in Sentinel Logs.Here, we can see the Watchlist results in Sentinel Logs.

Now you can follow the guidance to Build queries or rules with watchlists – Microsoft Sentinel | Microsoft Learn.

You can also skip to Step 7 Below and reference the instructions for using the Custom table but instead use the Watchlist function (_GetWatchlist(‘‘) to form a detection against Microsoft Defender Threat Intelligence. 

6. Option B: Send data to a Custom Table in Sentinel

The benefit of this option is that now the Data filtered from the ADX query will exist in a table that can be referenced and joined with other tables rather than calling the watchlist operators. There's no row restriction for results and the schema is available directly for building more complex queries. Also, this benefits consumers because they now can write detections against a custom table schema rather than managing rows in a watchlist, which reduces the amount of operational overhead.

1. Start with the initial logic app creation based on a recurrence (Step 4, above).

This time, however, we can make the query a bit simpler to drive toward our end goal of matching (low fidelity) potential brute force with Defender Threat Intelligence IP IOCs.

DeviceLogonEvents
| where TimeGenerated >= ago(180d)
| where RemoteIPType == "Public"
| summarize
 LOGONFAILED=countif(ActionType =="LogonFailed"),
 LOGONSUCCESS=countif(ActionType =="LogonSuccess")
 by RemoteIP, DeviceName
| where LOGONSUCCESS > 0
| where LOGONFAILED >= 300

Input the desired query in the Logic App ADX KQL Query Operator after testing and tuning it.Input the desired query in the Logic App ADX KQL Query Operator after testing and tuning it.

Spoiler

Hint: For more complex query schemas you might, at this stage, want to save the Logic App, run it, and then capture the results of your Query Output. You can get to this by clicking on the “Run KQL Query step” and then selecting “show raw outputs” where you can copy and paste to a text editor. This can be useful if you need to add a step to understand your JSON schema, you can add an action “Parse JSON” and upload a sample payload to generate/see the JSON Schema from the query. This is not necessary for this tutorial but can be helpful for future reference.

Query Output JSON from Logic App runQuery Output JSON from Logic App run
Pasting the raw output as a sample payload from the KQL Query run will allow you to generate the output schema.Pasting the raw output as a sample payload from the KQL Query run will allow you to generate the output schema.

Hint: For more complex query schemas you might, at this stage, want to save the Logic App, run it, and then capture the results of your Query Output. You can get to this by clicking on the “Run KQL Query step” and then selecting “show raw outputs” where you can copy and paste to a text editor. This can be useful if you need to add a step to understand your JSON schema, you can add an action “Parse JSON” and upload a sample payload to generate/see the JSON Schema from the query. This is not necessary for this tutorial but can be helpful for future reference. 
Query Output JSON from Logic App run
 
Pasting the raw output as a sample payload from the KQL Query run will allow you to generate the output schema.
 

Parsing the data in the Logic App depends on the simplicity of the query and the resulting key value pair formatting of the request sent. If the query and results are straightforward enough, we may be able to follow the prior instructions with Log Analytics collecting the data directly. However, in this case, if we try to send each result to Log Analytics, we get a JSON request body input that includes an array that can't neatly fit into the Schema of the new table.

Example of how the logic app may fail if you simply replace the action Example of how the logic app may fail if you simply replace the action “send data to Sentinel Watchlist” with “Send Data to Log Analytics Workspace.”

So, because the results are delivered as a JSON body, where the LOGONFAILED and LOGONSUCCESS fields contain an array of values, we need to deliver those as separate records into Log Analytics.

2. After the Logic app step where you run your ADX KQL Query, the next step will be a control to send the results to a Log Analytics workspace.

Select a Select a “For each” control

3. Once you select the Dynamic “value” from the ADX query output, you'll select “Add an action” and add a second “for each” control to accommodate the array values and break them into individual records in Microsoft Sentinel's Log Analytics Workspace.

4. In the second “For each” control output box, select “Add dynamic content” and then select the “Expression” tab, where you can enter the expression shared below, and then click “OK”:

array(triggerOutputs()?['value'])

Add an embedded Add an embedded “For each” control.

Input the expression to break the array into individual recordsInput the expression to break the array into individual records

Spoiler

Hint: Once we begin nesting controls, it can be helpful to name them to keep track of what each control is intended to accomplish. This makes things much easier if you're trying to a complex query for filtering data to Log Analytics.

Hint: Once we begin nesting controls, it can be helpful to name them to keep track of what each control is intended to accomplish. This makes things much easier if you're trying to troubleshoot a complex query for filtering data to Log Analytics. 

5. The final step in the Logic App is to send the data to Log Analytics.

To do so, select “Add an action” and then search for Azure Log Analytics, where you can select the option to “Send Data.”

Select the operation to Send Data to an Azure Log Analytics Workspace.Select the operation to Send Data to an Azure Log Analytics Workspace.

You'll also need to establish your connection to the Workspace, where you'll declare a name for the connection, then find the Workspace ID and Key details by navigating to your Sentinel portal–>Settings–>Workspace Settings–>Agents (use the drop down for Log Analytics agent instructions to find the workspace Key and ID.)

Input your Workspace detailsInput your Workspace details

6. Once populated, enter the desired JSON request body and name your custom log table, which you'll be able to query in Sentinel.

For the JSON request body, here's a copy/paste to start with:

{
"RemoteIP": "@{items('For_each')?['RemoteIP']}",
"DeviceName": "@{items('For_each')?['DeviceName']}",
"LOGONFAILED": "@{items('For_each')?['LOGONFAILED']}",
"LOGONSUCCESS": "@{items('For_each')?['LOGONSUCCESS']}"
}

 Pasting in the JSON request body from above should result in the automatic selection of the dynamic results from the ADX KQL queryPasting in the JSON request body from above should result in the automatic selection of the dynamic results from the ADX KQL query

Spoiler

Hint: Note how Dynamic content hides an expression under the selected shortcut item. This can be helpful for troubleshooting as well: items(‘For_Each_ADX_value,_send_to_Sentinel''s_Log_Analytic_Workspace')?[‘RemoteIP']

Hint: Note how Dynamic content hides an expression under the selected shortcut item. This can be helpful for troubleshooting as well: items(‘For_Each_ADX_value,_send_to_Sentinel''s_Log_Analytic_Workspace')?[‘RemoteIP']

7. Save the Logic app and select “Run Trigger” to test. Once it runs, you should see green checks at each stage with the expected number of results.

Successful Logic App RunSuccessful Logic App Run

8. Finally, you can see the results of your filtered import into Log Analytics from ADX!

Navigate to the “Logs” blade of Sentinel, where you can query the custom table, which will automatically append your “Custom Log Name” with an “_CL”. (Note: It can take several minutes to send the initial set of logs. If your results are unexpectedly empty after a successful Logic App trigger run, you can go get a beverage or snack and then try again.)

Use the Custom Log Name in the “Send Data” field of the Logic App to run the Log Analytics QueryUse the Custom Log Name in the “Send Data” field of the Logic App to run the Log Analytics Query

Query your new custom table from Logs in Microsoft Sentinel.Query your new custom table from Logs in Microsoft Sentinel.

Spoiler
Although in this tutorial, we're using well-formed data from Microsoft 365 Defender, It is possible to utilize this model with any data source, including Syslog, Netflow, Etc. In these cases, you can Use ASIM parsers to view data in a normalized format and to include all data relevant to the schema in your queries.

Although in this tutorial, we're using well-formed data from Microsoft 365 Defender, It is possible to utilize this model with any data source, including Syslog, Netflow, Etc. In these cases, you can Use ASIM parsers to view data in a normalized format and to include all data relevant to the schema in your queries.

7. Build an Analytic Rule Matching Your New Custom Table to Threat Intelligence IOC's from the Defender Threat Intelligence Connector. 

Here's where the fun really begins. Whereas you cannot use ADX directly for Analytics queries, once you send filtered data back into Microsoft Sentinel, you can certainly write custom detection rules.

Of course, these will not be Near Real Time (NRT) detections, but this is a great scenario for running forensics or hunting older records against Threat Intelligence (TI) Indicators of Compromise (IOC's).

It's always helpful, whenever possible, to use an existing useful rule and make a copy to modify a similar detection for a custom detection.

The Preview Integration with Defender Threat Intelligence and its associated Solution, once installed, not only provides out of box TI matching on native tables as soon as it's enabled, but it also builds a results table of its own and the native analytic rules provide great samples to draw from for the custom rule you can now write. (Learn more about Threat Intelligence in Sentinel here: Threat intelligence integration in Microsoft Sentinel | Microsoft Learn)

The beauty of the Defender Threat Intelligence solution is being able to take advantage of Microsoft's massive scale of visibility on Threat Intelligence, providing operationalization and immediate SOC maturity gains with little effort.

1. First, enable the Threat Intelligence Solution if you haven't yet done so.

Enable the Threat Intelligence Solution in Microsoft Sentinel.Enable the Threat Intelligence Solution in Microsoft Sentinel.

Then, when you click the “Manage” button in the bottom right, you'll be taken to the associated content. Since, in this case, we're trying to get a match against our potential brute force IP address, a good rule to use for a model to start with is the “TI map IP entity to SignInLogs.”

The Threat Intelligence Solution includes a number of useful templates for detection against Threat Intelligence IOC's such as this one.The Threat Intelligence Solution includes a number of useful templates for detection against Threat Intelligence IOC's such as this one.

Once you select the rule, you'll see the associated data sources and will notice that Custom Tables, like the one we just created, are not naturally included. To provide that native TI matching, you can enable the rule as-is (tuning as appropriate.)

2. You can also use the same template (“TI map IP entity to SignInLogs”) to create a new rule for our purposes here.

3. Rename the rule accordingly.

4. In order to add the context for Sentinel's built in , it's important to select the proper Tactics & Techniques, so for the sample we're building, you can select Tactic “Credential Access” and Technique “T1110 – Brute Force.”

Select Tactic Select Tactic “credential access” and Technique “Brute Force”

5. Select the blue button “Next : Set rule logic >”

6. Modify the KQL rule to address your detection goals.

Spoiler
If you're modifying or creating a rule for a new detection, it can be helpful to have a separate log screen available to test the query for desired detection results.

If you're modifying or creating a rule for a new detection, it can be helpful to have a separate log screen available to test the query for desired detection results.

Here's a sample that's sufficiently amended from the built-in query that delivers the intended results.

Note: For your query, you'll need to specifically replace the custom table under the “join” operator with the name that you created in your Logic App.

ThreatIntelligenceIndicator
| where Active == true
// Picking up only IOC's that contain the entities we want
| where isnotempty(NetworkIP) or isnotempty(EmailSourceIpAddress) or isnotempty(NetworkDestinationIP) or isnotempty(NetworkSourceIP)
// As there is potentially more than 1 indicator type for matching IP, taking NetworkIP first, then others if that is empty.
// Taking the first non-empty value based on potential IOC match availability
| extend TI_ipEntity = iff(isnotempty(NetworkIP), NetworkIP, NetworkDestinationIP)
| extend TI_ipEntity = iff(isempty(TI_ipEntity) and isnotempty(NetworkSourceIP), NetworkSourceIP, TI_ipEntity)
| extend TI_ipEntity = iff(isempty(TI_ipEntity) and isnotempty(EmailSourceIpAddress), EmailSourceIpAddress, TI_ipEntity)
// using innerunique to keep perf fast and result set low, we only need one match to indicate potential malicious activity that needs to be investigated
| join kind=innerunique (
//join custom Table
  DetailedADXSusIP_CL
 // | where TimeGenerated >= ago(dt_lookBack)
  | project TimeGenerated, IPEntity=RemoteIP_s, DeviceName_s, LOGONFAILED_s
    // renaming time column so it is clear the log this came from
    //| extend SigninLogs_TimeGenerated = TimeGenerated, Type = Type
)
//use appropriate custom table column for IOC match
on $left.TI_ipEntity == $right.IPEntity

The Results Simulation will be a good check for necessary tuning as well:

The Results Simulation on the right can be helpful for tuning your detection query.The Results Simulation on the right can be helpful for tuning your detection query.

7. Once the query is pasted and validated, you can select the appropriate entities, which are important to align for both the Built in and User and Entity Behaviour Analytics that are native capabilities of Sentinel.

The Entities that are being mapped for this example are Host and IP:

Map Host and IP EntitiesMap Host and IP Entities

8. You can then set your preference for the query scheduling and Alert Threshold. Just remember that even if there was a match in your data set, you won't see incident results until the rule runs, which is the time frame designated by the Query Schedule. (don't expect to see these results immediately if you've selected to run every 5 hours over the last 5 hours of data.)

On the next screen you can group alerts if desired (not required).

Then, you can align a desired Automated Response (not required).

9. On the Summary screen, you can review all of your selections, and once you're satisfied, Create the rule.

Review selections before creating the rule.Review selections before creating the rule.

…Several Hours Later:

We have an Incident match!

In a lab environment with a honeypot, this is what we were after.

The Threat Intelligence Rule Template found a match against the ADX Dataset!The Threat Intelligence Rule Template found a match against the ADX Dataset!

If we investigate and “view full details” to check the Entity information for the IP address, we can see the details associated with the IP entity:

Beth_Bischoff_7-1688587625077.png

In Summary:  

We hope you enjoyed this walkthrough of an example for using ADX as a security data lake from which you can draw filtered data back into Sentinel for Analytic rule matching.

The flexibility of Microsoft Sentinel within the rich ecosystem of the Azure Cloud provides many such creative and cost-saving capabilities allowing companies, who know their attack surface best, to benefit from the Native Cloud in which Microsoft Sentinel resides.

Special Thanks:

Thanks especially to co-author @mlopinto. This blog post was inspired by @Jeff_Chin and @Mary Lieb to help customers save costs while using Microsoft Sentinel. Thanks to @Matt_Witman  and @Yaniv Shasha for assisting with the envisioning of this multi-resource solution.

 

This article was originally published by Microsoft's Sentinel Blog. You can find the original article here.