Using Cribl Stream to ingest logs into Microsoft Sentinel

I would like to thank Javier Soriano, Eric Burkholder and Maria de Sousa-Valadas for helping out on this blog post. On 06 May 2024 it was announced by Microsoft here and by Cribl here that together, Microsoft and Cribl are working to drive accelerated SIEM migrations for customers looking to modernize their security operations (SecOps) with Microsoft Sentinel.

As quoted:

“By combining Cribl's leading data management technology with Microsoft Sentinel's next generation SecOps SIEM solution, we are collectively helping customers transform and secure their businesses,” said Vlad Melnik, vice president of business development, alliances at Cribl.  “We are excited to deepen our collaboration with Microsoft and unlock more value for our joint customers.”

Cribl stream architecture

As mentioned in this cribl document, Cribl stream helps you process machine data – logs, instrumentation data, application data, metrics, etc. – in real time, and deliver them to your analysis platform of choice. 

mahmoudmsft_0-1719823727244.png

Specifically in the context of Microsoft Sentinel migration projects, Cribl brings some advantages as seen from the field:

  • Fast and easy deployment of Cribl.

Cribl offers cloud based SaaS and self hosted scenario as well when needed. Here the whole cribl pipeline could be spin up quickly allowing for faster migration to Microsoft Sentinel

  • GUI rich features

Having easy GUI interface that lets you design, ingest data, process data, send data to destinations makes it so easy and helps teams quickly design and test a new data ingestion pipeline.

For example Cribl allows you to add data sources just by doing drag and drop and also allows you to configure listner details like IP address and port numbers and other information and add new fields to ingested data stream all within few clicks.

  • Applying data processing andor transformation easily using pipelines.

Within same GUI Cribl offers built in data processing capabilities and functions that makes it easy to manipulate, alter and apply data transformation before ingesting into Microsoft Sentinel. In addition to the built in ones Cribl also allows you to add new from scratch giving you full control on the pipeline design.

  • Capture and test data at each stage

A very important feature is the ability to capture live data at each stage of the pipeline to inspect how data has been processed or even the ability to use a sample log data at every stage of the pipeline giving you the great visibility and anticipation of how data is processed and how data looks like at every stage of the pipeline.

  • Ability to work in push and pull mechanisms

Following is a basic architecture concept of Cribl stream pipeline as mentioned in this cribl document:

mahmoudmsft_0-1719823920070.png

Now to show a simple scenario of ingesting syslog data in a migration project scenario using cribl. Following are the high level steps I will go over in following sections:

  1. Add Microsoft Sentinel as destination
  2. Add a syslog data source
  3. Add new fields to incoming events
  4. Show Create a new pipeline to transform data
  5. Show use Cribl built in packs
  1. Add Microsoft Sentinel as destination

adding Microsoft Sentinel as destination is referenced here in this document. It's worth it to note that Cribl stream is utilizing the standard Microsoft's ingestion API. These steps involves creating a new data collection rule and data collection endpoint to receive the ingestion stream. In addition cribl would need a new app registered in Microsoft to be able to use the ingestion API. All steps are mentioned in above cribl document.

From the quick connect screen we click on “Add Destination” and then select Sentinel

mahmoudmsft_1-1719824058751.png

mahmoudmsft_0-1719824115889.png

Here we fill up the ingestion API details like DCE endpoint and DCR immutable ID and other details:

mahmoudmsft_1-1719824143474.png

Under tab we fill up details about the App ID and App secret as obtained from Microsoft

mahmoudmsft_2-1719824173308.png

2- Add a syslog data source

Go to the quick connect we add a new syslog source

mahmoudmsft_3-1719824221962.png

Add a new syslog source:

mahmoudmsft_4-1719824242638.png

Here we configure the syslog port number to listen on. I have chosen port 9514

mahmoudmsft_5-1719824270786.png

 

Once the syslog data source is added we can go ahead and capture live data to see how it looks like

For the demo purposes of this blog post I have used following logger command to send a mock syslog message.

logger  -P 9514 -n   --rfc3164 "0|Cribl-test|MOCK|common=event-format-test|end|TRAFFIC|1|rt=$common=event-formatted-receive_time" 

Data fields after running above logger command looks as shown in following screenshot when using the live data capture feature at source:

mahmoudmsft_6-1719824335848.png

Now I'm going to add new fields to the incoming stream as hard coded which is useful in scenarios where a dedicated syslog pipeline is required for each syslog source or a 1:1 mapping.

3. Add new fields to incoming events

mahmoudmsft_7-1719824377217.png

And we can capture again to see result of the new added fields:

mahmoudmsft_8-1719824395767.png

Now that we have data coming is we can do some light data mapping in order to map incoming fields to the columns of the standard Sentinel syslog table. For this, we have two options:

A) Create your own pipeline transformation

B) Use an existing Cribl Pack

 

  

I have created a new pipeline with two functions. First function is to do a rename operation to some fields and second is to drop from fields entirely. As shown on right hand side all changes are shown in the standard pinkgreen colors with sample data

mahmoudmsft_9-1719824427014.png

And now we have the whole pipeline ready

mahmoudmsft_10-1719824454056.png

Now using same logger command we see how data is landing into Sentinel:

mahmoudmsft_11-1719824471581.png

Cribl Stream Packs Dispensary

In order to reduce complexity of creating processing pipelines with transformation capabilities specially in large organizations Cribl does have many built in processing packs to make it easy and quick to onboard several data sources. 

As mentioned in this Cribl document packs include:

  • Routes (Pack-level)
  • Pipelines (Pack-level)
  • Functions (built-in and custom)
  • Sample data files

mahmoudmsft_2-1719840549863.png

Specifically for Microsoft Sentinel there are several packs available. Following are some of available Sentinel packs:

mahmoudmsft_3-1719840753073.png

If we go ahead and try importing Microsoft Sentinel pack we see that it consists of following functions that cover data coming from sources like Palo Alto, Cisco ASA and Fortinet and Windows Event forwarding as well. All that just built in and more importantly is fully customizable within few clicks. It's also worth it to note that within same imported pack you get data automatically detected and forwarded to different Sentinel table like Syslog, CommonSecurityLog and WindowsEvent tables.

mahmoudmsft_4-1719840948280.png

Cribl Stream packs could be found here

So far it's obvious how Cribl could be used to help in scenarios of Sentinel migrations specially with its fast configurations and easy interface and choice between having Cribl as cloud instance or self hosted on-prem or in cloud VMs makes it a good choice.

Thanks

 

This article was originally published by Microsoft's Sentinel Blog. You can find the original article here.