Using Automation Runbook Webhooks To Alert on Databricks Status Updates


This guide walks you through the process of setting up and utilizing webhooks to receive Databricks status alerts, process them using Azure Runbook, and trigger notifications to administrators about the status event.


Before diving into the process, ensure you have the following prerequisites in place:

Process involved

To subscribe to Databricks status alerts via webhooks, begin by creating a webhook within Azure Runbook. This webhook generates an HTTP endpoint to receive alerts. Next, navigate to the Databricks status page and subscribe by providing the webhook URL. When Databricks events occur, they will be sent to this URL. Lastly, use a PowerShell script within the Runbook to process incoming alerts and log relevant information into Azure Log Analytics. This log entry can then trigger alerts via , notifying administrators about the Databricks events in near real-time.


Steps to Subscribe to Databricks Status Alerts via Webhooks

1. Create a Webhook in Azure Automation Runbook

The first step involves creating a webhook in Azure Automation Runbook. A webhook provides an HTTP endpoint that allows external services, in this case, Databricks, to send data to Azure Automation.

  1. Go to the Azure portal.
  2. Navigate to your Azure Automation account.
  3. Create a new Runbook or select an existing one.
  4. Inside the Runbook, create a webhook.
  5. As seen in the figure, note down the webhook URL. It will be required when subscribing to Databricks alerts.


2. Create a Webhook in Azure Automation Runbook

With the webhook URL in hand, you can now subscribe to Databricks alerts to start receiving status updates.

  1. Visit the Azure Databricks Status Page.
  2. In “Subscribe to Updates” section pick the Webhook option as seen below.
  3. Paste the Azure Automation Runbook webhook URL you generated earlier.
  4. Choose the events you want to receive alerts for (e.g., incidents, maintenance).
  5. Confirm your subscription and receipt via email.


3. PowerShell Script for Processing Alerts

Now that subscription to Databricks alerts is done and being sent to the automation runbooks webhook, you need to process these alerts and store them in Azure Log Analytics. This can be achieved using PowerShell scripting in an Azure Automation Runbook.

The following code interprets JSON payload akin to the structure of Databricks events outlined in the provided link. Fill in the CustomerId and SharedKey fields by utilizing the Workspace ID and Primary key obtained from the Agents section within the Log Analytics workspace.

param (
[Parameter (Mandatory = $false)]
[object] $WebHookData
# Replace with your Workspace ID
$customerId = “
# Replace with your Primary Key
$SharedKey = “
# Specify the name of the record type that you'll be creating
$LogType = “DatabricksStatusAlerts”
# Optional name of a field that includes the timestamp for the data. If the time field is not specified, assumes the time is the message ingestion time
$TimeStampField = “”
# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
$xHeaders = “x-ms-date:” + $date
$stringToHash = $method + “`n” + $contentLength + “`n” + $contentType + “`n” + $xHeaders + “`n” + $resource
$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($sharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.Key = $keyBytes
$calculatedHash = $sha256.ComputeHash($bytesToHash)
$encodedHash = [Convert]::ToBase64String($calculatedHash)
$authorization = ‘SharedKey {0}:{1}' -f $customerId,$encodedHash
return $authorization
# Create the function to create and post the request
Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)
$method = “POST”
$contentType = “application/json”
$resource = “/api/logs”
$rfc1123date = [DateTime]::UtcNow.ToString(“r”)
$contentLength = $body.Length
$signature = Build-Signature `
-customerId $customerId `
-sharedKey $sharedKey `
-date $rfc1123date `
-contentLength $contentLength `
-method $method `
-contentType $contentType `
-resource $resource
$uri = “https://” + $customerId + “” + $resource + “?api-version=2016-04-01”
$headers = @{
“Authorization” = $signature;
“Log-Type” = $logType;
“x-ms-date” = $rfc1123date;
“time-generated-field” = $TimeStampField;
$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
return $response.StatusCode
if ($WebHookData){
# Body of the message.
Write-Output ‘The Request Body'
Write-Output $WebHookData.RequestBody
# Convert the body data from JSON
$JsonPayload = ConvertFrom-Json -InputObject $WebHookData.RequestBody
# View the full body data
Write-Output ‘The Full Body Data'
Write-Output $JsonPayload
$JsonString = $JsonPayload | ConvertTo-Json -Compress
$json = @”
“JSONPayload”: $JsonString,
“DateValue”: “2019-09-12T20:00:00.625Z”,
“GUIDValue”: “9909ED01-A74C-4874-8ABF-D2678E3AE23D”
# Submit the data to the API endpoint
Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType
else {
Write-Output ‘No data received'

4. Set Up Alerts and Notifications

The last step involves setting up alerts triggered by the processed Databricks status updates. You can achieve this by creating an alert that activates when new log entries are detected in the Azure Log Analytics workspace.

  1. Navigate to Log Analytics Workspace where custom logs are written to in the Azure portal.
  2. Create a new Alert Rule as shown in figure below.
  3. Define the signal logic to trigger the alert (e.g., when a new log entry with a specific status event is detected).
  4. Configure the alert to use an Action Group that notifies administrators via email, SMS, or any other preferred method.


Verifying Webhook Functionality: Testing Webhook with Sample JSON Payload

To ensure the functionality of your webhook, you can use the PowerShell script below to send a JSON payload. Insert the actual URL of your webhook endpoint where shown. This script sends the JSON payload, and a “200 OK” status indicates that the webhook is correctly configured and operational.

$uri = ‘
$headers = @{
“Content-Type” = “application/json”
$jsonPayload = @”
“title”:”Server Upgrades”,
“current_status”:”Planned Maintenance”,
{“component”:”551ed627b556f14210000005″, “container”:”551ed5ac590f5a3b10000006″},
{“component”:”551ed627b556f14210000005″, “container”:”551ed5b1c9f9404110000005″}
{“name”:”Chat Service”,
{“name”:”East Server”,
{“name”:”West Server”,
“details”:”We've completed upgrades for all East Servers. No issues so far. Moving on to West Servers next. Updates to follow.”,
$body = ConvertTo-Json -InputObject $jsonPayload
$response = invoke-webrequest -method Post -uri $uri -header $headers -Body $body -UseBasicParsing

In the Jobs section of the Runbook, a new entry should go from Queued to Completed. Open the entry and confirm no errors. Output in the Logs section should show a 200 indicating log entry written to Log Analytics Workspace.



From Log Analytics Workspace confirm entries in Custom Logs under the record type of “DatabricksStatusAlerts”




Subscribing to Databricks status alerts via webhooks is an effective way to stay informed about the operational health of your critical services. By following the steps outlined in this guide, you can seamlessly set up the entire process, from creating webhooks to processing alerts and triggering notifications. This proactive approach empowers administrators to respond promptly to any potential issues, ensuring the smooth functioning of your Azure Databricks environment.


The sample are not supported under any Microsoft standard support program or service. The sample are provided AS IS without a warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.


This article was originally published by Microsoft's Azure Blog. You can find the original article here.