A couple of months ago, I wrote a blog about the Azure Data Explorer dashboards. Since then, the dashboards came out of public preview and are now general available. It is even possible to create an Azure Data Explorer cluster using ARM or Bicep templates. The only thing lacking is the ability to integrate dashboards into Azure DevOps enabling Continuous Integration / Continuous Delivery, or is it?

** UPDATE 04-07-2023** 
Got in contact with Microsoft and the ADX dashboard API is officially not (yet) supported

Research

After a question on the Microsoft Q&A about this topic, I did a little research. Because in the Azure Data Explorer (ADX) portal there is a possibility to export and import dashboards:

Using the development tools of my browser, I found the requests send via the portal. And it was just a GET and POST request with a bearer token for authentication.

GET

To check if it would work using the Azure DevOps pipeline, I first created a Postman GET request. I started with the basics: a URL and the bearer token. The dashboard ID can be found in the url when you browse to the dashboard in the ADX portal or in the development tool of your browser. The bearer can be found in the development tool in the headers of the request. It would be better to retrieve the bearer token from the Azure API, but for this test, this is enough.

And right on the first try it worked. In the response we got a JSON with the dashboard.

POST

The next step is to send the JSON formatted dashboard back to the API. So I went back to the developer tools of the browser and found out it was a similar request. But in this case, it was a PUT request. So I duplicated the Postman GET request and changed it to a PUT request. To send the dashboard to the API, I added the JSON response of the GET request as the body of this PUT request. In Postman, it is important to change the format of the body to JSON (indicated with the arrow in the screenshot).

And again, on the first try it worked. But on the second try, it failed with the message:

After some investigation, I saw the key “eTag” in the JSON response.

In my experience with other API’s, I know that these should be sequential. So, if you do a GET request and the eTag is “abc123”, then the PUT request should contain the same eTag value “abc123”. When the PUT request has been processed, the eTag will change to something else. This way, it is not possible to overwrite previous versions.

After I changed the eTag in the body of the second request, the third attempt was a success. So it is important to retrieve the eTag before we upload a dashboard to the portal.

Interim conclusion

It is possible to export and import dashboards using Postman. So probably it is also possible to integrate this into Azure DevOps. In the next part of the article, I will try to integrate the requests into an Azure DevOps pipeline.

Integration into Azure DevOps

To integrate the requests into Azure DevOps, I will be using Powershell.

So first, I created the Powershell script below. In this example, I use the body of the retrieved dashboard and upload it again. In a real life situation, you can use a JSON formatted dashboard in your repository. To do so, you will need to edit the Powershell script. But don’t delete the retrieval of the eTag, because we will need this to send the PUT request to the portal.

# ------------------------------------------------
# Parameters
# ------------------------------------------------
param(
    [Parameter(Mandatory=$true)]
    [String]$dashboardid,
    [Parameter(Mandatory=$true)]
    [String]$token
)

$eTag = ''
$body = ''
$uri = "https://dashboards.kusto.windows.net/dashboards/" + $dashboardid
$headers = @{
    'Authorization' = $token
    'Content-Type' = 'application/json'
}

# ------------------------------------------------
# GET request to retreive the eTag
# ------------------------------------------------
    
try
{
    $response = Invoke-RestMethod -Method Get -Uri "$($uri)" -Headers $headers
    if ($response)
    {    
        $body = $response
        $eTag = $response.eTag
        Write-Host('eTag: ' + $eTag)
    }
}
catch
{
    Write-Host "Exception details GET: "
    $e = $_.Exception
    Write-Host ("`tMessage: " + $e.Message)
    Write-Host ("`tStatus code: " + $e.Response.StatusCode)
    Write-Host ("`tStatus description: " + $e.Response.StatusDescription)

    Write-Host "`tResponse: " -NoNewline
    $memStream = $e.Response.GetResponseStream()
    $readStream = New-Object System.IO.StreamReader($memStream)
    while ($readStream.Peek() -ne -1) {
        Write-Host $readStream.ReadLine()
    }
    $readStream.Dispose();
}


# ------------------------------------------------
# Change eTag in body
# ------------------------------------------------
$body.eTag = $eTag


# ------------------------------------------------
# PUT request to export the dashboard
# ------------------------------------------------
$jsonBody = $body | ConvertTo-Json -Depth 100 
try
{
    $response = Invoke-RestMethod -Method Put -Uri "$($uri)" -Headers $headers -Body $jsonBody
    if ($response.error)
    {    
        throw 'Error:' + $response.error
    } else {
         Write-Host "Success"
    }
}
catch
{
    Write-Host "Exception details PUT: "
    $e = $_.Exception
    Write-Host ("`tMessage: " + $e.Message)
    Write-Host ("`tStatus code: " + $e.Response.StatusCode)
    Write-Host ("`tStatus description: " + $e.Response.StatusDescription)

    Write-Host "`tResponse: " -NoNewline
    $memStream = $e.Response.GetResponseStream()
    $readStream = New-Object System.IO.StreamReader($memStream)
    while ($readStream.Peek() -ne -1) {
        Write-Host $readStream.ReadLine()
    }
    $readStream.Dispose();
}  

The best way to integrate this script into your Azure Devops is to save it to your repository and then use it within the pipeline. So, I created a repository, added the Powershell script and created a pipeline script:

stages:
  - stage: general
    displayName: General
    jobs:
      - job: deploy_dashboard 
        displayName: Deploy dashboard 
        steps:
          - task: PowerShell@2
            displayName: 'Powershell task'
            inputs:
              targetType: 'FilePath'
              filePath: 'Powershell/UploadDashboard.ps1'
              arguments: '-dashboardid "<Dashboardid>" -token "<Bearer token>"'

In this example, I used a hard-coded Bearer token in the pipeline. It is better to retrieve the token through the Azure API. On this Microsoft page you will find more information about that.

After I committed the files into the repository, I started the pipeline and it worked:

Conclusion

After a little bit of research, it was possible to deploy an Azure Data Explorer dashboard using Azure Devops. By using the repository and Azure DevOps, we can take advantage of all the benefits of version control and Continuous Integration / Continuous Delivery.