How to configure Power BI to regularly update data from an upgraded Log Analytics workspace

I had a couple of excellent questions asked with regards to the first blog post in this series which discussed using Power BI to gather information from an upgraded Log Analytics workspace.

How do I authenticate to schedule a refresh of data in Power BI for my particular datasets from queries? I have the option and have tried all of them, all failed. Not sure what I’m doing wrong. Any pointers?” – My attempt to address this is in the “Scheduling the refresh of the Log Analytics data in Power BI” section.

I see the limit or records is set at 2000. When I enter the amount of records I require (25 000) I get an “Error 502 (Bad Gateway) error in PowerBI as soon as I click “Done” in the advanced query editor. Is there a timeout I need to adjust here to allow a little longer for the data to load? Are there limitations on how many records I can query?” – My attempts to address this question are included in the “Updating the query with an increased # of records” and the “Updating the query to return only the one day of data” sections of this blog post.

Scheduling the refresh of the Log Analytics data in Power BI:

Once you have your query in place, you need to provide credentials as we discussed in the previous blog post.
To schedule this data to update, we need to publish it next.

In my example I will publish it to my workspace in Power BI.

From here we switch to Power BI web ( and open “my workspace” in the datasets section and find the name of the dataset which was published.

Use the “Schedule Refresh” option to set when you would like the data to update

From here I needed to edit my credentials and set them to Oauth2 to authenticate properly.

And then logged in my account credentials for the OMS workspace.

Now we can determine when we want this to update (either daily, or weekly):

You can also add more times to have it refresh on the daily basis.

Right-clicking on the dataset shows when the most recent refresh was and when the next one is scheduled to occur. You can also click on the ellipses and choose to “Refresh Now”.

Updating the query with an increased # of records:

The default query returns 2000 records. This value can be changed to a higher value depending on how large the records which you return. For my query I was able to update it to a value of 18,000 before receiving this error message.

This appears to be due to a hard limit of 8mb on your data size: https://stackoverflo.wcom/questions/41869170/query-from-powerbi-to-ai-suddenly-fails-with-502-bad-gateway/41878025#41878025

Updating the query to return only the one day of data:

To minimize the amount of data, I changed from the original query approach to use a time restricted query which was updated on a schedule as discussed above.

  • Original query: Perf | where CounterName == “% Processor Time” | summarize AggregatedValue = avg(CounterValue) by Computer
  • Time restricted query: Perf | where CounterName == “% Processor Time” and TimeGenerated >= ago(1d) | summarize AggregatedValue = avg(CounterValue) by Computer

To validate that the scheduling was working, I looked at the results on the “TimeGenerated” field. When the data was originally gathered, the TimeGenerated went up to 9/5/2017 2:00:00 PM

After a successful refresh of the data the TimeGenerated field shows more recent data.

(*Note: may have to close and re-open Power BI web to see if the data has refreshed*)

Below is the query which is provided by OMS for Power BI:


The exported Power Query Formula Language (M Language) can be used with Power Query in Excel

and Power BI Desktop.

For Power BI Desktop follow the instructions below:

1) Download Power BI Desktop from

2) In Power BI Desktop select: ‘Get Data’ -> ‘Blank Query’->’Advanced Query Editor’

3) Paste the M Language script into the Advanced Query Editor and select ‘Done’


let AnalyticsQuery =

let Source = Json.Document(Web.Contents(“”,

[Query=[#”query”=”Perf | where CounterName == “”% Processor Time”” and TimeGenerated >= ago(1d) | summarize AggregatedValue = avg(CounterValue) by Computer, TimeGenerated | limit 2000″,#”x-ms-app”=”OmsAnalyticsPBI”,#”prefer”=”ai.response-thinning=true”],Timeout=#duration(0,0,4,0)])),

TypeMap = #table(

{ “AnalyticsTypes”, “Type” },


{ “string”, Text.Type },

{ “int”, Int32.Type },

{ “long”, Int64.Type },

{ “real”, Double.Type },

{ “timespan”, Duration.Type },

{ “datetime”, DateTimeZone.Type },

{ “bool”, Logical.Type },

{ “guid”, Text.Type }


DataTable = Source[tables]{0},

Columns = Table.FromRecords(DataTable[columns]),

ColumnsWithType = Table.Join(Columns, {“type”}, TypeMap , {“AnalyticsTypes”}),

Rows = Table.FromRows(DataTable[rows], Columns[name]),

Table = Table.TransformColumnTypes(Rows, Table.ToList(ColumnsWithType, (c) => { c{0}, c{3}}))



in AnalyticsQuery

Summary: To schedule your Log Analytics data to update on a scheduled basis: publish, and then schedule the dataset to update on the schedule which you would like it to update. To avoid the “502 bad gateway” errors, limit the duration of time for your query so that it will return less data but run it more frequently. Thank you to Erik Skov who was my “Phone a friend” for this issue!


  1. Riaan September 19, 2017
    • Cameron Fuller September 19, 2017

Leave a Reply