Get audit logs, filter by workload and then write the logs to a SharePoint list


This post is a continuation of the blog article Office 365 Management Activity API with Power Automate – Part Two and is part of a three part series. This blog series will show you how to use the Office 365 Management API aka Microsoft 365 Audit Logs with Microsoft Power Automate (Flow) then write the logs to SharePoint.

This blog post will show you how to retrieve Office 365 Management Activity API Audit logs from the API subscription created earlier. Then show you how to parse the Audit logs from a JSON format and filter by a workload i.e. MicrosoftFlow. Finally I will show you how to write them to a SharePoint list so the logs are available in SharePoint.

The blog series

NEW BLOG/UPDATED SOLUTION (deals with Pagination, refresh token & provided templates to deploy solution in Power Automate & Logic Apps)

Get Audit Logs


Modifying existing Flow

We now need to modify the existing Flow in Power Automate we have created to add a array variable to store all of the logs retrieved and two compose actions which each have a time/date calculation so only logs from today are retrieved.

First we are going to add an Initialize variable action near the top of the Flow.

Then we are going to give the variable the name varLogsArray (or whatever you’d like to call it), change the type to Array. No value is required.

Next add a compose action just under the array action. Place your cursor in the Inputs section and then click on the Expression header on the pop out field select menu. Then place your cursor in the Expression field and then enter the following:


This code deducts 24 hours from todays time (in UTC time) then provides the date and time in the format yyyy-MM-ddThh:mm i.e. 2020-01-05T11:38. This will be used as part of the URI to retrieve logs from 24 hours ago.

NOTE: Depending on what time zone in the world you are in the world you will need to use the convertTimeZone function instead and instead use the expression below replacing New Zealand Standard Time with your time zone (see Microsoft Time Zone link here for the string required)

convertTimeZone(subtractFromTime(utcNow(),24,'Hour'),'UTC','New Zealand Standard Time', 'yyyy-MM-ddThh:mm')

Once complete Press OK/Update

Now add another compose action just below the startTime compose action you’ve just added. This one is going to be very similar except it is for endTime and will just be the time/date of now and again will be formatted in yyyy-MM-ddThh:mm

formatDateTime(utcNow(), 'yyyy-MM-ddThh:mm')

Again depending on what time zone in the world you are in the world you will need to use the convertTimeZone function instead and use the expression below replacing New Zealand Standard Time with your time zone (see Microsoft Time Zone link here for the string required)

convertTimeZone(utcNow(),'UTC','New Zealand Standard Time', 'yyyy-MM-ddThh:mm')

Add HTTP action – Get Content Blob

We are now going to add another HTTP action to check what content blobs were created during a certain period (in our case 24hrs). Add a HTTP action to the location specified in the image below and change the name of the action to HTTP – Get Content Blobs

Note: Since I drafted this blog article in early 2020 Microsoft changed the previous action HTTP – Start Subscription to not fail gracefully when a subscription has already been subscribed and instead the HTTP action will fail with “AF20024 The subscription is already enabled. No property change.”

We should put some error catching below HTTP – Start Subscription (which we would in a production environment) but instead we will go to the new HTTP – Get Content Blobs action and then click the three dots in the right hand corner of the action then select Configure run after from the dropdown

Then select the has failed checkbox along with successful checkbox, so HTTP – Get Content Blobs will still run after HTTP – Start Subscription has failed.

The HTTP – Start Subscription action will now fail without stopping the actions below continuing. Next go back to HTTP – Get Content Blobs

Set the Method dropdown to GET.

Set the URI to field to

then add the Compose Tenant ID value to the URI field, then add to the URI field


NOTE: Audit.General should be replaced for the contentType i.e. Audit.SharePoint you wish to obtain content blobs from AND you have previously registered a subscription.

then add to the URI field the Compose startTime value we setup earlier in this article. Then add to the URI field


Then finally add to the URI field the Compose endTime value we setup earlier in this article. The URI field should look like the below:

Next add Headers for the HTTP request, see the table below for the values. Ensure for the Authorization header that the word Bearer is entered, then a SPACE (this is important) and then the access_token is added from Parse JSON. Next add the Content-Type header and it’s value application/json.

AuthorizationBearer <Parse JSON – access_token>

The completed HTTP action should look like this (note I have renamed the action to HTTP – Get Content Blobs

Now we are going to Test the Flow – click the Test button in the top right hand corner. Then from the right hand Test Flow menu that appears click I’ll perform the trigger action and then click Save & Test.

Now click on Run Flow

We can now see the results of the Flow which ran sucessfully – go to the HTTP – Get Content Blobs action. Check the OUTPUTS to see the Status code is 200 and the Body displays one or more Content Blobs in the format. The contentUri property is the URI from which you can retrieve the content blob from, there may be one or more contentUri Urls.

    "contentUri": "<Your Tenant ID>/activity/feed/audit/20200104053711428067061$20200105032332329011346$audit_general$Audit_General$apac0017",
    "contentId": "20200104053711428067061$20200105032332329011346$audit_general$Audit_General$apac0017",
    "contentType": "Audit.General",
    "contentCreated": "2020-01-05T03:23:32.329Z",
    "contentExpiration": "2020-01-11T05:37:11.428Z"

We now need to extract the contentUris from the body section for use with Flow. Go into the OUTPUTS Body section and place your cursor in there then press CtrlA to select all of the JSON in the body section and then copy this to your clipboard with CtrlC (Copy)

Now we are going to click on Edit to edit the Flow again

Now we are going to add to the Flow a Parse JSON action that will split the JSON fields i.e. contentUri so they are then available in the Flow to be used in actions. We will do this by clicking the +New Step button then typing in parse in the Choose an action menu then selecting the Parse JSON action.

In the Parse JSON action click Generate from sample

Paste in the Sample JSON Payload which is the OUTPUTS Body we just copied to the clipboard from the Flow Results for HTTP – Get Content Blobs. Then click Done.

The schema will now be generated in the Schema box. Next add the Body field from the previous HTTP action to the Content column.

Get each Content Blob individually

Next we are going to add another HTTP action to send a web request to each ContentUri. Set the Method dropdown to GET. The URI field is the contentUri from the previous PARSE JSON – Get Content Blobs action. When this is added a apply to each loop will automatically be added and the HTTP action will be moved into this loop incase there are multiple ContentUris.

Next add Headers for the HTTP request, see the table below for the values. Ensure for the Authorization header that the word Bearer is entered, then a SPACE (this is important) and then the access_token is added from Parse JSON. Next add the Content-Type header and it’s value application/json.

AuthorizationBearer <Parse JSON – access_token>

The HTTP action has moved into a Apply to each (for loop)

Now we are going to Test the Flow – click the Test button and then click I’ll perform the trigger action and then click Save & Test on the Test Flow menu

Now click on Run Flow

We can now see the results of the Flow which ran sucessfully – go to the HTTP – Get Content Blob action and expand it. Check the OUTPUTS to see the Status code is 200 and the Body displays audit logs in one long line in JSON format like the image below.

The action HTTP – Get Content Blob will then be run separately for each different ContentUri that is available and this will all of the Content Blobs available which will be downloaded in JSON format.

Filter Content Blobs for a M365 Workload

We are now going to filter the JSON body for a specific Microsoft 365 Workload MicrosoftFlow. See this link for all the different workloads available to filter on i.e. SharePoint, Yammer, ExchangeItem etc.

To do this add a Filter array action by adding an action and then typing Filter then selecting the Filter array action

Add to the From section of Filter array the Body value from HTTP – Get Content Blob

Next in the left hand box (circled blue above) for the Filter array condition we are going to enter an Expression which will be used to filter on. Move the your cursor into the box, then select Expression from the in-window pop out then enter item()?[‘Workload’] and then select OK.

Now finally add the Workload you wish to filter the array for i.e. MicrosoftFlow. By typing MicrosoftFlow in the right hand side box of the filter condition. Note

We are now going to filter each content blob returned for a specific workload. This will be returned as a JSON blob.

We are now going to then add all of these JSON blobs to an array variable varLogsArray so there will be one big array of all the blobs (Audit Logs).

We will do this by adding an Apply to each action as we need to loop through the filtered array and parse the array to add the elements one by one to the array variable varLogsArray. Under the HTTP action add an Apply to each action.

Add the Filtered Array body as the output for the new Apply to each action

Now add an Append to array variable action

Select the variable varLogsArray and then select Current item from the Apply to each 2 action

Next add a Compose action on it’s own at the bottom of the Flow and then select the variable varLogsArray.

We are now going to test the Flow to check the filtering for the workload MicrosoftFlow has been applied. Select the Test button in the top right hand corner.

Now select I’ll perform the trigger action and click Test

Now check the bottom Compose action we just added has now been completed and the OUTPUTS show a JSON array. Select all of the JSON array from Outputs text box and then copy this to your clipboard.

This JSON array can then be used with a Parse JSON action. Select New step then type “parse” and then select the Parse JSON

Now in the Parse JSON action add in the Content section the variable varLogsArray. To generate the Schema we are going to use the output from the previous compose action we copied to the clipboard. Click the Generate from sample button.

Now in the Insert a sample JSON Payload box paste in JSON array from your clipboard that we copied to the clipboard couple of steps earlier.

The completed Parse JSON action will now look like the following – notice the Schema has now been generated.

Now the varLogsArray has been parsed all the individual elements of the items in the JSON array will now be available to be used in subsequent Power Automate actions i.e. use the values for adding to SharePoint Lists, send emails using the values etc

Test the Flow and then check in the results that Parse JSON is now retrieving some results in the OUTPUTS section

The filtered audit logs are now available for use in your Flow and you could then use the logs with any of the different number of Power Automate actions i.e. send email, create SharePoint list item, request approval etc the possibilities are endless!

Write Results to a SharePoint List

For the purpose of this blog I am going to quickly show that you can write these logs to a SharePoint List. The filtered audit log events will then be stored in a SharePoint list and you could then invite non-admin users to view the logs, use SharePoint to create views, write logs to Azure, create a new Flow with Power Automate on this list etc. Endless possibilities!

Create a list called Flow Audit Logs in a SharePoint site with the suggested fields and types in the image below. Make a note of the Site Url and List Name as these will be needed shortly in Power Automate.

Now the objective is to add the Audit logs we have just retrieved to a SharePoint list. The Flow runs on a regular schedule and we don’t want to have duplicate entries of the same events that have already been added so we will add some logic to avoid duplicates.

Add a Get Items SharePoint action. In this action add the Site Url to the Site Address field and then in List Name select the SharePoint list Flow Audit Logs you have just created. Next in the Filter Query field type AuditLogID eq ” where ” is two seperate apostrophes (‘) together and not a single double quotation mark (“). Next place the cursor between the two apostrophes and then click on the Parse JSON Id field

Due the the Parse JSON being an array the Get Items action will then be added to a Apply to each condition. This means for each element in the array a seperate Get Items will be run (this could be done more efficiently and only run once but for this POC it will be done like this especially as there there not being much Audit Logs data in my demo tenant). Rename Get items to Get items – Flow Audit Logs

Immediately below Get Items – Flow Audit Logs click Add an action then add a condition. One the left hand side of the condition add the following expression.


Ensure the middle drop down is set to “is equal to” and then the final condition is set to:

The final condition should look like below, this condition checks if Get items – Flow Audit Logs action is empty. Get items – Flow Audit Logs action has a condition to only retrieve the entry equal to the current O365 Management API id, so if it does not find a matching ID it will be empty.

In the no branch action add a SharePoint Create item action – set the Site Address, List Name and then set the Fields as per the image below using the values from Parse JSON .

Now run the Flow and the Audit Log entries for Flow will be written to SharePoint on a regular basis like the image below.

I have further customised the list by adding the following column formatting to the FlowDetailsUrl field to show a clickable Flow image for the FlowDetailsUrl hyperlink. Note this relies on a Power Automate icon being in the Site Assets library so you will need to download one.

  "$schema": "",
  "elmType": "a",
  "attributes": {
    "target": "_blank",
    "href": "@currentField"
  "children": [
      "elmType": "img",
      "attributes": {
        "src": ""
      "style": {
        "width": "35px",
        "height": "35px",
        "display": "=if(@currentField!='', 'block','none !important')"


If you’ve made it this far well done! You will have completed the first two parts: part one where you setup your M365 tenant and part two where you created a Flow in Power Automate and subscribed to the Office 365 Management API.

We have just in this final blog post in this series retrieved the M365 Audit logs using Power Automate, filtered the logs to show only Power Automate (MicrosoftFlow) events and then written the log events to a SharePoint list.

Writing the logs to a SharePoint list is just one of the possibilities now the filtered logs are available to use in Power Automate. I intend to blog further about using the M365 Audit logs in Power Automate and presenting further scenarios of different workloads of the logs to show for example Teams created, Sites deleted etc.

Feel free if you like to suggest a workload of Audit Log events you would like me to blog on in the future i.e. user logins, files deleted etc.

I hope you enjoyed this blog series. Please leave comments and feedback below.

This Post Has 11 Comments

  1. Matti

    Great introduction.
    I followed it step by step and receive an error message in the “Filter array workload MicrosoftFlow”:
    BadRequest. The ‘from’ property value in the ‘query’ action inputs is of type ‘String’. The value must be an array.
    Any ideas?

    1. Leon Armston

      Hi Matti

      Thanks for following my blog, I reviewed it and it looks like I may have got two version of my blog intertwined in the pictures – apologies!

      I have updated the blog and the key step I believe you need is in the from field you will need the expression json(body(‘HTTP_-_Get_Content_Blob’)) but look at the updated text and pictures before and after.

      Hope that helps and let me know how you get on?



  2. Sam

    Hi, great content!

    Appreciate you taking your time to showcase this solution.
    Was wondering if this solution takes pagination into consideration, most tenants would have quite a few activities?

    I know there the paging option for the connector but I don’t think that would work out of the box in this situation.

    1. Leon Armston

      Hi Sam

      Thanks for looking at my blog – unfortunately this is only a POC and doesn’t take into account using pagination.

      I have only deployed this to production for monitoring all the files deleted in SharePoint in a small tenant so didn’t meet issues that would require pagination. I was running it every 15 mins – retrieving only the last 45 mins records so it wasn’t hitting any issues with records returned being higher than the limit.

      Looks like one way to fix it would be to check the headers for a NextPageUri and if that is listed there are additional records available so do another loop to retrieve these records until NextPageUri is no longer returned?

      Do let me know how you get on with retrieving all activities – I’m very interested.

      Reference Links

  3. Sean

    Hi Leon, thank you very much for this. I am stuck on the first test run of the Get content blob action….instead of getting URIs I am getting this error code. I should note, Start Subscription action was successful, once adding content blob action and run after, the start subscription action always fails, then the get content blob action gives me this error…. any idea what I am missing?
    “error”: {
    “code”: “AF20023”,
    “message”: “The subscription was disabled.”

  4. Leon Armston

    Hi Sean
    Apologies for the late reply to you – I hope you managed to solve the issue? it may have been a timing issue as you sometimes have to wait a while for the subscription to become active?

    If not double check your start subscription action
    to see see if it returns:
    “error”: {
    “code”: “AF20024”,
    “message”: “The subscription is already enabled. No property change.”
    This means the subscription has already been been activated

  5. Reddy

    Hi Leon, This is a great 3 part blog explaining step by step. Appreciate your work helping others to follow easily.
    I am able to build the complete flow, now trying to refresh the token since the flow runs for more than 60 mins on enterprise subscription, at least for the first time. It is failing due to authorization error as the token expires after 3599 secs

  6. Lamiaa Mamdouh

    Thank you so much for the great post but I do the subscription for DLP.All unfortunately I can get only data for exchange not for sharepoint or onedrive or teams my filter array is operation is equal to DLPRuleMatch is working perfectly nut I need to get all workloads any help please

    1. Leon Armston

      Hi Lamiaa

      I think you need to put Content Types DLP.All, Audit.SharePoint and Audit.Exchange in a array and then use a loop to go through each Content-Type separately to then register the subscription and then retrieve the blobs for each content type.

      Alternatively clone the Power Automate Flow and then update for the other content types – so have one for DLP.All, one flow for Audit.SharePoint and on for Audit.Exchange.

      Hope that helps


Leave a Reply