In this blog I will show you how to retrieve M365 audit logs with Azure Logic Apps & Power Automate including the logic to handle Pagination for very large tenants (number of results returned limited to prevent response timeouts). I provide instructions and templates for both Azure Logic Apps and Power Automate so you can install in your tenant today. Both workflows provide you with an JSON array the Audit Logs from your tenant which you could then filter in Logic Apps/Power Automate by workload i.e. SharePoint, Exchange, Flow, Teams, Yammer etc.

The blog is extending on my previous blog series last year on using the Office 365 Management API with Power Automate which provided a step by guide to allow M365 audit logs to be retrieved with Power Automate then write the logs to a SharePoint list. Now I extend it to use Azure Logic Apps and the handle pagination for both Logic Apps and Power Automate.

Below I will compare Azure Logic Apps and Power Automate for this scenario. Power Automate and Logic Apps are very similar and can both retrieve and process M365 Audit logs but there are some differences with licencing, performance and limits.

Azure Logic Apps

AdvantagesDisadvantages
– Array variable limit in Logic Apps is much larger with 100,000 as standard compared to Power Automate 5000 so you can store more items in an array in Logic Apps.
– Logic Apps as standard is faster than Power Automate standard tier – higher throughput and other limits.
– No premium licensing needed to use HTTP action (used to make O365 Managment API request) in Logic Apps which is a premium action in Power Automate.
– Needs an Azure Subscription which often normal M365 users dont have access to.
– No free access to Logic Apps – you pay for all the actions executed.

Power Automate

AdvantagesDisadvantages
– Makers by default can create flows using Power Automate (no Azure Subscriptions needed)
– Power Automate standard included with most M365 plans which includes a usage allowance – so no extra costs i.e. per action executed etc.
– The HTTP action is a Premium connector which requires a Power Automate premium licence per user/month or per flow.
– Higher performance/limits of Flow requires premium licencing.

The Basics – Office 365 Management Activity API aka Audit Logs

The Office 365 Management Activity API provides information about various user, admin, system, and policy actions and events from Office 365 and Azure Active Directory activity logs.

These events are available in your Microsoft 365 tenancy as Audit Logs via the web at https://protection.office.com/unifiedauditlog and are retained for 90 days in most tenants before being overwritten.

Here is a link to the Microsoft page with all the Audited Activities available in the Microsoft 365 Audit Logs

The Office 365 Management Activity API is a REST web service that allows the Audit logs to be queried and downloaded in JSON format.

Audit Logging may need to be turned on in your Microsoft 365 Tenant. For instructions, see Turn Office 365 audit log search on or off.

Install the solution in Azure Logic Apps

The solution uses Azure Logic Apps (Power Automate solution also supplied at the bottom) and requires an app registration to be created in Azure AD to authenticate to the Office 365 Management API to retrieve your audit logs in your tenant.

  1. Create an App Registration in Azure AD with the Office 365 Management API permission ActivityFeed.Read as an application.
    • Instructions to set this up in your tenant are in my previous blog here. Make a note of your ClientID, ClientSecret & TenantID
Created Azure AD App Registration with the Office 365 Management API permission ActivityFeed.Read as an application
  1. Create a new Logic App in Azure with any trigger i.e. Recurrence in Logic Apps Designer then go to edit the Logic app and go to Code View.
  1. In a new tab go to the template in my Github repo (direct link here) – select all of JSON text and copy it to your clipboard.
  1. Go back to the Code View Tab in Logic Apps and overwrite all of the JSON code there previously with my template JSON you just copied and then go to the Designer button.
  1. The solution will then appear your Logic Apps window – go to the Parse JSON – Environment Variables action and modify this to reflect your ClientID, ClientSecret & TenantID. The workload is currently set to report on SharePoint workload events only (Audit.SharePoint) but this could be changed to Audit.AzureActiveDirectory, Audit.Exchange or Audit.General.
  1. Save your changes and then the Azure Logic App is ready to be tested (run). Audit logs from the previous 24 hours will then be retrieved and then written to an JSON array at the end called array_logs. Note initially you may need to wait for a little while (around 12 hours) for the subscription once registered to return back content blobs

You could then use this array to filter for events for specific sites, users, applications etc and perhaps send a daily summary of site activity for example of a SharePoint site.

This image has an empty alt attribute; its file name is image-18-1024x483.png

Pagination – Azure Logic Apps & Power Automate

One of the key parts I wanted to show in this blog was how to deal with pagination in the Office 365 Management API. I will briefly explain it for you below but it will be best to download and setup the Azure Logic Apps template or Power Automate template to see it for yourself.

The normal process to retrieve M365 Audit Logs & note where NextPageUri occurs as this is used for Pagination in busy M365 tenants.

  • Connect to Microsoft Graph using an application with ActivityFeed.Read permissions and generate an access_token
  • Start a subscription to a specified content-type i.e. Audit.SharePoint
  • List Available content for the specified content-type which provides a list of all Urls (ContentUris) which are the JSON blobs of audit logs to retrieve.
    • The content is an aggregation of actions and events harvested from multiple servers across multiple datacenters.
    • If there is more than 600 results (ContentUris) returned then the Header NextPageUri will be given in the response. Use this url to retrieve a further page of ContentUris results.
      • Can be many pages in busy tenants.
NextPageUri header returned with url to request further contentUri(s)
Logic App that has logic to request content and if NextPageUri header is returned tokeep on requesting further content until NextPageUri header is not returned.
  • Use all the ContentUris obtained to retrieve with a GET requests content blob with one or more actions or events in JSON format (see example below).

Install the Solution in Power Automate

The main focus of this work was with Logic Apps which I evaluated at the beginning of this post with the advantages and disadvantages of Azure Logic Apps and Power Automate. My brief conclusion is that Logic Apps is more suited for this requirement due to costs, performance and considerably higher limits in comparison to the seeded M365 Flow licence that most people have (non premium) with Microsoft 365.

The solution requires an app registration to be created in Azure AD to authenticate to the Office 365 Management API to retrieve your audit logs in your tenant.

  1. Create an App Registration in Azure AD with the Office 365 Management API permission ActivityFeed.Read as an application.
  • Instructions to set this up in your tenant are in my previous blog here. Make a note of your ClientID, ClientSecret & TenantID
Created Azure AD App Registration with the Office 365 Management API permission ActivityFeed.Read as an application
  1. Install in Power Automate the GetOffice365ManagementActivityAPILogs template
  • In Power Automate, import the GetOffice365ManagementActivityAPILogs.zip template.
  • The solution will then appear in your Power Automate screen – go to the Parse JSON – Environment Variables action and modify this to reflect your ClientID, ClientSecret & TenantID. The workload is currently set to report on SharePoint workload events only (Audit.SharePoint) but this could be changed to Audit.AzureActiveDirectory, Audit.Exchange or Audit.General.
  • Save your changes and then the Power Automate Flow is ready to be tested (run). Audit logs from the previous 24 hours will then be retrieved and then written to an JSON array at the end called array_logs. You could then use this array to filter for events for specific sites, users, applications etc and perhaps send a daily summary of site activity for example of a SharePoint site.
  • See my previous blog Office 365 Management Activity API with Power Automate – Part Three where I filtered the JSON blob for Flow events and then wrote them to a SharePoint list.

Summary

In this blog we learned about using Azure Logic Apps and Power Automate to retrieve M365 Audit Logs and then showed you how to use Pagination to retrieve further pages of results if all results are not returned originally which happens in larger tenants to avoid timeout issues.

This continued my previous three part blog series I did last year on Office 365 Management API with Power Automate which provided a step by guide to allow the M365 audit logs to be retrieved with Power Automate then write the logs to a SharePoint list.

I hope you enjoyed this research and it will come in helpful for your organisation especially the pagination. Please leave comments or feedback below and I’d love to hear if and how you are using this in your organisation?

This Post Has 2 Comments

  1. Jose Vadisan

    Hi Leon, great article and guide. I am trying to use this Logic App to get logs from Microsoft 365, but store on a Blob Storage, is this possible?

    1. Leon Armston

      Hi Jose

      It depends what you want to do with the JSON logs (variable(‘array_logs’)) when you write them to Blob Storage…. Do you want to just store them as files in Azure Blob Storage for storage purposes or do you want to query them?

      To write the JSON array to Azure Blob storage in Logic Apps you can use the Create blob action.

      If you want to query them then you will need something to pick them up out of Azure blob storage and send them to a DB i.e. Cosmos DB/SQL etc via a SQL stored procedure for example or have some other code/process to read the DB or blobs.

      Hope that helps

      Leon

Leave a Reply