Step-by-Step Guide to Auditing with Fabric Activity Logs
You can check your environment by using Fabric Activity Logs. First, set up the right permissions and tools. Go to the Microsoft Purview portal to see the logs. Export the data so you can look at it. Study the activities to find patterns or problems. You can automate these steps to save time. Using Fabric Activity Logs in one place helps you follow rules and keeps your organization safe.
Key Takeaways
Give the right permissions and turn on auditing to start collecting Fabric Activity Logs in a safe way.
Use Microsoft Purview portal to look for logs, sort them, and export them for easy checking and reports.
Use built-in and outside tools to study logs, find problems, and make performance and security better.
Set up log exports to happen automatically so you keep a full record and do not spend time checking by hand.
Check logs often to find strange activity, follow rules, and keep your data safe.
Setup
Permissions
You need the right permissions before you start. The 'Audit' permission in Microsoft Fabric Data Warehouse lets you set up and see audit logs. This permission lets you know who did something and when. To get it, go to Azure Active Directory. Find your app registration. Add the 'Read activity log' permission under Microsoft Power BI service. An admin has to approve this permission. If you use Exchange Online, check if you have the Audit Logs role. This role is in the Compliance Management and Organization Management groups. You will use the Microsoft Purview portal to look for and check audit logs.
Tip: Always check your permissions first. If you do not have them, you cannot see the logs you need.
Enabling Auditing
You must turn on auditing to collect logs. First, make sure you have the Audit queries permission. Workspace Admins usually have this already. Go to your Fabric Data Warehouse workspace. Look for the audit log settings. Turn on SQL audit logs, which are off at first. Pick the audit actions you want to track, like changes or access events. Save your settings to start logging. You can use T-SQL queries to look at logs in OneLake. Remember, logging more things uses more storage, so only track what you need.
Steps to enable auditing:
Make sure you have Audit queries permission.
Open your workspace.
Go to audit log settings.
Turn on SQL audit logs.
Pick actions to audit.
Save your settings.
Use T-SQL to check logs.
Tools Needed
You need some tools to manage and study your logs. Here are some important ones:
Data pipeline: This tool helps you move and change data. It works with many data sources.
OneLake/Lakehouse: This is where your logs are kept. It can hold different types of data.
Eventhouse with KQL database: This database is good for storing and searching lots of log data.
Log Shippers: Tools like Fluentd or Filebeat send logs from your network to Azure.
Azcopy CLI: This tool helps you move data fast to Azure Storage.
PySpark Notebooks: Use these to clean and get your log data ready.
Update Policies in KQL: These help you change and organize your data in the database.
Real-Time Dashboards: Use these to see your log data right away.
Note: The right tools help you collect, store, and study your logs more easily.
Accessing Fabric Activity Logs
Microsoft Purview Portal
You can find Fabric Activity Logs in the Microsoft Purview compliance portal. This portal gives you a central place to search and manage your audit logs. Here is how you can access the logs:
1. Open the Microsoft Purview compliance portal. 2. Go to the Audit section. 3. Select Search to look for audit logs related to Fabric activities. 4. You can also check the Fabric Admin Portal under Tenant settings for export and sharing options. These settings help you control how logs are managed. 5. If you want more details, use the Microsoft Entra Admin Center. Here, you can filter guest users and sign-in logs to add more information to your audit data.
Tip: The Microsoft Purview portal is the main place to view and manage Fabric Activity Logs. You can also use Power BI to create reports from exported logs.
Filtering Options
When you search for logs, you can use filters to find the exact data you need. Filtering helps you focus on important events and makes your review faster.
User: Filter by user to see actions done by a specific person. This helps you track who made changes or accessed data.
Date: Set a date range to look at logs from a certain time period. This is useful if you want to check recent activity or review a past event.
Activity Type: Choose the type of activity, such as data access, changes, or sharing actions. This lets you focus on certain events that matter most to your audit.
You can combine these filters to narrow down your search. For example, you can look for all data access events by one user in the last week.
Note: Using filters saves time and helps you find patterns or problems quickly.
Export to CSV
After you find the logs you need, you can export them for further analysis. Exporting to CSV lets you use other tools, like Excel or Power BI, to study the data.
Follow these steps to export your logs:
In the Microsoft Purview portal, finish your search with the filters you want.
Click the Export button.
Choose Export to CSV.
Save the file to your computer.
You can now open the CSV file in Excel or import it into Power BI. This helps you create charts, spot trends, or share reports with your team.
Callout: Exporting Fabric Activity Logs makes it easier to review large amounts of data and build custom reports.
Log Structure
Key Fields
When you check Fabric Activity Logs, you will see many fields. These fields tell you what happened and who did it. Some fields are very important to know:
Timestamp: Shows when the event happened.
User Name: Tells you who did the action.
Action Type: Says what was done, like read or delete.
Resource Name: Shows which file or table was used.
Status: Tells if the action worked or not.
Source IP: Shows where the request came from.
Volume Name: Gives more details about storage.
Tip: Look at the timestamp and user name first. These help you find strange activity fast.
User and Admin Actions
You can see both user and admin actions in these logs. Users may read, write, or delete data. Admins often change settings or give permissions. Fabric Activity Logs show each action with clear details. You can see who made changes, what they changed, and when.
This table helps you check actions and find anything odd.
Data Access Tracking
Fabric Activity Logs let you see all data access events. When you turn on auditing, the system tracks files, streams, S3 objects, and tables. The logs cover many services, like authentication and cluster management. You can use insight services to collect and sort audit data into tables. These tables make it easy to search and study events with tools like Apache Spark.
Logs use easy names, like usernames and volume names, so you do not need to look up raw IDs.
You can find strange user actions because the insight service helps spot odd behavior.
You can turn insight gathering on or off for different clusters or nodes.
Note: Tracking data access helps you know who saw or changed important information. This is important for security and following rules.
Analyzing Fabric Activity Logs
If you want to use your audit data well, you need to know how to look at Fabric Activity Logs. This part will show you how to use built-in tools, work with other analysis choices, and set up ways to check logs often.
Query Tools
You can start by using the built-in query tools in Microsoft Fabric. These tools help you see what is happening without writing hard code.
Query Activity lets you see all running and past T-SQL queries. You can check things like the query text, how long it took, who ran it, and its status. You do not have to write T-SQL code to use this tool. You can filter queries, look for jobs that take a long time, and even stop a query if you need to.
Query Insights helps you see trends over time. It keeps up to 30 days of query data. You can use ready-made views to find slow queries, see which ones run most, and learn who uses the system most. This helps you save money and keep things working well.
Workspace Monitoring gathers logs and metrics from all your Fabric items in one spot. You can use KQL or SQL to search this data. This tool helps you find problems, check security, and see how your workspace is used.
Tip: Use Query Insights to find patterns and make things faster. You can spot slow spots and fix them before they get worse.
External Analysis
Sometimes you need more power or want to use your favorite tools. You can move your logs to other systems for deeper checks.
Log shippers like Fluentd, Filebeat, and OpenTelemetry Collector help you send logs from your network to Azure Blob Storage. This makes it easy to bring data from many places into your Fabric setup.
Azcopy CLI is a free tool that lets you move lots of data to Azure Storage quickly and safely. You can use it to send logs without paying for extra software.
Kusto Query Language (KQL) works inside Fabric’s real-time intelligence tools. You can use KQL to search, make dashboards, set up alerts, and even use AI to find patterns.
You can connect to other cloud storage, like AWS S3 or Google Cloud Storage, and use open formats like parquet or delta. This lets you use Spark, Python, or machine learning tools for advanced checks without extra storage costs.
Advantages of external tools:
You get more choices and can grow as needed.
You can connect to many data sources.
You save money because you only pay for what you use.
Note: Using outside tools lets you build your own dashboards, run machine learning, and mix data from different places for a full view of your setup.
Automation
You can save time and make sure you do not miss important events by setting up ways to export and check logs automatically. Here is how you can set up automation for Fabric Activity Logs:
Use PowerShell scripts to connect to the Power BI service. You need a Fabric administrator or a service principal for this step.
Pull activity log data for each day. You can go back up to 28 days by looping through each day.
Export the data as JSON files. Name each file with a date and a clear prefix so you can find it later.
Schedule your script to run every day. This keeps your log archive up to date.
Store the raw data in a folder. Do not filter or change it during export. You can clean and check it later.
Watch out for API limits. Only pull each day’s data once to avoid errors.
Use UTC time for all your files. This helps you avoid confusion with time zones.
# Example PowerShell snippet for exporting logs
Connect-PowerBIServiceAccount
$startDate = (Get-Date).AddDays(-1).ToString("yyyy-MM-dd")
$endDate = (Get-Date).ToString("yyyy-MM-dd")
Get-PowerBIActivityEvent -StartDateTime $startDate -EndDateTime $endDate |
Export-Csv -Path "C:\Logs\FabricActivityLogs_$startDate.csv"
Callout: Automating your log exports helps you keep a full history for audits, rules, and fixing problems.
You can use your checks for many important jobs. Here is a table to help you see what you can do:
You can also track things like data pipelines, data flows, datamarts, warehouses, notebooks, semantic models, and Spark jobs. This helps you keep your whole data setup safe and working well.
Best Practices
Compliance
You can follow rules by using good governance. Start by adding governance policies when you bring in data. This helps stop private data from leaking out. Use automation to make this step easier. Make sure only the right people can get in. Give users just the access they need. Always encrypt your data when it is stored or moved. Keep your data in one place to stop copies and track where it goes. Watch how people use data to find strange or wrong actions.
Good data governance helps you find risks like copies of data. It also sets up rules to keep your group safe. Auditors check for these rules when they look at your logs. Following these steps makes it easier to meet rules like HIPAA and ISO 27001.
Security Monitoring
You should check logs often to find security problems. Watch what users do, like making, reading, or deleting data. Use role-based access so only the right people see important logs. Get logs with PowerShell, the Power BI Admin API, or the Microsoft Purview portal. The Monitoring Hub and Admin Monitoring workspace show you what is happening now and in the past. Connect your logs to Microsoft Defender for Cloud or Microsoft Sentinel. These tools warn you about bad actions and help you act fast.
Check logs often to find strange things.
Set alerts for odd actions.
Export logs to look deeper and solve problems.
Checking logs often keeps your data safe and helps you fix problems fast.
Troubleshooting
If you have log problems, use built-in tools to write messages in log files. Change the log level in your settings to get more details when checking issues. Turn on connection leak checks to find problems early. Control how long you keep logs by setting rules for file size and history. Plan to delete old logs to keep things tidy. For long-term storage, send logs to big systems like ELK Stack or Splunk. Always restart nodes after you change log settings.
Tip: Know where your log files are and how to read them. This helps you fix problems faster.
You have learned how to set up and use Fabric Activity Logs. You can also export and study these logs for your group. Checking logs often helps you:
See who looked at data and when, which helps with rules and safety.
Find risky or wrong actions fast.
Keep good records for checks and reports.
You can also set up automatic log exports or link your logs to a SIEM system. This gives you real-time alerts and better ways to study your logs. Use these steps to keep your data safe and follow the rules.
FAQ
How often should you review Fabric Activity Logs?
You should check your logs at least once a week. Regular reviews help you spot problems early. If your data changes often, look at the logs every day.
Can you automate the export of Fabric Activity Logs?
Yes, you can use PowerShell scripts or other tools to export logs automatically. Set up a daily or weekly schedule. This keeps your records up to date without extra work.
What should you do if you see suspicious activity in the logs?
First, check the user and action details. Alert your security team right away. Block the user if needed. Save the log file for your records.
Where can you store exported logs for long-term use?
You can keep exported logs in Azure Blob Storage, OneLake, or another secure cloud storage. Make sure you set access controls. This keeps your logs safe and easy to find.
Do you need special permissions to access Fabric Activity Logs?
Yes, you need audit or admin permissions. Ask your system admin if you cannot see the logs. Only users with the right roles can view or export audit data.