All articles

Dataverse Auditing: How to Enable It, Query It, and Stop It From Eating Your Storage

A practical guide to Dataverse auditing: enabling it at every level, querying audit logs, managing retention policies, and keeping storage under control.

· 6 min read

Auditing in Dataverse is one of those features that gets turned on and then forgotten about. Somebody enables it during the initial project setup, checks a compliance box, and walks away. Two years later, you’re staring at a storage overage notification and wondering why your log capacity jumped by 8 GB.

The feature itself is solid. The problem is the default behavior: audit everything, keep it forever, never tell anyone how much space it’s consuming. That’s not a reasonable default for any production system.

Here’s how to set it up properly, query the data when you need it, and keep it from becoming a storage problem.

The Three Levels of Auditing

Dataverse auditing is controlled at three levels, and all three need to be configured for auditing to work.

1. Organization Level

This is the master switch. If organization-level auditing is off, nothing else matters.

To enable it:

  1. Open the Power Platform admin center
  2. Select your environment, then click Settings
  3. Go to Audit and logs > Audit settings
  4. Check Start Auditing and Audit user access (if you want login tracking)
  5. Save

You can also do this programmatically through the Organization entity:

var org = new Entity("organization", orgId);
org["isauditenabled"] = true;
org["isuseraccessauditenabled"] = true;
service.Update(org);

2. Table Level

Each table has its own auditing toggle. Even with org-level auditing on, a table won’t be audited unless you flip its switch too.

To enable it:

  1. Open the solution containing your table
  2. Open the table settings
  3. Under Advanced options, find Audit changes to its data and enable it
  4. Save and publish

Out of the box, many system tables have auditing enabled by default — Account, Contact, Lead, Opportunity, and most other “core” Dynamics 365 tables. Custom tables do not have auditing enabled by default. You have to turn it on yourself.

3. Column Level

Once a table is audited, all columns are audited by default. But you can turn off auditing on individual columns if you don’t need change tracking for every field.

This is where you should be opinionated. There’s no reason to audit changes to a “Last Modified On” column or a calculated field that updates every time a record is touched. Every audited column change creates a log entry. More entries means more storage consumed.

To disable auditing on a specific column, open the column properties in the solution editor and uncheck Enable auditing.

What Gets Captured

When auditing is active, Dataverse records:

EventWhat’s logged
CreateAll field values at the time of creation
UpdateOld value and new value for each changed field
DeleteAll field values at the time of deletion
User accessLogin events (if enabled at org level)
Sharing changesWhen records are shared or unshared

Each audit entry includes the user who made the change, the timestamp, and the operation type. For updates, you get a before-and-after snapshot of every changed field — not the entire record, just the fields that changed.

Querying Audit Logs

In the App

The quickest way to look at audit data is through the model-driven app.

  1. Open a record
  2. Go to Related > Audit History

This shows you the change history for that specific record. You can see who changed what and when.

For a broader view, the Audit Summary View is available under Settings > Auditing > Audit Summary View in the classic interface. It lets you filter by date range, user, entity, and operation type. It’s not glamorous, but it works for spot checks.

Through the API

For programmatic access, use RetrieveRecordChangeHistory:

var request = new RetrieveRecordChangeHistoryRequest
{
    Target = new EntityReference("account", accountId),
    PagingInfo = new PagingInfo
    {
        PageNumber = 1,
        Count = 50
    }
};

var response = (RetrieveRecordChangeHistoryResponse)service.Execute(request);

foreach (var detail in response.AuditDetailCollection.AuditDetails)
{
    if (detail is AttributeAuditDetail attrDetail)
    {
        Console.WriteLine($"Changed by: {attrDetail.AuditRecord["userid"]}");
        Console.WriteLine($"Date: {attrDetail.AuditRecord["createdon"]}");

        foreach (var attr in attrDetail.NewValue.Attributes)
        {
            var oldVal = attrDetail.OldValue.Contains(attr.Key) 
                ? attrDetail.OldValue[attr.Key] 
                : "(empty)";
            Console.WriteLine($"  {attr.Key}: {oldVal} → {attr.Value}");
        }
    }
}

There’s also RetrieveAuditDetails for fetching a single audit record by ID, and you can query the audit table directly via FetchXML or Web API if you need bulk access.

Web API equivalent:

GET /api/data/v9.2/audits?$filter=objectid_account eq {accountId}&$orderby=createdon desc

Storage: Where It Gets Expensive

Audit logs are stored in Log capacity, which is separate from Database and File capacity in the Power Platform admin center. This is a detail that surprises people — you can have plenty of database storage left and still hit capacity issues because your log storage is full.

To check your current usage:

  1. Go to the Power Platform admin center
  2. Click Resources > Capacity
  3. Look at the Log section

The capacity page breaks down usage by environment, so you can see exactly which environment is consuming the most log storage.

Here’s the part that matters: log capacity is often the first thing to fill up in tenants that have been running for a while with default settings. I’ve seen tenants where audit logs account for 80%+ of total storage consumption across all three categories. When that happens, the fix isn’t buying more capacity — it’s fixing your retention policy.

Retention Policies

By default, Dataverse keeps audit logs forever. That’s the root cause of most audit storage problems. A table with moderate activity — say, 500 updates per day across all records — generates roughly 180,000 audit entries per year. Multiply that by 20 audited tables and you’re looking at millions of log entries that accumulate year after year.

Setting a Retention Policy

You can configure audit log retention in the Power Platform admin center:

  1. Go to your environment’s Settings
  2. Navigate to Audit and logs > Audit settings
  3. Set Retain these logs for to your desired period (30 days, 60 days, 90 days, etc.)
  4. Save

There’s also a Dataverse-level long-term retention option using Microsoft Purview. For most organizations, the built-in retention setting is sufficient.

My recommendation: 90 days is a reasonable default for most environments. If compliance requires longer retention, keep the Dataverse logs at 90 days and archive to an external store (Azure Blob Storage, Dataverse long-term retention, or a dedicated audit database). Keeping 3+ years of audit logs in live Dataverse storage is almost never the right answer.

Deleting Old Audit Logs

If you already have a storage problem, setting a retention policy going forward won’t immediately free up space. You need to delete the existing old logs.

From the admin center:

  1. Go to Settings > Audit and logs > Audit Log Management
  2. Select a date range and delete logs older than that date
  3. This runs as a background job — large deletions can take hours

Programmatically:

var request = new DeleteAuditDataRequest
{
    EndDate = DateTime.UtcNow.AddDays(-90)
};
service.Execute(request);

Be careful with bulk deletion. It locks audit tables during execution. Run it during off-hours if you’re deleting millions of records.

Performance Impact

Auditing has a runtime cost. Every Create, Update, and Delete operation on an audited table has additional overhead because Dataverse needs to capture the before/after state and write the audit record.

For most workloads, the impact is negligible. You won’t notice it on a table that gets a few hundred updates per day. But for high-volume scenarios — bulk imports, batch processing, integration syncs that touch thousands of records per minute — the overhead adds up.

What to watch for:

  • Bulk data loads running noticeably slower after auditing is enabled
  • Plugin execution times increasing on audited tables
  • Dataverse API throttling kicking in sooner than expected during batch operations

If you’re running a bulk import of 100,000 records, consider temporarily disabling auditing on that table for the duration of the import. Re-enable it afterward. The compliance gap for that window is usually acceptable compared to doubling your import time.

What NOT to Audit

This is where most organizations go wrong. The instinct is to audit everything because “we might need it.” That instinct will cost you real money in storage.

Tables you should think twice about auditing:

  • Note (annotation): High-volume, large payloads, changes are already tracked through the parent record
  • Email, Task, Phone Call: Activity tables generate enormous audit volumes in busy CRM environments
  • Audit itself: Don’t audit the audit table (yes, this is possible to configure, and yes, people have done it)
  • Any table used as a queue or staging area: If records are created and deleted within hours, auditing them just creates churn
  • Tables with frequent automated updates: If a flow or plugin updates a status field every 5 minutes, every one of those updates creates an audit record

The rule is straightforward: audit tables where you need to answer “who changed this and when” for compliance or troubleshooting. Don’t audit tables where the answer to that question doesn’t matter.

A Reasonable Setup

For a typical Dynamics 365 environment, here’s what I’d configure:

  1. Enable org-level auditing with user access logging
  2. Enable table-level auditing on core business tables: Account, Contact, Opportunity, Case, and your key custom tables
  3. Disable column-level auditing on calculated fields, timestamps, and fields that change automatically
  4. Skip auditing on high-volume activity tables unless compliance specifically requires it
  5. Set retention to 90 days in the admin center
  6. Archive to external storage if you need long-term compliance records
  7. Check storage monthly — add it to your admin routine

Auditing is a necessary part of any production Dataverse deployment. But treating it as a “turn it on and forget it” feature is how you end up paying for storage you don’t need, keeping data you’ll never look at, and slowing down operations that should be fast. Configure it with intent, review it periodically, and clean up what you don’t need.

Share this article LinkedIn X / Twitter

Related articles