This article first appeared in the SQLServerGeeks Magazine.
Author: Josephine Bush .
Subscribe to get your copy.
In this article, you will learn about centralizing and reporting on auditing data in SQL Server and Azure SQL. In previous issues of SQL Server Geeks Magazine, I described how to set up SQL Server Audit and Azure SQL auditing. An important part of auditing is being able to easily query the data and report on it.
SQL Server Audit Centralization
Each of my production servers has a SQL Server Audit setup. These collect permissions and schema changes for that server. I want all production server audit data in a central location for easy querying and reporting.
For centralizing, I have a few things setup:
1. Auditing database – There is an Auditing database on my centralized SQL Server. It stores the auditing data from all production servers. Use this script to set up the Auditing database: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/1-create-AuditChanges-table-on-cms.sql
2. Auditing user – This is created on my centralized SQL Server. This user has permission to modify the Auditing database. Use this script to set up the auditing user: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/1a-create-sqlauditing-user.sql
3. Linked server – Each production server has a linked server. It links it to the centralized auditing SQL Server. This linked server uses the auditing user. Use this script to set up the linked server: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/2-setup-linked-server-on-audited-servers.sql
4. Auditing Collection SQL Agent job – This collects audit data from the .sqlaudit files. The audit data is inserted into the centralized auditing database. Each SQL Server version has different columns available in the audit.
Use this script to set up the agent job on SQL Server 2017 or later: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/3-audit-changes-collection-job-version-2017.sql
Use this script to set up the agent job on versions older than SQL Server 2017: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/3-audit-changes-collection-job-version-b4-2017.sql
5. Audit Data Retention SQL Agent job – I only keep 30 days of auditing data. How much you keep depends on factors you can decide on. 30 days was a nice, round number, so I chose that. Use this script to clean your auditing data: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/6-cleanup-audit-data.sql
SQL Server Audit Reporting
Once the audit data is centralized, I use an agent job to send a report daily. To make the reporting agent job work, you need to set up and configure database mail. Use this script to set up database mail: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/4-setup-dbmail.sql
Use this script to set up your daily reporting agent job: https://github.com/sqlkitty/auditing-sqlserver-centralization-scripts/blob/main/5-email-audit-findings.sql
The following screenshot shows a sample report:
Here is a description of each of the columns in the report:
• Event Time – This gives you the ability to link the action to a time. It’s very important for troubleshooting and filtering the report to specific timeframes.
• Audit Action – I include this because sometimes it’s not clear why you have a statement in the audit. For example, VIEW SERVER STATE might appear with some other statements. This audit action column helps you sort out why you may be seeing something you think shouldn’t be on the report.
• Partial Statement – I only include the partial statement because the full statement tends to make for a very ugly HTML table. The full statements are in the database, you can always go look there if you need them.
• Server – You will most likely have many servers in your audit report. This helps to differentiate them.
• Database – You will most likely have many databases in your audit report. This helps to differentiate them.
• Schema – If the audit record is for an object that’s inside the database, you will have a schema. Again, nice to know the schema since you might have more than one of those in the database.
• Username – This tells you who made this audited change. Another very important column because you might have to go back to them if there are any issues.
• Successful – This is 1 for Yes and 0 for No. I added the successful column because we were seeing users executing scripts for which they would need elevated permission. It turns out, the audit will show you if it did not succeed. The appuser did not have access to truncate the table and it shows zero under successful to let you know it failed.
Azure SQL Database Audit Centralization
My favorite way to store Azure SQL Database audit logs is with log analytics. I put all my Azure SQL Databases in the same log analytics workspace. This makes it easy to query and report on the audit data.
You can access this data by either navigating to your log analytics workspace or into Auditing under each of the audited databases. The following screenshot shows the steps to access the auditing data from the database:
Even if you enable auditing at the server level, you still have to access the audit data via the database. It’s easier to go into your log analytics workspace to see all the audit data in one place instead of clicking on different databases to see it. The following screenshot shows a workspace summary of auditing records in the log analytics workspace:
You can click on actions, database names, or IP addresses to drill down for more detailed information. This is good for a high-level overview or a quick look-see of what’s happening with your audit data.
But better yet, is querying your audit data with Kusto Query Language (KQL). It’s easy to figure out, especially if you know SQL. Here’s a link to a SQL to Kusto cheat sheet: https://docs.microsoft.com/en-us/azure/data-explorer/kusto/query/sqlcheatsheet
If you want to query the audit data in the logs analytics workspace, you can click on Logs, and use a Kusto query, as shown in the following screenshot:
The Kusto query I recommend using is:
AzureDiagnostics
| where Category == ‘SQLSecurityAuditEvents’
and TimeGenerated > ago(1d)
| project
event_time_t,
database_name_s,
statement_s,
server_principal_name_s,
succeeded_s,
client_ip_s,
application_name_s,
additional_information_s,
data_sensitivity_information_s
| order by event_time_t desc
This query gets all SQLSecurityAuditEvents generated on the last day and orders them by the event time descending. There are more columns you can add to this. You can also filter on those columns. There are a lot of options with Kusto, like with SQL queries.
Azure SQL Database Audit Reporting
To report on auditing data in Azure, I use Logic Apps. You can set up an easy workflow to email your team the audit data for the last day (or the frequency of your choosing).
There is something you need to keep in mind when setting up this reporting. If you leave the default auditing configuration in place for Azure SQL Auditing, you will be getting a TON of auditing data. With that much data, your logic app will because the CSV attachment will be too large to send via email. There is a way to change the default auditing configuration to capture less auditing data. You can use Azure PowerShell for this. I recommend modifying the default auditing configuration to suit your auditing needs. You can read more about this process by visiting: https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview#using-azure-powershell
The following screenshot shows you the workflow that I set up for my Logic App:
Let’s break this down step-by-step:
• Recurrence – How often this Logic App will run. Note that “At these hours” is in the UTC. I have mine run daily at 10 am UTC.
• Run query and list results – In this step, I use the Kusto query I recommend for querying your audit data.
• Create CSV table – This creates a CSV file to store the audit results. This is much easier than trying to format the results into HTML in the body of the email.
• Send an email (V2) – This will allow you to configure an email to be sent with the CSV results attached. Make sure to fill out the Attachments Content AND the Attachments Name. If you don’t include the Attachments Name, the attachment won’t be included with the email.
Additional Information
Kusto Query Language (KQL)
https://docs.microsoft.com/en-us/azure/data-explorer/kusto/query/
Create Azure Logic Apps in the Azure portal
https://docs.microsoft.com/en-us/azure/logic-apps/quickstart-create-first-logic-app-workflow
Azure SQL Create or Update Database Auditing Policy
https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview#using-azure-powershell
This article first appeared in the SQLServerGeeks Magazine.
Author: Josephine Bush .
Subscribe to get your copy.