"The query returned too many events. Specify additional filters for your query, and try again, ...."All audit data is stored in one single table and after some time this table can have millions of rows. When pulling logs from that table, the system might crash because of the amount of data. This post will give you some help on how to solve this.
1. Create a backup SQL query
First of all I created a new table with the same columns of the audit table (AuditData). To do this, I generated a CREATE script of the SharePoint audit table and used that to create my own table in a different database. Just change the 'using' parameter and if you want also the name of the database.
I used this SQL query to backup the audit rows to a table of my own:
"insert into MyBackupDb.dbo.AuditDataArchive(SiteId,ItemId,ItemType,UserId,MachineName,MachineIp,DocLocation,LocationType,Occurred,Event,EventName,EventSource,SourceName,EventData)select * from WSS_Content.dbo.AuditData where Occurred > CAST('2010-07-08 12:32:49' AS datetime)"
The date in CAST('2010-07-08 12:32:49' AS datetime) is a date used to point out which rows must be copied. In this query, older rows will be copied.
You can execute this query manual, with code or use LINQ on both tables.
2. Delete the correct rows via the API
You can use the SPAudit class to delete queries. The SPAudit.DeleteEntries method takes one DateTime parameter. All audit logs that were stored before this datetime, will be deleted.
Try to do this operations when there is no activity on your site. The SQL query that copies the rows can take several minutes to run and will have a huge impact on the CPU of your SQL server.
Secondly, because this query takes so long to complete, in some particular cases, rows that should be copied, will not be copied because they were inserted after the initiation of the query. This means there is a chance rows that were not copied will be deleted.