What are the implications of exceeding 4 GB in a Windows Event Log?

Other than the awful performance and ridiculous wait times when you have to load a 4 GB log and the hell it will be if you ever have to search through such a monstrous thing, not much. I think the largest one I've seen in my environments was 10 GB, and although I gave up waiting on it to load, it didn't seem to harm anything.

The 4GB caution for Server 2008 is due to that 32-bit limit that's often encountered at 4 GB. On a 64 bit system, you should be fine to let it grow to up to 16 TB (or 64, depending), though I don't know that anyone's gotten anywhere close to testing that limit.

Of course, if you haven't already, you'll discover that very large log files are simply impractical to use - the last time I tried to load a simple 100 GB (text) log file, it couldn't even be opened without crashing the application opening it, and I suspect you'll hit that issue well before 100 GB.

The far better approach is to limit the file size to something reasonable, and use a script to clear it out from time to time. I use the below in my environment, combined with a 1 GB size limit on our security log. Some (well, most) of our servers generate over 3 GB of security events per day, and we don't want to waste all that space on huge log files I'll quit before combing through, so my script copies the log contents to another folder and then clears the event log to be written to again. And since the folder I copy them to is backed up, we can always go back to the logs in the horrible event that we need to.

#Adapted from: http://blogs.technet.com/b/heyscriptingguy/archive/2009/04/08/how-can-i-check-the-size-of-my-event-log-and-then-backup-and-archive-it-if-it-is-more-than-half-full.aspx

Param($logName = "security",$backupFolder = "C:\backupLogs")

Function Get-EventLog([string]$logName)
{
 $log = Get-WmiObject -Class Win32_NTEventLogFile -filter "LogFileName = '$logName'"
 If($log.FileSize / $log.MaxFileSize -ge .9)
  {
   "Log is at least 90% full. Backing up now."
   Backup-EventLog($log)
  } #end if
 Else 
 { 
   "Not backed up: $logName is only " + ($log.FileSize / $log.MaxFileSize).tostring("N2") +  " percent full" 
 } #end else
} #end Get-EventLog

Function Backup-EventLog($log)
{
 $folder = Join-Path -Path $BackUpFolder -ChildPath (Get-Date).ToString("MMddyy_hhmm")
 If(-not(Test-Path $folder)) 
   { 
     New-Item -path $folder -itemtype Directory -force | out-Null
   }
  $rtn = $log.BackupEventLog("$folder\$logName.evt").ReturnValue
  If($rtn -eq 0)
    {
     $log.ClearEventLog() | out-null
    } #end if
 ELSE 
   {
    "$logName could not be cleared. Backup ended with $($rtn)" 
  }
} #end Backup-EventLog

# *** ENTRY POINT ***
Get-EventLog -logname $logname

The other answer covers the reasoning behind this - for modern systems, mostly keeping load times within the event viewer GUI somewhat bearable. Copying the current log to a location that gets backed up, then clearing it, is also good.

For parsing large log files that end up being generated anyway, two good options occur:

1) Parse the log faster than the current GUI can manage or 2) Split the log into separate files.

I'm sure there are some easily-available utilities out there for 2), so I'll focus on 1).

Firstly, Powershell has an excellent cmdlet for this functionality called 'get-winevent'. The fastest performance I've seen involves using hash tables. Here's an example that gets all events in the security log pertaining to a specific user from the last day:

$timeframe = (get-date) - (new-timespan -day 1)
$userevt = Get-WinEvent -ComputerName <specify> -FilterHashTable @{LogName='Security'; Data='<enter username here>'; StartTime=$timeframe}

$userevt is now a collection of events. Depending on the number of matches, you can pipe it through to format-list to easily read a small number of events. For a medium number, do the same but redirect the output to a file:

$userevt | format-list > <outputfile>.txt

For a large number, start filtering (say you want the caller computer for a lockout event on the user we acquired above):

$userevt | %{if ($_.message -match "Caller Computer .*") {$matches[0]}}

This will show a single-line result for each lockout event. The above processes generally take 1-4 minutes for a 4GB log on 2008 R2.

Secondly, especially for any 2003 machines you might end up having to manage, you can right-click on a particular log file in the left pane in event viewer, and select 'save log file as'.

If you are running event viewer on the local machine, you can save a .evt file that can be parsed by get-winevent.

Alternatively, you can save a text or CSV file (I find CSV easier) that can be parsed by appropriate command-line utilities such as grep or findstr, or certain programs like notepad++.


Real world example: We had this occur with Security logs being increased to 12GB size to allow 6 month retention per a compliance requirement.

By month 3 we were unable to logon to the servers 2008r2 and 2012r2 servers. The logon would get stuck at the "Welcome" screen. We tried increasing server memory to 20gb to accommodate the large files being opened and the server was still angry. We ended up deciding on following manage engine's recommendation of 1GB and adjusting it to archive old file when full versus overwrite.

We have this script to cleanup old files older than 180 days if we need it but we can likely just keep the files in place.

get-childitem -Path "C:\Windows\System32\winevt\Logs" |
  where-object {$_.LastWriteTime -lt (get-date).AddDays(-180)} |
  remove-item –whatif

https://www.manageengine.com/products/active-directory-audit/help/getting-started/event-log-size-retention-settings.html