I need to examine an 82.7 GB (!) text file. What can open it?

We recently had a meltdown of a Tomcat server, that produced an 82.7 GB "catalina.out" log file, which I saved for forensic analysis.

What macOS editors can open monster text files without consuming 80 GB of RAM or causing 15 minute freezes?


Solution 1:

Try Glogg. There is a MacOs build on the download page:

https://glogg.bonnefon.org/download.html

I don't know about 80 GB files, but I regularly used it (on Windos) to open log files up to 5 GB, and it works great on those (memory footprint after indexing is about 100-150MB, and searching is very fast).

One note though - it's read-only analyzer, not an editor.

Solution 2:

less filename

From the command line, it lets you view files straightaway without loading the full file into memory.

Solution 3:

I would not try to open it... I'd rather do:

  1. grep - look for some text
  2. split - chop the file into say 10Mb chunks.

Something like:

grep "crash" My80GbFile.txt | more 

If the big file is not "Line delimited"

split -b 10M My80GbFile.txt

But if the big file is just a load of lines, then (as was posted), split by line (100,000 per sub-file) in this case.

 split -l 100000 My80GbFile.txt