Poor performance with large .txt

Is there an inherent bug in the design of .txt that signficant performance drops can be experienced when working with relatively large files? Working with a file of over a million characters on Windows 8 and not only do editors (specifically Notepad++ and Wordpad) tend to run slowly, but frequently crash with large find and replace operations (if the find and replace conducts over 60,000 replacements for instance).

I have verified that this has nothing to do with the particular machine on which the textfile is operating (same poor performance on a similar, high spec machine).

I initially thought this was because the file in question was being edited on a network share, but copying it to a local folder still resulted in the same low performance.

The performance particularly drops when newlines are removed (working with a single word a million characters in length). Noticeable frame drop in UI despite using high end machine)


Solution 1:

Try Ultra Edit Its the best in this case.

Features at a glance

  • Column / block editing

  • Multi-caret editing

  • Multi-select

  • Syntax highlighting

  • Integrated FTP client

  • Integrated SSH/telnet

  • Editor themes

  • Large file editing +4GB

  • File / data sorting

  • Powerful search

  • Supports regex

  • CSV data reformatting

  • Macros and scripts

  • File compare

  • Smart templates

Solution 2:

Notepad++ was not designed for large text files, instead I recommend a program called glogg. found here: http://glogg.bonnefon.org/

and I swear to god if anyone dislikes this post and complains that I'm answering with a link then grow up because I did fully answer the question.