Linux Text editors able to work with very, very long lines & fixed length records [closed]

Solution 1:

I think a file like the way you are describing is, for practical purposes, acting like a binary file and you should treat it as such.

You could attack it with a hex editor, but that doesn't help you with the EBCIDC part.

If you have to do a lot of editing on this file and are intimately familiar with its fixed-length record format, it may be worth your time to whip up something in Perl or another language (I suggest Perl because it's old and would very likely have modules that convert UTF-8, etc. to EBCIDC), that would work with this file's specific format.

Solution 2:

I actually just tried vim on a file with a single line of 150000 characters and it ran smooth as silk.

So I guess you should really give it a go, if you didn't just because you heard vim doesn't like such kind of files.

Here's how I got my file:

seq 150000 | while read num; do echo -n "b" ; done > megaline.txt

Solution 3:

I tried the same method as Dakatine with my emacs install and it worked fine. No hangups, nothing. If you're editing plain text (i.e. not doing any of the kind of analysis that a mode for a programming language requires, in text-mode), it's going to be really, really hard to lock up emacs. The behavior that you're seeing with Eclipse is probably a reflection of the fact that Eclipse is trying to do some sort of analysis on the text as it renders it -- ditto for Gedit. I also can't reproduce your issue with less -- it comes out fine for me.

Solution 4:

Give it a try: JEdit - A programmer editor

It's got some advanced buffering features, and highly optimized I/O, but takes a while to start up (due to JVM). I've been using it to view & edit over 1GiB files without any trouble - at least on Debian. I cannot guarantee it will behave as good on windows though... :)