Memory error due to the huge input file size

Solution 1:

Obviously the file is too large to be read into memory all at once.

Why not just use:

with open("data.txt") as myfile:
    for line in myfile:
        do_something(line.rstrip("\n"))

or, if you're not on Python 2.6 and higher:

myfile = open("data.txt")
for line in myfile:
    do_something(line.rstrip("\n"))

In both cases, you'll get an iterator that can be treated much like a list of strings.

EDIT: Since your way of reading the entire file into one large string and then splitting it on newlines will remove the newlines in the process, I have added a .rstrip("\n") to my examples in order to better simulate the result.

Solution 2:

use this code to read file line by line:

for line in open('data.txt'):
    # work with line