What causes the error "_pickle.UnpicklingError: invalid load key, ' '."?

I'm trying to store 5000 data elements on an array. This 5000 elements are stored on an existent file (therefore it's not empty).

But I'm getting an error and I don't know what is causing it.

IN:

def array():

    name = 'puntos.df4'

    m = open(name, 'rb')
    v = []*5000

    m.seek(-5000, io.SEEK_END)
    fp = m.tell()
    sz = os.path.getsize(name)

    while fp < sz:
        pt = pickle.load(m)
        v.append(pt)

    m.close()
    return v

OUT:

line 23, in array
pt = pickle.load(m)
_pickle.UnpicklingError: invalid load key, ''.

Solution 1:

pickling is recursive, not sequential. Thus, to pickle a list, pickle will start to pickle the containing list, then pickle the first element… diving into the first element and pickling dependencies and sub-elements until the first element is serialized. Then moves on to the next element of the list, and so on, until it finally finishes the list and finishes serializing the enclosing list. In short, it's hard to treat a recursive pickle as sequential, except for some special cases. It's better to use a smarter pattern on your dump, if you want to load in a special way.

The most common pickle, it to pickle everything with a single dump to a file -- but then you have to load everything at once with a single load. However, if you open a file handle and do multiple dump calls (e.g. one for each element of the list, or a tuple of selected elements), then your load will mirror that… you open the file handle and do multiple load calls until you have all the list elements and can reconstruct the list. It's still not easy to selectively load only certain list elements, however. To do that, you'd probably have to store your list elements as a dict (with the index of the element or chunk as the key) using a package like klepto, which can break up a pickled dict into several files transparently, and enables easy loading of specific elements.

Saving and loading multiple objects in pickle file?

Solution 2:

This may not be relevant to your specific issue, but I had a similar problem when the pickle archive had been created using gzip.

For example if a compressed pickle archive is made like this,

import gzip, pickle
with gzip.open('test.pklz', 'wb') as ofp:
    pickle.dump([1,2,3], ofp)

Trying to open it throws the errors

 with open('test.pklz', 'rb') as ifp:
     print(pickle.load(ifp))
Traceback (most recent call last):
  File "<stdin>", line 2, in <module>
_pickle.UnpicklingError: invalid load key, ''.

But, if the pickle file is opened using gzip all is harmonious

with gzip.open('test.pklz', 'rb') as ifp:
    print(pickle.load(ifp))

[1, 2, 3]

Solution 3:

If you transferred these files through disk or other means, it is likely they were not saved properly.

Solution 4:

I solved my issue by:

  • Remove the cloned project
  • Install git lfs: sudo apt-get install git-lfs
  • Set up git lfs for your user account: git lfs install
  • Clone the project again.

I hope you find it useful too.

Solution 5:

I am not completely sure what you're trying to achieve by seeking to a specific offset and attempting to load individual values manually, the typical usage of the pickle module is:

# save data to a file
with open('myfile.pickle','wb') as fout:
    pickle.dump([1,2,3],fout)

# read data from a file
with open('myfile.pickle') as fin:
    print pickle.load(fin)

# output
>> [1, 2, 3]

If you dumped a list, you'll load a list, there's no need to load each item individually.

you're saying that you got an error before you were seeking to the -5000 offset, maybe the file you're trying to read is corrupted.

If you have access to the original data, I suggest you try saving it to a new file and reading it as in the example.