_pickle in python3 doesn't work for large data saving

Not anymore in Python 3.4 which has PEP 3154 and Pickle 4.0
https://www.python.org/dev/peps/pep-3154/

But you need to say you want to use version 4 of the protocol:
https://docs.python.org/3/library/pickle.html

pickle.dump(d, open("file", 'w'), protocol=4)

Yes, this is a hard-coded limit; from save_bytes function:

else if (size <= 0xffffffffL) {
    // ...
}
else {
    PyErr_SetString(PyExc_OverflowError,
                    "cannot serialize a bytes object larger than 4 GiB");
    return -1;          /* string too large */
}

The protocol uses 4 bytes to write the size of the object to disk, which means you can only track sizes of up to 232 == 4GB.

If you can break up the bytes object into multiple objects, each smaller than 4GB, you can still save the data to a pickle, of course.