I recently asked a question regarding how to save large python objects to file. I had previously run into problems converting massive Python dictionaries into string and writing them to file via write(). Now I am using pickle. Although it works, the files are incredibly large (> 5 GB). I have little experience in the field of such large files. I wanted to know if it would be faster, or even possible, to zip this pickle file prior to storing it to memory.
解决方案
Python code would be extremely slow when it comes to implementing data serialization.
If you try to create an equivalent to Pickle in pure Python, you'll see that it will be super slow.
Fortunately the built-in modules which perform that are quite good.
Apart from cPickle, you will find the marshal module, which is a lot faster.
But it needs a real file handle (not from a file-like object).
You can import marshal as Pickle and see the difference.
I don't think you can make a custom serializer which is a lot faster than this...