I'm actually working in a search engine project.
We are working with python + mongoDb.
I'm having the following problem:
I have a pymongo cursor after excecuting a find() command to the mongo db.
The pymongo cursor have around 20k results.
I have noticed that the iteration over the pymongo cursor is really slow compared with a normal iteration over for example a list of the same size.
I did a little benchmark:
-iteration over a list of 20k strings: 0.001492 seconds
-iteration over a pymongo cursor with 20k results: 1.445343 seconds
The difference is really a lot. Maybe not a problem with this amounts of results, but if I have millons of results the time would be unacceptable.
Has anyone got an idea of why pymongo cursors are too slow to iterate?
Any idea of how can I iterate the cursor in less time?
Some extra info:
Python v2.6
PyMongo v1.9
MongoDB v1.6 32 bits
解决方案
Remember the pymongo driver is not giving you back all 20k results at once. It is making network calls to the mongodb backend for more items as you iterate. Of course it wont be as fast as a list of strings. However, I'd suggest trying to adjust the cursor batch_size as outlined in the api docs: