For improved performance, take advantage of using a Bulk()
API for updating the collection efficiently in bulk as you will be sending the operations to the server in batches (for example, say a batch size of 500). This gives you much better performance since you won't be sending every request to the server but just once in every 500 requests, thus making your updates more efficient and quicker.
The following demonstrates this approach, the first example uses the Bulk()
API available in MongoDB versions >= 2.6 and < 3.2. It updates all the matched documents in the collection from a given array by incrementing 1 to the shown field. It assumes the array of images has the structure
var images = [
{ "_id": 1, "name": "img_1.png" },
{ "_id": 2, "name": "img_2.png" }
{ "_id": 3, "name": "img_3.png" },
...
{ "_id": n, "name": "img_n.png" }
]
MongoDB versions >= 2.6 and < 3.2:
var bulk = db.images.initializeUnorderedBulkOp(),
counter = 0;
images.forEach(function (doc) {
bulk.find({ "_id": doc._id }).updateOne({
"$inc": { "shown": 1 }
});
counter++;
if (counter % 500 === 0) {
// Execute per 500 operations
bulk.execute();
// re-initialize every 500 update statements
bulk = db.images.initializeUnorderedBulkOp();
}
})
// Clean up remaining queue
if (counter % 500 !== 0) { bulk.execute(); }
The next example applies to the new MongoDB version 3.2 which has since deprecated the Bulk()
API and provided a newer set of apis using bulkWrite()
.
MongoDB version 3.2 and greater:
var ops = [];
images.forEach(function(doc) {
ops.push({
"updateOne": {
"filter": { "_id": doc._id },
"update": {
"$inc": { "shown": 1 }
}
}
});
if (ops.length === 500 ) {
db.images.bulkWrite(ops);
ops = [];
}
})
if (ops.length > 0)
db.images.bulkWrite(ops);
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…