Bulk update in Pymongo using multiple ObjectId

bulk = db.testdata.initialize_unordered_bulk_op()

for id in ids:
   bulk.find( { '_id':  id}).update({ '$set': {  "isBad" : "N" }})

bulk.execute()

Iterate through the id list using a for loop and send the bulk updates in batches of 500:

bulk = db.testdata.initialize_unordered_bulk_op()
counter = 0

for id in ids:
    # process in bulk
    bulk.find({ '_id': id }).update({ '$set': { 'isBad': 'N' } })
    counter += 1

    if (counter % 500 == 0):
        bulk.execute()
        bulk = db.testdata.initialize_ordered_bulk_op()

if (counter % 500 != 0):
    bulk.execute()

Because write commands can accept no more than 1000 operations (from the docs), you will have to split bulk operations into multiple batches, in this case you can choose an arbitrary batch size of up to 1000.

The reason for choosing 500 is to ensure that the sum of the associated document from the Bulk.find() and the update document is less than or equal to the maximum BSON document size even though there is no there is no guarantee using the default 1000 operations requests will fit under the 16MB BSON limit. The Bulk() operations in the mongo shell and comparable methods in the drivers do not have this limit.