Mongoose massive inserts / updates

0

My application is powered by CSV files, which are parsed to JSON without problems.

Once I have the JSON object I start importing

let calls = [];

for (let i=0; i<json.length; i++) {
    calls.push(import(json[i]);
}

Promise.all(calls)
    .then(result => {
        ...
    })
    .catch(error => {
        ...
    });

My import function is:

import(jsonObj) {
    const promise = new Promise(function(resolve, reject) {
        Model.findOneAndUpdate(
            query,
            jsonObj,
            {upsert: true},
            function(error, doc) {
                ...
            }
        );
    });

    return promise;

}

When the file has few registers it works perfectly, but when you get files with 200,000 records the application has memory problems when resolving all the promises.

Is there any way to make promises in smaller batches so you do not have memory problems?

    
asked by Quidi90 16.04.2018 в 14:14
source

0 answers