Massive insertions in MongoDB with Python

0

I'm using python + MongoDB with pymongo So far everything is fine

but now I would like to optimize the insert, what I do is read a csv with 50000 rows and create a loop to read row by row, going to json and insert (insert), it takes a long time but it goes up XD

problems now I have about 200 csv files that have between 50000 and 150000 rows and it's already untenable

I wanted to know if there is any way to do mass insertions in MongoDB

example in postgressql is the copy statement, which uploads an entire file to the bd

thanks in advance, greetings

    
asked by jonellrz 16.04.2017 в 23:24
source

0 answers