You will see now I have to make a series of imports of csv files based on data, and I do it through doctrine in symfony 3.
I'm talking about a csv of about 100,000 records, the case is that from that CSV I draw a series of fields for two entities that are clients and invoices. So while I'm reading the csv I go to a line I take the customer data I insert after I take the identifier I create the invoice and I pass the data of the csv and the id of the client.
To do this I have tried with dql and with the orm, but with the two options it takes me about 9 hours to process the complete csv file.
The database is that it's not because I've done the same load with Node.js and it was much faster.
So I do not know if there is any option.
I've tried doing clean and flush after every 20 lines of the file but it's still slow
Any ideas on how to improve this performance.
Thanks.