I need to read and insert a CSV of more than 100,000 records. I have achieved that if it is of a small size of rows I do what I want, but when I work with the size that I mention, it blocks the page.
My idea was to do this in batches, for example 500 in 500 records. How could I do it? What should I modify?
What I have so far:
jQuery.ajax({
url: 'procesarcsv.php',
type: 'POST',
dataType: 'html',
data: formdata,
async: true,
success: function(response){
alert(response);
},
error: function(jqXHR, textStatus, errorThrown){
alert("error "+ errorThrown);
}
});
This call ajax, calls a .php. I imagine I should repeat this call as many batches as possible
public function procesarcsv(){
$batchsize = 10;
$handle = fopen("/carpeta/micsv.csv","r");
for ($i=0; $i < $batchsize; $i++) {
$line = fgetcsv($handle);
$col1 = $line[0];
$col2 = $line[1];
$col3 = $line[2];
$col4 = $line[3];
$col5 = $line[4];
$col6 = $line[5];
$col7 = $line[6];
$json = "'$col1', '$col2', '$col3', '$col4', '$col5', '$col6', '$col7'";
echo "JSON = ". $json . "<br>";
$this->insertIntoDDBB($json); //va insertando cada línea en la bbdd
}
}
How can I read and batch insert a CSV with thousands of records?