Get a lot of records from my MySQL DB using PHP

0

Warm regards. You see, I have a table "movements" with approximately 300 thousand records, which I wish to obtain through PHP, but at the time of making my query, I do not get results, unless I put a LIMIT.

This is my code:

$query = "SELECT * FROM movimientos";
$sql = $this->conex->prepare($query);
$sql->execute();
$movimientos = $sql->fetchAll(PDO::FETCH_ASSOC);
if($sql->rowCount()>0) var_dump($movimientos);
else echo "No hay movimientos";

The detail is that, as I said, it does not give me results, it does not even enter the "else", only my script remains blank.

BUT, if I modify my query to:

$query = "SELECT * FROM movimientos LIMIT 50000";

Then if I get my results.

Why could this be happening? I researched a bit, and I found something related to the variables post_max_size and memory_limit, of the php.ini. I have already increased these values but I still do not get results.

I hope you can help me, thank you in advance. Greetings!

------------------- Edited (Solution) -----------------

Sorry, I had not had time to share my solution. I warn that I do not know if it is the most advisable, you will think ... But it was what worked for me.

Firstly, I had to increase the memory_limit variable in my php.ini to 3MB (I know, too! ... not recommended, but believe me without this, my script does not work). The variable set_time_limit (configured at 1 min) did not represent a problem, since the script did not die by time, but by memory / buffer size (or so I understood it); so I did not have to increase it.

Afterwards, I had to make an adaptation in my query, to get the results, thanks to the help provided in the developer's website, I was suggested to make these changes.

I had my script in the manner mentioned above, and it was like this:

function prueba(){
 $query = "SELECT * FROM tarjetas";
 $sql = $this->conex->query($query); //Se ejecuta directo la consulta, ya que no hay parámetros
 $sql->setFetchMode(PDO::FETCH_ASSOC);
 $movs = array(); //array auxiliar para guardar los movimientos
 while ($row = $sql->fetch()) { //recomerremos una a una las filas obtenidas
    array_push($movs, $row); //guardamos el elemento en el array auxiliar
    unset($row); //eliminamos la fila para evitar sobrecargar la memoria
 }
    return $movs;
 }

This is how my script was, as I repeat, it was what worked for me. I hope someone can serve you, and can comment about it. Greetings and thanks to everyone!

    
asked by DevPHP 12.12.2016 в 18:00
source

5 answers

0

Sorry, I had not had time to share my solution. I warn that I do not know if it is the most advisable, you will think ... But it was what worked for me.

Firstly, I had to increase the memory_limit variable in my php.ini to 3MB (I know, too! ... not recommended, but believe me without this, my script does not work) The variable set_time_limit (configured at 1 min) did not represent a problem, since the script did not die by time, but by memory / buffer size (or so I understood it); so I did not have to increase it.

Afterwards, I had to make an adaptation in my query, when obtaining the results, thanks to the help given in thewebprogrammer , they suggested me to make these changes .

I had my script this way:

public function prueba(){
    $query = "SELECT * FROM tabla"; //tabla con +300mil registros
    $sql = $this->conex->prepare($query); //primer error xD
    $sql->execute();
    $movimientos = $sql->fetchAll(PDO::FETCH_ASSOC); //Justo aqui moría mi   script
    $res = ($sql->rowCount()>0)  ? $movimientos : false; 
    return $res;
   }

It remained as follows:

function prueba(){
            $query = "SELECT * FROM tarjetas";
            $sql = $this->conex->query($query); //Se ejecuta directo la consulta, ya que no hay parámetros
            $sql->setFetchMode(PDO::FETCH_ASSOC);
            $movs = array(); //array auxiliar para guardar los movimientos
            while ($row = $sql->fetch()) { //recomerremos una a una las filas obtenidas
                array_push($movs, $row); //guardamos el elemento en el array auxiliar
                unset($row); //eliminamos la fila para evitar sobrecargar la memoria
            }
            return $movs;
        }

This is how my script was, as I repeat, it was what worked for me. I hope someone can serve you, and can comment about it. Greetings and thanks to everyone!

    
answered by 19.01.2017 / 01:12
source
1

I propose a solution, the reality is that this problem can be covered from several points of view. You could optimize the table of the database by adding indexes, you could create an automatic page system or you could execute the process in a unattended program task. In my solution I will explain how to carry out the last case.

1.- Add a column in your table (executed TINYINT 1 UNSIGNED)

2.- Create the script programa_programada.php

<?php

$query = "SELECT * FROM movimientos WHERE executed = 0 LIMIT 50000";
$sql = $this->conex->prepare($query);
$sql->execute();
$movimientos = $sql->fetchAll(PDO::FETCH_ASSOC);
$ids_procesados = "";

for($i=0;$i<=count($movimientos)-1;$i++){
    //Ejecutar algun proceso personalizado
    $movimientos[$i]["test"] = $movimientos[$i]["test"] * 1;

    //Primary key
    $ids_procesados.=$movimientos[$i]["id"].",";
}

if (!empty($ids_procesados)){
    $ids_procesados = preg_replace(",$", "", $ids_procesados);
    $query_update = "UPDATE movimientos SET executed = 1 WHERE id IN(".$ids_procesados.")";
    $sql_update = $this->conex->prepare($query_update);
    $sql_update->execute();
}


?>

Basically what we do with this script is to process 50,000 records at a time looking for records that have not been processed and at the end we mark them as processed. You should try your script and see what are the limits with which you can work. You should configure a CRON / task scheduled on your server to run this script every X minutes.

A method to make sure that your script is not running in the background and that an overlap of data is generated could be to create a file on the server when starting to process a batch of records and delete it when finished. Then, at the beginning of your script, you would first verify that this file was not created to execute the process.

    
answered by 13.01.2017 в 16:56
1

Place the set_time_limit(0) at the beginning of the script so that the execution time of the script does not stop.

The other thing that I can recommend is that you do not bring the 300000 registration once but that you have a paginated so that the query becomes smaller.

I leave the link to the official php explanation: Go official php site for the method (set_time_limit) .

    
answered by 13.12.2016 в 00:10
0

If you can not configure the server, you can use one paging for every 50 thousand records or those you want.

You get the number of records:

function obtener_cantidad_registros()
{
    $query = "SELECT * FROM movimientos";
    $result = $sql->query($query);
    return = $result->num_rows;
}

Obtain the number of pages according to the number of records and the size of the page (number of records you will show for each page):

function obtener_numero_paginas($cantidad_registros, $tamaño_pagina)
{
  return ceil($cantidad_registros / $tamaño_pagina);
}

Get the number of the page that is being consulted:

function obtener($numero_pagina)
{
  if(! $numero_pagina or ! is_numeric($numero_pagina))
  {
    return 1;
  }

  return $numero_pagina;
}

This function simply validates that the number of the page is a numeric value, but returns 1 which would be equivalent to saying that it is page number 1.

Then you calculate the start of records of the query according to the number of the page and the limit that you assign (in this case I put each page to print 50000 records):

function calcular_inicio($numero_pagina, $tamaño_pagina)
{
  if($numero_pagina == 1)
  {
    $inicio = 0;
  }else{
    $inicio = ($numero_pagina- 1) * $tamaño_pagina;
  }

  return $inicio;
}

For example, if you are on page 1, the query should make a limit of 0 to 50000, if you are on page 2, the limit should be 50000 to 100000.

Now get a query:

function obtener_consulta($mysql, $inicio, $tamaño_pagina){
  $query = "SELECT * FROM movimientos LIMIT $inicio, $tamaño_pagina";
  return $mysql->query($query);
} 

And finally get the links for the pages:

function imprimir_paginacion($pagina, $tamaño_pagina)
{
 for($i = 0; $i <= $tamaño_pagina; $i++)
 {
   $i == $pagina ? print($pagina) : print('<a href="'.$url.'?pagina='.$i.'">'.$i.'</a>  ');
 }
}

To run it, it's like this:

$total_registros = obtener_cantidad_registros();

$tamaño_pagina = 50000;
$total_paginas = obtener_numero_paginas($total_registros, $total_registros);


$pagina = isset($_GET['pagina']) ? $_GET['pagina'] : 0;
$pagina_actual = obtener($pagina);

$inicio = calcular_inicio($pagina_actual, $tamaño_pagina);

$result = obtener_consulta($mysql, $inicio, $tamaño_pagina);

while($row = $result->fetch_array(MYSQLI_NUM))
{
   // Imprimes registros obtenidos de la consulta
}

imprimir_paginacion($pagina_actual , $tamaño_pagina);

Note: I relied on this article .

    
answered by 13.12.2016 в 00:19
-1

I imagine that it is not the same to do a var_dump of 50000 records than of 300000, better you have a foreach on $ movements and an echo, as far as it goes, PHP is escaping the echos although the script takes a long time, so you can see that if is bringing you the query.

Finally the solution, as you have been told is to make a page, this is whether or not for these cases, there is no other.

    
answered by 13.12.2016 в 00:26