Problem with loading speed

0

I'm trying R and learning to use it. I'm making an application with shiny loading the data from an ORACLE base.

To load the data I have created R scripts to part of the application and to launch it I load the data before with this scirpt:

library(shiny)
library(RODBC)
source("ConexionBBDDOracle.R")
source("sectorAgrupado.R")
source("Sector.R")
source("agrupacionProductoServicio.R")
source("productoservicio.R")
source("consulta.R")
nombresSectorAgrupado <- setNames(object = as.character(sectorAgrupado$L0CODIGO), nm = as.character(sectorAgrupado$L0NOMBRE))
nombresSector <- setNames(object = as.character(sector$SECODIGO), nm = as.character(sector$SENOMBRE))
nombresProductoAgrupado <- setNames(object = as.character(agrupacionproductoservio$E8CODIGO),nm = as.character(agrupacionproductoservio$E8NOMBRE))
nombreProducto <- setNames(object = as.character(productoServicio$PSCODIGO),nm = as.character(productoServicio$PSNOMBRE))

nombresSector <- setNames(object = as.character(sector$SECODIGO), nm = as.character(sector$SENOMBRE))
nombreProducto <- setNames(object = as.character(productoServicio$PSCODIGO),nm = as.character(productoServicio$PSNOMBRE))

runApp(getwd(), port=8100)

In the data files I have information of this style

con <-conexion()
consulta <- ejecutaQuery(con,"select COCODIGO, COTEXTO,COCOSE,COCOPS,COFECHA from L2113T00")

cerrarConexion(con)

This way I already have the data frames created for when I run the app.

The problem is this: The query table has more than 500,000 records and I want to load it in its entirety. Since it has a lot of data, it takes a long time to load the shiny application, not only that, but also that the application does not show the data and blocks after a while.

I understand that I should not load the whole blow table because it is useless. But I would like to know which is the most correct way to load the data.

    
asked by Lombarda Arda 21.02.2017 в 14:44
source

2 answers

1

The first problem, which was to prevent the table from taking too long to load, I managed to solve with the response from link

The second problem was that the shiny application did not show the data correctly, the solution has been to use the DT library

And implementing with a couple of lines in the ui.R

  mainPanel( 
    tabsetPanel(          
      tabPanel('Consulta',DT::dataTableOutput("consulta"))
    )
  )

And another pair in the server.R

  ##Tabla consultas
  output$consulta <- DT::renderDataTable(
   # consulta[which(consulta$COCOSE == input$selectSector | consulta$COCOPS == input$selectPro)]
    DT::datatable(consulta,options = list(pageLength = 25))
    )

This way the application is no longer blocked, it still takes a while to load but the change is more than obvious

    
answered by 22.02.2017 / 12:37
source
1

A tip that I use in this type of events when the download (possibly from a remote place) lasts a lot is:

nombre_data <- "consulta.rds" 

if(file.exists(nombre_data)) {
   # si existe previamente lo cargamos
   consulta<- readRDS(nombre_data)
} else {
  # en caso que no exista lo debemos cargar
  # crear conexión, etc.
  consulta<- ejecutaQuery(con,"select COCODIGO, COTEXTO,COCOSE,COCOPS,COFECHA     from L2113T00")

  # y salvamos para una proxima oportunidad
  saveRDS(consulta, nombre_data)

}

With that we store it locally the first time and in a next opportunity we load it from the local file.

Obviously that is useful in case the query does not change by execution date, with this I speak of examples where the table changes daily.

It is also recommended:

  • Download the really necessary columns, but I'm assuming you're already doing this
  • In case you need them summarized, it is preferable to give the work to the bbdd and thus minimize the amount of information to transfer / download.
answered by 21.02.2017 в 20:08