I have a loading process in a Django project which from an excel file stores information in a Business model. My loading process works, the problem is that the execution time is too high due to the data volume of the excel input.
Model.py
class Negocio(models.Model):
fecha_carga=models.DateField(auto_now_add=True)
fecha_actividad=models.DateField()
plataforma=models.CharField(max_length=50)
tecnologia=models.CharField(max_length=50)
terminal=models.CharField(max_length=50)
cantidad=models.FloatField()
plan_key=models.ForeignKey(Produccion,on_delete=models.CASCADE,null=True)
local_key=models.ForeignKey(Local,on_delete=models.CASCADE,null=True)
def __str__(self):
return self.plataforma
Views.py
def Carga(dataframe):
for row,col in dataframe.iterrows():
fecha=col["Fecha"]
Negocio(fecha_actividad=fecha,
plataforma = col['Plataforma'],
tecnologia = col['Tecnologia'],
terminal = col['Tipo Equipo'],
cantidad = col['Actividad'],
plan_key = ProduccionPlan.objects.get(fecha_produccion__year=fecha.year, fecha_produccion__month=fecha.month,plan__codigo_plan__iexact=col['Codigo Plan']),
local_key = Local.objects.get(codigo__iexact=col['Codigo Vendedor'])
).save()
This is what at the time I came up with to store the information, it works but it is not optimal since the loading time ranges between 40 and 45 minutes for excel files of approximately 80,000 rows.
I am trying to use the F () expressions but until now I have not been successful
To read the excel feed, I'm using pandas and my DB is in SQL Server.
Any help or idea?