Big Data project start council [closed]

1

Good evening,

I have been assigned a project and I do not really know where to start. By curl I receive a json, the client wants to save every second the received data.

Right now they have a cron task that calls a php file that performs an insert in a MySQL DB of each parameter received in the json. This cron is executed every minute and internally the php file is connected to the database, and through a loop the insert is generated every second, so the file executes 60 inserts in a loop. Not counting the 200 thousand records that may be daily.

I do not see this too optimally, so I have proposed to migrate to mongodb. Do you see it well?

The other doubt is about the engine of this system, I had thought of a python that picks up the json directly and registers it without further information in the mongo file. But how could this be done in the most optimal way possible? Would you still use a cron?

I do not need the code, it is more advice on how to start it, where you would route the project, if there is something I am not taking into account, etc ...

Thank you very much!

    
asked by Jonj 30.03.2017 в 21:30
source

0 answers