Test web page / Server performance


I have a web page made with laravel and mysql, and it has a lot of access to the database, and I would like to know how to measure how many simultaneous users making queries can support my page.

I have a basic server in digital ocean with the following characteristics:


512 MB (1 CPU)   20 GB (SSD DISK)   1000 GB (Transfer)

and has many different servers and the highest has the features:


224 GB (1 CPU)   500 GB (SSD DISK)   10 TB (Transfer)

Really a lot of difference, but beyond that, I do not know when to use one and when to go climbing to another, that would be to climb vertically. At the same time, I do not know when to scale horizontally, that is, to replicate the system, or to have a server that is only for the database.

I would really like to know these questions, before having future problems, no matter how few users are.

asked by Juan Pablo B 23.10.2016 в 15:55

1 answer


Correct me if I'm wrong, when a user requests a page from your system, a connection to the database is opened, the queries that are required are made, the page is displayed and the connection is closed. This operation takes, say, 2 seconds. The user can be reading your page, say 10 minutes.

If your database accepts 3 simultaneous connections, you can serve 3 connections every 2 seconds or 90 connections per minute.

But even though 90 users are reading your page at the same time they are not simultaneous connections, because the page in question is already in the client .

If at the same time 5 clients arrived to request the page, and your database server accepts 3 simultaneous connections, two of those clients would wait 2 seconds to see your page. If you raise the MySQL limit to 5, then you could serve 150 pages every minute.

Obviously, the more open connections you allow, the more resources your system consumes.

The solution is to use temporary memories, cache , the queries of your database and the pages generated by PHP. This cache, let's say it reduces the response time to 1/2 second.

Now we go to your http server. If your server accepts 50 clients simultaneously, serving cached pages in 1/2, it can serve 100 simultaneous pages per second .

But it's the same, open a connection, serve the page and close ... no simultaneous users .


This changes if you use websockets, but do not indicate it in your question.


Apache has a program called ab that serves to make stress tests or benchmarking . It shows the number of requests per second your Apache server can serve, although I think it works for any server, because it does the tests under the http protocol.

It is used like this:

ab -n 200 -c 20 http://example.com

With this you generate 100 requests in 20 threads, to verify the concurrency. In this page of genbeta: dev you can see an example.




Unless you know for certain that your system has very high memory requirements, you should consider optimizing your code, using cache and optimizing your servers before thinking about scaling your servers. A badly programmed and / or poorly optimized system will have poor performance on a server with 1 GB or 224 Gb.

answered by 23.10.2016 / 16:47