Parse server performance loss after time

When I start parse server single server can make 100req/sec. But when I do same test again it reduces to 90,80,70 … and down to 20 or 15. Sometimes it go as low as 5. But when I restart parse server with pm2 reload index, Performance resets and server handle 100req/sec again. But ıt goes low again. I have no idea how to fix this. How can I fix this? Or atleast How can I trace down the issue? How can I find the cause?

  • Over what time span does the reduction occur and what kind of requests are these, roughly?
  • Did you look at server instance metrics to see if there is any correlation?
  • Does Parse Server run on 1 instance or are there multiple instances behind a load balancer?

Performance loss is visible immediatly. I’m using wrk library to test it. 300 concurrent users, 1 thread, and 30 seconds of duration.

I have 8 servers total, Some of them has 1 core some of them has 4. If server has multiple core, I use PM2 library and run multiple instances with cluster mode.

Then I do loadbalancing with nginx between diferent servers.

Here is my load balancer config:

upstream backend {
      least_conn;
      server mini3.example.com:1337;
      server mini2.example.com:1337 weight=2;
      server mini1.example.com:1337;
      server mini.example.com:1337;
      server api4.example.com:1337 weight=3;
      server api3.example.com:1337 weight=4;
      server api2.example.com:1337 weight=4;
      server api1.example.com:1337 weight=4;

   }

   # This server accepts all traffic to port 80 and passes it to the upstream. 
   # Notice that the upstream name and the proxy_pass need to match.

   server {
      server_name loadbalancer.example.com;
      location / {

          proxy_pass http://backend;
      }

    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/loadbalancer.example.com/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/loadbalancer.example.com/privkey.pem; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot


}

I do benchmark to cloud code, since I disabled every CLP for every class and masterkey is the only way to get them. And I moved my app logic to cloud code.

Performance loss is immediatly happens. I do test, When test finishes, I do same test again immediatly and results are always lower than previous.

For example, Yesterday I made one. My servers handled 480 request per second. But second test, It dropped to 450 then 420 then 360 … and dropped to 75. request per second.

I controlled the database server. CPU usage never goes above 40%.(edit: I monitor parse server with pm2 monit command, No matter how many requests parse give, Parse always uses 100% cpu) Parse server shouldnt give 480req/sec in the first test if this was an database issue. So I dont think its related to database.

And as I say ıt drops to 70req/sec. But when I restart parse server in all servers, I see again 480(±) requests.

Edit: I should say that I do same benchmark test everytime. İf first test was done with 300 concurrent connections, second test is also done with 300 connections. When first test gives 480 requests, It drops to lower in the later tests and some of the connections get timeout error(which is requests takes longer than 30 seconds, library time out is set to 30 seconds)

If you wait a minute between these tests, is there still a decrease in performance?

I will do another test. And share the result s with you here.

Hey manuel. I made the test. Performance still decreases. I waited one minute between tests.

Here is test history. It started 318 request/second. And decreased to 103. And It was still decreasing but I stopped the test.

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   921.29ms  188.11ms   1.87s    85.78%
    Req/Sec   334.15    121.93   696.00     67.48%
  9536 requests in 30.05s, 179.25MB read
Requests/sec:    317.34
Transfer/sec:      5.97MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   908.23ms  154.12ms   1.46s    77.69%
    Req/Sec   336.41    134.10     0.87k    72.73%
  9606 requests in 30.06s, 180.56MB read
Requests/sec:    319.53
Transfer/sec:      6.01MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   918.89ms  155.10ms   1.75s    75.65%
    Req/Sec   331.93    134.94   848.00     70.83%
  9532 requests in 30.06s, 179.17MB read
Requests/sec:    317.11
Transfer/sec:      5.96MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   928.40ms  151.50ms   1.58s    75.08%
    Req/Sec   328.55    115.63   800.00     69.90%
  9472 requests in 30.05s, 178.04MB read
Requests/sec:    315.17
Transfer/sec:      5.92MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   941.49ms  146.27ms   1.51s    74.10%
    Req/Sec   323.12    111.22   828.00     73.61%
  9292 requests in 30.04s, 174.66MB read
Requests/sec:    309.27
Transfer/sec:      5.81MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   947.33ms  143.14ms   1.52s    72.83%
    Req/Sec   321.92     93.10   630.00     72.22%
  9247 requests in 30.06s, 173.82MB read
Requests/sec:    307.66
Transfer/sec:      5.78MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   968.04ms  160.94ms   1.96s    74.12%
    Req/Sec   315.72    100.05   770.00     71.33%
  9002 requests in 30.05s, 169.21MB read
  Socket errors: connect 0, read 0, write 0, timeout 3
Requests/sec:    299.56
Transfer/sec:      5.63MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   983.28ms  149.79ms   1.62s    69.66%
    Req/Sec   308.81    105.17   787.00     71.63%
  8906 requests in 30.04s, 167.41MB read
Requests/sec:    296.46
Transfer/sec:      5.57MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.01s   156.19ms   1.87s    65.25%
    Req/Sec   299.80    106.03   707.00     71.97%
  8639 requests in 30.05s, 162.39MB read
Requests/sec:    287.46
Transfer/sec:      5.40MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.02s   167.85ms   1.76s    66.68%
    Req/Sec   297.29    113.59   750.00     70.73%
  8520 requests in 30.05s, 160.15MB read
Requests/sec:    283.51
Transfer/sec:      5.33MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.06s   173.42ms   1.99s    65.74%
    Req/Sec   287.77    103.26     0.88k    72.38%
  8230 requests in 30.05s, 154.70MB read
  Socket errors: connect 0, read 0, write 0, timeout 5
Requests/sec:    273.86
Transfer/sec:      5.15MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.06s   175.03ms   1.94s    65.03%
    Req/Sec   287.26    102.01   700.00     68.64%
  8221 requests in 30.07s, 154.53MB read
Requests/sec:    273.43
Transfer/sec:      5.14MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.12s   190.55ms   1.99s    68.82%
    Req/Sec   271.80    102.13   828.00     76.57%
  7764 requests in 30.06s, 145.95MB read
  Socket errors: connect 0, read 0, write 0, timeout 10
Requests/sec:    258.33
Transfer/sec:      4.86MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.14s   230.84ms   2.00s    66.12%
    Req/Sec   266.72     86.49   700.00     76.66%
  7637 requests in 30.06s, 143.55MB read
  Socket errors: connect 0, read 0, write 0, timeout 7
Requests/sec:    254.06
Transfer/sec:      4.78MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.18s   245.31ms   2.00s    67.80%
    Req/Sec   255.79     86.33   545.00     70.77%
  7250 requests in 30.07s, 136.28MB read
  Socket errors: connect 0, read 0, write 0, timeout 98
Requests/sec:    241.14
Transfer/sec:      4.53MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.20s   251.28ms   2.00s    68.92%
    Req/Sec   250.43     87.33   700.00     69.79%
  7206 requests in 30.06s, 135.45MB read
  Socket errors: connect 0, read 0, write 0, timeout 114
Requests/sec:    239.72
Transfer/sec:      4.51MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.22s   234.71ms   2.00s    70.41%
    Req/Sec   247.76     98.26   818.00     77.70%
  7095 requests in 30.06s, 133.36MB read
  Socket errors: connect 0, read 0, write 0, timeout 78
Requests/sec:    236.07
Transfer/sec:      4.44MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.30s   264.83ms   2.00s    71.78%
    Req/Sec   219.64     77.18   480.00     65.62%
  6308 requests in 30.05s, 118.57MB read
  Socket errors: connect 0, read 0, write 0, timeout 433
Requests/sec:    209.92
Transfer/sec:      3.95MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.28s   273.28ms   2.00s    68.97%
    Req/Sec   224.68     86.50   610.00     72.47%
  6430 requests in 30.05s, 120.86MB read
  Socket errors: connect 0, read 0, write 0, timeout 423
Requests/sec:    213.98
Transfer/sec:      4.02MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.28s   297.44ms   2.00s    71.19%
    Req/Sec   216.58     76.46   505.00     70.49%
  6223 requests in 30.07s, 116.97MB read
  Socket errors: connect 0, read 0, write 0, timeout 725
Requests/sec:    206.95
Transfer/sec:      3.89MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.28s   268.14ms   2.00s    71.59%
    Req/Sec   212.54     70.55   690.00     76.12%
  6133 requests in 30.04s, 115.28MB read
  Socket errors: connect 0, read 0, write 0, timeout 769
Requests/sec:    204.13
Transfer/sec:      3.84MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.28s   248.52ms   2.00s    75.28%
    Req/Sec   205.85     73.77   770.00     77.08%
  5924 requests in 30.06s, 111.35MB read
  Socket errors: connect 0, read 0, write 0, timeout 875
Requests/sec:    197.08
Transfer/sec:      3.70MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.31s   264.71ms   2.00s    72.88%
    Req/Sec   201.56     70.79   550.00     74.56%
  5790 requests in 30.04s, 108.83MB read
  Socket errors: connect 0, read 0, write 0, timeout 872
Requests/sec:    192.73
Transfer/sec:      3.62MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.35s   281.51ms   2.00s    73.23%
    Req/Sec   175.89     61.97   646.00     79.51%
  5059 requests in 30.05s, 95.09MB read
  Socket errors: connect 0, read 0, write 0, timeout 1380
Requests/sec:    168.35
Transfer/sec:      3.16MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.36s   261.39ms   2.00s    70.81%
    Req/Sec   184.69     64.38   440.00     69.93%
  5271 requests in 30.05s, 99.08MB read
  Socket errors: connect 0, read 0, write 0, timeout 1245
Requests/sec:    175.38
Transfer/sec:      3.30MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.37s   296.79ms   2.00s    73.31%
    Req/Sec   169.93     64.26   626.00     78.32%
  4853 requests in 30.04s, 91.22MB read
  Socket errors: connect 0, read 0, write 0, timeout 1661
Requests/sec:    161.55
Transfer/sec:      3.04MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.34s   247.45ms   2.00s    74.87%
    Req/Sec   163.31     54.27   353.00     70.07%
  4626 requests in 30.04s, 86.95MB read
  Socket errors: connect 0, read 0, write 0, timeout 1514
Requests/sec:    154.01
Transfer/sec:      2.89MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.38s   290.60ms   2.00s    71.98%
    Req/Sec   158.62     68.69   520.00     79.72%
  4529 requests in 30.04s, 85.13MB read
  Socket errors: connect 0, read 0, write 0, timeout 1595
Requests/sec:    150.75
Transfer/sec:      2.83MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.37s   275.42ms   2.00s    75.76%
    Req/Sec   155.28     51.62   363.00     70.24%
  4484 requests in 30.05s, 84.29MB read
  Socket errors: connect 0, read 0, write 0, timeout 1893
Requests/sec:    149.20
Transfer/sec:      2.80MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.40s   278.83ms   2.00s    71.16%
    Req/Sec   154.91     51.04   360.00     73.43%
  4423 requests in 30.05s, 83.14MB read
  Socket errors: connect 0, read 0, write 0, timeout 1500
Requests/sec:    147.17
Transfer/sec:      2.77MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.38s   276.97ms   2.00s    76.20%
    Req/Sec   151.41     53.37   454.00     79.24%
  4378 requests in 30.05s, 82.29MB read
  Socket errors: connect 0, read 0, write 0, timeout 1954
Requests/sec:    145.68
Transfer/sec:      2.74MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.36s   246.76ms   2.00s    81.20%
    Req/Sec   147.93     58.99   480.00     82.29%
  4251 requests in 30.05s, 79.91MB read
  Socket errors: connect 0, read 0, write 0, timeout 2044
Requests/sec:    141.45
Transfer/sec:      2.66MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.35s   220.06ms   2.00s    80.37%
    Req/Sec   142.69     52.70   393.00     78.05%
  4101 requests in 30.03s, 77.10MB read
  Socket errors: connect 0, read 0, write 0, timeout 1854
Requests/sec:    136.56
Transfer/sec:      2.57MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.41s   244.83ms   2.00s    75.15%
    Req/Sec   139.88     60.57   545.00     81.34%
  3996 requests in 30.06s, 75.11MB read
  Socket errors: connect 0, read 0, write 0, timeout 1763
Requests/sec:    132.93
Transfer/sec:      2.50MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.36s   224.82ms   2.00s    81.04%
    Req/Sec   140.15     54.55   434.00     76.84%
  4011 requests in 30.03s, 75.39MB read
  Socket errors: connect 0, read 0, write 0, timeout 1912
Requests/sec:    133.55
Transfer/sec:      2.51MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.39s   197.84ms   2.00s    74.59%
    Req/Sec   141.83     71.15   545.00     78.87%
  4031 requests in 30.06s, 75.79MB read
  Socket errors: connect 0, read 0, write 0, timeout 1831
Requests/sec:    134.11
Transfer/sec:      2.52MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.38s   200.00ms   2.00s    77.80%
    Req/Sec   131.08     46.56   360.00     76.47%
  3787 requests in 30.05s, 71.20MB read
  Socket errors: connect 0, read 0, write 0, timeout 1895
Requests/sec:    126.03
Transfer/sec:      2.37MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.41s   185.00ms   2.00s    73.20%
    Req/Sec   133.85     62.00   474.00     83.69%
  3768 requests in 30.06s, 70.83MB read
  Socket errors: connect 0, read 0, write 0, timeout 1820
Requests/sec:    125.37
Transfer/sec:      2.36MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.43s   217.95ms   2.00s    72.34%
    Req/Sec   132.23     61.63   520.00     76.76%
  3751 requests in 30.05s, 70.51MB read
  Socket errors: connect 0, read 0, write 0, timeout 1806
Requests/sec:    124.84
Transfer/sec:      2.35MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.44s   208.47ms   1.99s    70.79%
    Req/Sec   127.17     53.19   363.00     73.78%
  3639 requests in 30.06s, 68.40MB read
  Socket errors: connect 0, read 0, write 0, timeout 1934
Requests/sec:    121.06
Transfer/sec:      2.28MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.46s   216.67ms   2.00s    68.67%
    Req/Sec   131.14     58.20   380.00     74.47%
  3701 requests in 30.05s, 69.57MB read
  Socket errors: connect 0, read 0, write 0, timeout 1818
Requests/sec:    123.16
Transfer/sec:      2.32MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.46s   191.29ms   2.00s    70.83%
    Req/Sec   122.80     47.96   313.00     75.61%
  3518 requests in 30.05s, 66.13MB read
  Socket errors: connect 0, read 0, write 0, timeout 1958
Requests/sec:    117.07
Transfer/sec:      2.20MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.47s   181.54ms   2.00s    73.70%
    Req/Sec   123.67     63.33   410.00     73.31%
  3473 requests in 30.07s, 65.28MB read
  Socket errors: connect 0, read 0, write 0, timeout 1701
Requests/sec:    115.48
Transfer/sec:      2.17MB

-----------------------------------------------------------------------

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.52s   180.95ms   2.00s    67.52%
    Req/Sec   111.96     51.35   330.00     75.99%
  3120 requests in 30.05s, 58.65MB read
  Socket errors: connect 0, read 0, write 0, timeout 1950
Requests/sec:    103.81
Transfer/sec:      1.95MB

Edit: I restarted all parse servers. And Result increased to 317 again:

  1 threads and 300 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   918.65ms  154.74ms   1.66s    84.86%
    Req/Sec   331.27    100.33   580.00     67.71%
  9531 requests in 30.05s, 179.15MB read
Requests/sec:    317.17
Transfer/sec:      5.96MB

If there was a pause of about a minute between each test, then this could be a memory leak. It is also possible that connections to the DB are not closed properly after each request, and so the DB latency is actually increasing.

My suggestion would be to do a clean install of Parse Server with just the basic configuration and redo the tests with a simple Cloud Code function. In a basic Parse Server configuration you should not see the latency increasing.

Then redo the tests and gradually add the code and configuration until you end up with the code and configuration that was used for the tests that showed increased latency. That should point you to where the issue occurs.

Hey Manuel. I was doing some tests today.

I created a simple cloud code:

Parse.Cloud.define("helloCode", async (request) => {
  return "hello";
});

When I do benchmark test against this code, performance DIDN’T decrease. First test give 2300 requests per second. And continued to give same results. Sometimes it give 10 less and sometimes 10 more, but performance was stable. Didn’t decrease.

I also did test with session token header to see if database connection causes any problem. And performance DIDN’T decrease again. Parse Server kept the current performance.

I monitored my active connections on MongoDB. I used db.serverStatus().connections command on mongo shell. This is the first output:

{
	"current" : 919,
	"available" : 50281,
	"totalCreated" : 919,
	"active" : 2,
	"exhaustIsMaster" : 1,
	"awaitingTopologyChanges" : 1
}

I had over 900 connections to mongodb. And I have 25 parse server running so 900/25 = 36 connections per parse server.

I wanted to check you advice

When benchmarking is done, I checked connections again. And See same numbers. So MongoDB doesnt close connections immediatly, But I found out that this is good, Because opening connection is slow.

I wanted to reset connections. I mean when I restarted parse servers, connections drop to 25 (1 connection per server). And Performance increases to normal value. So reseting connections must work right?

But this time I wanted to try different. Instead of restarting parse server, I restarted mongodb. To see what increase performance. Restarting parse server or reseting connections?

I restarted MongoDB and checked the connections again. Connections reduced to 25(1 per 1 parse server). But Performance DIDN’T increase this time.

So. I’m pretty sure problem is in parse server.

And running cloud code without query didnt decrease performance.

I will try it with some basic cloud code again but this time I will put a parse query. I will let you know.

Good investigation. I suggest to gradually add the code and configuration until you end up with the code and configuration that was used for the tests that showed increased latency. That should point you to where the issue occurs.

Hey Manuel. I made some test with cloud codes with query inside. Here is an example code:

Parse.Cloud.define("codeWithQuery", async (request) => {
  const Follow = Parse.Object.extend("Follow");
  const getFollow = new Parse.Query(Follow);
  return await getFollow.find({useMasterKey:true});
});

This is a simple query. In my follow class I have only 35 objects. I did the test. But performance is decreasing with query.

It started 800 req/sec. But decreased to 300 and I stopped the test. But when I do test on the other cloud code(no query) Parse server still keeps performance and performance doesnt decrease.

Then I changed query method. I was using find method. I changed that to count and made a test.

When I do test with count query, Parse server keeps the performance and doesnt decrease performance. After seeing that I put limit on find query. Its limited to 1. With this way I wanted to know What is the reason for performance loss, Parse Objects(because count query returns number and I tought maybe parse server having dificulties on parse objects) or Big object size we return(since find method with no limit returns all 35 objects and bigger object size).

After putting limit (1), Parse server didnt decrease performance. Returning smaller object size works fine, But when returning object size is bigger, performace decreases.

Edit: This is not correct. It turns out I forgot to change count method to find. Thats why it didnt decreased performance. But when I changed query method to find Performance decrease

To confirm this, I made test against classes endpoint. Instead of testing /functions/myCloudCode I made a test against /classes/Follow

Performance didnt decrease. Parse server send all 35 objects without problem. Parse server had a stable performance. So Its not about sending big objects. After this test I decided to do same query on cloud code without limits. But I will send simple text after that to see if problem happens during query or sending objects. Here is a simple code:

Parse.Cloud.define("helloCodeGetFollow", async (request) => {

  const Follow = Parse.Object.extend("Follow");

  const getFollow = new Parse.Query(Follow);
  var list = await getFollow.find({useMasterKey:true});

  return "hello";

});

And performance decreases. If you ask me, I would say problem is: Cloud code cant handle queries which return multiple parse objects.
Edit: Cloud code cant handle parse objects doesnt matter if multiple or singular.

There is something wrong with the cloud code when we do queries. And since count query dont have problem, I think problem happens when parse server convert mongo object to parse object. But thi is hust my guess.

Edit2: After doing test with correct method an seeing query that returns parse object decrease performance, I think problem happens with parse objetcs only. If query returns parse object, performace decreases. If problem is converting mongo object to parse object like I said previously, Then /classes/Follow endpoint also should decrease performace but It doesnt. So I think problem happens when parse server send query result to cloud code. Connection between query and cloud code is broken I guess(Only when returning parse objects).

And I dont know How to find real problem after here. What do you think?
And if you want to I can share test logs with you.

You may want to look at this issue regard performance of MongoDB response to Parse Object conversion, specifically these comments:

The tests conducted in that issue confirm that the conversion has a performance impact, but is inconclusive whether that impact is abnormally high or should be expected.

However, even if the conversion takes time, it does not explain why in your tests the performance is decreasing, even when there is a minute pause between each of the tests. There have been numerous discussions over the years about Parse Server deployments with decreasing performance, but I cannot remember to have seen any discussions that proofed that it was a Parse Server issue. Maybe you are onto something here.

The existing issue points to the BSON deserialization of the raw MongoDB response. If you read the whole thread you can see that there may be alternative ways for this deserialization. You could experiment and repeat your tests with these alternative ways.

İ read the thread. İ must say i don’t think the issue here is serialization of parse objects. Yes converting mongo object to parse object is CPU intensive. But that doesn’t explain decreasing performance. İf that was the issue we get 300req/sec at the begining and it would keep going like that. But we start as 500 and decrease to even under 100.

And if that was the problem /classes/Follow endpoint should also decrease performance. But it doesn’t. When we get parse object directly from classes endpoint, performance is stable. So our problem is cloud code spesific.

But i can do more tests. And I’m not sure what’s the real problem. Like you said this could be a big thing.

İt’s very late here so I will do test tomorrow. Btw you can also replicate the issue i have and maybe investigate if you want.

And maybe i should open an issue ? What do you think?

Edit: I found a github issue. USer says saveAll operation getting slower after each call. Performance decrease is similar. But I’m not sure if our problems is related.
https://github.com/parse-community/parse-server/issues/6300

Edit2: I made an another test. In this test I used find method. But I run this query over an empty collection. So query will return empty array. Here is the code:

Parse.Cloud.define("helloCodeGetFollow", async (request) => {

  const Block = Parse.Object.extend("Block");
  //There is no data in Block Class
  const getBlock = new Parse.Query(Block);
  return await getBlock.find({useMasterKey:true});
});

Performance was stable. And performace didn’t decrease. So our problem is related to Parse objects, But to confirm this we need to bypass mongo object to parse object transformation. I created a thread for this: Can we get raw JSON instead of ParseObject as result of query?

@dplewis said He implemeted query option to retrive original mongo result instead of parse objet to php sdk. You can use it like this: query.find({ json: true }) But this hasnt been implemented in js sdk. So in order to confirm our problem relays on parse object, we need to get raw json as result. But I have no idea how do we do that.

Edit3: I missed this advice

I created a new cloud with this code:

const Config = require('./node_modules/parse-server/lib/Config');

const config = Config.get(Parse.applicationId);
const mongoAdapter = config.database.adapter;

Parse.Cloud.define("cloudCode", async (request) => {
  await mongoAdapter.connect();
  return await mongoAdapter.database.collection("Product");
});

I will share the results.

Edit4: I cant run the above code. It give me error.
RangeError: Maximum call stack size exceeded

Edit5: I couldnt make it. Code gives the above error. I also tried mongo npm package. But same error happens. I dont know how to make this test.

Ok I created a github issue for this. https://github.com/parse-community/parse-server/issues/7036

Have a look at:

2 Likes

@uzaysan can you please try your benchmarking with JS SDK 4.0.1 / Parse Server v6