Problem:
When I run the code below for one user (one batch of 800 records that should take around 15 seconds to complete), the time instead grows infinitely to over a few minutes per batch until the server must restart:
// see if sender tracking already exists in db for this user (query in one call)
let senderTrackingRecords = await new Parse.Query("EmailSenderTracking").equalTo("userId", userId).containedIn("senderId", senderIdArray).grapeFind();
// create array of sender IDs from "senderTrackingRecords" (to compare later with "senderIdArray")
let senderTrackRecordSenderIds = [];
for (let index = 0; index < senderTrackingRecords.length; index++) {
const senderId = senderTrackingRecords[index].get("senderId");
const existingCount = senderTrackingRecords[index].get("count");
senderTrackRecordSenderIds.push(senderId);
let currentObj = senderTrackingObjBySenderId[senderId];
// update existing records
senderTrackingRecords[index].set("lastMessage", currentObj.messageId);
senderTrackingRecords[index].set("count", existingCount + currentObj.count);
}
if (senderTrackingRecords && senderTrackingRecords.length > 0) await Parse.Object.grapeSaveAll(senderTrackingRecords);
I have the same symptoms as most of the other posts I’ve seen with similar issues:
Parse server performance loss after time
opened 10:29AM - 02 Dec 20 UTC
closed 09:21PM - 16 Mar 21 UTC
### New Issue Checklist
- [x] I am not disclosing a [vulnerability](https:/… /github.com/parse-community/parse-server/blob/master/SECURITY.md).
- [x] I have searched through [existing issues](https://github.com/parse-community/parse-server/issues?q=is%3Aissue).
- [x] I am not just asking a [question](https://github.com/parse-community/.github/blob/master/SUPPORT.md).
- [x] I can reproduce the issue with the [latest version of Parse Server](https://github.com/parse-community/parse-server/releases).
### Issue Description
When I do performance test with parse server, performance decreases. And When I say decreases, I don't mean it decrease a little then stop. I mean It always decrease and doesn't stop decreasing. But This only happens when we use parse query in cloud code and query (in cloud code) returns parse object(s). But queries that doesn't return parse object(count query) or queries that should return array of parse objects but returns empty array(for example collection is empty), performance doesn't decrease. They are fine. Problem only happens when query return parse object. You can see more information on [forum thread that I have created](https://community.parseplatform.org/t/parse-server-performance-loss-after-time/1070)
### Steps to reproduce
Create a cloud code and put a simple query in it(query must return parse object(s)). Then do countinues benchmark test against this cloud code.
I used [wrk library](https://github.com/wg/wrk) to do benchmark tests. This is the code I used when I do benchmarking:
`wrk -t1 -c400 -d30s -H "Content-Type: application/json" -H "X-Parse-Application-Id: YOUR_APP_ID" -s post.lua http://parse_server_ip:1337/parse/functions/codeWithQuery`
Here is an example cloud code:
```
Parse.Cloud.define("codeWithQuery", async (request) => {
const Follow = Parse.Object.extend("Follow");
//Follow class has 35 objects. So this query returns 35 parse objects.
const getFollow = new Parse.Query(Follow);
return await getFollow.find({useMasterKey:true});
});
```
### Actual Outcome
Requests per second decreases. You can see my benchmark history on [this message](https://community.parseplatform.org/t/parse-server-performance-loss-after-time/1070/6?u=uzaysan).
And You can see information on that thread.
### Expected Outcome
Performance shouldn't decrease. Even if it does, It should be stable after some time.
### Environment
Server
- Parse Server version: `4.3.0`
- Operating system: `Ubuntu 18.04`
- Local or remote host (AWS, Azure, Google Cloud, Heroku, Digital Ocean, etc): `Remote host`
Database
- System (MongoDB or Postgres): `MongoDB`
- Database version: `4.4.1`
- Local or remote host (MongoDB Atlas, mLab, AWS, Azure, Google Cloud, etc): `Remote`
Client
- SDK (iOS, Android, JavaScript, PHP, Unity, etc): `Cloud Code`
- SDK version: `Cloud Code`
### Logs
opened 12:19PM - 27 Mar 20 UTC
type:feature
I'm trying to fetch more than 100,000 rows for report generation.
I tried these… approaches,
1. Recursive calls with 1,000 limit to fetch all rows
2. Using query.each() and push row in array
Both the calls are slow,
For cloud code query
query = //new query
query.equalTo("SOMEKEY","SOMEVALUE")
**For total of 1,500,000 rows**
In cloud code,
In recursive call approach, there is approx **1 second waiting time per 1000 rows.**
In query.each() approach this query takes approx **60 seconds** to process, http waiting time
>
> I assume this is fastest parse can work.
>
> Can i see some improvement by indexing the mongodb objectId column?
Is there a better approach for processing such huge data.
One alternative I can think of is parse aggregate but this approach doesn't respect parse authentication, security, acl etc. and doesn't work with existing query logic.
Will this show any performance improvement and worth a try?
opened 12:19PM - 27 Mar 20 UTC
type:feature
I'm trying to fetch more than 100,000 rows for report generation.
I tried these… approaches,
1. Recursive calls with 1,000 limit to fetch all rows
2. Using query.each() and push row in array
Both the calls are slow,
For cloud code query
query = //new query
query.equalTo("SOMEKEY","SOMEVALUE")
**For total of 1,500,000 rows**
In cloud code,
In recursive call approach, there is approx **1 second waiting time per 1000 rows.**
In query.each() approach this query takes approx **60 seconds** to process, http waiting time
>
> I assume this is fastest parse can work.
>
> Can i see some improvement by indexing the mongodb objectId column?
Is there a better approach for processing such huge data.
One alternative I can think of is parse aggregate but this approach doesn't respect parse authentication, security, acl etc. and doesn't work with existing query logic.
Will this show any performance improvement and worth a try?
opened 09:42AM - 02 Apr 19 UTC
closed 08:43PM - 16 Jun 19 UTC
### Issue Description
For demo purposes, we need to clone 3000 objects and sa… ve them each time a demo session is started.
We use a cloudFunction to do this among other this and the saveAll operation of the newly cloned 3000 objects takes more than 10 seconds, without any impact of the batchSize parameter.
### Steps to reproduce
1- query for 3000 objects
2- clone them
3- saveAll them
### Expected Results
It should send a single (or a few batch) operation to the DB
### Actual Outcome
It seems instead that the parse-server receives the requests in batch (from himself) but send them one by one to the DB, which then is very slow
### Environment Setup
- **Server**
- parse-server version: 3.1.3
- Localhost or remote server? AWS ELB
- **Database**
- MongoDB version: 3.6
- Localhost or remote server? MongoAtlas
But unfortunately none of those seem to have a clear solution and I’ve tried enabling directAccess, databaseOptions/enableSchemaHooks, and lots of testing.
This is what the growth looks like on the server when processing one batch of 800 records – it keeps growing.
As a test, I created one function that repeatedly saves 1,000 hard-coded records without any other logic using saveAll():
const stopTime = Date.now() + 3600000; // 1 hour = 3600000; 30 mins = 1800000;
while (Date.now() < stopTime) {
let newSenderTrackingRecords = [];
for (let index = 0; index < 1000; index++) {
const Record = Parse.Object.extend("EmailSenderTracking");
const record = new Record();
record.set("userId", "XXXXXXXXXX");
record.set("lastMessage", "XXXXXXXXXX");
record.set("lastMessageTime", Date.now());
record.set("senderId", "XXXXXXXXXX");
record.set("count", 45);
record.set("mailboxId", "XXXXXXXXXX");
newSenderTrackingRecords.push(record);
}
await Parse.Object.grapeSaveAll(newSenderTrackingRecords);
console.log("SAVED VIA PARSE");
}
And it looks like this (same problem with growing time to complete task & crashing server):
Then I saved directly to mongo db via mongoose and voila – it’ll save 1,000 records no problem in less than 1 second repeatedly.
Here’s my Parse config:
const config = {
databaseURI: databaseUri || 'mongodb://localhost:27017/dev',
cloud: process.env.CLOUD_CODE_MAIN || __dirname + '/cloud/main.js',
appId: process.env.APP_ID || 'myAppId',
masterKey: process.env.MASTER_KEY || '', //Add your master key here. Keep it secret!
serverURL: process.env.SERVER_URL || 'http://localhost:1337/parse', // Don't forget to change to https if needed
serverSelectionTimeoutMS: 60000,
connectTimeoutMS: 60000,
enableAnonymousUsers: false,
allowClientClassCreation: false,
directAccess: true,
databaseOptions: {
enableSchemaHooks: true,
}
Running Parse Server 5.4 and MongoDB 5
Can you all think of something else to test or if there’s another obvious configuration I need?
A workaround I can see is to stop using Parse for all db calls, or to save directly to mongo for 800+ batches, but then it’s a mess to use both Parse & mongo to save to db if I can’t use them interchangeably. I’m aware of the {json:true} trick to return JSON objects from Parse.Query() queries, but then I haven’t figured out how to resolve errors appears due to incompatibilities between Parse/JSON objects in mongo db. Today I saw that there might be an adapter for Parse-to-Mongo object storage, but then I’m not sure if that’s even the best solution.
Appreciate any insights!