Question About Concurrency in Parse

I have another noob question about how queries and mutations are handled by the Parse server, especially when it comes to ones triggered by Cloud code, in a Parse server ran via pm2(not sure if this info is relevant or not).

My question is: are queries handled in parallel or one by one? And if they run one by one, is it guaranteed that one query will run only after the afterFind trigger of the previous query has finished running?

How would one go about avoiding any concurrency problems(eg. A query returns an object which has just been deleted, etc.)?

@BobyIlea
Actually, Parse Server doesn’t support parallelism because JavaScript itself doesn’t. It runs on a single instance by default, not using PM2, but this depends on your decision.

Regarding the afterFind question, there is no query queue; it works similarly to JavaScript’s job processing.

You can prevent issues using atomic operations, which are handled at the database level. You should check which operations are atomic in your database queries. However, you still cannot be entirely sure about race conditions in requests to the database. You might need to explore other database-level solutions or consider using some mutex libraries.

Thanks @rgunindi for the insight. I was thinking that might be the case, but was hoping I’m wrong. In this case how would one approach the following(I assume quite usual) scenario:

I have a list of predefined unique products(I’ll call them packs). I want to have a cloud function that when called by a user, fetches a random pack(I can achieve the randomness by ordering by objectId, anyway this part is not important in the scope of this discussion) that has not yet been assigned to any user and then assigns the same pack to the user calling the function.

Let’s say the table is quite simple, we only have:

class User_Packs {
   objectId: string,
   userId: string,
   packAttributes: ...
}

My current logic looks something like this(pseudo JS):

Parse.Cloud.define('getPack', async request => {
    const packQuery = new Parse.Query("User_Packs");
    packQuery.ascending("objectId");
    packQuery.equalTo("userId", undefined);
    
    const pack = await packQuery.first();
    pack.set("userId", request.params.userId);
    await pack.save();
    
    return pack.id;
});

Basically the whole logic consists of three main steps: query → set a value → save

But, my assumption is that with this approach there is a chance that two users can end up getting the same pack, which I want to avoid. Am I right?
If so, then how would I refactor this to make sure this can be avoided?

I’ve read about batching operations in transactions, but I have no actual idea how that can be done or even where to start searching for info on the subject. I might have also misunderstood this concept and I might be talking nonsense, but if there would be a way to send a batch of commands from Cloud functions to the database in a way that guarantees they are executed “all at once” that would be very helpful.

Another approach would be that when a user calls the Cloud function, this function creates a queue of requests instead of actually handling the operations and then a Cloud job resolves the requests one by one. But this approach requires more work and it’s a bit complicated especially when it comes to returning the results back to the caller, so I would ideally avoid it.

@BobyIlea Hi,
You can use transactions in cloud code

https://www.mongodb.com/docs/manual/core/transactions/