Hi everyone, I’ve a concurrency issue in the code below:
Parse.Cloud.define("subscribe", async (request) => {
.....
const listId = .....
const subscriberId = ...
const SubscribersList = Parse.Object.extend("SubscribersList");
const query = new Parse.Query(SubscribersList);
query.equalTo("listId", listId);
let numberSubscriptions = await query.count({ useMasterKey: true });
// maxSubscribers retrieved from another collection linked to the SubscribersList with the same listId
const maxSubscribers = ..... get("maxSubscribers");
if (numberSubscriptions < maxSubscribers) {
const subscriber = new SubscribersList();
subscriber.set("subscriberId", subscriberId);
subscriber.set("subscriptionDate", new Date(Date.now()));
subscriber.save(null, { useMasterKey: true });
}
}, {
requireUser: true
});
When I add a new subscriber, if there are more concurrent requests, they could exceed the “maxSubscribers” that I have for a specific subscription list.
Any suggestion on how to manage the concurrency in this case?
Is there any kind of mutex / lock that can be implemented in a safe way to manage these behaviors, including the write operation on the same collection field?
It may be possible to use an aggregation query to do that. Aggregation queries can also perform modifications. So you could in 1 request look for the object with a condition that subscriberCount < x, then in the next pipeline step update the object.
However, I think the proper solution maybe be to implement the findAndUpdate command in Parse Server.