Query.save gets error: Uncaught internal server error. {"stack":"Error"}

Dear Parse team,
I have user import query:

const userObj = Parse.Object.extend("User");
var addingList = [];
 for (var i = 0; i < downloadedCustomers.length; i++) {
  var user = new userObj();
  user.set("username", downloadedCustomers[i].cardcode);
  user.set("password", downloadedCustomers[i].password);
  user.set('name', downloadedCustomers[i].cardname);
  addingList.push(user)
  user.save( { useMasterKey: true }).then((success)=>{
    console.log(success)
  }).catch((error)=>{
    console.log(error)
  })
}

when i use this code on array, sometimes gives

error: Uncaught internal server error. {“stack”:“Error”}

on some keys, but sometimes successfully saves all keys.

if i use :

Parse.Object.saveAll(addingList, { useMasterKey: true })

somekeys got saved, some not.

Please clarify how can i import all keys successful everytime?

PS: same situation on

user.signUp()

I suggest you clone the parse server repo locally and set a breakpoint on error. This way you can look at the stack trace and see what caused the error.

I fixed it dividing object requests into 10 pieces of chunks.
Seems parse user object saving needs time to process and resolve.

for (h = 0, k = addingList.length; h < k; h += chunk2) {
temparray = addingList.slice(h, h + chunk2);
await Parse.Object.saveAll(temparray,{useMasterKey:true}).then((success) => {
console.log(h + chunk2 + ’ customers were added from total ’ + k);
return h + chunk2 + ’ customers were added from total ’ + k
}).catch((error) => {
console.log('Some customers were failed to import in chunk : ’ + h + ‘. Error:’ + error);
console.log(error)
return 'Some customers were failed to import in chunk : ’ + h + ‘. Error:’ + error
});
}

Parse Server already supports the batchSize parameter that divides the objects passed to saveAll into batches. There is a default batchSize if I’m not mistaken, so even if you don’t set the parameter explicitly, it is saving the objects in batches.

Yes, default saveall() batchsize is 20, and giving error. If i set chunk size below 20. Error disappears.

So in my opinion default batch resolve timeouts before next batch process begins.

That behavior can be expected, depending on the context. If you are saving large objects and the database or server or network between server and database is near resource limits, a smaller batch size may be required.

However, if you are saving small objects and the metrics don’t indicate a resource constraint, that would look suspicious.