You can change now or wait until you get closer to the limit. In the thread, people suggest that the default 10 characters size is suitable up to 1M records. It shouldn’t have any impact in the current data but you might want to first test in a development database.
Ä°t has 11 digit key. YouTube has billions of video. Ä°f they are able to identify every video with just 11 digit key, you should be able to identify at least millions of objects with 10 digit key.
Let’s do math. Alphabet has 29characters and 10 character number
29+10=39
39!/29!= 2,30699289e15 which is 2,306,992,890,000,000
Which is over two katrillion (i don’t know the correct word. But you get it.)
And we just calculated lowercase letters. Ä°f we include higher case letters, final number would be huge
I assumed objectid are random by parse server. Ä°f parse server has an algorithm then number can change
That said I actually just did the calculation and with 1M objects and a potential key size of 36^10 the probability of a clash is just 0.00011.
So perhaps increasing the key size isn’t that vital. I assume that Parse probably wouldn’t crash if it attempts to insert an object with a key that already exists.