Cloud Code as Serverless Functions

I see that this question was broached when flovilmart was onboard however there wasn’t any real conclusion drawn.
I tend to rely heavily on Cloud Code and use REST / Graph QL without a client side parse lib.

I have visions of carving up my API into single functions that can be deployed individually to Azure / AWS / Netlify /

My concern main is with cold boot times, would they be too slow? Anything more than a 500ms cold boot I would consider to be too long.

Has anyone tried this? Here is a very simple guide to Azure. My preference would actually be 2 or Netlify.

How are you planning to build the serverless functions? If you try to embed the whole Parse Server and run the cloud code function as we normally do in the monolithic approach, you will probably have a a large cold boot time. I’d try to build each serverless function as an isolate module that only have installed the Parse JS SDK for a low cold boot time (if it is your main concern).

Hi @davimacedo, yes I was proposing carving up the app into single functions. In my my cloud code there are approx 50 separate functions that I simply all via REST.

I think a minimal example would be to run a simple login function which would only need parse-server as a dependency. I wonder if the starting up of parse server takes a long time? As you would be spinning up the function and then running parse.

For example you would have to run this on every single function call.

// 1. Spin up the function dependencies then run the following: 

var api = new ParseServer({
  databaseURI: 'mongodb://your.mongo.uri',
  cloud: './cloud/main.js',
  appId: 'myAppId',
  fileKey: 'myFileKey',
  masterKey: 'mySecretMasterKey',
  push: { ... }, // See the Push wiki page
  filesAdapter: ...,

var express = require('express');
var ParseServer = require('parse-server').ParseServer;

var app = express();
var api = new ParseServer({ ... });

// Serve the Parse API at /parse URL prefix
app.use('/parse', api);

var port = 1337;
app.listen(port, function() {
  console.log('parse-server-example running on port ' + port + '.');


This seems really wasteful compared to this running / listening in a monolithic approach. Am I thinking about this completely wrong? I don’t fully understand serverless functions

I’ve never tested it before but I think it is worth to try. AWS Lambda actually reuses the process between different calls. It cold boots the first one on the first call and it only needs to cold boot again when you have two calls to the same function at the same time. Then it keeps reusing these two processes and only cold boot again when you have a third concurrent request. And so on…

The other approach I’d test is little bit different: I’d try to keep the Parse Server isolated from the functions and, in each function, just initialize the Parse SDK connecting to this isolated Parse Server. In this case you will probably have a much better cold boot time.