Handling larger files

I’ll generate some large files locally, which I’d like to put on S3 through Parse, with pure node js code. The documentation seems only to only accept base64 or Arrays, which will consume vasts amounts of memory. Is there a way to use a Stream or File object that will not load the entire file into memory first?


I think that’s not currently possible using the JS SDK, but in theory that should be possible by using the FilesController (https://github.com/parse-community/parse-server/blob/4c29d4d23b67e4abaf25803fe71cae47ce1b5957/src/Controllers/FilesController.js#L18) or S3Asapter (https://github.com/parse-community/parse-server-s3-adapter/blob/master/index.js#L80) directly from your Cloud Code. Worst case, you can use the AWS SDK to upload.

1 Like