Timelapse video generation with FFmpeg
FFmpeg is a framework to help developers process multimedia file types. As it simply provides source code, it runs on pretty much any platform. From simple file type conversions, to adding machine learning models, to automatically removing bad frames videos, it can do pretty much anything you could ever dream of in the world of multimedia.
As mentioned before you can run FFmpeg on pretty much any platform, (see their installation page to find the right installation for you) I personally used brew:
In production it's unlikely you will be using brew so we will cover later on how you can deploy your code with this framework installed using docker.
Before we wrote any code we wrote down each step we would need to complete to get to the end result of a time-lapse video. It looked something like this:
1. Get images from db
2. Download images from AWS S3
3. Create and save video using FFmpeg
1. Get images from database
In our database we have a table that stores the S3 locations of all of our images. As this is going to be different for every other project let's just assume we are left with an array of image paths within an S3 bucket:
2. Download the images from S3
The following consists of a loop to go through the array of file paths and a function to download images from AWS S3 and save them to a local folder. At the end of running this snippet we should have a folder named "/tmp" containing all the downloaded images.
3. Create video using FFmpeg
ffmpeg-fluent is a npm package which really usefully wraps FFmpeg so that it becomes really easy to use with node.js. The below code simply reads from the folder of images we created in the last step and uses FFmpeg to compile them into a video file. There are loads of settings for both the input of the images and the output of the video file such as cropping, resizing, scaling, resolutions etc etc. Check out the documentation on the FFmpeg website for all the details of what you can do. In our case we are scaling and cropping the video to a specific size as we know this works for our images.
Deploying with docker
As mentioned at the beginning of the article, one of the problems we ran into was deploying was that you need the FFmpeg binary installed on whatever stack you are using. To make this easier we simply wrapped our app in a docker image that installed everything we need:
FFmpeg is pretty good at using the resources available and not breaking no matter what spec server you are using. However, we found that on a small server things got pretty slow. A instance with 8GB memory and 4 vCPUs gave FFmpeg all the resource it needed to run as quick as possible to create the videos (1 minute videos consisting od 1500 frames). Anything more than this would result in us paying more for the instance for no extra speed gains.
A server with these specs on AWS could cost you up to $75 a month, so we came up with a solution using ECS which when cost under $5 a month to create hundreds of time-lapse videos.
Checkout my article Using ECS tasks on AWS Fargate to replace Lambda functions, to see how you can trigger running the docker container in an ECS task and shutting it back down when the task is complete to save costs on Lambda/EC2.