Analog to Digital: Rethinking Arcade Video Exports

Learn about our Engineering teams' journey to deliver higher-quality video exports.

In the summer of 2022, our team got together for an offsite in NYC. On the last day, we were spitballing ideas for unique new Arcade features that our customers would find helpful. We knew that GIFs are an essential tool in a product marketers’ toolbox. It gives life to newsletters, and social media postings and provides a small glimpse into new features product teams ship to their customers. We’ve always known that we needed to figure out how to create GIF-based Arcade demos, but we didn’t know how.

As every good product engineering team does, we sought to come up with solutions. We’re a scrappy team. We move fast and love MVPs of MVPs. We didn’t want to build an elaborate video solution, nor did we want to spend months on end making something that may not be a viable part of the product. So, we thought about the problem pragmatically:

What if we autoplay an Arcade while pointing a camera at it?

Not a physical camera, of course, but a “camera” that can record a tab in the browser like a screen recording tool. There are many tools capable of doing this, but the most popular tool available to developers is Puppeteer. Puppeteer is mainly used to take screenshots or videos of end-to-end tests so developers can see the issue in the app visually when a test fails. We thought we could use the same technology to record a video of an auto-playing Arcade. Generating a GIF is a simple video-to-gif conversion after the original video is recorded.

October 2022: Launched GIF/Video exports

Is it analog, or is it digital?

Suppose you’ve been around long enough to experience the difference between an analog recording (cassette tapes, VHS or Beta, or even vinyl records) and a digital recording. In that case, you’ll know that the most significant difference is that digital recordings tend to capture a lot more detail than their analog counterpart.

The first version of the Arcade video exporter used an analog method of recording. That is, we automated the playback of an Arcade in real-time and recorded the screen it played on. This is akin to analog recording. The biggest drawback to this method is that we were bound to lose information during the recording. Frames are dropped, and detail is lost, leading to a generally degraded output.

Along the way, we introduced audio in videos, Synthetic Voiceovers, and background music. These features all enhanced interactive Arcades, but adding the same audio features to our video exports was a significant lift for our team. But we were determined to figure out a solution, even if it required an overhaul of how we managed exports.

Our video exporter runs on a serverless “Cloud Function” in Google Cloud. While a serverless infrastructure keeps things simple, we’re limited to running on machines without access to a GPU. This means that all of the GPU-optimized animations we’ve added to Arcades. Hotspot, Callout, and Pan and Zoom animations will never look as buttery-smooth as it does on our viewers’ screen.

All of this led to sub-optimal video quality. As a company, we want to enable teams to create beautiful demos, and that needed to translate to exports as well.

Moving to frame-by-frame video rendering

We knew that to get higher-quality video exports, we needed to export without losing information. This meant that every frame in our video export should be an exact rendition of the Arcade itself at that time-point in the Arcade. To do this, we had to render the video frame-by-frame. This required us to rewrite a substantial part of our codebase so that Arcades could be rendered frame-by-frame and eventually exported as a video.

Arcade is built with React — a popular web app framework. We wanted to reuse as much of our existing code as possible, so we needed to use React to render videos frame-by-frame. Luckily, there’s a popular source-available project that does just that: Remotion.

Remotion is a framework for generating videos programmatically using React. So, we rewrote our existing video export pipeline using Remotion. Aside from allowing us to get high-quality exports, this project also forced us to split our viewer components into their interactive and video counterparts. This makes it possible for us to experiment more freely with either interactive or video features without fear of breaking the other. For example, if we want to add more dynamic animations on the video exports, we can without affecting the interactive side.

With this rewrite, you’ll notice a much higher-quality export that resembles the original Arcade. We’ve also been able to add the audio capabilities quite easily.

Original Exporter: 
  • No sound (wasn't supported)
  • Choppy animations due to dropped frames and non-optimized CSS animations
  • 28s length / 16mb file size

New Exporter: 
  • Background music
  • Synthetic Voice audio
  • Higher quality and smoother animations
  • 28s length / 37mb file size

What’s next?

Now that our video (and GIF) exports are on par with the interactive Arcades and we have an actual video generation and rendering framework, we can take on more ambitious projects to further improve video output. This includes exploring more animations, better chapters or video lead-ins and exits, and providing real-time previews of the export before you export. That’s just a bit of what we’re thinking about, but we’d like to hear what you want from Arcade videos.

Please let us know :) 

Share on

More articles

Blog Post
3 min to read

Changelog: July 2024

Read blog post
Blog Post
3 min to read

What Is Demo Automation and How Can It 10x Your Pipeline?

Read blog post
3 min to read

The Evolution of the Arcade Sidebar

Read blog post