— 5 min read

Render SwiftUI animations to mp4 in Xcode playgrounds

I’ve found it convenient to prototype SwiftUI animations in Xcode playgrounds, to quickly render custom views without getting into creating an entire project. The result is a single file that is easily shareable code-wise. Sharing the visual result however, is not as easy. There is no built-in way of capturing the rendered output. Resorting to the various screen-recording apps out there does not yield the expected quality, but also seems like a round-about way, when already working with a rendering engine directly.

In this post we explore the possibility of rendering videos directly from the playground. For this purpose, we will reuse an animation I was working on recently. The basic idea is to render a gradient along a circular path, in order to implement a knob-like input control. Actual utility of the control is not important here. The result looks like this:

Initially, I was expecting to use a linear gradient as a lookup map, like a texture. But then realised that since the path is always circular, it’s possible to take advantage of an angular gradient and use it as a mask over a path. The code for the animation, step 1, is avilable on Github.

What the animation looks like is not really relevant to this post, as it focuses on how to capture a video of any animation. As such, the animation’s implementation is not discussed any further.

Render each frame individually#

Apple offers a class called ImageRenderer, allowing to export images from SwiftUI views. The first goal is to render a singular frame. Once it’s possible to render one frame, it should be straightforward to loop through all frames of an animation:

let renderer = ImageRenderer(content: Text("Stay Awhile and Listen!"))
let filepath = ...

//renderer.scale = displayScale
renderer.scale = 3.0

do {
    try renderer.uiImage?.pngData()?.write(to: filepath)
    print("> Error: \(filepath.path)")
} catch {
    print("> Error: \(error.localizedDescription)")
    return
}

Above excerpt can be summarised in three steps. Initialise a renderer with the SwiftUI view of interest, define a scale and instruct on where to render to. Especially the render scale is interesting. A screen recording tool would capture the animation depending on the developer’s monitor and its inherent display scaling. This approach offers full control to render frames at multiples that will look crisp on any monitor.

Integrate into playground#

Ideally, it’d be possible to just render the body of the prototyping view:

@MainActor func render() {
    let renderer = ImageRenderer(content: self.body)
}

Unfortunately, this will entangle the playground in an infinite loop. The solution is to move the prototype view into a view builder function and to reuse it in both view body and rendering:

// ...

var body: some View {
    prototypeView()
        .onAppear {
            render()
        }
}

@ViewBuilder func prototypeView() -> some View {
    ZStack {
        // ...
    }.frame(width: 320, height: 320)
}

@MainActor func render() {
    let renderer = ImageRenderer(content: prototypeView())
    // ...
}

Success. This will render the first frame of our animation in the playground canvas, write it to disk and conclude the execution. Please note that the render function was marked with the @MainActor annotation to ensure call safety. See the code for step 2 here.

Full animation loop#

The inability to render the view body directly, as well as the inaccesibility of partial animation values from state variables in SwiftUI, means that the render function needs to steer the value progression itself. The following code excerpt showcases how the progress animation value was extracted as an input value to the view builder function. After that, it’s a matter of rendering each frame in a loop:

@MainActor func render() {
    /// Render configuration for animation
    let fps = 144.0
    let duration = 0.72
    let from = 10.0
    let to = 270.0

    for i in 0...Int(ceil(duration * fps)) {
        /// Calculate current animation step values
        let fraction: Double = (Double(i) / (duration * fps))
        let progress = from + (to - from) * fraction

        let renderer = ImageRenderer(content: prototypeView(progress))
        let filename = String(format: "%05d", i)
        /// ... render frame
    }

    print("Done.")
}

With the configuration above, 104 frames (00000.png to 000104.png) are written to disk. Having the ability to step through each animation frame is a great help for debugging. It also enables fine-grained control by tweaking the parameters.

The keen eye might have noticed that the animation at the very top consisted of some easing, but that the loop above is just linear. Different easing function can be applied by modifying the fraction variable to something non-linear, for example:

struct Easing {
    static func easeInOutSine(x: Double) -> Double {
        return -(cos(Double.pi * x) - 1.0) / 2.0;
    }
}

With everything in place, the resulting playground shows the animation in the playground preview, and also renders each individual frame according to the input parameters to disk. With some care it’s possible for the preview and rendered animation to behave the same. After the playground from the Gist above is exectued, all 104 numerated frames on disk are ready to be consolidated into a video in the final step ahead.

Sequential frames to video with FFmpeg#

Strictly speaking, the goal of the post was to stay within the Xcode Playground. That would be possible using an AVComposition, but that would bloat the playground further. An easier approach is to use FFmpeg, which offers a very simple syntax to control the assembly and output video:

ffmpeg -framerate 144 -i %05d.png -c:v libx264 -crf 14 -r 144 -pix_fmt yuv420p out.mp4

It is important that the input framerate -framerate 144 matches the configuration in the render method, such that the final video duration is correct. The remaining parameters impact the output video quality. In most cases, the output framerate -r 144 should match the input. A scale between 0–51 called the Constant Rate Factor, in this case -crf 14, determines the video quality and therefore file size. More infromation can be found in the documentation: Slideshow (FFmpeg Wiki).

Conclusion#

With only a few steps, it’s now possible to render SwiftUI animations as mp4 video. At the same time it’s easy to debug individual animations steps and to control the output quality. It would be nice to find a way to render the view body directly - and using the partial values of SwiftUI animations directly, for that matter. But this approach feels like a fair compromise.