Kaiber partners with Sieve to launch Superstudio
We discuss Kaiber's launch of Superstudio and how they use Sieve's infrastructure to power their AI video workloads.
/blog-assets/authors/mokshith.jpg
by Mokshith Voodarla
Cover Image for Kaiber partners with Sieve to launch Superstudio

Kaiber recently announced the launch of Superstudio, an AI-native platform redefining how creatives interact with generative AI. Kaiber’s tools have amassed millions of users, including legendary musicians like Grimes and Linkin Park. Addressing the challenge of fragmented workflows, models, and tools, Superstudio provides a unified, intuitive interface where human imagination and machine intelligence collaborate seamlessly.

Kaiber partnered with Sieve as their AI infrastructure provider in April 2023 amid rapid growth. Sieve enabled Kaiber to run AI video workloads at scale while minimizing R&D time and reducing infrastructure spend by over 50%. So today, it’s with great excitement that we officially announce our partnership as they launch Superstudio.

What is Superstudio?

Superstudio offers a highly curated selection of foundational models for image and video creation, including Luma Lab’s Dream Machine, Black Forest Labs’ Flux, and Kaiber’s own image and video models. Its intuitive Canvas interface allows creators to easily combine their ideas with AI-generated content, sparking new creative possibilities. By integrating diverse tools and models into a single platform, Superstudio empowers artists and designers to push the boundaries of their craft while maintaining full control over their creative vision.

Superstudio Announcement

"Creatives are stuck in a loop of slow, ugly AI slop and disjointed workflows, paying 5-10 subscriptions to make one asset. With Superstudio, we've created a home base for the new forms of creativity emerging as humans collaborate with machines. Our focus has always been putting human creativity first, and Superstudio empowers artists to seamlessly integrate AI into their process, amplifying their taste without sacrificing originality."
Victor Wang, CEO and Co-founder, Kaiber

Partnering with Sieve

Kaiber’s team used to spend significant amounts of time setting up and managing GPU workers to support their growing video generation needs. They had experienced a few key pain points doing this, which they wanted to solve with Sieve.

  • They wanted to ensure high GPU utilization. Unlike text generation, video workloads are notorious for low GPU utilization due to highly-intertwined CPU operations.
  • They did not want to manage their own compute resources. They wanted a system that seamlessly autoscaled and handled their bursty GPU workloads.
  • They wanted to save R&D time, so they could continue iterating quickly and a fast-growing space.

Building custom video AI pipelines

Sieve’s platform equips developers with a combination of infrastructure primitives and pre-built pipelines that make it easy to design and run all sorts of AI video workloads at scale.

Kaiber originally started using Sieve to run a heavily modified version of Deforum Stable Diffusion through private functions deployments with GPU acceleration. Every function on Sieve comes with an auto-generated REST API so they could directly call it in their web application.

Kaiber Generate

As Kaiber’s workloads started leveraging many models at once, they needed to be split across multiple workers. Because Sieve makes it easy to reference functions from one another, Kaiber was able to deploy full-fledged pipelines with model components that autoscaled individually.

"Kaiber uses Sieve for our production AI video gen workloads. Our end-to-end shipping speed would be notably slower without Sieve, as they handle so many common, yet complex requirements out of the box. In addition, their team is exceptionally communicative and have always worked quickly to address our feedback."
Eric Gao, CTO and Co-founder, Kaiber

Maximizing efficiency with GPU sharing

Unlike text generation, video workloads are notorious for low GPU utilization due to highly-intertwined CPU operations. This is an issue Kaiber especially struggled with given the nature of how libraries like ComfyUI tend to work — a lot of file I/O and ffmpeg.

Sieve's GPU sharing feature made their workloads much more efficient. It lets multiple containers share one GPU, so Kaiber could use fewer GPUs while handling more tasks with custom ComfyUI workflows.

Reducing R&D time through pre-built pipelines

As Kaiber evolves to becoming an all-in-one creative studio, they’re constantly on the lookout for new models and pipelines to bring into their product experience.

With Sieve’s focus on video, Kaiber is starting to leverage Sieve optimized pipelines like SAM 2 (running 2x faster), SieveSync (quality, zero-shot video lipsync), and content moderation — all without having to engineer and manage every component themselves.

Whats ahead

What started as a search for better GPU infrastructure became a single partnership that allows Kaiber to consolidate spend, reduce R&D time, and save infrastructure headaches. Sieve looks forward to continuing to support Kaiber as their AI infrastructure needs grow!