Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Seedance 2.0 API Live on fal (April 2026) | Video Generation API
[go: Go Back, main page]

GPT Image 2 is now on fal! πŸš€

Seedance 2.0 API Unlock The Next Generation of AI Video

ByteDance's most advanced video generation model, launched April 9, 2026. Cinematic output with native audio, real-world physics, and director-level camera control. Accepts text, image, audio, and video inputs.


Available Endpoints

Start building with the Seedance 2.0 API

Multiple endpoints for text-to-video, image-to-video, and reference-to-video generation, including optimized fast variants.


API Documentation

How to get access to Seedance 2.0 API

The client API handles the API submit protocol. It will handle the request status updates and return the result when the request is completed.

import { fal } from "@fal-ai/client";

const result = await fal.subscribe("bytedance/seedance-2.0/text-to-video", {
  input: {
    prompt: "An octopus throws a football in the ocean",
    duration: "5",
    resolution: "720p",
  },
  logs: true,
  onQueueUpdate: (update) => {
    if (update.status === "IN_PROGRESS") {
      update.logs.map((log) => log.message).forEach(console.log);
    }
  },
});

console.log(result.data);
console.log(result.requestId);

What Makes The Seedance 2.0 API Different

Advanced Cinematography

Director-Level Camera Control API

The model handles complex camera work that other models struggle with. Dolly zooms, rack focuses, tracking shots, POV switches, and smooth handheld movement all work as expected. You describe the shot, and the camera executes it.

Real-World Physics

Realistic Motion & Action Generation

Fight scenes, vehicle chases, explosions, falling debris. Seedance 2.0 understands how objects interact under force. Collisions have weight, fabric tears realistically, and characters move with physical believability even in high-action sequences.

Audio-Video Joint Generation

Built-in Cinematic Audio Generation

Seedance 2.0 generates audio natively alongside video. Music carries deep bass and cinematic warmth. Dialogue is clear with precise lip-sync. Sound effects land exactly on cue. No post-production audio layering needed.


Unified API Platform

Two API Tiers for Different Use Cases

fal is the official Seedance 2.0 API, supporting two tiers so you can pick the right balance of quality, speed, and cost for your workflow.

FeatureSeedance 2.0 (Standard)Seedance 2.0 Fast
Primary Use CaseFinal production renders, cinematic output.Rapid prototyping, high-volume workloads.
Best ForEnterprises, big budget production teams, major film studiosSolopreneurs, budget conscious developers, indie film studios, students
Default ResolutionUp to 720p (HD).Up to 720p (Fast upscaling from 480p).
Wins onBest output, sharp qualityRapid iteration with low latency
Director ControlHigh director control. Control shots, camera, lighting, mood in one shot.Less in tune with director controls. Won’t perform slow motion, multishot, dolly shot on first try.
CostHigher costCost-Optimized
Audio GenerationSynced audio generation at no extra priceSynced audio generation at no extra price
API Endpointsbytedance/seedance-2.0/text-to-videobytedance/seedance-2.0/image-to-videobytedance/seedance-2.0/reference-to-videobytedance/seedance-2.0/fast/text-to-videobytedance/seedance-2.0/fast/image-to-videobytedance/seedance-2.0/fast/reference-to-video

Use Cases

Industries Using the Seedance 2.0 API

From film studios to fashion brands, teams across industries are building with Seedance 2.0 on fal.

Realistic Textures & Feel

High Quality Renders For Architecture & Design

World class luxury look and feel all rendered within minutes. Seedance 2.0 brings photorealistic textures, lighting, and spatial depth to architectural visualization and interior design workflows.

Pre-Vis and Concept at Production Speed

Create Studio Worthy Output For Film & Entertainment

Studios and independent filmmakers can generate storyboard-quality pre-visualization content directly from a script or shot list. Camera moves, lighting moods, and action sequences can be previewed before a single frame is shot, cutting pre-production timelines significantly.

Enterprise Advertising & Brand Creative

Expand On AI Assisted Ecommerce Campaign Ads

Brands can generate polished, on-brief video assets from a single prompt. Product showcases, lifestyle sequences, and cinematic brand ad spots, produced at the speed of a prompt, not a shoot.

Character Consistency & Real Game Engine Physics

Scale Video Game Development & Virtual Production

Game studios can use Seedance 2.0 to generate high-fidelity cinematic sequences, environmental previews, and in-engine concept footage β€” all without a dedicated animation pipeline.

Runway-Ready Visuals, Without the Runway

Fashion Campaign Assets & Virtual Try-On Videos On Demand

Generate editorial-quality video content without booking a studio, a crew, or a location. Seedance 2.0 handles fabric movement, lighting, and texture with cinematic precision β€” ideal for lookbooks, campaign content, seasonal drops, and digital runway presentations.

Authentic-Feeling Content at Professional Quality

Handheld UGC & AI Creator Content

Seedance 2.0 can replicate the handheld, lo-fi aesthetic of user-generated content while maintaining full creative control. Generate UGC-style video that feels organic and platform-native β€” ideal for TikTok, Instagram Reels, and YouTube Shorts.


Examples

Seedance 2.0 API Examples

Turn on audio to hear the native sound generation. Every example below was generated in a single pass with no post-production.

High-action chase with dynamic tracking

"Camera follows a man in black sprinting through a crowded street, a group chasing close behind. The shot cuts to a side tracking angle as he panics and crashes into a roadside fruit stall, scrambles to his feet, and keeps running. Sounds of a frantic crowd"

Martial arts choreography in nature

"A spear-wielding warrior clashes with a dual-blade fighter in a maple leaf forest. Autumn leaves scatter on each impact. Wide shot pulls into tight close-ups of parrying blades, then cuts to a slow-motion overhead as both leap into the air"

Long-take spy thriller with continuous camera

"Spy thriller style. Front-tracking shot of a female agent in a red trench coat walking forward through a busy street, pedestrians constantly crossing in front of her. She rounds a corner and disappears. A masked girl lurks at the corner, glaring after her. Camera pans forward as the agent walks into a mansion and vanishes. Single continuous take, no cuts"

Multi-shot creative commercial

"15s commercial. Shot 1: side angle, a donkey rides a motorcycle bursting through a barn fence, chickens scatter. Shot 2: close-up of spinning tires on sand, then aerial shot of the donkey doing donuts, dust clouds rising. Shot 3: snow mountain backdrop, the donkey launches off a hillside, text 'Inspire Creativity, Enrich Life' revealed behind it as dust settles"

FAQ

Seedance 2.0 API FAQ

How does the Seedance 2.0 API work?

Send a POST request to any Seedance 2.0 endpoint with your prompt and parameters. The fal serverless infrastructure handles GPU allocation, inference, and scaling automatically. You get back a URL to the generated video. Use the Python or JavaScript SDK for the simplest integration, or call the REST API directly.

What input types & formats does Seedance 2.0 API support?

Input: text prompts, images (JPEG, PNG, WebP), video files (MP4, MOV), and audio files (WAV, MP3). Output: MP4 video with synchronized audio. Resolutions: 480p and 720p. Durations: 4 to 15 seconds. Aspect ratios: 21:9, 16:9, 4:3, 1:1, 3:4, and 9:16.

How fast is the Seedance 2.0 API?

Generations complete in under 2 minutes. Fast-tier endpoints offer lower latency and cost for production workloads. Standard-tier endpoints prioritize maximum quality. Both tiers run on fal's serverless infrastructure with automatic scaling and no cold starts for sustained traffic.

How do I get access to the Seedance 2.0 API?

Seedance 2.0 is available now on fal as of April 9th, 2026. You can start generating videos immediately in the playground with no setup required. If you are a developer looking to integrate via API, sign up and grab an API key from your dashboard, then use the Python or JavaScript SDK, or call the REST API directly.

What countries can access the Seedance 2.0 API?

The Seedance 2.0 API is available globally through fal's infrastructure. Developers and enterprises in any country can access and integrate the API into their applications.

What is Seedance 2.0?

Seedance 2.0 is ByteDance's latest video generation model. It uses a unified multimodal audio-video architecture that accepts text, image, audio, and video inputs. It generates cinematic video with native audio, multi-shot cuts, and realistic physics in a single generation.

What input types does Seedance 2.0 support?

Seedance 2.0 accepts text prompts, reference images, audio clips, and video inputs. You can combine these to control the output. For example, provide a reference image for visual style, an audio clip for the soundtrack, and a text prompt for the scene description.

How long can generated videos be?

Seedance 2.0 generates videos up to 15 seconds in a single generation. Within that duration, the model can produce multiple shots with natural cuts and transitions, so a single output can feel like an edited sequence rather than a single continuous clip.

How good is the audio quality?

The audio quality is a standout feature. Music has deep bass and cinematic presence. Dialogue is clear with accurate lip-sync. Sound effects are contextually appropriate and well-timed. The model generates audio natively alongside the video, so everything stays in sync without post-production.


Getting Started

Seedance 2.0 API Integration Steps

Get up and running in minutes. No GPUs to manage, no infrastructure to set up.

  1. 1
    Install the client

    Pick your package manager. For Python, use pip.

    npm install --save @fal-ai/client
  2. 2
    Create an account on fal

    Sign up to get access to the dashboard and your API keys.

  3. 3
    Get your API key

    Locate your API credentials in the developer dashboard. Set FAL_KEY as an environment variable in your runtime.

  4. 4
    Submit a request

    Use fal.subscribe() to submit your request with a prompt and parameters. The client handles the async queue automatically, providing progress updates via onQueueUpdate, and returns the final video URL when generation is complete.

Try it now

No setup required

Start generating Seedance 2.0 videos instantly in the playground. No API key needed, just describe your scene and hit generate.

Open Playground β†’

For developers

Integrate via API

Grab an API key from your dashboard and integrate Seedance 2.0 into your app with a few lines of code. Python and JavaScript SDKs available, plus a REST API for any language.

Get API Key β†’

Learn more

Read the full guide

Prompting techniques, audio-video generation, image-to-video, reference-to-video, multi-shot workflows, and pricing explained.

Read the Guide β†’

Get in Touch About Seedance 2.0

Want to learn more about integrating Seedance 2.0 into your workflow? Leave your details and our team will reach out.

Contact Sales