[{"data":1,"prerenderedAt":521},["ShallowReactive",2],{"docs-/docs/models-overview":3},{"id":4,"title":5,"body":6,"description":513,"extension":514,"meta":515,"navigation":516,"path":517,"seo":518,"stem":519,"__hash__":520},"content/en/docs/models-overview.md","Models Overview - Seedance 2.0 API",{"type":7,"value":8,"toc":501},"minimark",[9,14,40,45,113,116,126,143,147,150,155,161,165,225,237,241,244,320,334,342,353,431,435,442,456,459,463,497],[10,11,13],"h1",{"id":12},"models-overview","Models Overview",[15,16,17,18,22,23,27,28,31,32,35,36,39],"p",{},"The Seedance 2.0 API is not a single model — it's a matrix of ",[19,20,21],"strong",{},"6 models",". You must specify exactly one of them in the ",[24,25,26],"code",{},"model"," field of every request. There is ",[19,29,30],{},"no \"automatic mode detection\"","; sending the wrong model ID will return ",[24,33,34],{},"model_access_denied"," or ",[24,37,38],{},"invalid_request",".",[41,42,44],"h2",{"id":43},"the-model-matrix","The Model Matrix",[46,47,48,64],"table",{},[49,50,51],"thead",{},[52,53,54,58,61],"tr",{},[55,56,57],"th",{},"Input type",[55,59,60],{},"Standard (best quality)",[55,62,63],{},"Fast (quicker / cheaper)",[65,66,67,83,98],"tbody",{},[52,68,69,73,78],{},[70,71,72],"td",{},"Text only",[70,74,75],{},[24,76,77],{},"seedance-2.0-text-to-video",[70,79,80],{},[24,81,82],{},"seedance-2.0-fast-text-to-video",[52,84,85,88,93],{},[70,86,87],{},"1–2 images",[70,89,90],{},[24,91,92],{},"seedance-2.0-image-to-video",[70,94,95],{},[24,96,97],{},"seedance-2.0-fast-image-to-video",[52,99,100,103,108],{},[70,101,102],{},"Images + videos + audio",[70,104,105],{},[24,106,107],{},"seedance-2.0-reference-to-video",[70,109,110],{},[24,111,112],{},"seedance-2.0-fast-reference-to-video",[15,114,115],{},"All six models share the same endpoint:",[117,118,123],"pre",{"className":119,"code":121,"language":122},[120],"language-text","POST https://api.evolink.ai/v1/videos/generations\n","text",[24,124,121],{"__ignoreMap":125},"",[15,127,128,129,131,132,135,136,135,139,142],{},"The only differences are the ",[24,130,26],{}," field value and which input arrays (",[24,133,134],{},"image_urls"," / ",[24,137,138],{},"video_urls",[24,140,141],{},"audio_urls",") each model accepts.",[41,144,146],{"id":145},"how-to-choose-a-model","How to Choose a Model",[15,148,149],{},"Two steps: pick the mode based on your inputs, then pick Standard vs Fast.",[151,152,154],"h3",{"id":153},"step-1-pick-the-mode-by-input","Step 1: Pick the mode by input",[117,156,159],{"className":157,"code":158,"language":122},[120],"Text prompt only\n    → text-to-video  (optional web_search parameter for up-to-date content)\n\n1 reference image  (used as first frame)\n2 reference images (used as first + last frames)\n    → image-to-video\n\nMultiple images, reference videos, or audio tracks  (up to 9 + 3 + 3)\n    → reference-to-video\n",[24,160,158],{"__ignoreMap":125},[151,162,164],{"id":163},"step-2-standard-or-fast","Step 2: Standard or Fast",[46,166,167,180],{},[49,168,169],{},[52,170,171,174,177],{},[55,172,173],{},"Dimension",[55,175,176],{},"Standard",[55,178,179],{},"Fast",[65,181,182,193,204,214],{},[52,183,184,187,190],{},[70,185,186],{},"Visual quality stability",[70,188,189],{},"Higher",[70,191,192],{},"Acceptable",[52,194,195,198,201],{},[70,196,197],{},"Generation speed",[70,199,200],{},"Baseline",[70,202,203],{},"Faster",[52,205,206,209,211],{},[70,207,208],{},"Per-second cost",[70,210,200],{},[70,212,213],{},"Lower",[52,215,216,219,222],{},[70,217,218],{},"Recommended for",[70,220,221],{},"Final deliverables, ads, hero product videos",[70,223,224],{},"Previews, A/B testing, bulk short-form production",[226,227,228],"blockquote",{},[15,229,230,233,234,236],{},[19,231,232],{},"Tip:"," The same prompt can usually be switched between Standard and Fast with zero code changes — the parameter structure is identical, only the ",[24,235,26],{}," field differs. Use Fast during prompt iteration, then swap to Standard for final delivery.",[41,238,240],{"id":239},"capabilities-shared-by-all-models","Capabilities Shared by All Models",[15,242,243],{},"Regardless of which model you pick, Seedance 2.0 supports:",[245,246,247,261,288,304,310],"ul",{},[248,249,250,253,254,257,258],"li",{},[19,251,252],{},"Synchronized audio generation"," — ",[24,255,256],{},"generate_audio: true"," (default). Put dialogue inside straight double quotes to optimize speech synthesis, e.g. ",[24,259,260],{},"The man said: \"Remember, never point at the moon.\"",[248,262,263,253,266,269,270,269,273,269,276,269,279,269,282,269,285],{},[19,264,265],{},"Multiple aspect ratios",[24,267,268],{},"16:9",", ",[24,271,272],{},"9:16",[24,274,275],{},"1:1",[24,277,278],{},"4:3",[24,280,281],{},"3:4",[24,283,284],{},"21:9",[24,286,287],{},"adaptive",[248,289,290,253,293,35,296,299,300,303],{},[19,291,292],{},"Two quality tiers",[24,294,295],{},"480p",[24,297,298],{},"720p"," (",[19,301,302],{},"1080p is not supported",")",[248,305,306,309],{},[19,307,308],{},"4–15 second durations"," — any integer, default 5 seconds",[248,311,312,315,316,319],{},[19,313,314],{},"Async task lifecycle"," — every request returns a task ID immediately; retrieve the result by polling or via ",[24,317,318],{},"callback_url"," webhook",[15,321,322,323,299,326,329,330,333],{},"Billing uses ",[19,324,325],{},"per-second pricing",[24,327,328],{},"billing_rule: \"per_second\"",") — longer ",[24,331,332],{},"duration"," values cost more for the same clip.",[41,335,337,338,341],{"id":336},"exclusive-capability-model_paramsweb_search-text-to-video-only","Exclusive Capability: ",[24,339,340],{},"model_params.web_search"," (text-to-video only)",[15,343,344,346,347,349,350,352],{},[24,345,77],{}," and ",[24,348,82],{}," support an additional ",[24,351,340],{}," parameter. When enabled, the model autonomously decides whether to search the internet for up-to-date information (and is only billed when a search is actually triggered):",[117,354,358],{"className":355,"code":356,"language":357,"meta":125,"style":125},"language-json shiki shiki-themes github-dark","{\n  \"model\": \"seedance-2.0-text-to-video\",\n  \"prompt\": \"An ad for the latest 2026 spring-edition electric sports car\",\n  \"model_params\": {\n    \"web_search\": true\n  }\n}\n","json",[24,359,360,369,386,399,408,419,425],{"__ignoreMap":125},[361,362,365],"span",{"class":363,"line":364},"line",1,[361,366,368],{"class":367},"s95oV","{\n",[361,370,372,376,379,383],{"class":363,"line":371},2,[361,373,375],{"class":374},"sDLfK","  \"model\"",[361,377,378],{"class":367},": ",[361,380,382],{"class":381},"sU2Wk","\"seedance-2.0-text-to-video\"",[361,384,385],{"class":367},",\n",[361,387,389,392,394,397],{"class":363,"line":388},3,[361,390,391],{"class":374},"  \"prompt\"",[361,393,378],{"class":367},[361,395,396],{"class":381},"\"An ad for the latest 2026 spring-edition electric sports car\"",[361,398,385],{"class":367},[361,400,402,405],{"class":363,"line":401},4,[361,403,404],{"class":374},"  \"model_params\"",[361,406,407],{"class":367},": {\n",[361,409,411,414,416],{"class":363,"line":410},5,[361,412,413],{"class":374},"    \"web_search\"",[361,415,378],{"class":367},[361,417,418],{"class":374},"true\n",[361,420,422],{"class":363,"line":421},6,[361,423,424],{"class":367},"  }\n",[361,426,428],{"class":363,"line":427},7,[361,429,430],{"class":367},"}\n",[41,432,434],{"id":433},"prompt-length-limit","Prompt Length Limit",[15,436,437,438,441],{},"All models enforce the same ",[24,439,440],{},"prompt"," limit:",[245,443,444,450],{},[248,445,446,447],{},"Chinese: ",[19,448,449],{},"≤ 500 characters",[248,451,452,453],{},"English: ",[19,454,455],{},"≤ 1000 words",[15,457,458],{},"Going over is rejected outright. Longer prompts do not produce better results — focus on the subject, action, and cinematography.",[41,460,462],{"id":461},"next-steps","Next Steps",[245,464,465,475,483,490],{},[248,466,467,472,473],{},[468,469,471],"a",{"href":470},"/docs/text-to-video","Text-to-Video API"," — Full reference for ",[24,474,77],{},[248,476,477,472,481],{},[468,478,480],{"href":479},"/docs/image-to-video","Image-to-Video API",[24,482,92],{},[248,484,485,489],{},[468,486,488],{"href":487},"/docs/reference-to-video","Reference-to-Video API"," — Multimodal composition, the most powerful mode",[248,491,492,496],{},[468,493,495],{"href":494},"/docs/fast-models","Fast Models"," — Speed/cost profile of the three Fast variants",[498,499,500],"style",{},"html pre.shiki code .s95oV, html code.shiki .s95oV{--shiki-default:#E1E4E8}html pre.shiki code .sDLfK, html code.shiki .sDLfK{--shiki-default:#79B8FF}html pre.shiki code .sU2Wk, html code.shiki .sU2Wk{--shiki-default:#9ECBFF}html .default .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}html .shiki span {color: var(--shiki-default);background: var(--shiki-default-bg);font-style: var(--shiki-default-font-style);font-weight: var(--shiki-default-font-weight);text-decoration: var(--shiki-default-text-decoration);}",{"title":125,"searchDepth":371,"depth":371,"links":502},[503,504,508,509,511,512],{"id":43,"depth":371,"text":44},{"id":145,"depth":371,"text":146,"children":505},[506,507],{"id":153,"depth":388,"text":154},{"id":163,"depth":388,"text":164},{"id":239,"depth":371,"text":240},{"id":336,"depth":371,"text":510},"Exclusive Capability: model_params.web_search (text-to-video only)",{"id":433,"depth":371,"text":434},{"id":461,"depth":371,"text":462},"Complete comparison of all 6 Seedance 2.0 models: text-to-video, image-to-video, reference-to-video — each in Standard and Fast variants. With a decision guide.","md",{},true,"/en/docs/models-overview",{"title":5,"description":513},"en/docs/models-overview","XLr8TNBoQIOh-yljXqufgpO7OEWjoEWLErjJ8HdgGpg",1776086322150]