Skip to content
Last updated

Templates define the visual and structural blueprint for a video — the number of scenes, their types, transitions, and styling. When creating a project with customTemplate, you have precise control over how strictly the AI follows that blueprint, what media goes into each scene, and how fonts, subtitles, and text behave.

Getting a template ID

Browse available templates with:

GET /auth/templates
GET /auth/organization/{organizationId}/workspace/{workspaceId}/templates

Use GET /auth/templates/{templateId} to inspect a template's scene structure before referencing it in a project.

Minimal usage

At a minimum, provide the template id:

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d"
}

All other customTemplate fields are optional overrides.


Flexibility: how closely to follow the template

The flexibilityOverride field controls how much freedom the AI has to deviate from the template's scene structure when generating the video.

ValueAI behaviourBest for
rigidStrict: scene count, types, and order are fixedRepeatable, predictable outputs — e.g. product demos, compliance videos
minorSmall adjustments allowed — the AI may reorder or lightly adapt scenesGeneral use: maintains brand structure while allowing natural narrative flow
flexibleThe AI can freely add, remove, or reorder scenes to best fit the contentCreative, story-driven videos where content quality matters more than structure
"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "flexibilityOverride": "rigid"
}

Priority: flexibilityOverride in the request → template's own flexibility setting → system default (minor).


Custom AI instructions (description)

The description field is injected into the AI's system prompt during script generation. Use it to provide context, tone guidance, brand voice rules, or anything else that should shape the script.

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "description": "Write in a confident, concise tone. Always mention the 30-day free trial. Avoid technical jargon."
}

To fall back to the template's own built-in description instead of providing one, set toUseTemplateDescription: true. If you provide description, it always wins regardless of toUseTemplateDescription.

Priority chain:

  1. description from the request (if non-empty) — always used
  2. toUseTemplateDescription: true in the request → use the template's description
  3. toUseTemplateDescription: false in the request → use no description
  4. Not provided → inherits the template's own useDescriptionAsSystemPrompt setting

Providing your own media (assets)

The assets array lets you supply external media URLs that Shuffll will analyse and intelligently map to the most appropriate scenes. This is useful when you have brand-specific footage or images that should appear in the video but you don't want to manually assign them to individual scenes.

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "assets": [
    {
      "url": "https://example.com/product-demo.mp4",
      "type": "video",
      "description": "Screen recording of the product dashboard"
    },
    {
      "url": "https://example.com/team-photo.jpg",
      "type": "image",
      "description": "Company team at the annual offsite"
    }
  ]
}

What happens internally:

  1. Each URL is validated (must be an image or video).
  2. Shuffll downloads each file and runs AI analysis — generating a description, detecting direction/composition, and identifying category.
  3. Files are uploaded to your workspace under an "Api Assets" folder.
  4. During scene assembly, the AI maps each asset to the most contextually relevant scene based on the asset's content and the scene's script.

Fields per asset:

FieldRequiredDescription
urlYesPublicly accessible URL to an image or video file
typeNoimage or video — helps with categorisation
descriptionNoHuman-readable description used as a hint during scene mapping
directionNoSubject direction hint (e.g. left, right, center)
textHeaderPositionNoPreferred text overlay position if the scene includes a text header

Overriding the media fallback pipeline

By default, each template has its own configured media source order. You can override this per-request using videoSourcesByOrder and imageSourcesByOrder inside customTemplate.

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "videoSourcesByOrder": ["workspace_assets", "stock_footage"],
  "imageSourcesByOrder": ["workspace_assets", "generate_from_text"]
}

This overrides the template's configured pipeline for this request only. Omitting a source removes it entirely from the fallback chain.

Difference from top-level pipeline fields:

FieldScope
videoSourcesPipeline (top-level)Applied when no template is used
customTemplate.videoSourcesByOrderOverrides the template's configured pipeline

Font overrides

Control the typography used across all scenes. Both primary and secondary must include family and weight if provided.

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "fontsSettings": {
    "primary": { "family": "Inter", "weight": "700" },
    "secondary": { "family": "Inter", "weight": "400" },
    "remaining": { "family": "Inter", "weight": "400" }
  }
}

To use the template's own configured font instead of defining one:

"toUseTemplateFont": true

Priority chain:

  1. fontsSettings in the request (if valid) — always used
  2. toUseTemplateFont: true → use the template's font
  3. toUseTemplateFont: false → use the system default font
  4. Not provided → inherits the template's own toUseTemplateFont setting

Per-scene overrides

Individual scenes can be overridden using the scenes array. Each entry is matched to a template scene by id.

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "scenes": [
    {
      "id": "scene_001",
      "staticText": ["Opening line set in stone regardless of AI output"],
      "toUseStaticText": true,
      "wordBoost": ["Shuffll", "GPT-4o"],
      "customSpelling": [
        { "from": ["shufle", "shuffle"], "to": "Shuffll" }
      ],
      "videosToAdd": [
        {
          "url": "https://cdn.shuffll.com/.../clip.mp4",
          "isMaster": true,
          "stagePositionNumber": 1
        }
      ]
    }
  ]
}

staticText and toUseStaticText

  • staticText — an array of text strings to inject into the scene's text fields.
  • toUseStaticText: true — forces these values to be used as-is, bypassing AI text generation for this scene.

Use this when specific copy must appear verbatim regardless of what the AI writes — e.g. legal disclaimers, fixed taglines, or CTA text.

wordBoost and customSpelling

Applied during subtitle generation (the transcription of voiceover audio):

  • wordBoost — signals to the speech-to-text engine that these words are likely to appear, improving recognition accuracy.
  • customSpelling — corrects known mistranscriptions. Each entry maps one or more misspelled variants (from) to the correct form (to).
"customSpelling": [
  { "from": ["shufle", "shuffel", "shuffle l"], "to": "Shuffll" },
  { "from": ["jay pee tee"], "to": "GPT" }
]

toGenerateSubtitles

Set toGenerateSubtitles: true on a scene whose videosToAdd clip has spoken audio you want transcribed into subtitles. This is useful when you provide a video clip that contains its own dialogue or narration.

videosToAdd

Pins specific video clips to this scene. See Video Generation for full details.


Background music

Set a specific background music track using the relativePath from GET /auth/config/music-library:

"customTemplate": {
  "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
  "preConfigs": {
    "bgMusic": {
      "path": "music/upbeat-corporate.mp3",
      "volume": 0.3
    },
    "toCleanAudio": true
  }
}
  • toCleanAudio: true — runs audio noise reduction on the voiceover during enhancement.
  • aiGenerationStability — a 0–1 value controlling how deterministic Stability AI image generation is. Higher values produce more consistent outputs; lower values produce more variation.

Subtitle style customisation

Customise the appearance of generated subtitles through preConfigs.subtitles.styles:

"preConfigs": {
  "subtitles": {
    "styles": {
      "normal": {
        "fontSize": 36,
        "fontFamily": "Inter",
        "color": "#FFFFFF",
        "fontWeight": "600"
      },
      "highlight": {
        "color": "#FFD700"
      }
    }
  }
}
  • normal — the base style applied to all subtitle text.
  • highlight — the style applied to the word currently being spoken (karaoke-style active-word highlighting).

Full example

A request combining the most common template customisation options:

curl -X POST "https://api.shuffll.com/api/v1/auth/project/create" \
  -H "x-api-key: YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "prompt": "How our platform reduces onboarding time by 40%",
    "language": "en",
    "videoLength": "medium",
    "toAutoEnhance": true,
    "toAutoExport": true,
    "webhook": "https://yourserver.com/webhooks/shuffll",
    "customTemplate": {
      "id": "6a3f1b2c9e8d7f4a0b5c2e1d",
      "flexibilityOverride": "minor",
      "description": "Confident, data-driven tone. Always mention the 40% reduction stat.",
      "assets": [
        {
          "url": "https://example.com/product-walkthrough.mp4",
          "type": "video",
          "description": "Product dashboard walkthrough"
        }
      ],
      "videoSourcesByOrder": ["workspace_assets", "stock_footage"],
      "toUseTemplateFont": true,
      "preConfigs": {
        "bgMusic": { "path": "music/upbeat-corporate.mp3", "volume": 0.25 },
        "toCleanAudio": true,
        "subtitles": {
          "styles": {
            "normal": { "fontSize": 34, "color": "#FFFFFF" },
            "highlight": { "color": "#FFD700" }
          }
        }
      },
      "scenes": [
        {
          "id": "scene_001",
          "staticText": ["Reduce onboarding time by 40%"],
          "toUseStaticText": true
        },
        {
          "id": "scene_003",
          "wordBoost": ["Shuffll", "onboarding"],
          "customSpelling": [
            { "from": ["shufle", "shuffle"], "to": "Shuffll" }
          ]
        }
      ]
    }
  }'