What We’ll Build

This guide shows how to build a fal.ai plugin that lets your agent generate 6-second, 768p videos from text prompts using the MiniMax Hailuo-02 model. For architectural concepts, see Plugin Architecture. You’ll learn:
  • Actions (what the agent can DO)
  • Progressive development (start simple, organize as you grow)
  • Local plugin testing (character.plugins array method)
  • Plugin testing (component and E2E tests)
For component details and patterns, see Plugin Components and Plugin Patterns.

Step 1: Quick Start

Create Project and Plugin

Create a project with a plugin inside using CLI commands:
1

Create project

Terminal
elizaos create --type project my-eliza-project
Configure when prompted:
  • Database: PgLite (perfect for local development)
  • Model: OpenAI
Terminal
cd my-eliza-project
2

Create plugin inside project

Terminal
elizaos create --type plugin plugin-fal-ai
When prompted, choose Quick Plugin (we don’t need frontend UI)Your structure now looks like:
my-eliza-project/
├── src/character.ts       # Default Eliza character
└── plugin-fal-ai/         # 👈 Plugin lives alongside project
    ├── src/
    │   ├── index.ts       # Plugin exports
    │   ├── plugin.ts      # Main plugin (start here)
    │   └── __tests__/     # Plugin tests
    └── package.json
3

Add plugin to character

In my-eliza-project/src/character.ts, add the local path to Eliza’s plugins array:
src/character.ts
export const character: Character = {
  name: 'Eliza',
  plugins: [
    '@elizaos/plugin-sql',
    '@elizaos/plugin-openai',
    '@elizaos/plugin-bootstrap',
    './plugin-fal-ai'
  ],
};

Connect and Test

1

Build plugin and test connection

The plugin needs to be built first to create the dist/ folder that ElizaOS loads from:
Terminal
# Build the plugin first
cd plugin-fal-ai
bun run build

# Go back to project and start
cd ..
elizaos start
Verify it’s loaded:
  • Check the console logs for Successfully loaded plugin 'plugin-fal-ai'
  • Visit http://localhost:3000 → click your agent → Plugins tab

Step 2: Development

Research the API

Let’s research what we want to build by exploring fal.ai for a good text-to-video model. MiniMax Hailuo-02 Text to Video looks pretty good.
  1. Navigate to the JavaScript/Typescript section of the docs to see how to call the API:
    • Install: bun add @fal-ai/client
    • Import: import { fal } from "@fal-ai/client"
    • Use: fal.subscribe("model-endpoint", { input: {...} })
    • Returns: { data, requestId }
Now we know exactly what to build and how to call it, so let’s start developing our plugin.

Edit Default Plugin Template

1

Add fal.ai dependency

Terminal
cd plugin-fal-ai
bun add @fal-ai/client
This adds the fal.ai client package to your plugin dependencies.
2

Study the template structure

Open plugin-fal-ai/src/plugin.ts to see the sample code patterns for plugins:
  • quickAction - example Action (what agent can DO)
  • quickProvider - example Provider (gives agent CONTEXT)
  • StarterService - example Service (manages state/connections)
  • Plugin events, routes, models - other comprehensive patterns
3

Create your text-to-video action using plugin patterns

Copy the plugin file and rename it to create your action:
Terminal
mkdir src/actions
cp src/plugin.ts src/actions/generateVideo.ts
Now let’s edit the example plugin into our generateVideo action:Add the fal.ai import (from the fal.ai docs):
src/actions/generateVideo.ts
import {
  Action, ActionResult, IAgentRuntime, Memory, HandlerCallback, State, logger
} from '@elizaos/core';
import { fal } from '@fal-ai/client'; 
Update the action identity for video generation:
const quickAction: Action = { 
export const generateVideoAction: Action = { 
  name: 'QUICK_ACTION', 
  name: 'TEXT_TO_VIDEO', 
  similes: ['GREET', 'SAY_HELLO', 'HELLO_WORLD'], 
  similes: ['CREATE_VIDEO', 'MAKE_VIDEO', 'GENERATE_VIDEO', 'VIDEO_FROM_TEXT'], 
  description: 'Responds with a simple hello world message', 
  description: 'Generate a video from text using MiniMax Hailuo-02', 
Replace validation with API key check:
  validate: async (_runtime, _message, _state) => { 
    return true; // Always valid
  }, 
  validate: async (runtime: IAgentRuntime, message: Memory) => { 
    const falKey = runtime.getSetting('FAL_KEY'); 
    if (!falKey) { 
      logger.error('FAL_KEY not found in environment variables'); 
      return false; 
    } 
    return true; 
  }, 
Replace hello world logic with video generation:
  handler: async (_runtime, message, _state, _options, callback) => { 
    const response = 'Hello world!'; 
    
    if (callback) { 
      await callback({ 
        text: response, 
        actions: ['QUICK_ACTION'], 
        source: message.content.source, 
      }); 
    } 
    
    return { 
      text: response, 
      success: true, 
      data: { actions: ['QUICK_ACTION'], source: message.content.source } 
    }; 
  }, 
  handler: async ( 
    runtime: IAgentRuntime, 
    message: Memory, 
    state: State | undefined, 
    options: any, 
    callback?: HandlerCallback
  ): Promise<ActionResult> => { 
    try { 
      fal.config({ credentials: runtime.getSetting('FAL_KEY') }); 
      let prompt = message.content.text.replace(/^(create video:|make video:)/i, '').trim(); 
      if (!prompt) return { success: false, text: 'I need a description' }; 
      
      const result = await fal.subscribe("fal-ai/minimax/hailuo-02/standard/text-to-video", { 
        input: { prompt, duration: "6" }, logs: true
      }); 
      
      const videoUrl = result.data.video.url; 
      if (callback) await callback({ text: `✅ Video ready! ${videoUrl}` }); 
      return { success: true, text: 'Video generated', data: { videoUrl, prompt } }; 
    } catch (error) { 
      return { success: false, text: `Failed: ${error.message}` }; 
    } 
  }, 
Update examples for video conversations:
  examples: [ 
    [{ 
      name: '{{name1}}', 
      content: { text: 'Can you say hello?' } 
    }, { 
      name: '{{name2}}', 
      content: { text: 'hello world!', actions: ['QUICK_ACTION'] } 
    }] 
  ], 
  examples: [ 
    [{ name: '{{user}}', content: { text: 'Create video: dolphins jumping' } }, 
     { name: '{{agent}}', content: { text: 'Creating video!', actions: ['TEXT_TO_VIDEO'] }}] 
  ], 
};
4

Update index.ts to use your action

Finally, update src/index.ts to use our new plugin:
src/index.ts
import { Plugin } from '@elizaos/core';
import { generateVideoAction } from './actions/generateVideo'; 

export const falaiPlugin: Plugin = { 
  name: 'fal-ai', 
  description: 'Generate videos using fal.ai MiniMax Hailuo-02', 
  actions: [generateVideoAction], 
  providers: [], 
  services: [] 
}; 

export default falaiPlugin; 
export { generateVideoAction }; 
You can reference plugin.ts as well as other plugins from the Plugin Registry to see other plugin component examples (providers, services, etc.) as you expand your plugin.

Add Configuration

1

Get your fal.ai API key

Get an API key from fal.ai and copy/paste it into your .env:
.env
PGLITE_DATA_DIR=./.eliza/.elizadb
OPENAI_API_KEY=your_openai_key_here

FAL_KEY=your_fal_key_here

Step 3: Testing

Test Plugin Functionality

Verify your plugin works as expected:
1

Test the updated plugin

First rebuild your plugin to effect our changes, then start from project root:
Terminal
# Build the plugin first
cd plugin-fal-ai
bun run build

# Start from project root  
cd ..
elizaos start
2

Test video generation

Try your new action by chatting with Eliza in the GUI (http://localhost:3000):
  • "Create video: dolphins jumping in ocean"
  • "Make video: cat playing piano"
  • "Generate video: sunset over mountains"
You should see the video generation process and get a URL to view the result!

Plugin Component Tests

Plugins come default with component and E2E tests. Let’s add custom component tests:
1

Add a component test

Update plugin-fal-ai/src/__tests__/plugin.test.ts:
src/__tests__/plugin.test.ts
import { describe, it, expect } from 'bun:test';
import { falaiPlugin, generateVideoAction } from '../index'; 

describe('FAL AI Plugin', () => {
  it('action validates with FAL_KEY', async () => { 
    const mockRuntime = { 
      getSetting: (key: string) => key === 'FAL_KEY' ? 'test-key' : null
    }; 
    
    const isValid = await generateVideoAction.validate(mockRuntime as any, {} as any); 
    expect(isValid).toBe(true); 
  }); 
});
2

Run component tests

Terminal
cd plugin-fal-ai
elizaos test --type component

Plugin E2E Tests

Let’s also add a custom E2E test:
1

Add an E2E test

Update src/__tests__/e2e/plugin-fal-ai.e2e.ts:
src/__tests__/e2e/plugin-fal-ai.e2e.ts
export const FalAiTestSuite = { 
  name: 'fal-ai-video-generation', 
  tests: [{ 
    name: 'should find video action in runtime', 
    fn: async (runtime) => { 
      const action = runtime.actions.find(a => a.name === 'TEXT_TO_VIDEO'); 
      if (!action) throw new Error('TEXT_TO_VIDEO action not found'); 
    } 
  }] 
}; 
2

Run E2E tests

Terminal
cd plugin-fal-ai  
elizaos test --type e2e

Step 4: Possible Next Steps

Congratulations! You now have a working video generation plugin. Here are some ways you can improve it:

Enhance Your Action

  • Add more similes - Handle requests like “animate this”, “video of”, “show me a clip of”
  • Better examples - Add more conversation examples so Eliza learns different chat patterns
  • Error handling - Handle rate limits, invalid prompts, or API timeouts

Add Plugin Components

  • Providers - Give your agent context about recent videos or video history
  • Evaluators - Track analytics, log successful generations, or rate video quality
  • Services - Add queueing for multiple video requests or caching for common prompts
The possibilities are endless!

What’s Next?