AI prompts
base on Run any ComfyUI workflow w/ ZERO setup. # ComfyUI Launcher (BETA)
Run any ComfyUI workflow w/ **ZERO setup**.
Need help? Join our Discord!
[![](https://dcbadge.vercel.app/api/server/kXS43yTRNA)](https://discord.gg/kXS43yTRNA)
Runs anywhere:
- [Locally](#quick-start)
- [RunPod](/cloud/RUNPOD.md)
- [Huggingface Spaces](https://huggingface.co/spaces/multimodalart/comfyUI-laucher-v2)
## Features
- Automatically installs custom nodes, missing model files, etc.
- Workflows exported by this tool can be run by anyone with **ZERO setup**
- Work on multiple ComfyUI workflows at the same time
- Each workflow runs in its own isolated environment
- Prevents your workflows from suddenly breaking when updating custom nodes, ComfyUI, etc.
<p float="middle">
<img src="./assets/launcher_projects.png" width="45%" />
<img src="./assets/launcher_new_workflow.png" width="45%" />
<img src="./assets/launcher_import_workflow.png" width="45%" />
<img src="./assets/launcher_comfyui.png" width="45%" />
</p>
## Demo
Running a workflow json file w/ no setup
https://github.com/ComfyWorkflows/ComfyUI-Launcher/assets/33400216/aa17680d-eee5-4e6d-abc4-9f7551f9a4ad
## Requirements
#### Windows (Windows Subsystem for Linux - WSL) & Linux:
- Docker (w/ GPU support) or Python 3
#### macOS:
- Python 3
## Quick start
### Option 1: Docker (Linux & Windows)
#### Linux
```
docker run \
--gpus all \ # remove this line if you don't have a GPU or if you're on MacOS
--rm \
--name comfyui_launcher \
-p 4000-4100:4000-4100 \
-v $(pwd)/comfyui_launcher_models:/app/server/models \
-v $(pwd)/comfyui_launcher_projects:/app/server/projects \
-it thecooltechguy/comfyui_launcher
```
### Windows
```
docker run ^
--gpus all ^ # remove this line if you don't have a GPU
--rm ^
--name comfyui_launcher ^
-p 4000-4100:4000-4100 ^
-v %cd%/comfyui_launcher_models:/app/server/models ^
-v %cd%/comfyui_launcher_projects:/app/server/projects ^
-it thecooltechguy/comfyui_launcher
```
Open http://localhost:4000 in your browser
### Option 2: Manual setup (macOS, Linux, and Windows)
Works for **Windows (WSL - Windows Subsystem for Linux)**, **Linux**, & **macOS**
#### Installation (one-time setup)
```
git clone https://github.com/ComfyWorkflows/comfyui-launcher
cd comfyui-launcher/
```
#### Start ComfyUI Launcher
```
./run.sh
```
Open http://localhost:4000 in your browser
If you're facing issues w/ the installation, please make a post in the *bugs* forum on our [discord](https://discord.gg/QvGC8CFGDU)
## Updating
### Option 1: Docker
```
docker pull thecooltechguy/comfyui_launcher
```
### Option 2: Manual setup
```
git pull
```
## Usage
### Using a reverse proxy (advanced)
If you're running ComfyUI Launcher behind a reverse proxy or in an environment where you can only expose a single port to access the Launcher and its workflow projects, you can run the Launcher with `PROXY_MODE=true` (only available for Docker).
```
docker run \
--gpus all \ # remove this line if you don't have a GPU or if you're on MacOS
--rm \
--name comfyui_launcher \
-p 4000:80 \
-v $(pwd)/comfyui_launcher_models:/app/server/models \
-v $(pwd)/comfyui_launcher_projects:/app/server/projects \
-e PROXY_MODE=true \
-it thecooltechguy/comfyui_launcher
```
Once the container is running, all you need to do is expose port 80 to the outside world. This will allow you to access the Launcher and its workflow projects from a single port.
Currently, `PROXY_MODE=true` only works with Docker, since NGINX is used within the container.
If you're running the Launcher manually, you'll need to set up a reverse proxy yourself (see the `nginx.conf` file for an example).
### Using an existing ComfyUI models folder
When starting the ComfyUI Launcher, you can set the `MODELS_DIR` environment variable to the path of your existing ComfyUI models folder. This will allow you to use the models you've already downloaded. By default, they're stored in `./server/models`
### Using a different folder to store your Launcher projects
When starting the ComfyUI Launcher, you can set the `PROJECTS_DIR` environment variable to the path of the folder you'd like to use to store your projects. By default, they're stored in `./server/projects`
## Donations
If you find our work useful for you, we'd appreciate any donations! Thank you!
<a href="https://www.buymeacoffee.com/comfy.workflows" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/default-orange.png" alt="Buy Me A Coffee" height="41" width="174"></a>
## Coming soon
- Native Windows support (w/o requiring WSL)
- Better way to manage your workflows locally
- Run workflows w/ Cloud GPUs
- Backup your projects to the cloud
- Run ComfyUI Launcher in the cloud
## Credits
- ComfyUI Manager (https://github.com/ltdrdata/ComfyUI-Manager/)
- Used to auto-detect & install custom nodes
", Assign "at most 3 tags" to the expected json: {"id":"8385","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"