Deploy OpenLLM Model
Starting OpenLLM
Each OpenLLM Server can deploy one model, and you can deploy it in the following way:
Note: Using the facebook/opt-1.3b model here is only for demonstration, and the effect may not be good. Please choose the appropriate model according to the actual situation. For more models, please refer to: Supported Model List.
After the model is deployed, use the connected model in Dify.
Fill in under Settings > Model Providers > OpenLLM:
- Model Name:
facebook/opt-1.3b - Server URL:
http://<Machine_IP>:3333Replace with your machine IP address
Edit this page | Report an issue