Replies: 1 comment 4 replies
-
Hey @jay-jjwu! 👋 I'm here to assist you with any bugs, questions, or contributions you have. Let's squash those bugs together! You can integrate the ChatGLM model into Dify. Dify supports ZHIPU (ChatGLM) as a provider for both System Inference Models and Embedding Models. Additionally, Dify can connect to locally deployed Xinference models, which include support for ChatGLM. To deploy Xinference locally and integrate it with Dify, follow these steps:
Click "Save" to use the model in the Dify application. For more detailed information, you can refer to the Xinference documentation [1][2][3]. |
Beta Was this translation helpful? Give feedback.
-
did anyone try to add 3rd party models to dify?
i did try to add yaml confogi file to integrate chatglm form aliyun to dify, but the log shows only qwen models are currently supported as below:
//*********
ERROR [Dummy-2] [base_app_generate_response_converter.py:128] - Not support model: chatglm3-6b, currently only support qwen models.
so the 3rd party models are not supported?
Beta Was this translation helpful? Give feedback.
All reactions