Logo
Explore Help
Sign In
wzj/dify-1.72
1
0
Fork 0
You've already forked dify-1.72
Code Issues Pull Requests Actions 1 Packages Projects Releases Wiki Activity
dify-1.72/api/core/third_party/langchain/llms
History
takatost bd3a9b2f8d
fix: xinference-chat-stream-response (#991)
2023-08-24 14:39:34 +08:00
..
__init__.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
azure_chat_open_ai.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
azure_open_ai.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
chat_open_ai.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
fake.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
huggingface_endpoint_llm.py
feat: optimize hf inference endpoint (#975)
2023-08-23 19:47:50 +08:00
open_ai.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
openllm.py
fix: openllm generate cutoff (#945)
2023-08-22 13:43:36 +08:00
replicate_llm.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
spark.py
feat: add spark v2 support (#885)
2023-08-17 15:08:57 +08:00
tongyi_llm.py
feat: server multi models support (#799)
2023-08-12 00:57:00 +08:00
wenxin.py
fix: wenxin error not raise when stream mode (#884)
2023-08-17 13:40:00 +08:00
xinference_llm.py
fix: xinference-chat-stream-response (#991)
2023-08-24 14:39:34 +08:00
Powered by Gitea Version: 1.23.7 Page: 247ms Template: 25ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API