Model Deployment Modes and Description
|
Model Name |
Cloud-based Deployment |
Edge Deployment |
|---|---|---|
|
DeepSeek-R1-distill-Qwen-32B |
Supported. At least two inference units are required to deploy the model. |
Supported. At least two inference units are required to deploy the model. |
|
DeepSeek-R1-distill-LLama-70B |
Supported. At least four inference units are required to deploy the model. |
Supported. At least four inference units are required to deploy the model. |
|
DeepSeek-R1-distill-LLama-8B |
Supported. At least one inference unit is required to deploy the model. |
Supported. At least one inference unit is required to deploy the model. |
|
Qwen3-235B-A22B |
Supported. At least 16 inference units are required to deploy the model. |
Supported. At least 16 inference units are required to deploy the model. |
|
Qwen3-32B |
Supported. At least two inference units are required to deploy the model. |
Supported. At least two inference units are required to deploy the model. |
|
Qwen3-30B-A3B |
Supported. At least two inference units are required to deploy the model. |
Supported. At least two inference units are required to deploy the model. |
|
Qwen3-14B |
Supported. At least one inference unit is required to deploy the model. |
Supported. At least one inference unit is required to deploy the model. |
|
Qwen3-8B |
Supported. At least one inference unit is required to deploy the model. |
Supported. At least one inference unit is required to deploy the model. |
|
Qwen2.5-72B |
Supported. At least four inference units are required to deploy the model. |
Supported. At least four inference units are required to deploy the model. |
|
QWQ-32B |
Supported. At least four inference units are required to deploy the model. |
Supported. At least four inference units are required to deploy the model. |
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot