Deploying a DeepSeek Model
After the model training is complete, you can deploy the model.
- Log in to ModelArts Studio and access the required workspace.
- In the navigation pane, choose Model Development > Model Deployment. Click Create Deployment Task in the upper right corner.
- In the Select Model dialog box, set Sources to Model Square and Type to NLP, select a model, and click OK.
- On the Create a deployment page, set deployment parameters by referring to Table 1 and start model deployment.
Table 1 Parameters for deploying a third-party large model Category
Parameter
Description
Deployment Configuration
Select Model
You can modify the following information:
- Sources: Select Model Square.
- Type: Select NLP and select the model and version to be deployed.
Deployment Mode
Deployment at the online: Algorithms are deployed in the resource pool provided by the platform.
Deployed_Model
Unique identifier of the inference service when the inference service is called through the inference API of the V2 version.
Safety guardrail
Open and agree to authorize
Ensures the security of model calling.
Version Selection
Currently, only the basic edition of the security guardrail is supported, which is built-in with default content moderation rules.
Resource Disposition
Billing model
Free for a limited time
Number of instances
Set the number of instances required by the model to be deployed. It is recommended that the number of instances to be deployed at a time be less than or equal to 10. Otherwise, traffic limiting may be triggered and the deployment may fail.
Subscription reminder
Subscription reminder
After this function is enabled, the system sends SMS or email notifications to users when the task status is updated.
Basic Information
Service Name
Set the name of the deployment task.
Description (Optional)
Set the description of the deployment task.
- After setting the deployment parameters, click Deploy Now.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot