An SQL Error Is Reported When the Number of MetaStore Dynamic Partitions Exceeds the Threshold
Symptom
When the SparkSQL or HiveSQL command is executed, the following error message is displayed:
Number of dynamic partitions created is 2001, which is more than 2000. To slove this try to set hive.exec.max.dynamic.partitions to at least 2001
Cause Analysis
By default, Hive limits the maximum number of dynamic partitions, which is specified by the hive.exec.max.dynamic.partitions parameter. The default value is 1000. If the threshold is exceeded, Hive will not create new dynamic partitions.
Procedure
- Adjust upper-layer services to ensure that the number of dynamic partitions is within the value range of hive.exec.max.dynamic.partitions.
- Run the set hive.exec.max.dynamic.partitions = XXX; command to increase the value of hive.exec.max.dynamic.partitions.
The spark.hadoop.hive.exec.max.dynamic.partitions parameter needs to be set in SparkSQL.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot