What Should I Do If the "had a not serializable result" Error Is Reported When a Spark Task Reads HBase Data?
Question
What should I do if the error "Task 0.0 in stage 0.0 (TID 0) had a not serializable result: org.apache.hadoop.hbase.io.ImmutableBytesWritable" is reported when a Spark task reads HBase data?
Answer
You can resolve this exception by using either of the following methods:
- Run the following lines of code before initializing SparkConf:
System.setProperty("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); System.setProperty("spark.kryo.registrator", "com.huawei.bigdata.spark.examples.MyRegistrator");
- Use the set method to set the SparkConf object. The code is as follows:
val conf = new SparkConf().setAppName("HbaseTest"); conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); conf.set("spark.kryo.registrator", "com.huawei.bigdata.spark.examples.MyRegistrator");
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.