Help Center>
Data Lake Insight>
FAQs>
Problems Related to Spark Jobs>
Job Development>
How Do I Use Spark to Write Data into a DLI Table?
Updated on 2022-11-09 GMT+08:00
How Do I Use Spark to Write Data into a DLI Table?
To use Spark to write data into a DLI table, configure the following parameters:
- fs.obs.access.key
- fs.obs.secret.key
- fs.obs.impl
- fs.obs.endpoint
The following is an example:
import logging from operator import add from pyspark import SparkContext logging.basicConfig(format='%(message)s', level=logging.INFO) #import local file test_file_name = "D://test-data_1.txt" out_file_name = "D://test-data_result_1" sc = SparkContext("local","wordcount app") sc._jsc.hadoopConfiguration().set("fs.obs.access.key", "myak") sc._jsc.hadoopConfiguration().set("fs.obs.secret.key", "mysk") sc._jsc.hadoopConfiguration().set("fs.obs.impl", "org.apache.hadoop.fs.obs.OBSFileSystem") sc._jsc.hadoopConfiguration().set("fs.obs.endpoint", "myendpoint") # red: text_file rdd object text_file = sc.textFile(test_file_name) # counts counts = text_file.flatMap(lambda line: line.split(" ")).map(lambda word: (word, 1)).reduceByKey(lambda a, b: a + b) # write counts.saveAsTextFile(out_file_name)
Parent topic: Job Development
Job Development FAQs
- How Do I Use Spark to Write Data into a DLI Table?
- How Do I Set the AK/SK for a Queue to Operate an OBS Table?
- How Do I View the Resource Usage of DLI Spark Jobs?
- How Do I Use Python Scripts to Access the MySQL Database If the pymysql Module Is Missing from the Spark Job Results Stored in MySQL?
- How Do I Run a Complex PySpark Program in DLI?
- How Does a Spark Job Access a MySQL Database?
- How Do I Use JDBC to Set the spark.sql.shuffle.partitions Parameter to Improve the Task Concurrency?
- How Do I Read Uploaded Files for a Spark Jar Job?
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbotmore