Help Center/
Data Lake Insight/
FAQs/
Spark Jobs/
Spark Job O&M/
Why Does a Spark Job Fail to Execute with an Abnormal Access Directory Error When Accessing Files in SFTP?
Updated on 2024-11-15 GMT+08:00
Why Does a Spark Job Fail to Execute with an Abnormal Access Directory Error When Accessing Files in SFTP?
Spark jobs cannot access SFTP. Upload the files you want to access to OBS and then you can analyze the data using Spark jobs.
- Upload data to an OBS bucket: Upload data stored in SFTP to an OBS bucket using the OBS management console or command-line tools.
For how to use a Spark job to read OBS data, see Using Spark Jar Jobs to Read and Query OBS Data.
- Configure the Spark job: Configure the Spark job to access data stored in OBS.
- Submit the Spark job: After completing the job writing, submit and execute the job.
Parent topic: Spark Job O&M
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
The system is busy. Please try again later.
For any further questions, feel free to contact us through the chatbot.
Chatbot