Commissioning a Spark Application in a Local Windows Environment
You can run applications in the Windows environment after application code development is complete. The procedures for running applications developed using Scala or Java are the same on IDEA.
- In the Windows environment, only the sample code for accessing Spark SQL using JDBC is provided.
- Ensure that the Maven image repository of the SDK in the Huawei image site has been configured for Maven. For details, see Configuring Huawei Open Source Image Repository.
Compiling and Running Applications
- Obtain the sample code.
Download the Maven project source code and configuration files of the sample project. For details, see Obtaining the MRS Application Development Sample Project.
Import the sample code to IDEA.
- Obtain configuration files.
- Obtain the files from the cluster client. Download the hive-site.xml and spark-defaults.conf files from $SPARK_HOME/conf to a local directory.
- On the FusionInsight Manager page of the cluster, download the user authentication file to a local directory.
- Upload data to HDFS.
- Create a data text file on Linux and save the following data to the data file:
Miranda,32 Karlie,23 Candice,27
- On the HDFS client, run the following commands for authentication:
cd {Client installation directory}
kinit <Service user for authentication>
- On the HDFS client running the Linux OS, run the hadoop fs -mkdir /data command (or the hdfs dfs command) to create a directory.
- On the HDFS client running the Linux OS, run the hadoop fs -put data /data command to upload the data file.
- Create a data text file on Linux and save the following data to the data file:
- Configure related parameters in the sample code.
- Configure the authentication information.
Set userPrincipal to the username.
Set userKeytabPath to the path of the downloaded keytab file.
Set Krb5ConfPath to the path of the downloaded krb5.conf file.
Set the domain name to DEFAULT_REALM. In the KerberosUtil class, change DEFAULT_REALM to the domain name of the cluster.
- Change user.principal and user.keytab in the string concatenated by securityConfig to the corresponding username and path. Note that the path of the keytab file must use slashes (/).
- Change the SQL statement for loading data to LOAD DATA INPATH 'hdfs:/data/data' INTO TABLE CHILD.
- Configure the authentication information.
- Add running parameters to the hive-site.xml and spark-defaults.conf files when the application is running.
- Run the application.
View Debugging Results
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/D:/mavenlocal/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/D:/mavenlocal/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. ---- Begin executing sql: CREATE TABLE IF NOT EXISTS CHILD (NAME STRING, AGE INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' ---- Result ---- Done executing sql: CREATE TABLE IF NOT EXISTS CHILD (NAME STRING, AGE INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' ---- ---- Begin executing sql: LOAD DATA INPATH 'hdfs:/data/data' INTO TABLE CHILD ---- Result ---- Done executing sql: LOAD DATA INPATH 'hdfs:/data/data' INTO TABLE CHILD ---- ---- Begin executing sql: SELECT * FROM child ---- NAME AGE Miranda 32 Karlie 23 Candice 27 ---- Done executing sql: SELECT * FROM child ---- ---- Begin executing sql: DROP TABLE child ---- Result ---- Done executing sql: DROP TABLE child ---- Process finished with exit code 0
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot