Updated on 2024-08-10 GMT+08:00

How Do I Submit the Spark Application Using Java Commands?

Question

How do I use Java commands to submit Spark applications in addition to the spark-submit command?

Answer

Use the org.apache.spark.launcher.SparkLauncher class and run Java command to submit the Spark application. The procedure is as follows:

  1. Define the org.apache.spark.launcher.SparkLauncher class. The SparkLauncherJavaExample and SparkLauncherScalaExample are provided by default as sample codes. You can modify the input parameters of sample codes as required.

    • If you use Java as the development language, you can edit the SparkLauncher class by referring to the following code:
          public static void main(String[] args) throws Exception {
              System.out.println("com.huawei.bigdata.spark.examples.SparkLauncherExample <mode> <jarParh> <app_main_class> <appArgs>");
              SparkLauncher launcher = new SparkLauncher();
              launcher.setMaster(args[0])
                  .setAppResource(args[1]) // Specify user app jar path
                  .setMainClass(args[2]);
              if (args.length > 3) {
                  String[] list = new String[args.length - 3];
                  for (int i = 3; i < args.length; i++) {
                      list[i-3] = args[i];
                  }
                  // Set app args
                  launcher.addAppArgs(list);
              }
      
              // Launch the app
              Process process = launcher.launch();
              // Get Spark driver log
              new Thread(new ISRRunnable(process.getErrorStream())).start();
              int exitCode = process.waitFor();
              System.out.println("Finished! Exit code is "  + exitCode);
          }
    • If you use Scala as the development language, you can edit the SparkLauncher class by referring to the following code:
        def main(args: Array[String]) {
          println(s"com.huawei.bigdata.spark.examples.SparkLauncherExample <mode> <jarParh>  <app_main_class> <appArgs>")
          val launcher = new SparkLauncher()
          launcher.setMaster(args(0))
            .setAppResource(args(1)) // Specify user app jar path
            .setMainClass(args(2))
            if (args.drop(3).length > 0) {
              // Set app args
              launcher.addAppArgs(args.drop(3): _*)
            }
      
      
          // Launch the app
          val process = launcher.launch()
          // Get Spark driver log
          new Thread(new ISRRunnable(process.getErrorStream)).start()
          val exitCode = process.waitFor()
          println(s"Finished! Exit code is $exitCode")
        }

  2. Develop the Spark application based on the service logic and configure constant values such as the main class of the user-compiled Spark application. Prepare the service application code and related configurations. For details about different scenarios, see Developing Spark Applications.
  3. Call the org.apache.spark.launcher.SparkLauncher.launch() function to submit user applications.

    1. Generate JAR files from the SparkLauncher application and user applications, and upload the JAR files to the Spark node of the application. For how to generate JAR packages, see Commissioning a Spark Application in a Linux Environment.
      • To obtain the compilation dependency package for the SparkLauncher program, you can find spark-launcher_2.12-3.1.1-hw-ei-311001-SNAPSHOT.jar in the jars directory of the FusionInsight_Spark2x_8.1.0.1.tar.gz package located in the Software folder of the software release package.
      • The compilation dependency packages of user applications vary with the code. You need to load the dependency package based on the compiled code.
    2. Upload the dependency JAR file of the application to a directory, for example, $SPARK_HOME/jars (the node where the application will run).

      Upload the dependency packages of the SparkLauncher class and the application to the jars directory on the client. The dependency package of the sample code has existed in the jars directory on the client.

      If you want to use the Spark Launcher class, the node where the application runs must have the Spark client installed, and the client runs properly. In addition, configured environment variables, running dependency package, and configuration files should also be prepared.

    3. In the node where the Spark application is running, run the following commands to submit the application using Spark Launcher. You can check the running situation through Spark web UI and check the result by obtaining specified files.

      java -cp $SPARK_HOME/conf:$SPARK_HOME/jars/*:SparkLauncherExample.jar com.huawei.bigdata.spark.examples.SparkLauncherExample yarn-client /opt/female/FemaleInfoCollection.jar com.huawei.bigdata.spark.examples.FemaleInfoCollection <inputPath>