Updated on 2024-08-10 GMT+08:00

Spark Access Configuration on Windows Using EIPs

Scenario

This section describes how to bind Elastic IP addresses (EIPs) to a cluster and configure Spark files so that sample files can be compiled locally.

This section uses SparkScalaExample as an example.

Procedure

  1. Apply for an EIP for each node in the cluster and add public IP addresses and corresponding host domain names of all nodes to the Windows local hosts file. (If a host name contains uppercase letters, change them to lowercase letters.)

    1. On the VPC console, apply for EIPs (the number of EIPs you buy should be equal to the number of nodes in the cluster), click the name of each node in the MRS cluster, and bind an EIP to each node on the EIPs page.

      For details, see Virtual Private Cloud > User Guide > EIP > Assigning an EIP and Binding It to an ECS.

    2. Record the mapping between the public IP addresses and private IP addresses. Change the private IP addresses in the hosts file to the corresponding public IP addresses.

  2. Change the IP addresses in the krb5.conf file to the corresponding host names.
  3. Configure security group rules for the cluster.

    1. On the Dashboard page, choose Add Security Group Rule > Manage Security Group Rule.

    2. On the Inbound Rules tab page, click Add Rule. In the Add Inbound Rule dialog box, configure Windows IP addresses and ports 21730TCP, 21731TCP/UDP, and 21732TCP/UDP.

  4. On Manager, choose Cluster > Services > HDFS > More > Download Client, and copy the core-site.xml and hdfs-site.xml files on the client to the conf directory of the sample project.

    Add the following content to the hdfs-site.xml file:
    <property>
            <name>dfs.client.use.datanode.hostname</name>
            <value>true</value>
    </property>

    Add the following content to the pom.xml file:

    <dependency>
         <groupId>com.huawei.mrs</groupId>
         <artifactId>hadoop-plugins</artifactId>
         <version>Component package version-302002</version>
    </dependency>

  5. Before running the sample code, add .master("local").config("spark.driver.host", "localhost") to SparkSession to set the local running mode for Spart. Change PRNCIPAL_NAME in the sample code to the username for security authentication.