How Do I Access Spark of the Cluster in Normal Mode on Windows Using EIPs?
Scenario
This section describes how to bind Elastic IP addresses (EIPs) to a cluster and configure Spark files so that sample files can be compiled locally.
This section uses SparkScalaExample as an example.
Procedure
- Apply for an EIP for each node in the cluster and add public IP addresses and corresponding host domain names of all nodes to the Windows local hosts file. (If a host name contains uppercase letters, change them to lowercase letters.)
- On the VPC console, apply for EIPs (the number of EIPs you buy should be equal to the number of nodes in the cluster), click the name of each node in the MRS cluster, and bind an EIP to each node on the EIPs page.
For details, see
. - Record the mapping between the public IP addresses and private IP addresses. Change the private IP addresses in the hosts file to the corresponding public IP addresses.
- On the VPC console, apply for EIPs (the number of EIPs you buy should be equal to the number of nodes in the cluster), click the name of each node in the MRS cluster, and bind an EIP to each node on the EIPs page.
- Configure security group rules for the cluster.
- On the Dashboard page, choose Add Security Group Rule > Manage Security Group Rule.
- On the Inbound Rules tab page, click Add Rule. In the Add Inbound Rule dialog box, configure the Windows IP addresses and ports 8020 and 9866.
- On the Dashboard page, choose Add Security Group Rule > Manage Security Group Rule.
- On Manager, choose Cluster > Services > HDFS > More > Download Client, and copy the core-site.xml and hdfs-site.xml files on the client to the conf directory of the sample project.
Add the following content to the hdfs-site.xml file:
<property> <name>dfs.client.use.datanode.hostname</name> <value>true</value> </property>
Add the following content to the pom.xml file:
<dependency> <groupId>com.huawei.mrs</groupId> <artifactId>hadoop-plugins</artifactId> <version>Component package version-302002</version> </dependency>
- Before running the sample code, add .master("local").config("spark.driver.host", "localhost") to SparkSession to set the local running mode for Spart.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.