当前位置: 动力学知识库 > 问答 > 编程问答 >

Fail to connect to master with Spark on Google Compute Engine

问题描述:

I am trying hadoop/spark cluster in Google Compute Engine through "Launch click-to-deploy software" feature .

I have created 1 master and 2 slave node and i can launch spark-shell on the cluster but when i want to launch spark-shell since my computer, i failed.

I launch :

./bin/spark-shell --master spark://IP or Hostname:7077

And i have this stackTrace :

15/04/09 10:58:06 INFO AppClient$ClientActor: Connecting to master

akka.tcp://[email protected] or Hostname:7077/user/Master...

15/04/09 10:58:06 WARN AppClient$ClientActor: Could not connect to

akka.tcp://[email protected] or Hostname:7077: akka.remote.InvalidAssociation: Invalid address: akka.tcp://[email protected] or Hostname:7077

15/04/09 10:58:06 WARN Remoting: Tried to associate with unreachable remote address [akka.tcp://[email protected] or Hostname:7077]. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters. Reason: IP or Hostname: unknown error

please let me know how to overcome this problem .

网友答案:

See comment from Daniel Darabos. By default, all incoming connections are blocked except for SSH, RDP and ICMP. To be able to connect from the Internet to the hadoop master instance, you must open port 7077 for 'hadoop-master' tag in your project first:

gcloud compute --project PROJECT firewall-rules create allow-spark \
    --allow TCP:7077 \
    --target-tags hadoop-master

See Firewalls, Adding a firewall and gcloud compute firewall-rules create at GCE public documentation for further details and all the possibilities.

分享给朋友:
您可能感兴趣的文章:
随机阅读: