failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker
生活随笔
收集整理的這篇文章主要介紹了
failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
報錯如下:
$ startspark org.apache.spark.deploy.master.Master running as process 14542. Stop it first. Desktop: org.apache.spark.deploy.worker.Worker running as process 14710. Stop it first. Laptop: starting org.apache.spark.deploy.worker.Worker, logging to /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out Laptop: failed to launch: nice -n 0 /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://Desktop:7077 Laptop: at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989) Laptop: at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254) Laptop: at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364) Laptop: at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) Laptop: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) Laptop: at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) Laptop: at java.lang.Thread.run(Thread.java:748) Laptop: 20/05/05 22:57:51 INFO util.ShutdownHookManager: Shutdown hook called Laptop: full log in /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out org.apache.spark.deploy.history.HistoryServer running as process 14800. Stop it first.集群環境:
臺式機(Desktop)
筆記本(Laptop)
$SPARK_HOME/conf/spark-env.sh
/etc/hosts中的設置是:
192.168.0.103?Desktop 192.168.0.102?Laptop兩個辦法:
①$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP無論master還是slave全部改成0.0.0.0
②
臺式機中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Desktop
筆記本中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Laptop
?
產生故障的原因:
大家都是把master的配置批量拷貝到slave,然后忘記修改slave的$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP了
?
總結
以上是生活随笔為你收集整理的failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 楚留香手游帮会前缀怎么改
- 下一篇: spark集群启动正常,但是提交任务后只