日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker

發布時間:2023/12/31 编程问答 40 豆豆
生活随笔 收集整理的這篇文章主要介紹了 failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

報錯如下:

$ startspark org.apache.spark.deploy.master.Master running as process 14542. Stop it first. Desktop: org.apache.spark.deploy.worker.Worker running as process 14710. Stop it first. Laptop: starting org.apache.spark.deploy.worker.Worker, logging to /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out Laptop: failed to launch: nice -n 0 /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://Desktop:7077 Laptop: at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989) Laptop: at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254) Laptop: at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364) Laptop: at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) Laptop: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) Laptop: at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) Laptop: at java.lang.Thread.run(Thread.java:748) Laptop: 20/05/05 22:57:51 INFO util.ShutdownHookManager: Shutdown hook called Laptop: full log in /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out org.apache.spark.deploy.history.HistoryServer running as process 14800. Stop it first.

集群環境:

臺式機(Desktop)

筆記本(Laptop)

$SPARK_HOME/conf/spark-env.sh

/etc/hosts中的設置是:

192.168.0.103?Desktop 192.168.0.102?Laptop

兩個辦法:

$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP無論master還是slave全部改成0.0.0.0

臺式機中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Desktop

筆記本中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Laptop

?

產生故障的原因:

大家都是把master的配置批量拷貝到slave,然后忘記修改slave的$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP了

?

總結

以上是生活随笔為你收集整理的failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。