日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問 生活随笔!

生活随笔

當(dāng)前位置: 首頁 >

failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker

發(fā)布時(shí)間:2023/12/31 45 豆豆
生活随笔 收集整理的這篇文章主要介紹了 failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker 小編覺得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.

報(bào)錯(cuò)如下:

$ startspark org.apache.spark.deploy.master.Master running as process 14542. Stop it first. Desktop: org.apache.spark.deploy.worker.Worker running as process 14710. Stop it first. Laptop: starting org.apache.spark.deploy.worker.Worker, logging to /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out Laptop: failed to launch: nice -n 0 /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://Desktop:7077 Laptop: at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989) Laptop: at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254) Laptop: at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364) Laptop: at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) Laptop: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) Laptop: at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) Laptop: at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) Laptop: at java.lang.Thread.run(Thread.java:748) Laptop: 20/05/05 22:57:51 INFO util.ShutdownHookManager: Shutdown hook called Laptop: full log in /home/appleyuchi/bigdata/spark-2.3.1-bin-hadoop2.7/logs/spark-appleyuchi-org.apache.spark.deploy.worker.Worker-1-Laptop.out org.apache.spark.deploy.history.HistoryServer running as process 14800. Stop it first.

集群環(huán)境:

臺(tái)式機(jī)(Desktop)

筆記本(Laptop)

$SPARK_HOME/conf/spark-env.sh

/etc/hosts中的設(shè)置是:

192.168.0.103?Desktop 192.168.0.102?Laptop

兩個(gè)辦法:

$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP無論master還是slave全部改成0.0.0.0

臺(tái)式機(jī)中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Desktop

筆記本中$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP=Laptop

?

產(chǎn)生故障的原因:

大家都是把master的配置批量拷貝到slave,然后忘記修改slave的$SPARK_HOME/conf/spark-env.sh中的SPARK_LOCAL_IP了

?

總結(jié)

以上是生活随笔為你收集整理的failed to launch: nice -n 0 $SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔推薦給好友。