pysparkpython版本_pyspark修改python版本
ubuntu自帶的python?版本是2.7,
我們要把pyspark默認改成anaconda python 3.6
down vot
You can specify the version of Python for the driver by setting the appropriate environment variables in the?./conf/spark-env.sh?file. If it doesn't already exist, you can use the?spark-env.sh.templatefile provided which also includes lots of other variables.
Here is a simple example of a?spark-env.sh?file to set the relevant Python environment variables:
#!/usr/bin/env bash
# This file is sourced when running various Spark programs.
export PYSPARK_PYTHON=/usr/bin/python3
export PYSPARK_DRIVER_PYTHON=/usr/bin/ipython
In this case it sets the version of Python used by the workers/executors to Python3 and the driver version of Python to iPython for a nicer shell to work in.
意思就是把spark文件夾下的./conf/spark-env.sh.tempalte 重命名成spark-env.sh
然后添加如下內容:
# This file is sourced when running various Spark programs.
export PYSPARK_PYTHON=/usr/bin/python3
export PYSPARK_DRIVER_PYTHON=/usr/bin/ipython
重啟spark?即可
與50位技術專家面對面20年技術見證,附贈技術全景圖總結
以上是生活随笔為你收集整理的pysparkpython版本_pyspark修改python版本的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Java jdt 编辑_JDT入门
- 下一篇: websocket python爬虫_p