泪目!一天终于解决了这个bug Pyspark: Exception: Java gateway process exited before sending the driver its port
Pyspark: Exception: Java gateway process exited before sending the driver its port number
·
我参考使用 Docker 快速部署 Spark + Hadoop 大数据集群 - 知乎
Bug:
里面的compose拉齐集群,但是出现一个致命的bug,普通用户,我下载不了pip所有包,包括jupyter,用exec -u 0 -it进去的root用户可以下载,但是却启动不了pyspark集群,会报
Pyspark: Exception: Java gateway process exited before sending the driver its port number
bug
解决bug在compose加上user:root
完成的docker-compose
version: '2'
services:
spark:
image: docker.io/bitnami/spark:3
hostname: master
user: root
environment:
- SPARK_MODE=master
- SPARK_RPC_AUTHENTICATION_ENABLED=no
- SPARK_RPC_ENCRYPTION_ENABLED=no
- SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- SPARK_SSL_ENABLED=no
volumes:
- ./:/opt/share
ports:
- '8080:8080'
- '4040:4040'
spark-worker-1:
image: docker.io/bitnami/spark:3
hostname: worker1
user: root
environment:
- SPARK_MODE=worker
- SPARK_MASTER_URL=spark://master:7077
- SPARK_WORKER_MEMORY=1G
- SPARK_WORKER_CORES=1
- SPARK_RPC_AUTHENTICATION_ENABLED=no
- SPARK_RPC_ENCRYPTION_ENABLED=no
- SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- SPARK_SSL_ENABLED=no
volumes:
- ./:/opt/share
ports:
- '8081:8081'
spark-worker-2:
image: docker.io/bitnami/spark:3
hostname: worker2
user: root
environment:
- SPARK_MODE=worker
- SPARK_MASTER_URL=spark://master:7077
- SPARK_WORKER_MEMORY=1G
- SPARK_WORKER_CORES=1
- SPARK_RPC_AUTHENTICATION_ENABLED=no
- SPARK_RPC_ENCRYPTION_ENABLED=no
- SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- SPARK_SSL_ENABLED=no
volumes:
- ./:/opt/share
ports:
- '8082:8081'
~
终于实现我想实现的效果
参考链接:
更多推荐
已为社区贡献3条内容
所有评论(0)