SPARK_WORKER_INSTANCES for Spark 2.2.0 -
in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?
if @ spark-env.sh
file inside conf
directory of spark
folder, see option spark_worker_instances=1
. can change number want.
so when spark started sbin/start-all.sh
number of worker nodes defined should started on machine.
Comments
Post a Comment