SPARK_WORKER_INSTANCES for Spark 2.2.0 -


in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?

if @ spark-env.sh file inside conf directory of spark folder, see option spark_worker_instances=1. can change number want.

so when spark started sbin/start-all.sh number of worker nodes defined should started on machine.


Comments

Popular posts from this blog

python - Best design pattern for collection of objects -

go - serving up pdfs using golang -

python - django admin: changing the way a field (w/ relationship to another model) is submitted on a form so that it can be submitted multiple times -