SPARK_WORKER_INSTANCES for Spark 2.2.0 -


in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?

if @ spark-env.sh file inside conf directory of spark folder, see option spark_worker_instances=1. can change number want.

so when spark started sbin/start-all.sh number of worker nodes defined should started on machine.


Comments

Popular posts from this blog

go - serving up pdfs using golang -

python - Best design pattern for collection of objects -

r - Using `bbmle:mle2` with vector parameters (already works using `optim`) -