SPARK_WORKER_INSTANCES for Spark 2.2.0 -


in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?

if @ spark-env.sh file inside conf directory of spark folder, see option spark_worker_instances=1. can change number want.

so when spark started sbin/start-all.sh number of worker nodes defined should started on machine.


Comments

Popular posts from this blog

html - How to set bootstrap input responsive width? -

javascript - Highchart x and y axes data from json -

javascript - Get js console.log as python variable in QWebView pyqt -