SPARK_WORKER_INSTANCES for Spark 2.2.0 -


in spark 2.2.0, not see option spark_worker_instances launch multiple workers per node. how do this?

if @ spark-env.sh file inside conf directory of spark folder, see option spark_worker_instances=1. can change number want.

so when spark started sbin/start-all.sh number of worker nodes defined should started on machine.


Comments

Popular posts from this blog

networking - Vagrant-provisioned VirtualBox VM is not reachable from Ubuntu host -

c# - ASP.NET Core - There is already an object named 'AspNetRoles' in the database -

ruby on rails - ArgumentError: Missing host to link to! Please provide the :host parameter, set default_url_options[:host], or set :only_path to true -