Skip to content

Commit

Permalink
[SYSTEMDS-3476] Spark with default settings
Browse files Browse the repository at this point in the history
There was a bug in the /bin/systemds script that did not properly
set up the spark execution variables with default variables.
1. bug was memory was set to 16 not 16g, 2. was the log4j variable
that was overwritten and used incorrectly.

Both bugs are fixed, but there could be use of some more cleanup of our
/bin/systemds script.

Closes apache#1748
  • Loading branch information
Baunsgaard committed Dec 9, 2022
1 parent be29ce8 commit 49e03ea
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion bin/systemds
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,8 @@ else
LOG4JPROP2=$(find "$LOG4JPROP")
if [ -z "${LOG4JPROP2}" ]; then
LOG4JPROP=""
elif [ -z "${SYSTEMDS_DISTRIBUTED_OPTS}" ]; then
LOG4JPROP=$LOG4JPROP
else
LOG4JPROP="-Dlog4j.configuration=file:$LOG4JPROP2"
fi
Expand All @@ -126,7 +128,7 @@ else
--files $LOG4JPROP \
--conf spark.network.timeout=512s \
--num-executors 4 \
--executor-memory 64 \
--executor-memory 64g \
--executor-cores 16 "
fi

Expand Down

0 comments on commit 49e03ea

Please sign in to comment.