-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mvn install udf-compiler instead of pulling from remote when building all #4453
Conversation
… all Signed-off-by: Chong Gao <res_life@163.com>
Below command first finds aggregator jars in the target directory, if not find, then pulls from remote.
Note for Databricks, it always pulls from remote. Use this command to build after this fix
// After Databricks jars are uploaded to remote repository, please use this
For this fix, please see the comments inline. |
build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know enough about how the parallel build is intended to work to be able to say if this is really a fix or not.
[[ "$LOG_FILE" != "/dev/tty" ]] && tail -20 "$LOG_FILE" || true | ||
exit 255 | ||
} | ||
|
||
echo "#### REDIRECTING mvn output to $LOG_FILE ####" | ||
mvn -U "$MVN_PHASE" \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this command append to the log file instead of over-write it?
# If not, the first build all will fail because of remote repository does not have udf-compiler jar. | ||
# And should not pull from remote to avoid the inconsistent code of different sub-modules. | ||
# This will compile and install the corresponded shim dependency also. | ||
mvn -U "install" \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there is a larger underlying issue here. This is probably going to be fixed by a separate build, but isn't the UDF compiler being built against different versions of Spark too? But there is no distinction between these versions in the JAR name. So then when we build it in parallel like this is it not just a race to see which one finishes when to see what version is included in the jar?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the UDF compiler is built against different versions of Spark.
The jars are generated in the different path, like target/spark3xx/ , so it's not race.
This PR is invalid, see the comments below, will close it.
This PR is invalid.
|
Local build PASS: ./build/buildall -p=noSnapshots -P=8
|
The corresponding issue is closed. |
This fixes #4442
Mvn install udf-compiler instead of pulling from remote when building all
Signed-off-by: Chong Gao res_life@163.com