-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUILD] cleanup unused jenkins files and scripts #1568
Comments
@tgravescs
2, Other Dockerfiles/Jenkinsfiles/scripts are necessary for spark-rapids pre-merge and integration jobs. So after clean useless Dockerfiles in item 1, need we to move the whole Jenkins dir? or the subdirs[jenkins/databricks] or files out of spark-rapids Github and move them into internal Gitlab repo? |
|
Hi, I totally agree that put all those stuff in 2 different places could confuse people, but it is required by blossom and security team to move at least jenkins files to internal. And yes, we may just keep jenkins files and scripts at the same place if possible,
|
the problem comes in the definition of we put all CSP related scripts... What does that include, right now that includes scripts for running tests, which I don't want them split. If we can split and commonize them such that the scripts for running tests are in spark-rapids and other Jenkinsfiles and CSP setup scripts are in private, I'm good with that. |
also here, it seems we aren't running all of the tests. I'm fixing the integration test jar to include the resources directory, but all the test environment should be using the run_pyspark_from_build.sh where its appropriate. |
note that this should go alone with #1640 and basically ideally we commonize to use the same script to run all the tests. see the comments in that issue. |
Fix issue : NVIDIA#1568 Move databricks scrpits to Gitlib so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com>
Fix issue : NVIDIA#1568 Move Databricks scripts to GitLab so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com>
MR on GitLab: 123 |
NVIDIA#1568 Move Databricks scripts to GitLab so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com>
* Cleanup unused Jenkins files and scripts #1568 Move Databricks scripts to GitLab so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com> * rm Dockerfile.integration.ubuntu16 * Restore Databricks nightly scripts Signed-off-by: Tim Liu <timl@nvidia.com>
Todo
|
@NvTimLiu are you still working on commonizing some of the scripts for blossom? |
@tgravescs I've done all the common scripts for all the cloud environments. @jlowe Need we to add some tag similar to the |
Yes, all of the CI pipelines should be specifying something for |
close per PR2059 merged |
* Cleanup unused Jenkins files and scripts NVIDIA#1568 Move Databricks scripts to GitLab so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com> * rm Dockerfile.integration.ubuntu16 * Restore Databricks nightly scripts Signed-off-by: Tim Liu <timl@nvidia.com>
* Cleanup unused Jenkins files and scripts NVIDIA#1568 Move Databricks scripts to GitLab so we can use the common scripts for the nightly build job and integration tests job Remove unused Dockerfiles Signed-off-by: Tim Liu <timl@nvidia.com> * rm Dockerfile.integration.ubuntu16 * Restore Databricks nightly scripts Signed-off-by: Tim Liu <timl@nvidia.com>
…p ci] [bot] (NVIDIA#1568) * Update submodule cudf to 8deb3dd7573000e7d87f18a9e2bbe39cf2932e10 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 9e7f8a5fdd03d6a24630687621d0ee14c2db26d7 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to f9c586d48aa2a879b2267318088d3cc38f398662 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 53127de4d9e06f9fa172ac34952f85104eb7bac9 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 8e1ef05b2b96775ce7e1a2f22894ec7a8ebb65a4 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to bf63d1049db70c28ea961b677ad5f207aa648860 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to ba5ec4080be38b795053d11bf46cb3688c201893 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 6c2e972cefff05f6ffbba4fd9ba894e6849b041e Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 723c565f7a03e3e9a842526cd4cc94bcf6f582e5 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> * Update submodule cudf to 823d3214a9489e3c496aa31041b5d29f650e94b3 Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com> --------- Signed-off-by: spark-rapids automation <70000568+nvauto@users.noreply.github.com>
Is your feature request related to a problem? Please describe.
Directory:
https://github.com/NVIDIA/spark-rapids/tree/branch-0.4/jenkins
has jenkins files, docker files, and scripts. I believe many of those are not used anymore (like the databricks ones). We should remove any files from here that aren't used.
I'm not sure if one of the Docker files was an example for users. We do already have one for k8s here: https://github.com/NVIDIA/spark-rapids/tree/branch-0.4/docs/get-started. If we want it visible to users perhaps we should document and put it in similar location or create a separate directory for it.
The text was updated successfully, but these errors were encountered: