Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] compile failed in databricks runtimes due to new added TestReport #9149

Closed
pxLi opened this issue Aug 31, 2023 · 2 comments · Fixed by #9158
Closed

[BUG] compile failed in databricks runtimes due to new added TestReport #9149

pxLi opened this issue Aug 31, 2023 · 2 comments · Fixed by #9158
Assignees
Labels
bug Something isn't working

Comments

@pxLi
Copy link
Collaborator

pxLi commented Aug 31, 2023

Describe the bug
related to new merged #9089

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:22: object json4s is not a member of package org

[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s._

[2023-08-31T10:28:17.143Z] [ERROR]            ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:23: object json4s is not a member of package org

[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s.jackson.Serialization.writePretty

[2023-08-31T10:28:17.143Z] [ERROR]            ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:41: not found: value DefaultFormats

[2023-08-31T10:28:17.143Z] [ERROR]     implicit val formats = DefaultFormats

[2023-08-31T10:28:17.143Z] [ERROR]                            ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:43: not found: value writePretty

[2023-08-31T10:28:17.143Z] [ERROR]     os.write(writePretty(queryMetas).getBytes)

[2023-08-31T10:28:17.143Z] [ERROR]              ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:22: Unused import

[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s._

[2023-08-31T10:28:17.143Z] [ERROR]                   ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:23: Unused import

[2023-08-31T10:28:17.143Z] [ERROR] import org.json4s.jackson.Serialization.writePretty

[2023-08-31T10:28:17.143Z] [ERROR]                                         ^

[2023-08-31T10:28:17.143Z] [ERROR] /home/ubuntu/spark-rapids/integration_tests/src/main/scala/com/nvidia/spark/rapids/tests/scaletest/TestReport.scala:41: local val formats in method save is never used

[2023-08-31T10:28:17.143Z] [ERROR]     implicit val formats = DefaultFormats

[2023-08-31T10:28:17.143Z] [ERROR]                  ^

[2023-08-31T10:28:17.143Z] [ERROR] 7 errors found

Steps/Code to reproduce bug
Please provide a list of steps or a code sample to reproduce the issue.
Avoid posting private or sensitive data.

Expected behavior
A clear and concise description of what you expected to happen.

Environment details (please complete the following information)

  • Environment location: [Standalone, YARN, Kubernetes, Cloud(specify cloud provider)]
  • Spark configuration settings related to the issue

Additional context
Add any other context about the problem here.

@pxLi pxLi added bug Something isn't working ? - Needs Triage Need team to review and classify labels Aug 31, 2023
@wjxiz1992
Copy link
Collaborator

checking, I plan to add what is included in Spark's pom: https://github.com/apache/spark/blob/branch-3.3/pom.xml#L1049-L1059 into our pom. Will this resolve this issue?
(btw how to trigger the build to include Databricks run? )

@jlowe
Copy link
Member

jlowe commented Aug 31, 2023

I plan to add what is included in Spark's pom

That's sort of what we need, but specifically for Databricks. In non-Databricks builds, the Maven artifacts and dependencies are explicit and we pick up json4s "for free," but in Databricks builds the artifacts and dependencies are not published and thus we need to manually add the jars to the classpath that we need. json4s is already referenced in other poms (e.g.: sql-plugin), and there's already a databricks profile in integration_tests that we need to update similarly to fix this.

how to trigger the build to include Databricks run?

Add [databricks] to the headline of the PR before triggering a build.

I'll post a PR to unblock CI for databricks builds.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants