Skip to content

Commit

Permalink
merging master (#11)
Browse files Browse the repository at this point in the history
* [SPARK-33641][SQL][DOC][FOLLOW-UP] Add migration guide for CHAR VARCHAR types

### What changes were proposed in this pull request?

Add migration guide for CHAR VARCHAR types

### Why are the changes needed?

for migration

### Does this PR introduce _any_ user-facing change?

doc change

### How was this patch tested?

passing ci

Closes apache#30654 from yaooqinn/SPARK-33641-F.

Authored-by: Kent Yao <yaooqinn@hotmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>

* [SPARK-33669] Wrong error message from YARN application state monitor when sc.stop in yarn client mode

### What changes were proposed in this pull request?
This change make InterruptedIOException to be treated as InterruptedException when closing YarnClientSchedulerBackend, which doesn't log error with "YARN application has exited unexpectedly xxx"

### Why are the changes needed?
For YarnClient mode, when stopping YarnClientSchedulerBackend, it first tries to interrupt Yarn application monitor thread. In MonitorThread.run() it catches InterruptedException to gracefully response to stopping request.

But client.monitorApplication method also throws InterruptedIOException when the hadoop rpc call is calling. In this case, MonitorThread will not know it is interrupted, a Yarn App failed is returned with "Failed to contact YARN for application xxxxx;  YARN application has exited unexpectedly with state xxxxx" is logged with error level. which confuse user a lot.

### Does this PR introduce _any_ user-facing change?
Yes

### How was this patch tested?
very simple patch, seems no need?

Closes apache#30617 from sqlwindspeaker/yarn-client-interrupt-monitor.

Authored-by: suqilong <suqilong@qiyi.com>
Signed-off-by: Mridul Muralidharan <mridul<at>gmail.com>

* [SPARK-33655][SQL] Improve performance of processing FETCH_PRIOR

### What changes were proposed in this pull request?
Currently, when a client requests FETCH_PRIOR to Thriftserver, Thriftserver reiterates from the start position. Because Thriftserver caches a query result with an array when THRIFTSERVER_INCREMENTAL_COLLECT feature is off, FETCH_PRIOR can be implemented without reiterating the result. A trait FeatureIterator is added in order to separate the implementation for iterator and an array. Also, FeatureIterator supports moves cursor with absolute position, which will be useful for the implementation of FETCH_RELATIVE, FETCH_ABSOLUTE.

### Why are the changes needed?
For better performance of Thriftserver.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
FetchIteratorSuite

Closes apache#30600 from Dooyoung-Hwang/refactor_with_fetch_iterator.

Authored-by: Dooyoung Hwang <dooyoung.hwang@sk.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>

* [SPARK-33719][DOC] Add make_date/make_timestamp/make_interval into the doc of ANSI Compliance

### What changes were proposed in this pull request?

Add make_date/make_timestamp/make_interval into the doc of ANSI Compliance

### Why are the changes needed?

Users can know that these functions throw runtime exceptions under ANSI mode if the result is not valid.
### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Build doc and check it in browser:
![image](https://user-images.githubusercontent.com/1097932/101608930-34a79e80-39bb-11eb-9294-9d9b8c3f6faa.png)

Closes apache#30683 from gengliangwang/improveDoc.

Authored-by: Gengliang Wang <gengliang.wang@databricks.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>

* [SPARK-33071][SPARK-33536][SQL][FOLLOW-UP] Rename deniedMetadataKeys to nonInheritableMetadataKeys in Alias

### What changes were proposed in this pull request?

This PR is a followup of apache#30488. This PR proposes to rename `Alias.deniedMetadataKeys` to `Alias.nonInheritableMetadataKeys` to make it less confusing.

### Why are the changes needed?

To make it easier to maintain and read.

### Does this PR introduce _any_ user-facing change?

No. This is rather a code cleanup.

### How was this patch tested?

Ran the unittests written in the previous PR manually. Jenkins and GitHub Actions in this PR should also test them.

Closes apache#30682 from HyukjinKwon/SPARK-33071-SPARK-33536.

Authored-by: HyukjinKwon <gurwls223@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>

* [SPARK-33722][SQL] Handle DELETE in ReplaceNullWithFalseInPredicate

### What changes were proposed in this pull request?

This PR adds `DeleteFromTable` to supported plans in `ReplaceNullWithFalseInPredicate`.

### Why are the changes needed?

This change allows Spark to optimize delete conditions like we optimize filters.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

This PR extends the existing test cases to also cover `DeleteFromTable`.

Closes apache#30688 from aokolnychyi/spark-33722.

Authored-by: Anton Okolnychyi <aokolnychyi@apple.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

Co-authored-by: Kent Yao <yaooqinn@hotmail.com>
Co-authored-by: suqilong <suqilong@qiyi.com>
Co-authored-by: Dooyoung Hwang <dooyoung.hwang@sk.com>
Co-authored-by: Gengliang Wang <gengliang.wang@databricks.com>
Co-authored-by: HyukjinKwon <gurwls223@apache.org>
Co-authored-by: Anton Okolnychyi <aokolnychyi@apple.com>
  • Loading branch information
7 people authored Dec 9, 2020
1 parent e77424b commit c74663e
Show file tree
Hide file tree
Showing 13 changed files with 317 additions and 79 deletions.
2 changes: 2 additions & 0 deletions docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,8 @@ license: |

- In Spark 3.1, creating or altering a view will capture runtime SQL configs and store them as view properties. These configs will be applied during the parsing and analysis phases of the view resolution. To restore the behavior before Spark 3.1, you can set `spark.sql.legacy.useCurrentConfigsForView` to `true`.

- Since Spark 3.1, CHAR/CHARACTER and VARCHAR types are supported in the table schema. Table scan/insertion will respect the char/varchar semantic. If char/varchar is used in places other than table schema, an exception will be thrown (CAST is an exception that simply treats char/varchar as string like before). To restore the behavior before Spark 3.1, which treats them as STRING types and ignores a length parameter, e.g. `CHAR(4)`, you can set `spark.sql.legacy.charVarcharAsString` to `true`.

## Upgrading from Spark SQL 3.0 to 3.0.1

- In Spark 3.0, JSON datasource and JSON function `schema_of_json` infer TimestampType from string values if they match to the pattern defined by the JSON option `timestampFormat`. Since version 3.0.1, the timestamp type inference is disabled by default. Set the JSON option `inferTimestamp` to `true` to enable such type inference.
Expand Down
16 changes: 10 additions & 6 deletions docs/sql-ref-ansi-compliance.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,14 +144,18 @@ SELECT * FROM t;

The behavior of some SQL functions can be different under ANSI mode (`spark.sql.ansi.enabled=true`).
- `size`: This function returns null for null input.
- `element_at`: This function throws `ArrayIndexOutOfBoundsException` if using invalid indices.
- `element_at`: This function throws `NoSuchElementException` if key does not exist in map.
- `element_at`:
- This function throws `ArrayIndexOutOfBoundsException` if using invalid indices.
- This function throws `NoSuchElementException` if key does not exist in map.
- `elt`: This function throws `ArrayIndexOutOfBoundsException` if using invalid indices.
- `parse_url`: This function throws `IllegalArgumentException` if an input string is not a valid url.
- `to_date` This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `to_timestamp` This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `unix_timestamp` This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `to_unix_timestamp` This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `to_date`: This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `to_timestamp`: This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `unix_timestamp`: This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `to_unix_timestamp`: This function should fail with an exception if the input string can't be parsed, or the pattern string is invalid.
- `make_date`: This function should fail with an exception if the result date is invalid.
- `make_timestamp`: This function should fail with an exception if the result timestamp is invalid.
- `make_interval`: This function should fail with an exception if the result interval is invalid.

### SQL Operators

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1069,7 +1069,7 @@ private[spark] class Client(
logError(s"Application $appId not found.")
cleanupStagingDir()
return YarnAppReport(YarnApplicationState.KILLED, FinalApplicationStatus.KILLED, None)
case NonFatal(e) =>
case NonFatal(e) if !e.isInstanceOf[InterruptedIOException] =>
val msg = s"Failed to contact YARN for application $appId."
logError(msg, e)
// Don't necessarily clean up staging dir because status is unknown
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@

package org.apache.spark.scheduler.cluster

import java.io.InterruptedIOException

import scala.collection.mutable.ArrayBuffer

import org.apache.hadoop.yarn.api.records.YarnApplicationState
Expand Down Expand Up @@ -121,7 +123,8 @@ private[spark] class YarnClientSchedulerBackend(
allowInterrupt = false
sc.stop()
} catch {
case e: InterruptedException => logInfo("Interrupting monitor thread")
case _: InterruptedException | _: InterruptedIOException =>
logInfo("Interrupting monitor thread")
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ trait AliasHelper {
exprId = a.exprId,
qualifier = a.qualifier,
explicitMetadata = Some(a.metadata),
deniedMetadataKeys = a.deniedMetadataKeys)
nonInheritableMetadataKeys = a.nonInheritableMetadataKeys)
case a: MultiAlias =>
a.copy(child = trimAliases(a.child))
case other => trimAliases(other)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -143,14 +143,14 @@ abstract class Attribute extends LeafExpression with NamedExpression with NullIn
* fully qualified way. Consider the examples tableName.name, subQueryAlias.name.
* tableName and subQueryAlias are possible qualifiers.
* @param explicitMetadata Explicit metadata associated with this alias that overwrites child's.
* @param deniedMetadataKeys Keys of metadata entries that are supposed to be removed when
* inheriting the metadata from the child.
* @param nonInheritableMetadataKeys Keys of metadata entries that are supposed to be removed when
* inheriting the metadata from the child.
*/
case class Alias(child: Expression, name: String)(
val exprId: ExprId = NamedExpression.newExprId,
val qualifier: Seq[String] = Seq.empty,
val explicitMetadata: Option[Metadata] = None,
val deniedMetadataKeys: Seq[String] = Seq.empty)
val nonInheritableMetadataKeys: Seq[String] = Seq.empty)
extends UnaryExpression with NamedExpression {

// Alias(Generator, xx) need to be transformed into Generate(generator, ...)
Expand All @@ -172,7 +172,7 @@ case class Alias(child: Expression, name: String)(
child match {
case named: NamedExpression =>
val builder = new MetadataBuilder().withMetadata(named.metadata)
deniedMetadataKeys.foreach(builder.remove)
nonInheritableMetadataKeys.foreach(builder.remove)
builder.build()

case _ => Metadata.empty
Expand All @@ -181,7 +181,10 @@ case class Alias(child: Expression, name: String)(
}

def newInstance(): NamedExpression =
Alias(child, name)(qualifier = qualifier, explicitMetadata = explicitMetadata)
Alias(child, name)(
qualifier = qualifier,
explicitMetadata = explicitMetadata,
nonInheritableMetadataKeys = nonInheritableMetadataKeys)

override def toAttribute: Attribute = {
if (resolved) {
Expand All @@ -201,7 +204,7 @@ case class Alias(child: Expression, name: String)(
override def toString: String = s"$child AS $name#${exprId.id}$typeSuffix$delaySuffix"

override protected final def otherCopyArgs: Seq[AnyRef] = {
exprId :: qualifier :: explicitMetadata :: deniedMetadataKeys :: Nil
exprId :: qualifier :: explicitMetadata :: nonInheritableMetadataKeys :: Nil
}

override def hashCode(): Int = {
Expand All @@ -212,7 +215,8 @@ case class Alias(child: Expression, name: String)(
override def equals(other: Any): Boolean = other match {
case a: Alias =>
name == a.name && exprId == a.exprId && child == a.child && qualifier == a.qualifier &&
explicitMetadata == a.explicitMetadata && deniedMetadataKeys == a.deniedMetadataKeys
explicitMetadata == a.explicitMetadata &&
nonInheritableMetadataKeys == a.nonInheritableMetadataKeys
case _ => false
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ package org.apache.spark.sql.catalyst.optimizer
import org.apache.spark.sql.catalyst.expressions.{And, ArrayExists, ArrayFilter, CaseWhen, Expression, If}
import org.apache.spark.sql.catalyst.expressions.{LambdaFunction, Literal, MapFilter, Or}
import org.apache.spark.sql.catalyst.expressions.Literal.FalseLiteral
import org.apache.spark.sql.catalyst.plans.logical.{Filter, Join, LogicalPlan}
import org.apache.spark.sql.catalyst.plans.logical.{DeleteFromTable, Filter, Join, LogicalPlan}
import org.apache.spark.sql.catalyst.rules.Rule
import org.apache.spark.sql.types.BooleanType
import org.apache.spark.util.Utils
Expand Down Expand Up @@ -53,6 +53,7 @@ object ReplaceNullWithFalseInPredicate extends Rule[LogicalPlan] {
def apply(plan: LogicalPlan): LogicalPlan = plan transform {
case f @ Filter(cond, _) => f.copy(condition = replaceNullWithFalse(cond))
case j @ Join(_, _, _, Some(cond), _) => j.copy(condition = Some(replaceNullWithFalse(cond)))
case d @ DeleteFromTable(_, Some(cond)) => d.copy(condition = Some(replaceNullWithFalse(cond)))
case p: LogicalPlan => p transformExpressions {
case i @ If(pred, _, _) => i.copy(predicate = replaceNullWithFalse(pred))
case cw @ CaseWhen(branches, _) =>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import org.apache.spark.sql.catalyst.dsl.plans._
import org.apache.spark.sql.catalyst.expressions.{And, ArrayExists, ArrayFilter, ArrayTransform, CaseWhen, Expression, GreaterThan, If, LambdaFunction, Literal, MapFilter, NamedExpression, Or, UnresolvedNamedLambdaVariable}
import org.apache.spark.sql.catalyst.expressions.Literal.{FalseLiteral, TrueLiteral}
import org.apache.spark.sql.catalyst.plans.{Inner, PlanTest}
import org.apache.spark.sql.catalyst.plans.logical.{LocalRelation, LogicalPlan}
import org.apache.spark.sql.catalyst.plans.logical.{DeleteFromTable, LocalRelation, LogicalPlan}
import org.apache.spark.sql.catalyst.rules.RuleExecutor
import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.sql.types.{BooleanType, IntegerType}
Expand All @@ -48,6 +48,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
test("replace null inside filter and join conditions") {
testFilter(originalCond = Literal(null, BooleanType), expectedCond = FalseLiteral)
testJoin(originalCond = Literal(null, BooleanType), expectedCond = FalseLiteral)
testDelete(originalCond = Literal(null, BooleanType), expectedCond = FalseLiteral)
}

test("Not expected type - replaceNullWithFalse") {
Expand All @@ -64,6 +65,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
Literal(null, BooleanType))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace nulls in nested expressions in branches of If") {
Expand All @@ -73,6 +75,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
UnresolvedAttribute("b") && Literal(null, BooleanType))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in elseValue of CaseWhen") {
Expand All @@ -83,6 +86,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
val expectedCond = CaseWhen(branches, FalseLiteral)
testFilter(originalCond, expectedCond)
testJoin(originalCond, expectedCond)
testDelete(originalCond, expectedCond)
}

test("replace null in branch values of CaseWhen") {
Expand All @@ -92,6 +96,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
val originalCond = CaseWhen(branches, Literal(null))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in branches of If inside CaseWhen") {
Expand All @@ -108,6 +113,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {

testFilter(originalCond, expectedCond)
testJoin(originalCond, expectedCond)
testDelete(originalCond, expectedCond)
}

test("replace null in complex CaseWhen expressions") {
Expand All @@ -127,19 +133,22 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {

testFilter(originalCond, expectedCond)
testJoin(originalCond, expectedCond)
testDelete(originalCond, expectedCond)
}

test("replace null in Or") {
val originalCond = Or(UnresolvedAttribute("b"), Literal(null))
val expectedCond = UnresolvedAttribute("b")
testFilter(originalCond, expectedCond)
testJoin(originalCond, expectedCond)
testDelete(originalCond, expectedCond)
}

test("replace null in And") {
val originalCond = And(UnresolvedAttribute("b"), Literal(null))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace nulls in nested And/Or expressions") {
Expand All @@ -148,6 +157,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
Or(Literal(null), And(Literal(null), And(UnresolvedAttribute("b"), Literal(null)))))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in And inside branches of If") {
Expand All @@ -157,6 +167,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
And(UnresolvedAttribute("b"), Literal(null, BooleanType)))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in branches of If inside And") {
Expand All @@ -168,6 +179,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
And(FalseLiteral, UnresolvedAttribute("b"))))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in branches of If inside another If") {
Expand All @@ -177,13 +189,15 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
Literal(null))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("replace null in CaseWhen inside another CaseWhen") {
val nestedCaseWhen = CaseWhen(Seq(UnresolvedAttribute("b") -> FalseLiteral), Literal(null))
val originalCond = CaseWhen(Seq(nestedCaseWhen -> TrueLiteral), Literal(null))
testFilter(originalCond, expectedCond = FalseLiteral)
testJoin(originalCond, expectedCond = FalseLiteral)
testDelete(originalCond, expectedCond = FalseLiteral)
}

test("inability to replace null in non-boolean branches of If") {
Expand All @@ -196,6 +210,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
FalseLiteral)
testFilter(originalCond = condition, expectedCond = condition)
testJoin(originalCond = condition, expectedCond = condition)
testDelete(originalCond = condition, expectedCond = condition)
}

test("inability to replace null in non-boolean values of CaseWhen") {
Expand All @@ -210,6 +225,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
val condition = CaseWhen(branches)
testFilter(originalCond = condition, expectedCond = condition)
testJoin(originalCond = condition, expectedCond = condition)
testDelete(originalCond = condition, expectedCond = condition)
}

test("inability to replace null in non-boolean branches of If inside another If") {
Expand All @@ -222,6 +238,7 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
FalseLiteral)
testFilter(originalCond = condition, expectedCond = condition)
testJoin(originalCond = condition, expectedCond = condition)
testDelete(originalCond = condition, expectedCond = condition)
}

test("replace null in If used as a join condition") {
Expand Down Expand Up @@ -353,6 +370,10 @@ class ReplaceNullWithFalseInPredicateSuite extends PlanTest {
test((rel, exp) => rel.select(exp), originalExpr, expectedExpr)
}

private def testDelete(originalCond: Expression, expectedCond: Expression): Unit = {
test((rel, expr) => DeleteFromTable(rel, Some(expr)), originalCond, expectedCond)
}

private def testHigherOrderFunc(
argument: Expression,
createExpr: (Expression, Expression) => Expression,
Expand Down
9 changes: 5 additions & 4 deletions sql/core/src/main/scala/org/apache/spark/sql/Column.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1164,10 +1164,11 @@ class Column(val expr: Expression) extends Logging {
* @since 2.0.0
*/
def name(alias: String): Column = withExpr {
// SPARK-33536: The Alias is no longer a column reference after converting to an attribute.
// These denied metadata keys are used to strip the column reference related metadata for
// the Alias. So it won't be caught as a column reference in DetectAmbiguousSelfJoin.
Alias(expr, alias)(deniedMetadataKeys = Seq(Dataset.DATASET_ID_KEY, Dataset.COL_POS_KEY))
// SPARK-33536: an alias is no longer a column reference. Therefore,
// we should not inherit the column reference related metadata in an alias
// so that it is not caught as a column reference in DetectAmbiguousSelfJoin.
Alias(expr, alias)(
nonInheritableMetadataKeys = Seq(Dataset.DATASET_ID_KEY, Dataset.COL_POS_KEY))
}

/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -577,8 +577,8 @@ class ColumnarAlias(child: ColumnarExpression, name: String)(
override val exprId: ExprId = NamedExpression.newExprId,
override val qualifier: Seq[String] = Seq.empty,
override val explicitMetadata: Option[Metadata] = None,
override val deniedMetadataKeys: Seq[String] = Seq.empty)
extends Alias(child, name)(exprId, qualifier, explicitMetadata, deniedMetadataKeys)
override val nonInheritableMetadataKeys: Seq[String] = Seq.empty)
extends Alias(child, name)(exprId, qualifier, explicitMetadata, nonInheritableMetadataKeys)
with ColumnarExpression {

override def columnarEval(batch: ColumnarBatch): Any = child.columnarEval(batch)
Expand Down Expand Up @@ -715,7 +715,7 @@ case class PreRuleReplaceAddWithBrokenVersion() extends Rule[SparkPlan] {
def replaceWithColumnarExpression(exp: Expression): ColumnarExpression = exp match {
case a: Alias =>
new ColumnarAlias(replaceWithColumnarExpression(a.child),
a.name)(a.exprId, a.qualifier, a.explicitMetadata, a.deniedMetadataKeys)
a.name)(a.exprId, a.qualifier, a.explicitMetadata, a.nonInheritableMetadataKeys)
case att: AttributeReference =>
new ColumnarAttributeReference(att.name, att.dataType, att.nullable,
att.metadata)(att.exprId, att.qualifier)
Expand Down
Loading

0 comments on commit c74663e

Please sign in to comment.