You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Spark 3.0 will include a bump of the Arrow version it uses (in Java) from 0.12.0 to 0.15.1 (apache/spark#26133). So with that Arrow version, we won't need this legacy format flag. (It shouldn't crash if we use the flag, but there was a reason for the format change in Arrow, and we should avoid legacy codepaths if we can.)
What this means is that we should move where that env var is set (temporarily or otherwise) to the appropriate place that know what Spark version it's communicating with. As it stands currently in core_arrow.R, the spark context isn't known.
The text was updated successfully, but these errors were encountered:
For compatibility with older versions of Arrow used by Spark, we set this environment variable when writing arrow data: https://github.com/sparklyr/sparklyr/blob/master/R/core_arrow.R#L3-L4
Spark 3.0 will include a bump of the Arrow version it uses (in Java) from 0.12.0 to 0.15.1 (apache/spark#26133). So with that Arrow version, we won't need this legacy format flag. (It shouldn't crash if we use the flag, but there was a reason for the format change in Arrow, and we should avoid legacy codepaths if we can.)
What this means is that we should move where that env var is set (temporarily or otherwise) to the appropriate place that know what Spark version it's communicating with. As it stands currently in core_arrow.R, the spark context isn't known.
The text was updated successfully, but these errors were encountered: