You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
libcudf crashed with a segfault while trying to load a Parquet file containing a column with a Map type. This file is a bit interesting since the Map key is a nested type instead of a primitive type. I verified this file can be loaded successfully in Apache Spark.
The crash occurs when trying to specify the column name to load from this file. Without specifying the name of any column to load the crash does not occur.
Steps/Code to reproduce bug
A sample Parquet file is attached that can reproduce the crash.
Unzip the archive and use the following test code to reproduce the crash.
#include<cudf/io/parquet.hpp>
#include<cudf/table/table.hpp>intmain() {
auto path = std::string("maptest.parquet");
auto opts = cudf::io::parquet_reader_options::builder(cudf::io::source_info(path))
// NOTE: without specifying the column name it does not crash
.columns({"value"})
.build();
auto result = cudf::io::read_parquet(opts);
return0;
}
Expected behavior
libcudf methods should not segfault with valid inputs.
The text was updated successfully, but these errors were encountered:
Describe the bug
libcudf crashed with a segfault while trying to load a Parquet file containing a column with a Map type. This file is a bit interesting since the Map key is a nested type instead of a primitive type. I verified this file can be loaded successfully in Apache Spark.
The crash occurs when trying to specify the column name to load from this file. Without specifying the name of any column to load the crash does not occur.
Steps/Code to reproduce bug
A sample Parquet file is attached that can reproduce the crash.
maptest.parquet.zip
Unzip the archive and use the following test code to reproduce the crash.
Expected behavior
libcudf methods should not segfault with valid inputs.
The text was updated successfully, but these errors were encountered: