Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iter_batches does not work with a Snowflake connection in pl.read_database() #17404

Closed
2 tasks done
tmespe opened this issue Jul 3, 2024 · 0 comments · Fixed by #17688
Closed
2 tasks done

Iter_batches does not work with a Snowflake connection in pl.read_database() #17404

tmespe opened this issue Jul 3, 2024 · 0 comments · Fixed by #17688
Assignees
Labels
A-io-database Area: reading/writing to databases bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars

Comments

@tmespe
Copy link

tmespe commented Jul 3, 2024

Checks

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

for df in
    pl.read_database(
        connection=snowflake_connector_connection,
        query=f"SELECT * FROM SNOWFLAKE.TABLE",
        iter_batches=True,
        batch_size=100_000,
    )
):
    df.head()

Log output

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[8], line 2
      1 pl.Config().set_verbose(True)
----> 2 for n, df in enumerate(
      3     pl.read_database(
      4         connection=session.connection,
      5         query=f"SELECT * FROM {legacy_table}",
      6         iter_batches=True,
      7         batch_size=100,
      8     )
      9 ):
     10     df.write_parquet(f"data/legacy_{n}.parquet")
     11     print(f"Batch {n} written")

File ~/miniforge3/envs/dataops/lib/python3.10/site-packages/polars/io/database/_executor.py:529, in <genexpr>(.0)
    527     if frame is not None:
    528         if defer_cursor_close:
--> 529             frame = (
    530                 df
    531                 for df in CloseAfterFrameIter(  # type: ignore[attr-defined]
    532                     frame,
    533                     cursor=self.result,
    534                 )
    535             )
    536         return frame
    538 msg = (
    539     f"Currently no support for {self.driver_name!r} connection {self.cursor!r}"
    540 )

File ~/miniforge3/envs/dataops/lib/python3.10/site-packages/polars/io/database/_executor.py:72, in CloseAfterFrameIter.__iter__(self)
     71 def __iter__(self) -> Iterable[DataFrame]:
---> 72     yield from self._iter_frames
     74     if hasattr(self._cursor, "close"):
     75         self._cursor.close()

File ~/miniforge3/envs/dataops/lib/python3.10/site-packages/polars/io/database/_executor.py:216, in <genexpr>(.0)
    214             if ver := driver_properties["minimum_version"]:
    215                 self._check_module_version(self.driver_name, ver)
--> 216             frames = (
    217                 self._apply_overrides(batch, (schema_overrides or {}))
    218                 if isinstance(batch, DataFrame)
    219                 else from_arrow(batch, schema_overrides=schema_overrides)
    220                 for batch in self._fetch_arrow(
    221                     driver_properties,
    222                     iter_batches=iter_batches,
    223                     batch_size=batch_size,
    224                 )
    225             )
    226             return frames if iter_batches else next(frames)  # type: ignore[arg-type,return-value]
    227 except Exception as err:
    228     # eg: valid turbodbc/snowflake connection, but no arrow support
    229     # compiled in to the underlying driver (or on this connection)

File ~/miniforge3/envs/dataops/lib/python3.10/site-packages/polars/io/database/_executor.py:169, in ConnectionExecutor._fetch_arrow(self, driver_properties, batch_size, iter_batches)
    167 fetchmany_arrow = getattr(self.result, fetch_batches)
    168 if not repeat_batch_calls:
--> 169     yield from fetchmany_arrow(size)
    170 else:
    171     while True:

TypeError: SnowflakeCursor.fetch_arrow_batches() takes 1 positional argument but 2 were given

Issue description

Iter_bacthes does not work with a snowflake connection in pl.read_database. It will throw a TypeError TypeError: SnowflakeCursor.fetch_arrow_batches() takes 1 positional argument but 2 were given. Looking at the snowflake connectior documentation it's probably because it does not support setting a batch_size parameter. However, this is required in pl.read_database if iter_batches is set to True. The connection works fine without iter_batches, so it seems it's a matter of not passing the batch_size if the connection is a snowflake type.

Expected behavior

The expected behaviour would be for polars to return the batches from the snowflake connector.

Installed versions

--------Version info---------
Polars:               1.0.0
Index type:           UInt32
Platform:             Linux-5.15.153.1-microsoft-standard-WSL2-x86_64-with-glibc2.35
Python:               3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:45:18) [GCC 12.3.0]

----Optional dependencies----
adbc_driver_manager:  <not installed>
cloudpickle:          2.2.1
connectorx:           <not installed>
deltalake:            <not installed>
fastexcel:            <not installed>
fsspec:               <not installed>
gevent:               <not installed>
great_tables:         <not installed>
hvplot:               <not installed>
matplotlib:           <not installed>
nest_asyncio:         1.6.0
numpy:                2.0.0
openpyxl:             <not installed>
pandas:               2.2.2
pyarrow:              16.1.0
pydantic:             2.7.4
pyiceberg:            <not installed>
sqlalchemy:           <not installed>
torch:                <not installed>
xlsx2csv:             <not installed>
xlsxwriter:           <not installed>```

</details>
@tmespe tmespe added bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars labels Jul 3, 2024
@alexander-beedie alexander-beedie self-assigned this Jul 3, 2024
@alexander-beedie alexander-beedie added the A-io-database Area: reading/writing to databases label Jul 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-io-database Area: reading/writing to databases bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants