You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When large/complex structures are stored using recursive normalisers, it fails with a confusing error that indicates that user metadata is too large. Even when the user has not specified metadata.
Steps/Code to Reproduce
N = 100_000
d = {}
for i in range(N):
k = f"key_{i}"
v = pd.DataFrame({'col': np.arange(float(i), i + 100.)})
d[k] = v
lib.write_pickle('dict', d)
Takes a long time on the write_pickle but eventually completes with a warning
[2024-07-02 14:12:27.493] [arcticdb] [warning] User defined metadata is above warning size (8388608B), metadata cannot exceed 16777216B. Current size: 15489024B.
Expected Results
The error message should state that the structure being normalised is too large/complex.
OS, Python Version and ArcticDB Version
adb 4.0.3
Backend storage used
On-prem S3
Additional Context
No response
The text was updated successfully, but these errors were encountered:
Describe the bug
When large/complex structures are stored using recursive normalisers, it fails with a confusing error that indicates that user metadata is too large. Even when the user has not specified metadata.
Steps/Code to Reproduce
Takes a long time on the write_pickle but eventually completes with a warning
Expected Results
The error message should state that the structure being normalised is too large/complex.
OS, Python Version and ArcticDB Version
adb 4.0.3
Backend storage used
On-prem S3
Additional Context
No response
The text was updated successfully, but these errors were encountered: