-
Notifications
You must be signed in to change notification settings - Fork 292
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
read goes-r data from https aws, gcp or azure (it uses h5netcdf) #1424
base: main
Are you sure you want to change the base?
Conversation
DeepCode's analysis on #b2cc26 found:
Top issues
👉 View analysis in DeepCode’s Dashboard | Configure the bot |
Codecov Report
@@ Coverage Diff @@
## master #1424 +/- ##
==========================================
- Coverage 90.58% 90.58% -0.01%
==========================================
Files 236 236
Lines 33797 33804 +7
==========================================
+ Hits 30615 30620 +5
- Misses 3182 3184 +2
Continue to review full report at Codecov.
|
1 similar comment
@djhoese any opinion on this? |
@mraspaud Does the base reader support your FSFile stuff yet? I'd say this PR is an alternative that is specific to this reader so I'd prefer a documented example of how to use the FSFile stuff rather than this. |
Yes it is. |
This second method @djhoese, it uses h5netcdf, I think is a best option, but it has some error, I've tried to track it but can't find it
is adapted of this: pydata/xarray#1075
When I load directly from https using xarray it works fine, but in satpy "abi_base.py" it gets stuck.
the advantage of this method, is that you can read direct from S3 using boto3: pydata/xarray#1075 (comment)
and from google storage
the latter I tried it in google colab and it worked
I did the test with this data in my local machine
and this is the error when I read the data
flake8 satpy
AUTHORS.md
if not there already