Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python 3.9 Support #382

Closed
johnthagen opened this issue Aug 29, 2020 · 18 comments
Closed

Python 3.9 Support #382

johnthagen opened this issue Aug 29, 2020 · 18 comments

Comments

@johnthagen
Copy link
Contributor

johnthagen commented Aug 29, 2020

Tracking issue for Python 3.9 support, set to release Oct 5, 2020.

Python 3.9rc1 has been released so it is possible to test in a near-final state. Travis CI python3.9-dev exists that could be used to test builds.

@skvark
Copy link
Member

skvark commented Aug 31, 2020

Different Python versions are provided by the manylinux images and multibuild and 3.9 is probably available also in them. Travis build host Python is not used in the builds or tests.

@johnthagen
Copy link
Contributor Author

Python 3.9 released today.

@skvark
Copy link
Member

skvark commented Oct 5, 2020

Yeah, manylinux2014 Docker images in this repo need to be rebuilt when the base images have been updated from rc2 to the final version: https://github.com/pypa/manylinux/blob/master/docker/build_scripts/build_env.sh

@skvark
Copy link
Member

skvark commented Oct 6, 2020

I'll do the needed changes and rebuilds today, manylinux and multibuild were just updated to the 3.9 official release version.

Also we need to wait for numpy 1.19.3 release which should available in a week or so.

@skvark
Copy link
Member

skvark commented Oct 6, 2020

Waiting also for this: appveyor/ci#3541

Branch with the changes: https://github.com/skvark/opencv-python/tree/feat/python39

@skvark
Copy link
Member

skvark commented Oct 7, 2020

numpy/numpy#17482

@cclauss
Copy link
Contributor

cclauss commented Oct 29, 2020

https://github.com/numpy/numpy/releases/tag/v1.19.3 Delivers numpy Python 3.9 binary wheels on all supported platforms.

@skvark
Copy link
Member

skvark commented Oct 31, 2020

Travis builds look ok, but I have to think what to do with Appveyor. Python 3.9 was installed only on the Visual Studio 2019 images (appveyor/ci#3541 (comment)) and that's not going to work for this project. Currently, Visual Studio 2015 is used for building the wheels. scikit-build does not recognize VS 2019 as a valid build tool so I'll have to install Python 3.9 manually to the Appveyor environment before builds. Upgrading to VS 2019 would also affect to the VC++ redistributable version needed to run these packages.

@skvark
Copy link
Member

skvark commented Nov 2, 2020

Release builds are now in progress but it will take over 24 hours before all wheels have been built and uploaded to PyPI.

@skvark
Copy link
Member

skvark commented Nov 3, 2020

Some of the wheels are not yet uploaded, opencv-python-headless hit the PyPI project size limit and prevents some builds from uploading new wheels. I'll delete some of the oldest releases from PyPI today to make room for new ones.

@skvark
Copy link
Member

skvark commented Nov 3, 2020

I don't understand why some projects in PyPI have 20 GB limit, some 30 GB limit and some 40 GB limit... maybe something to do with the time of creation ¯_(ツ)_/¯ I uploaded the missing opencv-python-headless wheels manually from my Azure artifact storage to PyPI. Windows builds are still running and will hopefully finish today.

@skvark skvark closed this as completed Nov 3, 2020
@johnthagen
Copy link
Contributor Author

johnthagen commented Nov 3, 2020

pypa/packaging-problems#86 (comment)

We can increase the limit on a package per package basis.

It looks like they will happily increase the limit for legitimate packages if you asked. To me, opencv-python seems like a very good candidate due to how many precompiled wheels are needed to host it properly.

Looks like you can ask here if you wanted: https://github.com/pypa/warehouse/issues

@skvark
Copy link
Member

skvark commented Nov 3, 2020

Yeah, I have already individual package size limit requests in place (source distribution for contrib variants are missing due to this): #412

However, I will also probably need to request limit increases for the projects in PyPI. Current status for different projects:

opencv-python

Project size : 28.4 GiB
Project total size limit : 40.0 GiB

opencv-contrib-python

Project size : 30.6 GiB
Project total size limit : 40.0 GiB

opencv-python-headless:

Project size : 19.5 GiB
Project total size limit : 20.0 GiB

opencv-contrib-python-headless:

Project size : 24.0 GiB
Project total size limit : 30.0 GiB

@cclauss
Copy link
Contributor

cclauss commented Nov 3, 2020

Staggering! They are massive.

@skvark
Copy link
Member

skvark commented Nov 3, 2020

In addition to the PyPI artifacts, I'm storing currently about 150 GB of macOS and Linux build artifacts in Azure. Appveyor luckily provides native free artifact storage.

I think the wheels have a reasonable size (around 20-50 MB per each depending on platform), there are just a lot of releases and maybe 20 wheels in each of them.

Have a look for example at tensorflow, single wheel size is several hundred MB: https://pypi.org/project/tensorflow/#files

@skvark
Copy link
Member

skvark commented Nov 3, 2020

pypi/support#712

@mshabunin
Copy link

There is PYTHON3_LIMITED_API option which enables building a single binary for all python 3.x versions (>=3.4) (opencv/opencv#14736). But numpy must be >=1.17 or <1.15. And I'm not sure how it aligns with all toolchains, manylinux etc.

@skvark
Copy link
Member

skvark commented Nov 6, 2020

That limited API sounds great if it works reliably. Numpy can be forced to >=1.17 but this needs also other investigation and testing. Added issue so that I don't forget it: #414

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants