Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge constraint contents for different envs #826

Open
webknjaz opened this issue May 27, 2019 · 21 comments
Open

Merge constraint contents for different envs #826

webknjaz opened this issue May 27, 2019 · 21 comments
Labels
feature Request for a new feature needs discussion Need some more discussion question User question support User support

Comments

@webknjaz
Copy link
Member

webknjaz commented May 27, 2019

UPD (Nov 30, 2023): I changed my mind, having understood the consequences and having gained a better understanding of the challenges this would present:


Known closed and/or duplicate issues that may have some unique context/conversations:


I want to run pip-compile against the same set of requirements files under different envs (Python interpreters) and get a combined output.
Currently, it doesn't seem possible. Am I overlooking something?

Environment Versions

N/A

Steps to replicate

N/A

Expected result

resuling file is extended with extra entries

Actual result

resulting file is overridden

@atugushev
Copy link
Member

Hello @webknjaz,

Thank for the report. See #651 for the detailed info how to deal with different envs.

@webknjaz
Copy link
Member Author

@atugushev thanks. I usually have -c constraints.txt at the top of my requirements file. Which means that it's also shared leading to missing pins in constraints in some envs. Currently I have to edit things manually.

@atugushev atugushev added the question User question label May 27, 2019
@atugushev
Copy link
Member

Currently, it doesn't seem possible. Am I overlooking something?

@webknjaz does the #826 (comment) resolved the issue?

@atugushev atugushev added the awaiting response Awaiting response from a contributor label Apr 12, 2020
@webknjaz
Copy link
Member Author

No, it's not what I was asking about.

@webknjaz webknjaz removed the awaiting response Awaiting response from a contributor label Apr 13, 2020
@atugushev
Copy link
Member

atugushev commented Apr 13, 2020

I see. Unfortunately, it's not possible (at the moment). Currently, pip-tools compiles requirements, evaluates markers and outputs results without ones. The official stance of pip-tools regarding cross-environment usage is that pip-compile "must be executed for each environment" (as I pointed out before). I believe this decision has been made deliberately to keep things simple.

But anyways, "merging markers" doesn't seem so impossible. At least pipenv and poetry solved this issue (IIRC) somehow. This might be challenging, but the feature worth it. Let's mark it as feature-request then.

@atugushev atugushev added feature Request for a new feature and removed question User question labels Apr 13, 2020
@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented Apr 16, 2020

@webknjaz With the new 5.0.0 release comes some new possibilities for working with multi-environment markers, as pip-sync will now happily ignore non-env-matching reqs.

I'm not sure what the best solutions and workflows might be, but here's something I played around with. Maybe it can give you some ideas, and then you can give back some ideas!

platform-compile.sh:

#!/bin/dash -e
# platform-compile.sh [--base <basetxt>] [<reqsin>] [<pip-compile-arg>...]

python_version="$(python -c 'from __future__ import print_function; import platform; print(*platform.python_version_tuple()[:2], sep=".")')"
sys_platform="$(python -c 'from __future__ import print_function; import sys; print(sys.platform)')"
machine="$(python -c 'from __future__ import print_function; import platform; print(platform.machine())')"

if [ "$1" = '--base' ]; then
    base="$2"
    shift 2
else
    unset base
fi

if [ -r "$1" ]; then
    reqsin="$1"
    shift
else
    reqsin="requirements.in"
fi

txt="py${python_version}-${sys_platform}-${machine}-$(printf '%s' "${reqsin}" | sed 's/\.in$//').txt"
markers="; python_version ~= '${python_version}' and sys_platform == '${sys_platform}' and platform_machine == '${machine}'"

if [ "$base" ] && [ "$base" != "$txt" ]; then
    cp "$base" "$txt"
fi

pip-compile --no-header "$reqsin" -o "$txt" "$@"
sed -i -E "s/(^[^;]+==[^;#]+)(#|$)/\1${markers}  \2/g" "$txt"

Then let's say we've got this requirements.in:

pathlib ; python_version < '3'
httpx ; python_version >= '3'

Running

$ ./platform-compile.sh

on my system in a py3 env will generate py3.8-linux-x86_64-requirements.txt:

certifi==2020.4.5.1       ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
chardet==3.0.4            ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
h11==0.9.0                ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
h2==3.2.0                 ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
hpack==3.0.0              ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via h2
hstspreload==2020.4.14    ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
httpx==0.12.1 ; python_version >= "3"  # via -r requirements.in
hyperframe==5.2.0         ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via h2
idna==2.9                 ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
rfc3986==1.4.0            ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
sniffio==1.1.0            ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx
urllib3==1.25.9           ; python_version ~= '3.8' and sys_platform == 'linux' and platform_machine == 'x86_64'  # via httpx

Doing so in a py2 env will generate py2.7-linux2-x86_64-requirements.txt:

pathlib==1.0.1 ; python_version < "3"  # via -r requirements.in

Then I can have a manually created (though potentially scripted) requirements.txt:

-r py2.7-linux2-x86_64-requirements.txt
-r py3.8-linux-x86_64-requirements.txt

The result is that whether I'm in a py2 or py3 env, I can use either

$ pip install -r requirements.txt

or

$ pip-sync requirements.txt

to get up and running.

So with something like this, you could also have a constraints.in, and a manually written constraints.txt which just includes all the platform-specific constraints files (e.g. -r py2.7-linux2-x86_64-constraints.txt...).

@altendky
Copy link

I imagine the 'real' solution to multi-environment locking to be a two step process. First, run in each environment to collect a bunch of info required for locking and store it to a file or files. Two, have a single run (that doesn't care what env it is in) that does the resolving using the sum of all collected info. The point here is to create a common set of requirements for all envs if at all possible. Processing each environment separately could easily result in twisted==20.3 in one environment and twisted==19.10in another even if 19.10 would have satisfied both.

This would apply equally I think to environment variance across Python version, interpreter, OS, and even 'groups'/'layers'. The last being used in my case for 'normal installation' vs. 'testing' vs. 'development' for example (kind of like 'extras' but not quite).

So, I'm confident this would be a lot of work regardless but... does that sort of two-stage situation already exist in pip-tools even if not available presently as completely separate steps?

@altendky
Copy link

Oh, for completeness, the entirely undocumented http://github.com/altendky/boots is what I use as my front end to achieve 1) multiple 'groups' and 2) multiple operating systems (via http://github.com/altendky/romp). https://github.com/altendky/pm/tree/master/requirements is an example using those.

@AndydeCleyre
Copy link
Contributor

The point here is to create a common set of requirements for all envs if at all possible. Processing each environment separately could easily result in twisted==20.3 in one environment and twisted==19.10in another even if 19.10 would have satisfied both.

I think that could be achieved by modifying my hack above to treat one of the platforms as the default/standard/boss, and having the others -c constrained by that one's output.

@AndydeCleyre
Copy link
Contributor

Actually, no, it would probably ignore those "constraints" since the markers wouldn't match. Whoops.

@altendky
Copy link

The intermediate data needs to describe acceptable ranges of versions, not pinned versions.

@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented Apr 16, 2020

Can you speak more about the intermediate data here?

Another hack that actually does work for that would be to copy the "default" env's txt over the current env's txt before compilation. pip-compile still picks up the versions there as preferred/good-enough and uses them for the current env.

EDIT: I've updated the above plaltform-compile.sh with --base, to demonstrate.

@AndydeCleyre
Copy link
Contributor

AndydeCleyre commented May 2, 2020

@altendky --base in the hack above should enable the workflow for 2 layers. For more, maybe it could be done one --base at a time, looping until either stabilization achieved or a max-tries is reached.

EDIT: I think I understand what you mean about the "intermediate data" now -- you want full smart resolver behavior to account for all the in requirements across environments. That's tough, especially considering we just lean on pip's resolver here, and I guess we'd need a sort of meta-resolver on top or to get more super powers into pip's upcoming new resolver directly.

@altendky
Copy link

altendky commented May 3, 2020

I... thought I responded to your request for more details, but apparently I never submitted it. :[ Sorry.

Yeah, by all means not a trivial task to actually implement all this. The key though seems to be in separating the collecting of requirements (x needs y between v1 and v3, y needs z after v4, etc) and the resolution of them. That way you can collect from multiple places (Python versions, platforms, etc) and then merge all that together in one Python process to resolve.

While I 'understand' (am understanding of?) the difficulty both in implementing this and the extra resources needed by the user (access to all relevant platforms) it does strike me that among all the strong cross platform nature of Python and the community that locking is one thing that doesn't seem to be even remotely multi-platform ready.

anthrotype added a commit to googlefonts/picosvg that referenced this issue Nov 27, 2020
... parametrized by python version.

We have a bunch of dependencies that are only needed on specific python
versions (e.g. backports like dataclasses, importlib_resources, etc).
We use conditional environment markers for those in the top-level
requirements.in files.
pip-compile does not yet support generating a combined requirements.txt
file where differences across python versions/platforms/archs are
reconciled using environmet markers:
jazzband/pip-tools#826

They currently recommend running pip-compile on each targeted python
environment generating as many concrete requirements.txt files
jazzband/pip-tools#651

So this is what I have done here.
anthrotype added a commit to googlefonts/picosvg that referenced this issue Nov 27, 2020
... containing all top-level (*.in) and concrete requirements.txt, the
latter parametrized by python version.

We have a bunch of dependencies that are only needed on specific python
versions (e.g. backports like dataclasses, importlib_resources, etc).
We use conditional environment markers for those in the top-level
requirements.in files.
pip-compile does not yet support generating a combined requirements.txt
file where differences across python versions/platforms/archs are
reconciled using environmet markers:
jazzband/pip-tools#826

They currently recommend running pip-compile on each targeted python
environment generating as many concrete requirements.txt files
jazzband/pip-tools#651

So this is what I have done here.
anthrotype added a commit to googlefonts/picosvg that referenced this issue Nov 30, 2020
... containing all top-level (*.in) and concrete requirements.txt, the
latter parametrized by python version.

We have a bunch of dependencies that are only needed on specific python
versions (e.g. backports like dataclasses, importlib_resources, etc).
We use conditional environment markers for those in the top-level
requirements.in files.
pip-compile does not yet support generating a combined requirements.txt
file where differences across python versions/platforms/archs are
reconciled using environmet markers:
jazzband/pip-tools#826

They currently recommend running pip-compile on each targeted python
environment generating as many concrete requirements.txt files
jazzband/pip-tools#651

So this is what I have done here.
@NicolasT
Copy link

FWIW, we had a similar situation (one dependency, doit, pulling in either pyinotify or macfsevents depending on the platform) requiring us to render and keep our requirements.in into two different requirements-Linux.txt and requirements-Darwin.txt files, which is cumbersome if one has no access to a system running one or the other.

Until a real solution to this problem is found, I built a work-around using the following approach, though it only works if the platform-specific dependencies are fairly self-contained (little to no transitive deps), don't change often (as you'll see, we effectively pin their version instead of updating them using pip-compile) and top-level dependencies don't eagerly add more platform-specific ones.

The trick is as follows:

  • I took the diff between our existing requirements-Linux.txt and requirements-Darwin.txt, resulting in an entry for pyinotify and macfsevents (both with hashes etc.)
  • Put these entries in platform-requirements.txt, each with a 'platform selector' added manually
  • Add -c platform-requirements.txt at the top of requirements.in
  • Adjust the script running pip-compile as follows:
    • Run pip-compile from requirements.in on the system, as always, using a temporary file as output
    • Remove the pyinotify and macfsevents entries from the temporary file (one of them will exist, depending on the platform on which this is executed). I use a bit of (g)awk to get this done, I guess there are 'better' ways but this does the job
    • Concatenate the temporary file (now missing one dependency) with platform-requirements.txt, effectively restoring the dep removed before by the 'right' platform-specific one (thanks to the 'platform selector'), though version-pinned

The resulting requirements.txt can be used on both Linux as well as Darwin systems. If other platforms ought to be supported, the trick can be extended I guess.

To see what this looks like code-wise, see scality/metalk8s@f1f5f9c

@webknjaz
Copy link
Member Author

@AndydeCleyre I've been experimenting with an idea similar to yours lately and got a PoC with tox + per-env constraints files. But only the installation part, not matrix generation for which I'll look into how I can use GHA.

As a result, I no longer think that merging constraints is the best idea. Linters constraints should not influence how the docs are built or how the tests are run. Also, limitations of the wheels available for macOS shouldn't influence the deps on Windows. Python 2 deps shouldn't necessarily define how Python 3-only bits are tested. And so on.

@altendky
Copy link

There is value in some cases to have cross-talk between the envs. There's something nice about having the same version of library X everywhere. For some use cases...

@webknjaz
Copy link
Member Author

Agreed, it's usecase-dependent. I've figured out that my use case is probably different. Or at least, some of my envs need to be separate.

@AndydeCleyre
Copy link
Contributor

Maybe we should close either #1326 or this one? I think they're getting at the same questions.

@webknjaz
Copy link
Member Author

matrix generation for which I'll look into how I can use GHA.

So I now have all the peaces in my PoC. It's not generally reusable yet but feel free to lurk into these bits of automation:

I hope to make these more generic and composable one day, but for now, it's a nice and working demo, I've been using since Jan 4, 2022.

@ssbarnea
Copy link
Member

@webknjaz I would really want to see a combined constraints file especially because I find the splitting into different files for each python/os/architecture a colossal maintenance PITA. Still, I don't know when pip-tools will be able to deal with this.

I wonder if we could at least add support for most basic type of conditions, when we have something like:

foo<1.0; python_version<3.9
foo>=2.0; python_version>=3.9

IMHO, over 19/20 conditions are just like this, only looking for python_version, nothing else.

@webknjaz webknjaz added question User question support User support needs discussion Need some more discussion labels Dec 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Request for a new feature needs discussion Need some more discussion question User question support User support
Projects
None yet
Development

No branches or pull requests

6 participants