-
Notifications
You must be signed in to change notification settings - Fork 542
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip.parse
crashes with StopIteration
if platform restriction prevents package from install
#1868
Comments
As a workaround, one can exclude that specific requirement from the |
Given that colorama should be only included on Windows and you run this error on non Windows, I suspect that you are trying to cross-compile, or you are trying to use colorama without including it in your lock file, which should alter the conditional inclusion of the wheel from only on Windows to all platforms. We at work use a similar lockfile without any issues with rules_python 0.31, so I am perplexed why this is not working in your case. |
We switched to platform-agnostic lockfile (meaning that you can use this same lockfile on many platforms, it has all the needed info and all markers). Consider that we have such lines in it
we are running this on linux x86_64 and run into
I think one way to fix it would be to add code along the lines of result_lines = []
for line in lines:
r = packaging.requirements.Requirement(line)
if r.marker:
if not r.marker.evaluate():
# if we have marker and it's not for our platform of interest, skip
continue
result_lines.append(line + "\n") |
After a little bit of thinking I think the problem is that supporting cross-platform requirements would be something that detracts us from supporting other up and coming or existing cross-platform formats (e.g. I proposed a change in #1885 to unblock people needed different versions per Technically there is no parser of |
This PR implements a better way of specifying the requirements files for different (os, cpu) tuples. It allows for more granular specification than what is available today and allows for future extension to have all of the sources in the select statements in the hub repository. This is replacing the previous selection of the requirements and there are a few differences in behaviour that should not be visible to the external user. Instead of selecting the right file which we should then use to create `whl_library` instances we parse all of the provided requirements files and merge them based on the contents. The merging is done based on the blocks within the requirement file and this allows the starlark code to understand if we are working with different versions of the same package on different target platforms. Fixes #1868 Work towards #1643, #735
@aignas I just tested commit a6cb620 against my project, but it doesn't fix the issue. The problem I have is that I can't really modify the |
If I understand correctly, poetry in its lock file has dependency that
is only installed on a particular platform. In order to have a single
version of the dependency you would have to modify the poetly lock file
to only contain a single version.
Let me know if that works for you.
|
sorry I'm a bit late to the issue. @aignas I'm not sure I quite understand what you meant by
Best I can figure you're saying that supporting cross-platform requirements would make it more difficult to support the non- I'll also add that
is not a thing anyone should be doing since it's a generated file and not a manually maintained one. With the correct edits, you're right that it should work. However, manually editing a |
What I meant was that poetry, pdm and other tools usually lock the dependencies and each platform has certain constraints on the packages they include. I do not mean modifying the lock files manually, but rathec modifying the input files to the lock files. The problem with requirements.txt is that up until now rules python implicitly supported only a single version of those packages in a given requirements file because all of the starlark tooling does not support marker evaluation. In your pyproject file you could modify your project constraints to ensure that the resulting pdm or poetry lock file contains a single version of each package instead of conditionally including different versions on different platforms. That would mean when those tools export a requirements file, it would not have conditional dependencies and rules_python would be able to read it without a problem. The uv outputting requirements files that are specific to a platform is something that is the same constraint that pip compile was imposing on the users, that is that in general, the requirements file is only compatible with the platform that the file was generated on/for. I am wondering if supporting poetry, pdm, hatch lock files is something that we should do as they have more information in them, or if we should just support the environment markers in the requirement files. |
Thank you for elaborating and providing context! Sorry for the misunderstanding. I would think both supporting environment markers and the lockfiles would be very useful. Though if I had to choose it would be environment markers because there's a PEP for it, which makes it highly standardized and widely used/supported in third-party tools (and a single effort would provide better support for almost all third-party package managers). |
We do support the PEP508 in the context of I am happy to read PRs of anyone who may be willing to implement support of requirement markers in the parsing of the Regarding "standardized and widle used" part of your answer, the real world of I am writing all of this down not necessarily to disagree with you but to add more context to the discussion of supporting any |
This is just a small PR to reduce the scope of bazelbuild#2059. Work towards bazelbuild#260, bazelbuild#1105, bazelbuild#1868.
…around This is extra preparation needed for bazelbuild#2059. Summary: - Create `pypi_repo_utils` for more ergonomic handling of Python in repo context. - Split the resolution of requirements files to platforms into a separate function to make the testing easier. This also allows more validation that was realized that there is a need for in the WIP feature PR. - Make the code more robust about the assumption of the target platform label. Work towards bazelbuild#260, bazelbuild#1105, bazelbuild#1868.
…around This is extra preparation needed for bazelbuild#2059. Summary: - Create `pypi_repo_utils` for more ergonomic handling of Python in repo context. - Split the resolution of requirements files to platforms into a separate function to make the testing easier. This also allows more validation that was realized that there is a need for in the WIP feature PR. - Make the code more robust about the assumption of the target platform label. Work towards bazelbuild#260, bazelbuild#1105, bazelbuild#1868.
…around (#2069) This is extra preparation needed for #2059. Summary: - Create `pypi_repo_utils` for more ergonomic handling of Python in repo context. - Split the resolution of requirements files to platforms into a separate function to make the testing easier. This also allows more validation that was realized that there is a need for in the WIP feature PR. - Make the code more robust about the assumption of the target platform label. Work towards #260, #1105, #1868.
This also changes the local_runtime_repo to explicitly check for supported platforms instead of relying on a `None` value returned by the helper method. This makes the behaviour exactly the same to the behaviour before this PR and we can potentially drop the need for the validation in the future if our local_runtime detection is more robust. This also makes the platform detectino in `pypi_repo_utils` not depend on `uname` and only use the `repository_ctx`. Apparently the `module_ctx.watch` throws an error if one attempts to watch files on the system (this is left for `repository_rule` it seems and one can only do `module_ctx.watch` on files within the current workspace. This was surprising, but could have been worked around by just unifying code. This splits out things from bazelbuild#2059 and makes the code more succinct. Work towards bazelbuild#260, bazelbuild#1105, bazelbuild#1868.
This also changes the local_runtime_repo to explicitly check for supported platforms instead of relying on a `None` value returned by the helper method. This makes the behaviour exactly the same to the behaviour before this PR and we can potentially drop the need for the validation in the future if our local_runtime detection is more robust. This also makes the platform detectino in `pypi_repo_utils` not depend on `uname` and only use the `repository_ctx`. Apparently the `module_ctx.watch` throws an error if one attempts to watch files on the system (this is left for `repository_rule` it seems and one can only do `module_ctx.watch` on files within the current workspace. This was surprising, but could have been worked around by just unifying code. This splits out things from #2059 and makes the code more succinct. Work towards #260, #1105, #1868.
Before this change the `all_requirements` and related constants will have packages that need to be installed only on specific platforms and will mean that users relying on those constants (e.g. `gazelle`) will need to do extra work to exclude platform-specific packages. The package managers that that support outputting such files now include `uv` and `pdm`. This might be also useful in cases where we attempt to handle non-requirements lock files. Note, that the best way to handle this would be to move all of the requirements parsing code to Python, but that may cause regressions as it is a much bigger change. This is only changing the code so that we are doing extra processing for the requirement lines that have env markers. The lines that have no markers will not see any change in the code execution paths and the python interpreter will not be downloaded. We also use the `*_ctx.watch` API where available to correctly re-evaluate the markers if the `packaging` Python sources for this change. Extra changes that are included in this PR: - Extend the `repo_utils` to have a method for `arch` getting from the `ctx`. - Change the `local_runtime_repo` to perform the validation not relying on the implementation detail of the `get_platforms_os_name`. - Add `$(UV)` make variable for the `uv:current_toolchain` so that we can generate the requirements for `sphinx` using `uv`. - Swap the requirement generation using `genrule` and `uv` for `sphinx` and co so that we can test the `requirement` marker code. Note, the `requirement` markers are not working well with the `requirement_cycles`. Fixes #1105. Fixes #1868. Work towards #260, #1975. Related #1663. --------- Co-authored-by: Richard Levasseur <rlevasseur@google.com>
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefor we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Work towards bazelbuild#1105, bazelbuild#1868.
…thon (#2135) Before this PR the lockfile would become platform dependent when the `requirements` file would have env markers. This was not caught because we do not have MODULE.bazel.lock checked into the `rules_python` repository because the CI is running against many versions and the lock file is different, therefore we would not be able to run with `bazel build --lockfile_mode=error`. With this change we use the label to `BUILD.bazel` which is living next to the `python` symlink and since the `BUILD.bazel` is the same on all platforms, the lockfile will remain the same. Summary * refactor(uv): create a reusable macro for using uv for locking reqs. * test(bzlmod): enable testing the MODULE.bazel.lock breakage across platforms. * test(bzlmod): use a universal requirements file for 3.9. This breaks the CI, because the python interpreter file hash is added to the lock file. * fix(bzlmod): keep the lockfile platform independent when resolving python Fixes #1105 and #1868 for real this time. Implements an additional helper for #1975.
🐞 bug report
Affected Rule
The issue is caused by the
pip.parse
extension.Is this a regression?
No.
Description
If the given requirement has a platform restriction like
platform_system == "Windows"
andpip
refuses to download the requirements, then Bazel will crash.🔬 Minimal Reproduction
https://github.com/sin-ack/rules_python-repro
🔥 Exception or Error
🌍 Your Environment
Operating System:
Output of
bazel version
:Rules_python version:
Anything else relevant?
I'm using rules_python_poetry, which generates that verbose requirement line.
The text was updated successfully, but these errors were encountered: