-
Notifications
You must be signed in to change notification settings - Fork 542
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proper (cross) compilation of pip packages #453
Comments
If you've not seen them already, the dbx_build_tools rules from Dropbox are currently the most sophisticated in terms of building third-party Python code from source using toolchains registered with Bazel. Python also often requires Fortran to be setup if I recall for many of the Numerical / Scientific libraries, but the Dropbox rules should make this clearer. When you talk about "cross-compilation", are you talking about cross-compilation across CPU architectures (x86 vs ARM etc) or also across OS (win, macos, linux)? The reason I ask is because something that should be considered is that it is relatively common to come across "platform specific" dependencies. PEP 508 defines so called "Environment Markers" that change the dependency graph based on the system where the installation is happening. One example from Jupyter is https://github.com/ipython/ipython/blob/master/setup.py#L212. Python does not currently have strictly separate resolve, build, install steps. It is the "resolve + build" step that is currently coupled, but Python is taking some steps towards making as much of this "static" as possible. However, in the vast majority of cases, |
Thanks, I've recently become aware of them but have not yet explored what they're capable of. Thanks for the pointer re: Fortran.
I mean cross OS and cross architecture. For now I'm installing pre-compiled wheels by specifying a number of additional pip arguments:
Right now I work around this by specifying different pip repositories for each platform and then a sort of "meta"
My approach seems to work fine for precompiled wheels, but I'm now seeing the need to extend rules_python so that it works for packages that require compilation. |
The Python packaging ecosystem is always evolving, so in some sense it might be wise to take a "wait and see" approach until things have stabilised. Of course, things may never settle. :) For example, there is an evolving standard in PEP-517. This clarifies the responsibilities of the so called "build front-end" and "build back-end". Any "package author" is able to define and configure how their package builds from the package's source tree through their "build back-end", but package "end-users" are responsible for selecting and configuring the "build front-end". It might be possible someday for Bazel rules_python to become a compliant PEP-517 "build front-end" if this standard is accepted and the ecosystem accepts it. However, what does that then mean if a package author wishes to use
Not sure if you've considered it, but another option for your use case, if they are always third-party dependencies you are looking to compile would be to build (precompile) the wheels for your target platforms and host the artifacts in your own mirror (artifactory etc). Then Bazel can always work against prebuilt wheels which it simply needs to download and unpack. |
Related issue: #260 ("Cross compilation" of py_binary/py_image/py_library targets) |
This issue has been automatically marked as stale because it has not had any activity for 180 days. It will be closed if no further activity occurs in 30 days. |
This issue was automatically closed because it went 30 days without a reply since it was labeled "Can Close?" |
Hi @ajbouh, I'm trying to replicate your setup in my org. I am getting below error
As I understand that |
🚀 feature request
I want to use pip packages that require compilation to install. I'd also like to use Bazel's platform transitions and a hermetic cross-compilation toolchain to easily build pip packages for any (linux) target.1
Relevant Rules
pip_parse
Describe the solution you'd like
I expect we'd need to adjust
pip_parse
logic to do wheel extraction (or compilation) during Bazel's execution phase.If we can switch
pip_parse
to generateBUILD
files that do the wheel extraction / pip install in agenrule
or other non-repository rule, then we can use the toolchain Bazel has selected for the target platform. Note that in the non-cross-compilation case this would just be the host platform cc toolchain.There are two ways to get access to the current cc toolchain:
genrule
and a command that uses the make variablesCC
andCC_FLAGS
to run the pip command (see https://docs.bazel.build/versions/master/be/make-variables.html#custom_variables).Describe alternatives you've considered
Leave package installation / compilation in Bazel's loading phase and use repository rules workarounds to find and set
CC
andCFLAGS
. This is feasible but slower (since repository rules do not benefit from fine-grained parallelism), harder to maintain, and kind of a mess.Can also continue with the status quo of manually porting non-wheel-based pip packages over to build with Bazel. This is very challenging, time consuming, and error prone. Doubly so in the context of cross compilation.
1: I have been building a hermetic cross-compilation toolchain for Bazel that's relatively small and straightforward to use. It patches rules_docker to use platform transitions for targets that are used inside the
cc_image
. Here's a preview: https://github.com/ajbouh/bazel-zig-cc.The text was updated successfully, but these errors were encountered: