Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release pants as a pex #4896

Closed
kwlzn opened this issue Sep 26, 2017 · 29 comments
Closed

release pants as a pex #4896

kwlzn opened this issue Sep 26, 2017 · 29 comments
Assignees

Comments

@kwlzn
Copy link
Member

kwlzn commented Sep 26, 2017

now that we're adequately entangled in platform-specific deps that are often times resolving (or failing to resolve.. or build..) against pypi, it would probably make sense for us to go ahead and start building and releasing a binary pex of pants along side (or in place of) sdists. this should eliminate most customer issues related to resolves and/or resolve-time builds of native deps at pants install/bootstrap time and ensure a consistent and tested shared set of backing wheels for running pants in most cases.

at Twitter, we deploy pants only via pex using a custom build and release process. having a formally released OSS pex that we can directly consume will help us stay closer to shared releases vs direct sha consumption.

@kwlzn kwlzn added the UX label Sep 26, 2017
@kwlzn
Copy link
Member Author

kwlzn commented Sep 26, 2017

core problems to solve here:

  • can we get a travis instance of OSX on a platform old enough to ensure forward compatibility for the last 3-4 OSX versions? native deps like psutil built on OSX 10.12 can fail to run on 10.11/10.10 due to linking to newer libc versions.

@jsirois
Copy link
Contributor

jsirois commented Sep 26, 2017

can we get a travis instance of OSX on a platform old enough ...

I think likely not, but it seems worth it to me to pay up again for a https://macstadium.com instance at the worst. They offer OS choices as old as 10.7 and even VMWare, in which case we could run OSX vms (not sure of licensing costs for OSX versions) and generate binaries per-OSX version if we're super paranoid about compat).

@kwlzn
Copy link
Member Author

kwlzn commented Sep 26, 2017

sounds great.

@jsirois
Copy link
Contributor

jsirois commented Oct 2, 2017

can we get a travis instance of OSX on a platform old enough ...

I think likely not ...

We can using osx_image: xcode6.4 which nets us osx 10.10. I'm trying to flip to this in #4914

@kwlzn
Copy link
Member Author

kwlzn commented Oct 2, 2017

nice

@wisechengyi
Copy link
Contributor

Could you clarify

  1. Is the plan to release just pantsbuild.pants as pex, or all the other packages together in a single pex?
  2. Where are the pex going to be stored?
  3. Will this change the existing publishing story to pypi?

@kwlzn
Copy link
Member Author

kwlzn commented Oct 2, 2017

  1. up for discussion, but I'd assume just the core pants package as pex with the others bootstrapped as plugins via the existing plugin resolver.
  2. github releases let you directly attach files, so probably there (as we do with pex: https://github.com/pantsbuild/pex/releases/tag/v1.2.12)
  3. shouldn't, no.

@jsirois
Copy link
Contributor

jsirois commented Oct 7, 2017

Now that #4906 is in - we release pants via wheels, including linux/mac specific ones for the main pantsbuild.pants distribution - the prerequisites for releasing a pex are all met afaict.

@stuhood stuhood self-assigned this Nov 10, 2017
@stuhood
Copy link
Sponsor Member

stuhood commented Nov 10, 2017

I'm going to sketch out how to do this today, and hopefully get started on it.

@stuhood
Copy link
Sponsor Member

stuhood commented Nov 17, 2017

My proposal:

  1. The linux and osx binary builder shards will begin pushing SHA-keyed and SHA-versioned platform specific whls to a "well known" (but private, ie: not guaranteed to be stable) location in our (existing?) S3 bucket.
  2. A script named pex_from_whls.sh will be added that given a SHA (defaulting to HEAD) and a list of platforms (defaulting to both linux and osx), consumes the wheels to build a pex from src/python/pants/bin:pants by setting:
    • --python-setup-platforms=${platforms}
    • --python-repos-repos=${s3_whls_path}

For now, pex_from_whls.sh will only be run manually by users who need pexes. In future PR(s), we could begin running it either as a new travis Build Stage, or as part of release.sh.

The "list of platforms" option for pex_from_whls.sh is an important component for turnaround time, as it will not always be necessary to wait for the osx binary builder shard to complete if a pex does not need to be cross-platform (say, to test it in a company's internal linux-only CI).

I initially thought about building two single-platform pexes (which is significantly easier), but the downside of that approach is that whatever bootstrap script fetches the pex would need to do platform detection, which pex already implements for multi-platform pexes.


Will start implementation tomorrow... would appreciate any feedback.

@kwlzn
Copy link
Member Author

kwlzn commented Nov 17, 2017

#1 sounds good, but we'll likely have to resort to setup.py mod hacks (as we do today internally) to achieve SHA-versioned (but I think it would be worthwhile to have this annotation on the .whl inside the pex). we might be able to improve this with a setuptools plugin, or it may not be worth it.

for #2, a shell script seems fine for now. eventually, I wonder if we couldn't get away with doing this purely in pants. --python-setup-platforms could be decomposed as 3 separate BUILD targets (e.g. src/python/pants/bin:{pants_osx,pants_linux,pants_multi}). and --python-repos-repos could potentially be done as a simple custom target type (pants_binary?) that wrapped PythonBinary, parameterized the current git sha as python_binary(repositories=['http://.../{}/'.format(sha)]) and injected likewise SHA-versioned deps, etc.

it would probably also make sense for the same binary builder shards generating pants' own bdists to also do a pip wheel resolve against pants' transitive deps and stash those in the S3 bucket also (i.e. automating the cheeseshop model).. this would 1) ensure that the same machine builds the native_engine binary as well as any backing 3rdparty deps for alignment and 2) enable painless multi-platform resolves (from any runtime OS) for deps like psutil that don't provide bdists on pypi etc. otherwise, afaict pex_from_whls.sh will never be able to create a singular multi-platform pex since it can only ever run on one platform at a time.

@stuhood
Copy link
Sponsor Member

stuhood commented Nov 17, 2017

to also do a pip wheel resolve against pants' transitive deps and stash those in the S3 bucket also (i.e. automating the cheeseshop model)

Yea, this will be necessary. There are source-only deps on PyPI which you wouldn't be able to translate otherwise, afaik.

@illicitonion
Copy link
Contributor

It's worth noting that a release build of the native engine is large. The Linux .so is 65M, and the Mac .so is 15M. They zip down ok, so that a pex is only about 35M, but 35M is still very large. (I hadn't realised until now how different they were; I'm curious, but not quite curious enough to find out...)

It's likely that when https://github.com/rust-lang/rust/issues/36342 is resolved and we can be building a cdylib not a hacky dylib, the linker can do a lot more stripping and these will get smaller, but platform-specific pexes would have a non-trivial size improvement (which maps to a non-trivial speed improvement on first run).

@stuhood
Copy link
Sponsor Member

stuhood commented Nov 17, 2017

A few notes after additional reading:

  • The binary builder shards already push whls for pants itself:

    pants/.travis.yml

    Lines 423 to 425 in 042a817

    deploy:
    # See: https://docs.travis-ci.com/user/deployment/s3/
    provider: s3
    ...but they do it in a relatively risky way:
    • the whls are always pushed for the current src/python/pants/version.py value, meaning that many pushes are overwrites
    • a release consuming the whls will just consume whatever the most recently pushed binaries for that version happen to be: if you're running a dev release from master, this might be some sha a few beyond where the release notes landed

So, adjustments to the plan:

  1. Reuse the travis whl building path, but adjust it to:
    • namespace the whls under the current SHA, to avoid collisions [0]
    • push SHA-suffixed releases for branches
    • push un-suffixed releases for tags
  2. To avoid the release race condition above, I'll adjust the release workflow to move the tag push much earlier in the process. Because it will be a tag, CI will build un-suffixed whls for the version, which the release script can consume.

[0] Both a suffixed and un-suffixed version will be built for a particular SHA if it both exists in a branch and is tagged, or is tagged multiple times, but these collisions should be harmless because the packages are identical.

@stuhood
Copy link
Sponsor Member

stuhood commented Nov 30, 2017

As mumbled to myself on #5118, I've been exploring @illicitonion 's suggestion to always build suffixed releases in travis, and to "re-version" them only when we want to release them. While a bit crazy, it was perhaps not too crazy: I've implemented it here #5145.

So, the plan of record is now to:

  1. Always build suffixed pants wheels
  2. Make no changes to how releases are tagged
  3. Re-version the suffixed wheels to stable wheels after fetching them from S3 and before testing them and uploading them to pypi.

stuhood pushed a commit that referenced this issue Nov 30, 2017
### Problem

After feedback in #5118, #4896 will now propose to build suffixed releases of pants in travis on relevant platforms, fetch them them to a releaser's machine, and then "stabilize"/"re-version" them with an unsuffixed version before sending them to pypi.

AFAICT, there are no utilities for re-versioning wheels around (perhaps because it's not a great idea? *shrug).

### Solution

Implement a tool for re-versioning whl files by find-replacing the version str in all files in the whl and then updating the `RECORD` file with new fingerprints. Because of the blind find-replace, this is useful for stabilizing from a version like `1.4.0.dev21+b9121c0c4` to `1.4.0.dev21`, but probably a bit risky for heading in the opposite direction.

### Result

#5118 will be able to use a command like:
```
./pants -q run src/python/pants/releases:reversion -- requests-2.18.4-py2.py3-none-any.whl ${dist} 17.4.0.dev21
```
...to re-version whls fetched from travis.

I did some local testing to confirm that pip is able to install a re-versioned whl.
stuhood pushed a commit that referenced this issue Dec 1, 2017
### Problem

As described on #4896, the "binary builder" shards in travis currently overwrite the previous artifacts for a SHA, meaning that it is possible for `release.sh` to race with additional commits landing in master.

Relatedly, because we build artifacts for all shas and not just tags (and should continue to do so in order to solve #4896), it's not possible to differentiate the `whl`s that were built for an actual release from those built for any old dev sha. 

### Solution

1. Build version-suffixed `whl`s for all travis builds.
    * This will allow consumers to build a pex for any SHA that has been built in travis unambiguously, while avoiding cache collisions due to re-used version numbers.
2. SHA-prefix the artifacts built by the "binary builder" shards to avoid problematic collisions between published artifacts.
3. "Re-version" whls that we release to PyPI (via #5145).

### Result

This change makes releases less racy, and begins to unblock building pexes from any SHA with #4896 in future PRs.
@stuhood
Copy link
Sponsor Member

stuhood commented Dec 1, 2017

Alright... the first 80% of this has landed via #5145 and #5118. I'll run the release tomorrow in order to figure out how I broke everything.

The last 20% (possibly also accomplishable tomorrow?) will be to take all of the frozen input wheels and build pexes. There is an open question of where to publish them (github? s3?), and where to link to them from (the docsite? pypi?).

@stuhood
Copy link
Sponsor Member

stuhood commented Dec 2, 2017

Got distracted by various things and didn't get to actually building pexes. But got the release out, and a few fixes to the release script are here: #5152.

@stuhood
Copy link
Sponsor Member

stuhood commented Dec 5, 2017

Next step is out in #5159

@stuhood
Copy link
Sponsor Member

stuhood commented Dec 6, 2017

Pretty deep in a rabbit hole on this one, but getting close! Now updating g++ in the docker image to get gRPC compiling.

Note to self: http://linux.web.cern.ch/linux/devtoolset/#install

@stuhood
Copy link
Sponsor Member

stuhood commented Dec 13, 2017

Landed #5167 .. next up, tagging that to docker hub, and swapping over the travis_ci image to start FROM it. cc @illicitonion , @jsirois

stuhood pushed a commit that referenced this issue Dec 13, 2017
### Problem

See #5167

### Solution

Begin using the newly pushed image.

### Result

Wheels built in travis CI will be more compatible, as well as the releases and pexes that will eventually be built by #4896.
@stuhood
Copy link
Sponsor Member

stuhood commented Dec 15, 2017

Validated that on SHAs after 96264d3, running:

build-support/bin/release.sh -p

produces a crossplatform pex in dist that works on centos6 and on OSX. Huzzah.

@stuhood
Copy link
Sponsor Member

stuhood commented Jan 9, 2018

Alright: time to finalize this ticket I think (and open another one for followup).

I'd like to propose that for now we defer publishing pexes as part of the release process (and move that into a separate ticket), and instead make it very easy for consumers to manually request a pex for the prebuilt wheels. The reality of our (recent) usage is that we need some number of cherry-picks atop any internal release, and stable releases happen infrequently enough that we can't rely on release binaries.

So I'd propose to close this ticket by:

upstream (here):

  1. adding documentation advertising the ability for anyone to request contributor access
  2. giving contributors the ability to push branches/tags to origin to trigger whl builds (and thus treat travis as a community resource)
  3. document the process of building a pex by manually invoking release.sh -p (as mentioned above: manual for now. will open a followup for publishing pexes as part of the release process)

internally:

  1. Refer team members to the upstream docs for kicking off builds as contributors
  2. build a pex for the relevant sha and execute any internal release steps

Thoughts? The "contributors can push branches" bit in particular seems potentially controversial.

@jsirois
Copy link
Contributor

jsirois commented Jan 9, 2018

I think the odd thing here is an OSS project where committers can build special public binaries, whereas plain old users cannot. It's true committers are by definition privileged in general in any OSS project, but this particular privilege seems a bit confusing. We then have public binaries for anyone to use that are a not easily identifiable mix of commits and components. All this said, Twitter contributes a huge amount to the project, so - constructive ideas:

  • Why not a twitter fork for releases that wraps / env var modifies / etc - .travis.yml / ci.sh / release.sh - and builds custom binaries on Twitter Travis quota pushing to Twitter servers / Twitter s3 buckets?
  • ... still thinking ...

@jsirois
Copy link
Contributor

jsirois commented Jan 10, 2018

And, just got the contributors vs committers distinction, but the confusion objection still stands - we allow anyone, via contributor access request, to publish, but we still have a mass of unidentifiable public releases as a result.

@kwlzn
Copy link
Member Author

kwlzn commented Jan 10, 2018

So I'd propose to close this ticket by:
upstream (here):

adding documentation advertising the ability for anyone to request contributor access
giving contributors the ability to push branches/tags to origin to trigger whl builds (and thus treat travis as a community resource)
document the process of building a pex by manually invoking release.sh -p (as mentioned above: manual for now. will open a followup for publishing pexes as part of the release process)
internally:

Refer team members to the upstream docs for kicking off builds as contributors
build a pex for the relevant sha and execute any internal release steps

this sgtm - and at the very least seems like a perfectly reasonable experiment that we can always tweak later as needed.

..we still have a mass of unidentifiable public releases as a result.

could we solve for this via better annotation of the custom vs mainline releases in either tag naming or posted descriptions on the github release landing page? or maybe we could use this flag for all custom releases:

image

Why not a twitter fork ...

we've long had a custom solution to this - the goal here is to unfork as much as possible, so that all consumers can benefit from and share the same shared release mechanics, build platforms, etc.

@jsirois
Copy link
Contributor

jsirois commented Jan 10, 2018

My proposal would share all this (shared release mechanics, build platforms, etc), just not the public branches/tags, travis time and storage.

The other downside to enabling this is we'd then be effectively promoting fragmentation. The easier it is to create a custom binary using the main project, the easier it is to stay mis-aligned - ie cherry pick vs just fix the mainline or work off of weekly releases.

I'm not hugely opposed, this just seems decidedly odd and despite all pretensions, a feature being enabled for Twitter only.

@stuhood
Copy link
Sponsor Member

stuhood commented Jan 10, 2018

Got great feedback here, and in the slack #releases room. Will mail about a completely different proposal when I get some more time tomorrow.

@stuhood stuhood removed their assignment Jan 17, 2018
@stuhood
Copy link
Sponsor Member

stuhood commented Jan 17, 2018

I've moved the ball a lot on this one, and we now have everything in place to begin publishing a cross-platform pex as part of releases. But I need to context switch away to continue to clean up our internal release process, so unassigning this.

@jsirois jsirois self-assigned this Jun 8, 2018
jsirois added a commit to jsirois/pants that referenced this issue Jun 8, 2018
This needs to match the Travis CI osx platform we pre-build wheels on.

Work towards pantsbuild#4896.
jsirois added a commit that referenced this issue Jun 8, 2018
This needs to match the Travis CI osx platform we pre-build wheels on.

Work towards #4896.
CMLivingston pushed a commit to CMLivingston/pants that referenced this issue Jun 22, 2018
Closes pantsbuild#5831
Prep for release 1.8.0dev3 (pantsbuild#5937)

Ban bad `readonly` shell pattern (pantsbuild#5924)

This subverts `set -e` and means that failed commands don't fail the
script, which is responsible for late failures on CI such as
https://travis-ci.org/pantsbuild/pants/jobs/389174049 which failed to
download protoc, but only failed when something tried to use it.
Fixup macosx platform version. (pantsbuild#5938)

This needs to match the Travis CI osx platform we pre-build wheels on.

Work towards pantsbuild#4896.
Allow pants to select targets by file(s) (pantsbuild#5930)

Fixes pantsbuild#5912

Problem
There should be an option to accept literal files, and then select the targets that own those files. Similar to how the --changed-parent triggers a diff and the targets are selected based on the result.
The proposed syntax is something like:

$ ./pants \
  --owner-of=this/is/a/file/name.java \    # flag triggers owner lookup
  --owner-of=this/is/a/file/name/too.py \  # flag triggers owner lookup
  compile                                  # goal
Solution
I've created a global option --owner-of= that takes a list of files as a parameter, and created a OwnerCalculator class to handle the logic similar to how ChangeCalculator works with the --changed-* subsystem.

Result
Now users will be able to run goals on files without needing to know which target owns those files.
Also, the list-owners goal can be deprecated in favor of --owner-of=some/file.py list

It is important to note that multiple target selection methods are not allowed, so it fails when more than one of --changed-*, --owner-of, or target specs are supplied.
e.g. this fails:

$ ./pants \
  --owner-of=this/is/a/file/name.java \    # flag triggers owner lookup
  --owner-of=this/is/a/file/name/too.py \  # flag triggers owner lookup
  compile
  <another target>
Integration test for daemon environment scrubbing (pantsbuild#5893)

This shows that pantsbuild#5898 works, which itself fixed pantsbuild#5854
Add the --owner-of= usage on Target Address documentation (pantsbuild#5931)

Problem
The documentation of the feature proposed in PR pantsbuild#5930

Solution
I decided to put it inside Target Addresses because that is where a user would look if they needed a feature like this, I think.

Result
More docs, and that's always good.
Add script to get a list of failing pants from travis (pantsbuild#5946)

[jvm-compile] template-methodify JvmCompile further; add compiler choices (pantsbuild#5923)

Introduce `JvmPlatform.add_compiler_choice(name)`, which allows plugins to register compilers that can be configured.
This patch pulls out some template methods in JvmCompile to make it easier to extend. It also pushes some of the implementations of those methods down into ZincCompile, where appropriate.

These changes should be covered by existing tests, but it could make sense to add tests around the interfaces of the new template methods. I don't anticipate there being a large number of implementations at this time though, so I didn't think it'd be worth it.

Add the following template methods

* `create_empty_extra_products` Allows subclasses to create extra products that other subclasses might not need, that ought to be constructed even if no compile targets are necessary.

* `register_extra_products_from_contexts` rename of `_register_vts`. This allows subclasses to register their extra products for particular targets.
* `select_runtime_context` Not 100% happy with this, but I'm working on something that needs to have different types of compile contexts. It allows subclasses to specify a context that provides paths for the runtime classpath if the default context isn't quite right for the usages in the base class.
* `create_compile_jobs` Pulled this out into a separate method so that subclasses can create multiple graph jobs per target.

* Pushed down behavior from JvmCompile that should live in zinc via the template methods extracted above. There's probably more that could be done here, but this was the first cut of it.
* Moved the execute definition from BaseZincCompile to ZincCompile so that it's possible to subclass BaseZincCompile with a different compiler name.
release notes for 1.7.0.rc1 (pantsbuild#5942)

Use target not make_target in some tests (pantsbuild#5939)

This pushes parsing of the targets through the engine, rather than
bypassing it.

This is important because I'm about to make these targets require an
EagerFilesetWithSpec as their source/sources arg, rather than being
happy with a list of strings.
Add new remote execution options (pantsbuild#5932)

As described in pantsbuild#5904, a few configuration values that are important to testing of remote execution are currently hardcoded.

Extract existing options to a `ExecutionOptions` collection (which should become a `Subsystem` whenever we add support for consuming `Subsystems` during bootstrap), and add the new options.

Fixes pantsbuild#5904.
Separate the resolution cache and repository cache in Ivy (pantsbuild#5844)

move glob matching into its own file (pantsbuild#5945)

See pantsbuild#5871, where we describe an encapsulation leak created by implementing all of the glob expansion logic in the body of `VFS`.

- Create `glob_matching.rs`, exporting the `GlobMatching` trait, which exports the two methods `canonicalize` and `expand`, which call into methods in a private trait `GlobMatchingImplementation`.

**Note:** `canonicalize` calls `expand`, and vice versa, which is why both methods were moved to `glob_matching.rs`.

Orthogonal glob matching logic is made into a trait that is implemented for all types implementing `VFS`, removing the encapsulation leak. The `VFS` trait is now just four method signature declarations, making the trait much easier to read and understand.
Enable fromfile support for --owner-of and increase test coverage (pantsbuild#5948)

The new `--owner-of` option was broken in the context of `pantsd`, but didn't have test coverage due to the `--changed` and `--owner-of` tests not running under `pantsd`. Additionally, `fromfile` support is useful for this option, but was not enabled.

Mark some integration tests as needing to run under the daemon, and enable `fromfile` support for `--owner-of`.
[pantsd] Robustify client connection logic. (pantsbuild#5952)

Fixes pantsbuild#5812

under full-on-assault stress testing via:

$ watch -n.1 'pkill -f "pantsd \[" pantsd-runner'
this will mostly behave like:

WARN] pantsd was unresponsive on port 55620, retrying (1/3)
WARN] pantsd was unresponsive on port 55620, retrying (2/3)
WARN] pantsd was unresponsive on port 55626, retrying (3/3)
WARN] caught client exception: Fallback(NailgunExecutionError(u'Problem executing command on nailgun server (address: 127.0.0.1:55630): TruncatedHeaderError(u"Failed to read nailgun chunk header (TruncatedRead(u\'Expected 5 bytes before socket shutdown, instead received 0\',)).",)',),), falling back to non-daemon mode

23:30:24 00:00 [main]
               (To run a reporting server: ./pants server)
23:30:38 00:14   [setup]
23:30:39 00:15     [parse]
...
mid-flight terminations (simulated via single-shot pkill calls) also result in a more descriptive error with traceback proxying:

23:40:51 00:04     [zinc]
23:40:51 00:04     [javac]
23:40:51 00:04     [cpp]
23:40:51 00:04     [errorprone]
23:40:51 00:04     [findbugs]CRITICAL]
CRITICAL] lost active connection to pantsd!
Exception caught: (<class 'pants.bin.remote_pants_runner.Terminated'>)
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_loader.py", line 73, in <module>
    main()
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_loader.py", line 69, in main
    PantsLoader.run()
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_loader.py", line 65, in run
    cls.load_and_execute(entrypoint)
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_loader.py", line 58, in load_and_execute
    entrypoint_main()
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_exe.py", line 39, in main
    PantsRunner(exiter, start_time=start_time).run()
  File "/Users/kwilson/dev/pants/src/python/pants/bin/pants_runner.py", line 39, in run
    return RemotePantsRunner(self._exiter, self._args, self._env, bootstrap_options).run()
  File "/Users/kwilson/dev/pants/src/python/pants/bin/remote_pants_runner.py", line 162, in run
    self._run_pants_with_retry(port)
  File "/Users/kwilson/dev/pants/src/python/pants/java/nailgun_client.py", line 221, in execute
    return self._session.execute(cwd, main_class, *args, **environment)
  File "/Users/kwilson/dev/pants/src/python/pants/java/nailgun_client.py", line 94, in execute
    return self._process_session()
  File "/Users/kwilson/dev/pants/src/python/pants/java/nailgun_client.py", line 69, in _process_session
    for chunk_type, payload in self.iter_chunks(self._sock, return_bytes=True):
  File "/Users/kwilson/dev/pants/src/python/pants/java/nailgun_protocol.py", line 206, in iter_chunks
    chunk_type, payload = cls.read_chunk(sock, return_bytes)
  File "/Users/kwilson/dev/pants/src/python/pants/java/nailgun_protocol.py", line 182, in read_chunk
    raise cls.TruncatedHeaderError('Failed to read nailgun chunk header ({!r}).'.format(e))

Exception message: abruptly lost active connection to pantsd runner: NailgunError(u'Problem talking to nailgun server (address: 127.0.0.1:55707, remote_pid: -28972): TruncatedHeaderError(u"Failed to read nailgun chunk header (TruncatedRead(u\'Expected 5 bytes before socket shutdown, instead received 0\',)).",)',)
Re-shade zinc to avoid classpath collisions with annotation processors. (pantsbuild#5953)

Zinc used to be shaded before the `1.x.y` upgrade (pantsbuild#4729), but shading was removed due to an overabundance of optimism. When testing the zinc upgrade internally, we experienced a classpath collision between an annotation processor and zinc (in guava, although zinc has many other dependencies that could cause issues).

Shade zinc, and ensure that our annotation processor uses a very old guava in order to attempt to force collisions in future.
Improve PythonInterpreterCache logging (pantsbuild#5954)

When users have issues building their Python interpreter cache, they are often very confused because does not currently log much about the process to help users debug. Here we add log lines describing what/where Pants looks to build the interpreter cache, and the results of what it found. This should help users better understand/debug the process.
use liblzma.dylib for xz on osx and add platform-specific testing to the rust osx shard (pantsbuild#5936)

See pantsbuild#5928. The `xz` archiver wasn't tested on osx at all, and failed to find `liblzma.so` on osx (it should have been `liblzma.dylib`). There were additional errors with library search paths reported in that PR which I was not immediately able to repro. This PR hopefully fixes all of those errors, and ensures they won't happen again with the addition of platform-specific testing (see previous issue at pantsbuild#5920).

- Switch to a statically linked `xz`.
- Fix the incorrect key `'darwin'` in the platform dictionary in the `LLVM` subsystem (to `'mac'`).
- Add the tag `platform_specific_behavior` to the new python target `tests/python/pants_test/backend/python/tasks:python_native_code_testing`, which covers the production of `python_dist()`s with native code.
- Add the `-z` argument to `build-support/bin/ci.sh` to run all tests with the `platform_specific_behavior` tag. Also clean up old unused options in the getopts call, and convert echo statements to a simpler heredoc.
- Change the name of the "Rust Tests OSX" shard to "Rust + Platform-specific Tests OSX", and add the `-z` switch to the `ci.sh` invocation.

**Note:** the tests in `tests/python/pants_test/backend/native/subsystems` are going to be removed in pantsbuild#5815, otherwise they would be tagged similarly.

`./pants test tests/python/pants_test/backend/python/tasks:python_native_code_testing` now passes on osx, and this fact is now being tested in an osx shard in travis.
Support output directory saving for local process execution. (pantsbuild#5944)

Closes pantsbuild#5860
reimplement a previous PR -- ignore this

This commit is a reimplementation of registering @rules for backends, because this PR began before
that one was split off.

add some simple examples to demonstrate how to use backend rules

...actually make the changes to consume backend rules in register.py

revert accidental change to a test target

remove extraneous log statement

fix lint errors

add native backend to release.sh

isolate native toolchain path and hope for the best

add config subdir to native backend

really just some more attempts

start trying to dispatch based on platform

extend EngineInitializer to add more rules from a backend

refactor Platform to use new methods in osutil.py

refactor the native backend to be a real backend and expose rules

register a rule in the python backend to get a setup.py environment

make python_dist() tests pass

make lint pass

create tasks and targets for c/c++ sources

- refactors the "native toolchain" and introduces the "binaries" subdirectory of subsystems

start by adding a new ctypes testproject

add c/c++ sources

add example BUILD file

add some native targets

add tasks dir

remove gcc

try to start adding tasks

clean some leftover notes in BuildLocalPythonDistributions

update NativeLibrary with headers

move DependencyContext to target.py

add native compile tasks

houston we have compilation

run:

./pants -ldebug --print-exception-stacktrace compile testprojects/src/python/python_distribution/ctypes:

for an idea

use the forbidden product request

change target names to avoid conflict with cpp contrib and flesh out cpp_compile

now we are compiling code

we can link things now

now we know how to infer headers vs sources

fix the test case and fix include dir collection

(un)?suprisingly, everything works, but bdist_wheel doesn't read MANIFEST.in

houston we have c++

bring back gcc so we can compile

halfway done with osx support

now things work on osx????????

ok, now it works on linux again too

first round of review

- NB: refactors the organization of the `results_dir` for python_dist()!
- move ctypes integration testing into python backend tests

revert some unnecessary changes

refactor native_artifact to be a datatype

fix some copyright years

add ctypes integration test

add assert_single_element method in collections.py

decouple the native tools for setup.py from the execution environment

streamline the selection of the native tools for setup.py invocation

make gcc depend on binutils on linux for the 'as' assembler

fix logging visibility by moving it back into the task

make the check for the external llvm url more clear

refactor local dist building a bit

- use SetupPyRunner.DIST_DIR as the source of truth
- add a separate subdir of the python_dist target's results_dir for the
  python_dist sources
- move shraed libs into the new subdir

fix imports

second round of review

- fixed bugs
- expanded error messages and docstrings

make a couple docstring changes

fix dist platform selection ('current' is not a real platform)

lint fixes

fix broken regex which modifies the `.dylib` extension for python_dist()

fix the ctypes integration test

respond to some review comments

clear the error message if we can't find xcode cli tools

refactor the compilation and linking pipeline to use subsystems

- also add `PythonNativeCode` subsystem to bridge the native and python backends

refactor the compilation and linking pipeline to use subsystems

add some notes

fix rebase issues

add link to pantsbuild#5788 -- maybe use variants for args for static libs

move `native_source_extensions` to a new `PythonNativeCode` subsystem

update native toolchain docs and remove bad old tests

move tgt_closure_platforms into the new `PythonNativeCode` subsystem

remove unnecessary logging

remove compile_settings_class in favor of another abstractmethod

refactor `NativeCompile` and add documentation

improve debug logging in NativeCompile

document NativeCompileSettings

refactor and add docstrings

convert provides= to ctypes_dylib= and add many more docstrings

remove or improve TODOs

improve or remove FIXMEs

improve some docstrings, demote a FIXME, and add a TODO

link FIXMEs to a ticket

add notes to the ctypes testproject

update mock object for strict deps -- test passes

fix failing integration test on osx

add hack to let travis pass

fix the system_id key in llvm and add a shameful hack to pass travis

swap the order of alias_types

remove unused EmptyDepContext class

remove -settings suffix from compile subsystem options scopes

add AbstractClass to NativeLibrary

bump implementation_version for python_dist() build

- we have changed the layout of the results_dir in this PR

add ticket link and fix bug

revert indentation changes to execute() method

refactor `assert_single_element()`

revert addition of `narrow_relative_paths()`

add link to pantsbuild#5950

move common process invocation logic into NativeCompile

revert an unnecessary change

turn an assert into a full exception

revert unnecessary change

use get_local_platform() wherever possible

delete duplicate llvm subsystem

fix xcode cli tools resolution

change `ctypes_dylib=` to `ctypes_native_library=`

add a newline

move UnsupportedPlatformError to be a class field

remove unused codegen_types field

fix zinc-compiler options to be valid ones

Construct rule_graph recursively (pantsbuild#5955)

The `RuleGraph` is currently constructed iteratively, but can be more-easily constructed recursively.

Switch to constructing the `RuleGraph` recursively, and unify a few disparate diagnostic messages.

Helps to prepare for further refactoring in pantsbuild#5788.
Allow manylinux wheels when resolving plugins. (pantsbuild#5959)

Also plumb manylinux resolution support for the python backend, on by
default, but configurable via `python_setup.resolver_use_manylinux`.

Fixes pantsbuild#5958
`exclude-patterns` and `tag` should apply only to roots (pantsbuild#5786)

The `--exclude-patterns` flag currently applies to inner nodes, which causes odd errors. Moreover, tags should also apply only to roots. See pantsbuild#5189.

- added `tag` & `exclude_patterns` as params to `Specs`
- add tests for both
- modify changed tests to pass for inner node filtering

Fixes pantsbuild#5189.
Remove value wrapper on the python side of ffi. (pantsbuild#5961)

As explained in the comment in this change, the overhead of wrapping our CFFI "handle"/`void*` instances in a type that is shaped like the `Value` struct was significant enough to care about.

Since the struct has zero overhead on the rust side, whether we represent it as typedef or a struct on the python side doesn't make a difference (except in terms of syntax).

6% faster `./pants list ::` in Twitter's repo.
return an actual field

use temporary native-compile goal

Cobertura coverage: Include the full target closure's classpath entries for instrumentation (pantsbuild#5879)

Sometimes Cobertura needs access to the dependencies of class files being instrumented in order to rewrite them (pantsbuild#5878).

This patch adds an option that creates a manifest jar and adds an argument to the Cobertura call so that it can take advantage of it.

class files that need to determine a least upper bound in order to be rewritten can now be instrumented.

Fixes  pantsbuild#5878
add ticket link

fix xcode install locations and reduce it to a single dir_option

return the correct amount of path entries

Record start times per graph node and expose a method to summarize them. (pantsbuild#5964)

In order to display "significant" work while it is running on the engine, we need to compute interesting, long-running leaves.

Record start times per entry in the `Graph`, and add a method to compute a graph-aware top-k longest running leaves. We traverse the graph "longest running first", and emit the first `k` leaves we encounter.

While this will almost certainly need further edits to maximize it's usefulness, visualization can begun to be built atop of it.
Prepare the 1.8.0.dev4 release (pantsbuild#5969)

Mark a few options that should not show up in `./pants help`. (pantsbuild#5968)

`./pants help` contains core options that are useful to every pants command, and the vast majority of global options are hidden in order to keep it concise. A few non-essential options ended up there recently.

Hide them.
adding more documentation for python_app (pantsbuild#5965)

The python_app target doesn't have the documentation specific for it and has a documentation that is specific to jvm_app.

Added a few lines of documentation.

There is no system-wide change, only a documentation change.
Remove DeprecatedPythonTaskTestBase (pantsbuild#5973)

Use PythonTaskTestBase instead.

Fixes pantsbuild#5870
Chris first commit on fresh rebase

Merge branch 'ctypes-test-project' of github.com:cosmicexplorer/pants into clivingston/ctypes-test-project-third-party

unrevert reverted fix (NEEDS FOLLOWUP ISSUE!)

put in a better fix for the strict_deps error until the followup issue is made

add ticket link

Merge branch 'ctypes-test-project' of github.com:cosmicexplorer/pants into clivingston/ctypes-test-project-third-party

Shorten safe filenames further, and combine codepaths to make them readable. (pantsbuild#5971)

Lower the `safe_filename` path component length limit to 100 characters, since the previous 255 value did not account for the fact that many filesystems also have a limit on total path length. This "fixes" the issue described in pantsbuild#5587, which was caused by using this method via `build_invalidator.py`.

Additionally, merge the codepath from `Target.compute_target_id` which computes a readable shortened filename into `safe_filename`, and expand tests. This removes some duplication, and ensure that we don't run into a similar issue with target ids.

The specific error from pantsbuild#5587 should be prevented, and consumers of `safe_filename` should have safe and readable filenames.

Fixes pantsbuild#5587.
Whitelist the --owner-of option to not restart the daemon. (pantsbuild#5979)

Because the `--owner-of` option was not whitelisted as `daemon=False`, changing its value triggered unnecessary `pantsd` restarts.

Whitelist it.
Prepare the 1.8.0rc0 release. (pantsbuild#5980)

Robustify test_namespace_effective PYTHONPATH. (pantsbuild#5976)

The real problem is noted, but this quick fix should bolster against
interpreter cache interpreters pointing off to python binaries that
have no setuptools in their associated site-packages.

Fixes pantsbuild#5972
make_target upgrades sources to EagerFilesetWithSpec (pantsbuild#5974)

This better simulates how the engine parses BUILD files, giving a more
faithful experience in tests.

I'm about to make it a warning/error to pass a list of strings as the
sources arg, so this will make tests which use make_target continue to
work after that.

Also, make cloc use base class scheduler instead of configuring its own.
Lib and include as a dep-specifc location

source attribute is automatically promoted to sources (pantsbuild#5908)

This means that either the `source` or `sources` attribute can be used
for any rule which expects sources. Places that `source` was expected
still verify that the correct number of sources are actually present.
Places that `sources` was expected will automatically promote `source`
to `sources`.

This is a step towards all `sources` attributes being
`EagerFilesetWithSpec`s, which will make them cached in the daemon, and
make them easier to work with with both v2 remote execution and in the
v2 engine in general. It also provides common hooks for input file
validation, rather than relying on them being done ad-hoc in each
`Target` constructor.

For backwards compatibility, both attributes will be populated on
`Target`s, but in the future only the sources value will be provided.

`sources` is guaranteed to be an `EagerFilesetWithSpec` whichever of
these mechanisms is used.

A hook is provided for rules to perform validation on `sources` at build
file parsing time. Hooks are put in place for non-contrib rule types
which currently take a `source` attribute to verify that the correct
number of sources are provided. I imagine at some point we may want to
add a "file type" hook too, so that rules can error if files of the
wrong type were added as sources.

This is a breaking change for rules which use both the `source` and `sources` attributes (and where the latter is not equivalent to the former), or where the `source` attribute is used to refer to something other than a file. `source` is now becoming a
reserved attribute name, as `sources` and `dependencies` already are.

This is also a breaking change for rules which use the `source`
attribute, but never set `sources` in a Payload. These will now fail to
parse.

This is also a slightly breaking change for the `page` rule - before,
omitting the `source` attribute would parse, but fail at runtime. Now,
it will fail to parse.

This is also a breaking change in that in means that the source
attribute is now treated like a glob, and so if a file is specified
which isn't present, it will be ignored instead of error. This feels a
little sketchy, but it turns out we did the exact same thing by making
all sources lists be treated like globs...
Override get_sources for pants plugins (pantsbuild#5984)

1.7.0 release notes (pantsbuild#5983)

No additional changes, so it's a very short release note.
Fixups for native third party work

hardcode in c/c++ language levels for now

remove all the unnecessary code relating to file extensions

Merge branch 'ctypes-test-project' of github.com:cosmicexplorer/pants into clivingston/ctypes-test-project-third-party

Caching tests are parsed through the engine (pantsbuild#5985)

reimplement a previous PR -- ignore this

This commit is a reimplementation of registering @rules for backends, because this PR began before
that one was split off.

add some simple examples to demonstrate how to use backend rules

...actually make the changes to consume backend rules in register.py

revert accidental change to a test target

remove extraneous log statement

fix lint errors

add native backend to release.sh

isolate native toolchain path and hope for the best

add config subdir to native backend

really just some more attempts

start trying to dispatch based on platform

extend EngineInitializer to add more rules from a backend

refactor Platform to use new methods in osutil.py

refactor the native backend to be a real backend and expose rules

register a rule in the python backend to get a setup.py environment

make python_dist() tests pass

make lint pass

create tasks and targets for c/c++ sources

- refactors the "native toolchain" and introduces the "binaries" subdirectory of subsystems

start by adding a new ctypes testproject

add c/c++ sources

add example BUILD file

add some native targets

add tasks dir

remove gcc

try to start adding tasks

clean some leftover notes in BuildLocalPythonDistributions

update NativeLibrary with headers

move DependencyContext to target.py

add native compile tasks

houston we have compilation

run:

./pants -ldebug --print-exception-stacktrace compile testprojects/src/python/python_distribution/ctypes:

for an idea

use the forbidden product request

change target names to avoid conflict with cpp contrib and flesh out cpp_compile

now we are compiling code

we can link things now

now we know how to infer headers vs sources

fix the test case and fix include dir collection

(un)?suprisingly, everything works, but bdist_wheel doesn't read MANIFEST.in

houston we have c++

bring back gcc so we can compile

halfway done with osx support

now things work on osx????????

ok, now it works on linux again too

first round of review

- NB: refactors the organization of the `results_dir` for python_dist()!
- move ctypes integration testing into python backend tests

revert some unnecessary changes

refactor native_artifact to be a datatype

fix some copyright years

add ctypes integration test

add assert_single_element method in collections.py

decouple the native tools for setup.py from the execution environment

streamline the selection of the native tools for setup.py invocation

make gcc depend on binutils on linux for the 'as' assembler

fix logging visibility by moving it back into the task

make the check for the external llvm url more clear

refactor local dist building a bit

- use SetupPyRunner.DIST_DIR as the source of truth
- add a separate subdir of the python_dist target's results_dir for the
  python_dist sources
- move shraed libs into the new subdir

fix imports

second round of review

- fixed bugs
- expanded error messages and docstrings

make a couple docstring changes

fix dist platform selection ('current' is not a real platform)

lint fixes

fix broken regex which modifies the `.dylib` extension for python_dist()

fix the ctypes integration test

respond to some review comments

clear the error message if we can't find xcode cli tools

refactor the compilation and linking pipeline to use subsystems

- also add `PythonNativeCode` subsystem to bridge the native and python backends

refactor the compilation and linking pipeline to use subsystems

add some notes

fix rebase issues

add link to pantsbuild#5788 -- maybe use variants for args for static libs

move `native_source_extensions` to a new `PythonNativeCode` subsystem

update native toolchain docs and remove bad old tests

move tgt_closure_platforms into the new `PythonNativeCode` subsystem

remove unnecessary logging

remove compile_settings_class in favor of another abstractmethod

refactor `NativeCompile` and add documentation

improve debug logging in NativeCompile

document NativeCompileSettings

refactor and add docstrings

convert provides= to ctypes_dylib= and add many more docstrings

remove or improve TODOs

improve or remove FIXMEs

improve some docstrings, demote a FIXME, and add a TODO

link FIXMEs to a ticket

add notes to the ctypes testproject

update mock object for strict deps -- test passes

fix failing integration test on osx

add hack to let travis pass

fix the system_id key in llvm and add a shameful hack to pass travis

swap the order of alias_types

remove unused EmptyDepContext class

remove -settings suffix from compile subsystem options scopes

add AbstractClass to NativeLibrary

bump implementation_version for python_dist() build

- we have changed the layout of the results_dir in this PR

add ticket link and fix bug

revert indentation changes to execute() method

refactor `assert_single_element()`

revert addition of `narrow_relative_paths()`

add link to pantsbuild#5950

move common process invocation logic into NativeCompile

revert an unnecessary change

turn an assert into a full exception

revert unnecessary change

use get_local_platform() wherever possible

delete duplicate llvm subsystem

fix xcode cli tools resolution

change `ctypes_dylib=` to `ctypes_native_library=`

add a newline

move UnsupportedPlatformError to be a class field

remove unused codegen_types field

fix zinc-compiler options to be valid ones

return an actual field

use temporary native-compile goal

add ticket link

fix xcode install locations and reduce it to a single dir_option

return the correct amount of path entries

unrevert reverted fix (NEEDS FOLLOWUP ISSUE!)

put in a better fix for the strict_deps error until the followup issue is made

add ticket link

hardcode in c/c++ language levels for now

remove all the unnecessary code relating to file extensions

fix osx failures and leave a ticket link

Add rang header-only lib for integration testing

Merge branch 'ctypes-test-project' of github.com:cosmicexplorer/pants into clivingston/ctypes-test-project-third-party

Fix TestSetupPyInterpreter.test_setuptools_version (pantsbuild#5988)

Previously the test failed to populate the `PythonInterpreter` data
product leading to a fallback to the current non-bare interpreter which
allowed `setuptools` from `site-packages` to leak in.

Fixes pantsbuild#5467
Refactor conan grab into subsystem

Engine looks up default sources when parsing (pantsbuild#5989)

Rather than re-implementing default source look-up.

This pushes sources parsing of default sources through the engine, in parallel,
rather than being later synchronous python calls.

This also works for plugin types, and doesn't change any existing APIs.

It updates the Go patterns to match those that the engine currently performs,
rather than ones which aren't actually used by any code.
Add unit tests, refactor

C/C++ targets which can be compiled/linked and used in python_dist() with ctypes (pantsbuild#5815)

It is currently possible to expose native code to Python by compiling it in a `python_dist()` target, specifying C or C++ source files as a `distutils.core.Extension` in the `setup.py` file, as well as in the target's sources. `python_dist()` was introduced in pantsbuild#5141. We introduced a "native toolchain" to compile native sources for this use case in pantsbuild#5490.

Exposing Python code this way requires using the Python native API and `#include <Python.h>` in your source files. However, python can also interact with native code that does not use the Python native API, using the provided `ctypes` library. For this to work, the `python_dist()` module using `ctypes` needs to have a platform-specific shared library provided within the package. This PR introduces the targets, tasks, and subsystems to compile and link a shared library from native code, then inserts it into the `python_dist()` where it is easily accessible.

- Introduce the `ctypes_compatible_c_library()` target which covers C sources (`ctypes_compatible_cpp_library()` for C++), and can specify what to name the shared library created from linking the object files compiled from its sources and dependencies.
- Introduce `CCompile`, `CppCompile`, and `LinkSharedLibraries` to produce the shared libraries from the native sources. The compile tasks use options `CCompileSettings` or `CppCompileSettings` to define file extensions for "header" and "source" files.
- Introduce the `CCompileSettings` and `CppCompileSettings` subsystems to control compile settings for those languages.
- Convert `BuildLocalPythonDistributions` to proxy to the native backend through the new `PythonNativeCode` subsystem.
- Move all the `BinaryTool` subsystems to a `subsystems/binaries` subdirectory, and expose them to the v2 engine through `@rule`s defined in the subsystem's file.
- Move some of the logic in `pex_build_util.py` to `setup_py.py`, and expose datatypes composing the setup.py environment through `@rule`s in `setup_py.py`. `SetupPyRunner.for_bdist_wheel()` was created to set the wheel's platform, if the `python_dist()` target contains any native sources of its own, or depends on any `ctypes_compatible_*_library`s.

**Note:** the new targets are specifically prefixed with `ctypes_compatible_` because we don't yet eclipse the functionality of `contrib/cpp`. When the targets become usable for more than this one use case, the name should be changed.

To see how to link up native and Python code with `ctypes`, here's (most of) the contents of `testprojects/src/python/python_distribution/ctypes`:
*BUILD*:
```python
ctypes_compatible_c_library(
  name='c_library',
  sources=['some_math.h', 'some_math.c', 'src-subdir/add_three.h', 'src-subdir/add_three.c'],
  ctypes_dylib=native_artifact(lib_name='asdf-c'),
)

ctypes_compatible_cpp_library(
  name='cpp_library',
  sources=['some_more_math.hpp', 'some_more_math.cpp'],
  ctypes_dylib=native_artifact(lib_name='asdf-cpp'),
)

python_dist(
  sources=[
    'setup.py',
    'ctypes_python_pkg/__init__.py',
    'ctypes_python_pkg/ctypes_wrapper.py',
  ],
  dependencies=[
    ':c_library',
    ':cpp_library',
  ],
)
```
*setup.py*:
```python
setup(
  name='ctypes_test',
  version='0.0.1',
  packages=find_packages(),
  # Declare two files at the top-level directory (denoted by '').
  data_files=[('', ['libasdf-c.so', 'libasdf-cpp.so'])],
)
```
*ctypes_python_pkg/ctypes_wrapper.py*:
```python
import ctypes
import os

def get_generated_shared_lib(lib_name):
  # These are the same filenames as in setup.py.
  filename = 'lib{}.so'.format(lib_name)
  # The data files are in the root directory, but we are in ctypes_python_pkg/.
  rel_path = os.path.join(os.path.dirname(__file__), '..', filename)
  return os.path.normpath(rel_path)

asdf_c_lib_path = get_generated_shared_lib('asdf-c')
asdf_cpp_lib_path = get_generated_shared_lib('asdf-cpp')

asdf_c_lib = ctypes.CDLL(asdf_c_lib_path)
asdf_cpp_lib = ctypes.CDLL(asdf_cpp_lib_path)

def f(x):
  added = asdf_c_lib.add_three(x)
  multiplied = asdf_cpp_lib.multiply_by_three(added)
  return multiplied
```

Now, the target `testprojects/src/python/python_distribution/ctypes` can be depended on in a BUILD file, and other python code can freely use `from ctypes_python_pkg.ctypes_wrapper import f` to start jumping into native code.

1. pantsbuild#5933
2. pantsbuild#5934
3. pantsbuild#5949
4. pantsbuild#5950
5. pantsbuild#5951
6. pantsbuild#5962
7. pantsbuild#5967
8. pantsbuild#5977
Add simple integration test

Pull in master

Minor cleanup

Fix CI errors

Debug log stdout

Fix integration test

Fix integration test

Fix lint error on third party
jsirois added a commit to jsirois/pants that referenced this issue Jun 24, 2018
This adds release automation of a Pants pex to our Travis CI job when
run against a release_* tag. There is more to be done to consume the pex
in the setup script we point users to, but the actual release should now
be covered.

Fixes pantsbuild#4896
jsirois added a commit that referenced this issue Jun 25, 2018
This adds release automation of a Pants pex to our Travis CI job when
run against a release_* tag. There is more to be done to consume the pex
in the setup script we point users to, but the actual release should now
be covered.

Fixes #4896
@jsirois
Copy link
Contributor

jsirois commented Jun 26, 2018

The 1st release worked: https://github.com/pantsbuild/pants/releases/tag/release_1.9.0.dev0
But there are warts. Arguably, the file name should just be pants.pex since the github release contains the version info. Also the pex reports an unexpected version for the end user:

$ curl -LO https://github.com/pantsbuild/pants/releases/download/release_1.9.0.dev0/pants.1.9.0.dev0.pex && chmod +x pants.1.9.0.dev0.pex && ./pants.1.9.0.dev0.pex -V
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   609    0   609    0     0    904      0 --:--:-- --:--:-- --:--:--   904
100 28.4M  100 28.4M    0     0  4050k      0  0:00:07  0:00:07 --:--:-- 6529k
1.9.0.dev0+aad4b7be

File #6026 to address these items.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants