How to update a Github release?

(this is a followup to my previous post)

I would like my untagged commit builds to deploy to a GitHib release named like the Maven version. The idea is that the latest snapshot build is always available as a GitHub release, say 1.0-SNAPSHOT. This means the release might already exist.
I have used the indications there to set the TRAVIS_TAG to the Maven version. It worked one time, but failed the subsequent times.

If I add a unique suffix, like this:

   - export VERSION=$(mvn help:evaluate -Dexpression=project.version -q -DforceStdout)
   - export TRAVIS_TAG=${TRAVIS_TAG:-$VERSION-$(git log --format=%h -1)}

it works but a new release is created on each build. I would like the existing release to be updated, if it exists.

Is there a way to UPDATE an existing GitHub release? I suppose I might need to move the tag, if it exists, to the current commit ?


Thanks for pointing me to this discussion about “nightly” release, very helpful.

So my idea to avoid infinite loop is:

  • set the deployment to on: tags
  • when building an untagged commit, move the “latest” tag with git tag -f and push, but don’t set TRAVIS_TAG; this will prevent the current build from being deployed (because of on: tags)
    -the tagging should trigger a new tagged commit build, which will have TRAVIS_TAG set and hence trigger a normal deploy

Does that sound viable?

PS: it’s not exactly true that a Github release must be associated with a Git tag, you can actually create a GitHub release without a tag in draft mode, but you can’t publish it unless it is associated with a tag.

1 Like

Yes, this is the easiest way, and would allow you to build both automatic and manually created tags in a uniform manner.

I have implemented this and it almost works… The situation now is that my build has 3 jobs (for different platforms), and when building the tagged commit, only one of the 3 jobs (the first to deploy, actually) succeeds in deploying to GitHub releases, the other 2 fail with Faraday::ConnectionFailed.
If I delete the GitHub release before triggering a build, it works and all 3 jobs upload their respective files.
Is that a bug ?
As a workaround I could maybe delete the release just before moving the tag and triggering the tagged build. This would also have the additional benefit of making sure that there are no leftover files attached, so all the files are guaranteed to come from the same commit.

Need to see the build to say anything.

Here’s the latest one:
I tried dpl v2, interestingly only 1 out of 3 jobs failed.

These look like unrelated network problems. In Windows, some invoked programs are also not found but this doesn’t seem to be affecting the result.

I suggest to restart the failed job(s) up to a few times – this will hopefully be enough for them to succeed at least once.

I have restarted the failing build several times and it never succeeded. It seems to time out now…

Now the Mac build is consistently timing out on deploy -->

Deploying is very fast when it works, so giving it more time wouldn’t change anything…
The log isn’t very helpful, there’s a bunch of “warning: already initialized constant” (all apparently related to OpenSSL) which I assume are not the reason it fails, and then “No output has been received in the last 10m0s”.

How can I debug this?

Okay, I confirmed the following workaround to work:

  - |
    if [[ $TRAVIS_OS_NAME == "osx" ]]; then
      sudo chown -R "$USER:$(id -g)" /Users/travis/.rvm/rubies/
      rvm $(travis_internal_ruby) do gem uninstall openssl --all

The deployment logic is at You can fork that project, create a topic branch with debug logic and use it in your build by specifying the following in the deploy: entry:

  source: your_github_name/dpl
  branch: your_topic_branch

In this case, however, the warnings and suggest that Travis’ openssl gem is incompatible with Homebrew’s OpenSSL – looks like the same general problem as Ruby Openssl: Python deployment fails on osx image. The incompatible gem happened to be not the default one so it was enough to uninstall it, rebuilding Ruby wasn’t needed.