@jeroen @jimhester
Here’s the failing build. The originating repo is on GH.
The build renders and Rmd and clearly installs the python package htmldate via R package reticulate, you can see this in the log. But later, when the script tries to use the package (in FOSS4Spectroscopy.Rmd line 177) it claims it can’t find it. All of this works just fine locally.
I’m grateful for any suggestions!
Check PATH
and the contents of the created virtualenv’s bin
.
I suspect that reticulate doesn’t adjust PATH
for the R’s process to point to virtualenv.
Thank you. I made some changes to the .Rmd
so that virtual environments are explicit now. Works locally. When I build at T-CI, I get messages that “Error: Tools for managing Python virtual environments are not installed.”
So I edited my .travis.yml to include:
matrix:
include:
- language: python
python: 3.6
before_install:
- sudo apt-get install python3-pip
- sudo apt-get install python3-venv
- language: r
r: 3.6
But the before_install
stuff seems to be ignored. I hope I am getting closer but I’m stuck for now.
You can’t use language: python
and language: r
together, you have to use one or the other.
Thank you Jim @jimhester I will work from the R
angle only then. I was trying both as a way around my original question. Can you suggest how to search for projects using R
that also use reticulate
? I need to see a working example.
Otherwise I’ll make an interim change to eliminate the Python use and possibly make an R
equivalent.
Thanks for your work here!
Hi @bryanhanson,
I am not sure if this helps you at all. I use travis to test my R package which uses reticulate
to connect to python’ boto3 to connect to AWS Athena (https://github.com/DyfanJones/RAthena). My travis yaml is set up as follows:
language: R
sudo: false
cache:
- packages
- pip
addons:
apt:
packages:
- python3-pip
before_install:
- pip3 install --user boto3
after_success:
- Rscript -e 'covr::codecov()'
Thanks @dyfanjones This approach would be overkill for my little project, but I may be able to use this in the future. Much appreciated, Bryan