Hello community,
I’m fresh of today with Travis and I’m attempting my first deployment over FTP triggered at the end of a Github super simple pipeline.
Everything seems to work except for the file transfer stage.
I have adapted what this user did in his project and I made it work
https://kendrickcoleman.com/index.php/Tech-Blog/use-travis-ci-to-update-your-website-using-ftp-and-git.html
Except for the final step where in the logs it seems that curl takes forever
curl --ftp-create-dirs -T "myfile" -u "my_user:myPassword" "ftp://my_host/myfile"
When I run the build.sh
script works locally but not on Travis.
It looks like the FTP starts but never comes to an end.
What did I do wrong?
I read that I might need sftp but what if my server doesn’t support it?
Hi,
thanks for your answer.
Doing dome more tests I can see that this happens when an existing file has been changed. It takes time, then it times out and goes to the next and if it doesn’t exist on the server side, it adds it immediately.
The image I attached below shows that ftp on index.php
times out after 8 minutes. A the end the index.php
“becomes” an empty file.
However the file testphpfile.php
is uploaded without any problem, that’s because it doesn’t exist on the server.
ftp takes forever|690x350
In this light, it looks like that it might be an overwrite permission for existing files. Do you have any clue about that? Perhaps there is something I can set in the .travis.yaml
file?
You need to get FTP session log (curl -v
) and preferrably FTP server logs, too (to see if the server accepts data connections), to understand what is happening.
This looks like a server issue.