View Full Version : Transfer entire directory in FTP command-line?
03-07-2006, 10:09 PM
Squandering time at work, I was trying to move some files from an FTP server to my computer via Terminal. I began to realize that it is either impossible or more difficult than I'd imagined to get or put a whole folder on the site (entering many individual names for multiple files in large folders -- mget-- seems too labor intensive). "Get" commands used with directory names as arguments return a "no such file or directory" message. Very confusing. Am I missing something or is this a real limitation of command-line FTP?
03-07-2006, 10:13 PM
Have you checked the "man" for "FTP"?
03-07-2006, 11:57 PM
Try running ftp with the -i flag and then using mget on the root folder. It won't prompt for each file in a folder.
03-08-2006, 08:56 AM
...entering many individual names for multiple files in large folders -- mget-- seems too labor intensive
interactive mode off
03-08-2006, 11:11 AM
Turning off interactive then applying mget * works well. But if the directory contains other directories, mget does not seem to copy these directories or their contents. Is their some sort of recursive command for ftp (-r does something different)?
03-08-2006, 11:56 AM
Is their some sort of recursive command for ftp ?
% man ftp | grep recurs
And the man page says:
Note: mget, mput and mreget are not meant to transfer entire directory subtrees of files. That can be done by transferring a tar(1) archive of the sub-tree (in binary mode).
That is the usual way - log onto the remote system, create a tar archive of what you want to transfer, then use 'ftp' to transfer the tar file (or preferably tar.gz)
03-08-2006, 03:05 PM
If you have secure shell (ssh) access to both hosts, there is a pretty easy way to move an entire directory structure from host A to host B in one line:
from host A:
tar -cf - directory | ssh user@hostB "cd target_dir; tar -xf -"
Where directory is the directory you want to move and target_dir is the destination directory on the remote host.
03-08-2006, 04:05 PM
Another way, if you have SSH:
rsync -aE -e ssh directory user@hostB:target_dir
or from hostB
rsync -aE -e ssh user@hostA:directory target_dir
You can also use the z (--compress) switch to rsync if network throughput is an issue.
03-08-2006, 04:19 PM
Unfortunately none of these will work if you don't have ssh access to the box. For a student org. that I'm part of we have ftp access for our web page but nothing else. I eventually gave up using the ftp command line tool for nested folders in favor of using Cyberduck to transfer the files. As much as I like working in the command line (and being able to script things to my heart's content), for ftp, gui is probably the way to go.
/me wonders if Cyberduck is scriptable via Applescript.
/me goes to find out...
03-08-2006, 06:31 PM
You could probably do it with wget (http://directory.fsf.org/wget.html).
03-09-2006, 12:58 PM
I liked the tar option, but with no ssh access there is no tar. I found that I was able to mount the FTP site then use cp to grab the folders. The free GUI tools like Cyberduck and Fugu are always an option for ease of use. Command-line tools like the aforementioned wget and NcFTP can also transfer recursively, but one needs to compile and install them from source code.
07-12-2011, 12:46 AM
This one definitely works
1. use folder zip command as follows for all Directories under /tmp/abcd/xyz/
zip -r <target zip file name> <source folder/*>
the example will hence be
zip -r /tmp/yyy /tmp/abcd/xyz/*
This will create a file in /tmp area with the name yyy.zip
2. Use your normal FTP Command for the zip file
3. In the target server use unzip command to unzip the entire folder structure
This will create the folder with its original heirarchy but with new permissions and timestamps with the credentials of the current user.
vBulletin® v3.8.7, Copyright ©2000-2014, vBulletin Solutions, Inc.