I like to use rsync for scp for transferring files from another server whenever possible. But sometimes, we will be limited to use ftp for the same task due to different ssh port than the standard 22 as scp and rsync uses ssh.
Downloading each file separately one after another is tedious task than recursively download them. We can use wget with ftp protocol for recursively downloading files from another server. We will find out how to do this below.
We can use the below syntax for downloading file using wget recursively.
wget -r ftp://server.com/ --user username --password 'password'
Here -r make sure that files are downloaded recursively. If you want to download a specific folder and its contents recursively, change the command accordingly.
For example, to download only the blog folder of main site from remote server which is located at /home/user/public_html/blog
wget -r ftp://server.com/public_html/blog --user username --password 'password'
If we need to just download a backup file at home directory of user, there is no need for -r (recursive) switch.
wget ftp://192.168.0.1/mybackup.tar.gz --user username --password 'password'
This will download the file current working directory.
Some times some servers won't allow ftp in passive mode. Then we need to use below command with --no-passive-ftp switch
wget -r ftp://server.com/mybackup.tar.gz --user username --password 'password' --no-passive-ftp
As you see above, there is no need to use server.com, but you can use the server's resolving ip address instead and it will save some dns look ups and make the process faster, a bit faster.
Downloading each file separately one after another is tedious task than recursively download them. We can use wget with ftp protocol for recursively downloading files from another server. We will find out how to do this below.
We can use the below syntax for downloading file using wget recursively.
wget -r ftp://server.com/ --user username --password 'password'
Here -r make sure that files are downloaded recursively. If you want to download a specific folder and its contents recursively, change the command accordingly.
For example, to download only the blog folder of main site from remote server which is located at /home/user/public_html/blog
wget -r ftp://server.com/public_html/blog --user username --password 'password'
If we need to just download a backup file at home directory of user, there is no need for -r (recursive) switch.
wget ftp://192.168.0.1/mybackup.tar.gz --user username --password 'password'
This will download the file current working directory.
Some times some servers won't allow ftp in passive mode. Then we need to use below command with --no-passive-ftp switch
wget -r ftp://server.com/mybackup.tar.gz --user username --password 'password' --no-passive-ftp
As you see above, there is no need to use server.com, but you can use the server's resolving ip address instead and it will save some dns look ups and make the process faster, a bit faster.
No comments :
Post a Comment