Rclone: Rsync for Cloud Storage
Today, we’re going to talk about a unique easy-to-use tool that makes using our Cloud Storage even easier. Meet rclone. The developers have described it as “rsync for cloud storage”, and this says a lot.
Rclone has applications for a wide variety of cloud users. Its main function is to synchronize data in cloud storage with local machines, making it an ideal tool for creating backups, working with static sites, and more.
Rclone also has options that are not available in similar programs. We’ll discuss these in detail below.
Installation and Initial Configuration
Rclone’s obvious advantage over similar products is its multi-platform support: Linux, Windows (which is great since there haven’t been any clients other than Cyberduck recently), MacOS, Solaris, FreeBSD, OpenBSD, NetBSD, and Plan 9.
Versions all of these operating systems can be found on the download page.
In this article, we’ll be looking at the particulars of rclone for Linux. To install, we first have to download the necessary packages and then run:
$ unzip rclone-current-linux-amd64.zip $ cd rclone-current-linux-arm $ sudo cp rclone /usr/sbin/ $ sudo chown root:root /usr/sbin/rclone $ sudo chmod 755 /usr/sbin/rclone $ sudo mkdir -p /usr/local/share/man/man1 $ sudo cp rclone.1 /usr/local/share/man/man1/ $ sudo mandb
Once the installation is complete, we configure rclone for Selectel Cloud Storage:
$ rclone config
The following dialog will appear in the console:
No remotes found - make a new one n) New remote q) Quit config n/q>
We choose n and press Enter. Next, we’ll need to enter a name for our remote storage connection:
name >
We enter a name (like Selectel) and move on to the next step:
1 / Amazon Cloud Drive \ "amazon cloud drive" 2 / Amazon S3 (also Dreamhost, Ceph) \ "s3" 3 / Backblaze B2 \ "b2" 4 / Dropbox \ "dropbox" 5 / Google Cloud Storage (this is not Google Drive) \ "google cloud storage" 6 / Google Drive \ "drive" 7 / Hubic \ "hubic" 8 / Local Disk \ "local" 9 / Microsoft OneDrive \ "onedrive" 10 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH) \ "swift" 11 / Yandex Disk \ "yandex"
We select number 10 (swift) and press Enter. Afterwards, we’ll be asked for our username and password:
Authentication URL for server. Choose a number from below, or type in your own value 1 / Rackspace US \ "https://auth.api.rackspacecloud.com/v1.0" 2 / Rackspace UK \ "https://lon.auth.api.rackspacecloud.com/v1.0" 3 / Rackspace v2 \ "https://identity.api.rackspacecloud.com/v2.0" 4 / Memset Memstore UK \ "https://auth.storage.memset.com/v1.0" 5 / Memset Memstore UK v2 \ "https://auth.storage.memset.com/v2.0" 6 / OVH \ "https://auth.cloud.ovh.net/v2.0"
Selectel Cloud Storage isn’t on the current list, so we’ll have to enter the address manually:
auth > https://auth.selcdn.ru/v1.0
The next two points (tenant and region) are optional and can be skipped.
The last prompt in the dialog will ask us to verify our configuration:
Remote config -------------------- [selectel] user = your_username key = your_password auth = https://auth.selcdn.ru/v1.0 tenant = region = -------------------- y) Yes this is OK e) Edit this remote d) Delete this remote y/e/d>
If all of the information is correct, we select y and press Enter.
Command Examples
The command syntax for cloud storage is simple:
# View a list of containers in storage $ rclone lsd selectel: # Create a new container $ rclone mkdir selectel:[container name] # View a list of files in a container $ rclone ls selectel:[container name] # Copy files from the local machine to storage rclone copy /home/local/directory # Synchronize files on the local machine with storage $ rclone sync /home/local/directory selectel:[container name] # Synchronize files in storage with the local machine $ rclone selectel:[container name] sync /home/local/directory
When executing copy and synchronization operations, rclone checks each file for the date and time it was last modified or its md5 checksum. Only modified files are transferred from the source directory to the destination.
We won’t look at all of the commands in this article, but anyone interested should look at the official documentation. You can also get information using the command:
$ rclone --help
The bulk of rclone’s functions are similarly found in other tools for working with cloud storage. There is, however, one unique function that is missing from every other tool that we know of: migrating data from one cloud server to another.
We’ll look at the following practical use case: we have a folder with photos in Google Drive, and we have to migrate the contents to our Cloud Storage. This is something that rclone can handle easily. First we create a new connection. In the list of available clouds, we choose Google Drive. Afterwards, we have to enter two parameters: client_id and client_secret. We’ll leave these blank and press Enter.
Next, we’re asked the following question:
Remote config Use auto config? * Say Y if not sure * Say N if you are working on a remote or headless machine or Y didn't work y) Yes n) No y/n>
We choose “no” (n). Rclone will generate a link where we can obtain the code:
If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?client_id=202264815644.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&state=ac901aefe97aff8ce65fe593060d0b0c Log in and authorize rclone for access
We open this link in our browser and grant rclone access to our files.
Afterwards, the Google Drive API will return a code that we’ll have to enter in response to the prompt:
Enter verification code>
And that’s it! The connection to Google Drive has been set up, and we can start our migration:
$ rclone copy [connection name]:[directory name] [selectel]:[container name]
Rclone can perform this task fairly quickly; we copied a folder with 1.8 GB worth of photos in 1 minute and 55 seconds.
While experimenting with rclone, we also discovered that it can easily copy GoogleDocs documents and convert them to docx format in the process.
Uploading Large Objects
Let’s see how quickly rclone handles one more test: we’ll try to upload a large object—over 20 GB—to storage. Files that are up to 20 GB can be uploaded to storage with standard commands. The process for uploading larger files is a bit different: files are first divided into segments and then uploaded to a separate container.
Rclone divides these files into 5 GB segments by default. If necessary, the segment size can be changed using the -swift-chunk-size option. We decided to upload a 25 GB file to storage. Rclone completed the task in 11 minutes 14 seconds. As you can see, it handled the task pretty well.
Conclusion
Rclone is a very interesting tool that we fully recommend. If you feel we’ve forgotten something, please write us and we’ll be sure to add it.
Also, if you’ve already used rclone, please share your experience in the comments below.