Ok, so having a recent deal of 50 Gb free Box.com accounts I've decided it would be nice to backup my sites there.
My sites are already backed by a 3rd party software to my other LEBs. It stores it in domain/backup-date.tar.gz structure. Now I want to automate uploads of these files to Box.com.
I've wrote a simple script which automates this process for me. Its very personal but still I want to share it and the basic idea of how you can do that yourself and adapt for own needs.
Basically it will work with any online storage which supports WebDAV. It also supports file splitting to support file size limits on remote storage (200Mb on box.com). It doesn't requires FUSE (which are almost always disabled on OpenVZ VPS) as it works with curl uploads. So all you need is just having curl (mostly default) installed on your VPS.
Here is the script on pastebin.
Script was written rapidly for personal purposes. It may look fugly as it was written in few minutes, no time to polish it (actually this post takes more time to write). So AS-IS, no warranties. Consider it just as an example. I'm not going to polish/improve it/make it more generic/etc.
Most important parts here:
curl -u $LOGIN:$PASS -X MKCOL https://www.box.net/dav/my/path - will create remote directory. Required because if you try to upload to non-existing dir, file will go to /dev/null.
curl --user $LOGIN:$PASS -T /home/backups/x.tar.gz https://www.box.net/dav/my/path/myfile.tar.gz - will upload /home/x.tar.gz as myfile.tar.gz to the /my/path directory on box.com.
split -b $MAXFS $files /tmp/$$. - will split file by MAXFS bytes in /tmp. $$ is current PID so it will be files like 12345.aa, 12345.ab, etc.
Remember - you can add "-v" to curl options to see debug info on transfers.
Sorry if I explained that aren't good enough and the script aren't good enough. It just works for me and you can use this as an example for your needs.