Copy 20GB to a remote server: no luck.

On my laptop my harddisk has started to fail. =(

So I wanted to backup my crypt file holding all my valuable files. This crypt file has 20GB. I tried to copy this file at once on our server. No luck.

Tried within KDE (4.9.0), no luck.
Tried in bash with

$ cp crypt-file smb/server/home

(I use to mount Samba shares in ~/smb/SERVER/SHARE)
No luck.

Tried with scp: no luck.

The effect: at a certain point of time the transmission starts to either abort or – in the case of scp – is stalled. That is when I got around 11GB on the server. So scp got stalled and I tried it with the -l option:

$ scp -l 8192 crypt-file user@server:

But this takes an eternity for 20GB! And there is no guarantee of success either.
No luck.

Next, I dropped the TCP SACK ability, which occasionally causes scp to get stalled:

# echo 0 > /proc/sys/net/ipv4/tcp_sack
$ scp crypt-file user@server:

No luck.

*sigh*

Finally I mounted the crypt and went for rsync

$ rsync -avuz crypt/ user@server:backup

and encrypted there by using gpg the tarball of credentials holding passwords.

*sigh*

So rsync is the salvation. But then it is put there in plain on the server. I admit: this *has* some advantages but the drawback is: that you have to go and find all your precious files which ought to be protected and find yet another way to encrypt them. With the help of $ gpg -e FILE this is ok but cumbersome.