Compress them first - you'll probably find you get close to 3:1 compression on database dumps. I have a 1GB database which compresses to a 340MB file.
Why do you want to download a local copy anyway? For backup purposes? Try setting up an automated routine to dump your database to disk, compress it, then archive it to Amazon S3.
It's fully automated so doesn't take you any time, independent of any local internet download speeds or quotas you might have, and since it is going to be server-to-server, it is likely much faster than downloading anyway. You can then download any copy you want from S3 when you need it.
I have several hundred gigabytes worth of database backups on S3 costing less than $10 per month to store ... this could be cheaper still if I started using Glacier for archiving.