I doubt you can do recursive retrieval directly with any FTP, although you could probably do a bunch of file listing and scripting and get there in the end. If you don't want to use a backup program (I use SyncBack which has a free version that can do FTP backups) you can use macroscheduler to automate wget which is a free command line network retrieval program.
wget is a windows port of a unix utility so it's very powerful but not user friendly. You can get it
here, just download and expand the .zip that's the "current recommended download" at the top of the page.
Here's a quick attempt at a macro that does close to what you want:
Change Directory>c:\mydir
Run Program>c:\getw\wget
ftp://username:[email protected]/web/subdir/ -r
It needs some fine tuning but what this will do is download the whole directory structure and files from /subdir onwards. As written it does create the WHOLE directory structure under c:\mydir so the first level is c:\mydir\mydomain.com and the next is c:\mydir\mydomain.com\web and then at last c:\mydir\mydomain.com\web\subdir which has files in. I think you can play with the options to fix that.
You can see all the options by wget --help at the command prompt, and google will turn up examples and help. Depending on what you are doing you might want the "-mirror" rather than the "-r" option.