A: 

If you want an exact copy the data use binary mode - using ascii mode will assume the data is 7bit text (chars 0-127) and truncate any data outside of this range. Dates back to arcane 7bit networking days where ascii mode could save you time.

In a globalized environment that we live in - such that it is quite common to find non-ascii characters e.g. foreign languages, currency symbols etc. - you should always use BINARY mode.

stephbu
A: 

ASCII mode also makes file sharing of text files across different platforms more straightforward for end users. They won't have to worry about the default line ending (cr/lf versus just lf for example) since the ASCII mode will do that translation for them on the fly.

For most file types you will ALWAYS want to use BINARY mode though.

GodEater
+1  A: 

ASCII mode changes new line characters between unix and DOS formats. \n to \r\n and viceversa.

Victor
A: 

ACSII mode converts text files between UNIX and Windows formats based on the server and client platform (CR/LF vs LF), Binary doesn't. Of course, if you transfer nearly anything in ASCII mode that isn't text, it will probably be corrupted for that reason.

Dark Shikari
+1  A: 

Actually, ASCII/BINARY has nothing to do with the 8th bit. It's a convention for translating line endings.

When you are on a windows machine talking to a Unix FTP server (FTPS or FTP - doesn't matter - the protocol is the same), the server will replace any <CR><LF>-Combination with <LF> before storing the file and consequently do the translation in reverse in case you get the file from the unix server.

The idea behind ASCII mode is to convert the line endings to the respective endings of the target platform.

As todays world seems to be converging to the unix convention (<LF>) and as nearly all of todays editors (aside of notepad) can easily handle Unix-Line-Endings, the days of ASCII mode are, indeed, numbered and I would by all means recommend to always use BINARY transfer mode.

The prospect of having data altered in mid-transfer is somewhat frightening anyways.

pilif
A: 

For the FTP protocol, the ASCII transfer mode will consider the 8th bit of each of your character as insignificant and will use it for error checking. As for binary transfer mode, your data will be sent as is. Note that sending binary data in ASCII mode will (almost) always end up in data corruption. However, transferring ASCII data in binary mode will work as long as the sending and receiving systems use the 8th bit in the same way (in modern system the 8th bit should stay at 0 to prevent collision with extended ASCII charsets).

PierreBdR
ASCII transfer mode will is not allowed to modify the 8th bit. ASCII mode will only result in data corruption if your file does not actually consist of text.Transferring ASCII in binary mode will not always work, if the 2 system don't agree on character sets.
Darron
A: 

Thanks a lot for you comments. I think Binary transfer wil do the trick for me.

shaleen mohan
+3  A: 

Many of the other answers to this question are a collection of nearly correct to outright wrong information.

ASCII mode means that the file should be converted to canonical text form on the wire. Among other things this means:

  • NVT-ASCII character set. Even if the original file is in some other character set, such as ASCII, EBCDIC or UTF-8. Technically this disallows characters with the 8th bit set, but most implementations won't enforce this.
  • CRLF line endings.

EBCDIC mode means a similar set of rules, except that the data on the wire should be in EBCDIC.

LOCAL mode allows sending data with a size other than 8 bits per byte.

IMAGE (or BINARY) mode means that the data should be send without any changes. It is up to the user to ensure that the target system can understand the data once it arrives.

Among other things, this means that the recommendation to use BINARY mode to send text data will fail if one of the systems involved doesn't use a ASCII based character set.

Darron