A more detailed explanation:
The robots.txt file is a simple text file (as the name suggests) that is just for bots and spiders (web crawlers) to look at for reference. It can contain a list of what pages/directories are not supposed to be index or crawled by the bots. It can also be ignored by them and often is by 'bad' bots from unscrupulous sites like phishers and spammers. It should not be relied on to keep anything out. It is a purely passive 'read only' type file and is a suggestion rather than an instruction. It can usually be seen just by typing it's name in a browser bar and is not hidden.
The .htaccess file is an apache server system file that can contain instructions for the server to carry out given certain conditions. Like blocking IP's or rewriting webpages or redirecting error pages and such. This can be relied on to perform it's job like keeping out spammers etc. and would be what you should use for guaranteed functions. It cannot be ignored and is an 'active' type file that must be obeyed. It is hidden from view and cannot be accessed by visitors.
Summary: robots.txt = suggestions & .htaccess = instructions (system file).
I hope that's clear now
If I can't be a good example, I'll just have to be a terrible warning...