How important is it to have a robots.txt file installed?

Mine currently looks like this:

Code:
Sitemap: http://www.mysite.com/sitemap.xml.gz

User-agent: *
Disallow:
What do you think about blocking /privacy-policy/ and other copy-paste == duplicate pages?