If you are testing your robots.txt file with the Google tools here
https://www.google.com/webmasters/tools/robots-testing-tool and you are getting this message:
File robots.txt not found (404)
It seems like you don't have a robots.txt file. In such cases we assume that there are no restrictions and crawl all content on your site
you can simply solve the problem this way: Click the menu “Fetch as Google” (it's just over “robots.txt tester”) insert robots.txt in there and click Fetch.
Go back to robots.txt tester and you'll see it now works.
Someone says it happens because the server block Google BOT access, but the question is:
why all the other Google functions work then?
Why if you try “Test” on robots.txt from the same page it says “Allowed”?
Why “fetch as Google” works?
Why Google bot scan pages and only robots.txt seems blocked?
Why a webfarm should block only the access to robots.txt?
The answer is: “Google bug”.