To exclude a single robot (배드봇이란 검색로봇만 긁어가기 제외)
User-agent: BadBot
Disallow: /
To allow a single robot (웹크롤러 검색로봇만 긁어가기 허락)
User-agent: WebCrawler
Disallow:
To exclude all files except one (한군데 빼고 몽땅 긁어가기 허락)
This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "docs", and leave the one file in the level above this directory: