I need to make an upgrade to my site which involves some url changes and some work i can only do whilst the site is live (don’t ask me why!) so i want to temporarily stop google and chums from indexing my site until i’ve finished the work and added my redirects.
What i want to know is, if i put a “disallow all” in the robots.txt and a search engine comes along and sees that, will it (a) respect my robots.txt and (b) come back another time? I don’t want em thinking they must never crawl my site again, that would be a disaster of biblical proportions!
Any advice very very gratefully received!