Hi All,
My site is a real-estate listing site, so the client will be regularly adding and deleting various listings on the site. I’m using the Structure module and I have the real-estate listings set up as Structure listings.
My client’s SEO person has told me to write a robots.txt file that excludes all the actual listings from being crawled. I’m not sure how to do this since those pages are generated dynamically and change regularly. Can anyone give me some guidance? Thanks!