hello all, I am in the process of launching a new site in which I have used the cms:calendar tag to implement an events calendar.
I use www.xml-sitemaps.com to generate xml sitemaps and was surprised when I reached the limit of 500 pages! The site is only quite small!
Anyway, what is happening is that it is following links within the cms:calendar loop somehow and recording loads of pages suffixed like this ....
?cal=2025-06-01
?cal=2006-04-01
I can manually edit the sitemap obviously to get rid of all these.
But am just wondering if this has implications for robots accessing the site ... will they too find these 'ghost' pages?
Any thoughts anyone? Perhaps I need to put something in the robots.txt file ....??
I use www.xml-sitemaps.com to generate xml sitemaps and was surprised when I reached the limit of 500 pages! The site is only quite small!
Anyway, what is happening is that it is following links within the cms:calendar loop somehow and recording loads of pages suffixed like this ....
?cal=2025-06-01
?cal=2006-04-01
I can manually edit the sitemap obviously to get rid of all these.
But am just wondering if this has implications for robots accessing the site ... will they too find these 'ghost' pages?
Any thoughts anyone? Perhaps I need to put something in the robots.txt file ....??