Although my site right now does not have this problem, I am wondering if a larger site could run into this. I have seen in several places with regards to SEO that a large number of links on any given page could cause problems with search engines. In fact, when I submitted the Analyze META Tags function, it returned the following:
Code:
Web page analysis.
This page contains too many URLs.
This tag contains 194 urls. Some Search Engines have problems with more than 100 urls on a page.
--------------------------------------------------------------------------------
The size of the web page is to big.
The size of the web page is 104199 bytes.
--------------------------------------------------------------------------------
The web page load time.
The web page load time is 5 seconds.
|
What I am wondering is if the Sitemap should really incorporate some form of paging so that the number of links are kept to a reasonable number?