The spider crawling is not smooth and clean up the site trap is the key

(2) redirect or delete dead links, if dead link position is not very important, it can be redirected to the home page, if the position is more important, the number is more, you can choose to remove dead links, you can use the tool to delete.


## solution: choose an authoritative good tool to create a web site map, such as webmaster tools, love Shanghai Webmaster Platform tools, if they are more familiar with the site, the best able to Handmade, and have several tests, ensure the accuracy and integrity of the good site map.

(1) submitted to the dead link. The love of Shanghai Webmaster Platform "links submitted" tools to submit website links, details please refer to love Shanghai webmaster tools.

is the so-called dead link return code 404 error page links, these links are usually created by a website or website, to replace the domain name after. There is a dead link is not good for the user experience and the spider crawling for. Dead link is undoubtedly prevented the spider crawling, one after another when the spider met these should not appear dead links, will not have confidence in the site, the site will eventually give up crawling.

huge trap dead link site number

[]: a trap site map is not correct

trap URL contains too many parameters


love Shanghai official guide optimization show that the current love Shanghai search robot can also like the Google robot included dynamic parameters.

## solution:

there is a spider, just a robot, which consists of program code, we must follow the rules to crawl the web site, if there are some websites trap, will let the spider crawling become not smooth, thereby causing a series of problems. What website trap has become the stumbling block of the spider crawling? Here I come to the simple talk.

site map is a very useful tool for web site, users and spiders, a complete and correct site map can well identify the website structure, so as to better browse and crawl the website. Because some owners are not familiar with the code, and the framework of the website are not familiar with, you just chose a bad authoritative tool to produce a pair of incomplete or incorrect site map, the final result is to let the spider crawling into the final "lost".

spider crawling often determines the site included, so in the usual optimization work, we will pay more attention to the IIS log, by observing the log to get the spider crawling website, and through the analysis of the dynamic monitoring of the spider web optimization schedule. But many webmaster will see their IIS log shows the spider crawling is not smooth, although think site optimization do very perfect. It contains what reason?