Yahoo slurp– waste of bandwidth
Looking at the stats for a couple of websites I administer I have noticed that Yahoo Slurp uses an unfeasibly large amount of bandwidth compared to all the other crawlers despite actually providing very few visitors.
In the worst case for one site last month (www.tinangelrecords.co.uk) slurp used 1000 times the bandwidth of google, yet produced only 1/20th of the number of visits, it was also responsible for almost 1/3 of the total site bandwidth. Fortunately at the moment bandwidth isn't a problem, however if it should become one then the first thing I will do is just block slurp as the number of lost visitors doesn't justify the effort required to work out how to simply limit the amount of robot traffic generated by it.
Robots/Spiders visitors (Top 25) - Full list - Last visit | |||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Connect to site from | |||||||||||||||||||||||||||||||||||||||||||||||||||
|
Mathew Mannion
Yahoo Slurp ignores the robots.txt instruction to wait between requests too, so if you have a slow page it tends to hammer it repeatedly because it has a short timeout, making it slower and slower and slower.
03 Jul 2008, 10:13
Add a comment
You are not allowed to comment on this entry as it has restricted commenting permissions.