Firesight categorizes search bots (Googlebot, Bingbot, Yahoo! Slurp) as very high risks. I see them frequently hitting our webservers but I would expect this is the normal indexing process. Can anyone explain why these bots are considered "Very High"...