Despite the seemingly enormous mindshare, Google’s search engine holds with the public at large, Microsoft continues to toil away at its own search engine, not only prevent a monopoly in the search engine market but also help the company’s own synergy of solutions that rely on the use of Bing.
With that being said, the Microsoft’s Bing team just announced that it’s working on making its web crawler more efficient. By improving the BingBot (yes, that’s a thing) with a new 18-month developed algorithm, it should have a better understanding of “which sites to crawl, how often and how many pages to fetch from each site,” according to Fabrice Canel, principal program manager for Bing Webmaster Tools (via Search Engine Land).
Other improvements of the BingBot center around its footprint on web servers as it does its new and improved web crawling. Microsoft’s goal of “crawl efficiency” is a goal of balancing indexing new content as well as ensuring that site servers aren’t taxed while it’s being done.
The improvements to BingBot come amid concerns from its webmaster and SEO community, which have complained about the effectiveness of Bings web crawl. Canel addresses those concerns as he penned the following response on the Bing webmaster blog.
“We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.”
Bing search has been a relative second-class citizen as far as consumer interest goes for search engines. Perpahs, the updated BingBot will help nullify the argument of relevancy valley that most perceive between Google and Microsoft’s search engine offerings.