Google, Yahoo, Microsoft Agree On Web Site Mapping Standard - InformationWeek
IoT
IoT
News
News
11/15/2006
07:48 PM
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%
RELATED EVENTS
Ransomware: Latest Developments & How to Defend Against Them
Nov 01, 2017
Ransomware is one of the fastest growing types of malware, and new breeds that escalate quickly ar ...Read More>>

Google, Yahoo, Microsoft Agree On Web Site Mapping Standard

Sitemaps .90 provides a unified format for publishers to submit content to search engines.

The three Internet search leaders have decided to work together to make information easier to find. Google, Microsoft, and Yahoo have jointly agreed to support a single, open protocol to make Web sites more visible to search engines.

On Thursday, the three companies plan to reveal support for Sitemaps .90, a protocol that lets Web publishers inform search engines about content on their sites.

"We just wanted to make it easy for site owners to have one file they could maintain on the site," says Vanessa Fox, product manager for Webmaster Central at Google.

"For a long time, most people in the search industry have thought it would be great to have a unified format to submit content to search engines," says Tim Mayer, senior director of global Web search at Yahoo. "There are a lot of sites that are difficult to crawl, like shopping sites or those with dynamic URLs. This enables a publisher to give us a feed of all the URLs for a specific site so we can access them."

A Sitemap is simply an XML file that describes a Web site's file and directory structure. Beyond identifying fresh content for search engines, Sitemaps help save bandwidth for publishers and search engines alike by avoiding unnecessary crawls to identify new or changed files.

"The quality of your index is predicated by the quality of your sources, and Windows Live Search is happy to be working with Google and Yahoo on Sitemaps to not only help Webmasters, but also help consumers by delivering more relevant search results so they can find what they're looking for faster," said Ken Moss, general manager of Windows Live Search at Microsoft, in a statement. "I am sure this will be the first of many industry initiatives you will see us working and collaborating on."

Sitemaps augment rather than replace search engine crawls. And they do not limit the visibility of files to search engines. That's a function of the robots.txt file, which is used to prevent files from being indexed (though not all crawlers obey).

Publishers have been asking for standardization for how search crawlers and other bots deal with directives in a robots.txt file, says Mayer, who indicates the large search engines are open to input about this.

Further information can be found at Sitemaps.org.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
How Enterprises Are Attacking the IT Security Enterprise
How Enterprises Are Attacking the IT Security Enterprise
To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Register for InformationWeek Newsletters
White Papers
Current Issue
Digital Transformation Myths & Truths
Transformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.
Video
Slideshows
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll