News Websites Seek More Search Control
An AP newswire article, via The New York Times, reports that:
Leading news organizations and other publishers have proposed changing the rules that tell search engines what they can and can't collect when scouring the Web, saying the revisions would give site owners greater control over their content.More here.
Google Inc., Yahoo Inc. and other top search companies now voluntarily respect a Web site's wishes as stated in a document known as ''robots.txt,'' which a search engine's indexing software, called a crawler, knows to look for on a site.
Under the existing 13-year-old technology, a site can block indexing of individual Web pages, specific directories or the entire site. Some search engines have added their own commands to the rules, but they're not universally observed.
The Automated Content Access Protocol proposal, unveiled Thursday by a consortium of publishers at the global headquarters of The Associated Press, seeks to have those extra commands -- and more -- apply across the board.
1 Comments:
I think that it's up to website owners or blog owners what information they want to offer. Of course effing and blinding should be excluded.
Post a Comment
<< Home