Is Search Broken?

by Dave on March 5, 2007

in Search

Tom Foremski over at Silicon Vallery Watcher points out the things that annoy him about search:

- There are many publishers that try to make sure their headlines catch the attention of the search engines rather than catch the attention of readers. The same is true for content, editors increasingly optimize it for the search engines rather than the readers.

- Why should I have to tag my content, and tag it according to the specific formats that Technorati, and other search engines recommend?  Aren’t they supposed to do that?

- Google relies on a tremendous amount of user-helped search. Websites are encouraged to create site maps and leave the XML file on their server so that the GOOGbot can find its way around.

- The search engines ask web site owners to mask-off parts of their sites that are not relevant, such as the comment sections,  with no-follow and no-index tags.

- Web sites are encouraged to upload their content into the Googlebase database. Nice–it doesn’t even need to send out a robot to index the site.

- Every time I publish something, I send out notification “pings” to dozens of search engines and aggregators. Again, they don’t have to send out their robots to check if there is new content.

- Google asks users to create collections of sites within specific topics so that other users can use them to find specific types of information.

- The popularity of blogs is partly based on the fact that they find lots of relevant links around a particular subject. Blogs are clear examples of people-powered search services.

It’s my view that web search has come as far as it can based on algorithms and sheer grunt alone. There needs to be a human element in terms of whether or not a result is actually a) relevant and b) useful to the searcher.

This is the thinking behind the Search Wikia project which Wikipedia and Wikia’s Jimmy Wales is running. I wrote a little about this on my personal blog here and here.

It’s also why I am working on a human generated ‘search engine’. The aim will be for people to submit links they have found useful, tag and categorise them, and allow others to vote them as useful. This database of links will then be searchable, producing fewer results, but ones which have been recommended by others. I think it is going to be really useful, but it will need the committment of other people to make it work.

Watch this space.

Share this post:
  • Print
  • Digg
  • del.icio.us
  • Facebook
  • Google Bookmarks
  • email
  • FriendFeed
  • Posterous
  • Reddit
  • StumbleUpon
  • Technorati
  • Tumblr
  • Twitter

Possibly related posts:

  • No Related Post

{ 2 comments… read them below or add one }

1 Jason Ryan March 5, 2007 at 9:10 pm

While I don’t agree with all of Tom’s points, I can see how search is critical to the delivery of government information and services. I think we have to take some of the responsibility here and ensure that the content we publish is both findable and reusable.

Microformats are one way of delivering content that empowers the users (allowing them to reuse information that they have effectively paid to have created) and is also optimised for search. It means that the govt namespace becomes a usable database, rather than a sea of brochureware.

Search isn’t necessarily broken, but our approach to it (at least in most jurisdictions) is not exactly ideal…

2 Dave Briggs March 5, 2007 at 10:15 pm

Thanks for your comments, Jason. Microformats are indeed interesting and something I really want to get to know more about. I wrote a little bit about them on my personal blog, but must get my head around it properly!

Leave a Comment

Previous post:

Next post: