I’ve been thinking a lot lately about the possible ways of rating, ranking or scoring blogs.
As I mentioned when speaking to Mayo Clinic librarians a few weeks ago, I grow increasingly annoyed with the way Technorati uses the word “Authority.”
It annoys me because although the Technorati rankings are interesting and useful, they do not actually measure authority (as libraryfolk understand the term). As the Technorati FAQ explains:
Technorati Authority is the number of blogs linking to a website in the last six months. The higher the number, the more Technorati Authority the blog has.
So this score doesn’t indicate authority, quality or trustworthiness. At best, it could be said to measure popularity. (It could also be argued that the Technorati rank is to blogs what Impact Factor is to journals…but I’m not going to make that argument here.)
Other ranking sites do something similar, ranking blogs based on metrics that measure popularity.
Google PageRank (0 to 10) – Google PageRank is a link analysis algorithm that interprets Web links and assigns a numerical weighting (0 to 10) to each site. High-quality sites receive a higher PageRank.
Bloglines Subscribers (1 to 20) – Bloglines displays the number of subscribers for each blog’s feed. Each blog is assigned a Bloglines value from 1 to 20 based on subscriber ranges – e.g., more than 20, more than 30, etc. The more subscribers, the higher the Bloglines value.
Technorati Authority Ranking (1 to 30) – Technorati’s authority ranking shows the number of unique blogs that have linked to a particular blog over the past six months. The more link sources referencing your blog, the higher the Technorati ranking. Similar to a blog’s Bloglines value, a blog’s Technorati value is determined based on ranges (e.g., top 10,000, top 20,000, etc.), and each range is assigned a number (1 to 30) that is part of the algorithm.
eDrugSearch.com Points (1 to 10) – To add a little spice to the Healthcare 100 rankings, we add our own subjective ranking of each blog based on the quality of content and frequency of updates.
So with the exception of the eDrugSearch.com Points, these measures are generally objective and, like Technorati, measure only popularity.
So what’s wrong with attempting to measure blog popularity? Nothing at all- I just want us to remember that popularity does not equal quality. As an illustration of this I’d like to point out that the New Kids on the Block had TWO #1 records in both the U.S. and the U.K..
Of course, the newest attempt to rank medical blogs is at MedGadget.nl, and this has gotten a good bit of attention from medical bloggers.
- the number of published articles
- the number of comments
- the mean Google pagerank of the homepage
- the mean Technorati-rank, -inblog and inlink numbers
- the mean number of subscription to Feedburner (the circulation) and the mean hits of Feedburner.
In my opinion, this list is more accurate and comprehensive than that of eDrugSearch. And the reason I think so is not Scienceroll’s 22nd rank, but the objective parameters. Let us know if you know more lists about medical blogs.
I can’t agree with Berci. First, because the word “accuracy” has absolutely no place here. Neither list is “accurate” in the sense that neither list is “salty” or “orange”. Second, because I see these other problems with the metrics that MedGadget.nl uses:
- Counting the number of published articles says nothing about quality OR popularity. It might say something about how long the blog has been around, it might say something about how prolific the blog’s author is. It also might indicate a whole ton of lousy content. Any way you look at it, this isn’t a reliably useful metric.
- Counting the number of contents says nothing about quality or popularity. If paired with the number of pages views as a ratio, it might say something about how much readers are engaged to share their thoughts. Without that ratio, it might (again) say something about how long the blog has been around or how prolific the blog’s author is. It also might indicate a whole ton of comment spam. Any way you look at it, this isn’t a reliably useful metric.
- To use the mean number of subscribers via FeedBurner as a metric is clearly a very bad idea for the simple reason that many blogs don’t use FeedBurner.
Technorati and the Healthcare 100 are imperfect, but the rankings at MedGadget.nl are significantly less meaningful.
So how do you find out what the best blogs are? Heck, check out the most popular ones written on topics that interest you. They’re probably popular because they have some sort of broad appeal that you might also experience. More importantly, read and read critically. Take seriously the reading recommendations of the blog authors you most enjoy/trust. Read the “about” page of each blog carefully to check the author’s credentials and experience. (Hint: A blog without a detailed “About” page is like a “fact” without a citation.)
What do you think?
Do you have a favorite rating system for blogs? How do you go about evaluating a medical blog (or a biblioblog)?