(Via OL Daily)
Good article that points to a problem with RSS and makes a pitch for some sanity. The problem is this: “If your feed works, if you are successful in attracting subscriptions on a global scale, if you do it right, you are doomed [because] everyone subscribes to a small file on your site. The critical word there is ‘everyone’.” Now of course, content syndication doesn’t work if nobody syndicates the content! The way RSS is supposed to work is that a website is accessed by a small number of harvesters; these harvesters, in turn, feed to people who have a specialized interest in a topic. But if people subscribe to individual feeds, rather than aggregated (and syndicated) feeds, then “it is like having a permanent listing on the front page of SlashDot.” Such a situation should never happen. You should not need to hit an individual feed once an hour. “In fact, you may not need my feed at all if this aggregator buddy’s feed has collected my posts with other opensource hippie sites and can provide you with a composite feed where the news is hourly different instead of my lazy two-days-maybe publishing cycle.” Via Seb (who had a good day today). By Gary Lawrence Murphy, Teledyn, November 24, 2003
One thing to add to this interesting and scary(?) scenario. Mark Fletcher at Bloglines comments that server-based aggregators hit sites only once no matter how many subscribers there are. That obviously keeps bandwith use down. More growing pains…
There seems to be no better way to get a link on Weblogg-Ed than proclaim the end or death of something (I’ve demonstrated this myself with my “Manila is dying” line).
Anyhow, the article hits on real issues (and solutions), but is totally overwrought due to the fact this guy is stressed about having to pay more money for bandwidth.
The biggest short term fix, as he mentions, would just be to get RSS aggregators to respect the existing protocols and not download the whole feed if there are no changes.
I like to keep track of trends…;0)
I’m mystified.
There is a real problem with broken newsreaders not doing the right thing.
However, most of his complaint seems to be misplaced. He’s got a popular site; rss, even with the broken newsreaders, is still reducing the amount of traffic. If thousands of readers hitting his rss feed 10 times a day breaks his bandwidth limit, how much worse would it be if those same thousands loaded his whole site once every day or two?
Just the text on my index file, without any of the graphics at all, is 120K, while the index.xml file is less than 8K. Complaining about users accessing the rss because it is a drain on resources just doesn’t make sense to me.