CNet today sums up the current debate of the future of RSS. Robert Scoble at Microsoft set off a firestorm when he claimed that RSS is broken because of the bandwidth spikes set off by readers everywhere that are set to go off on the hour.
Dave Winer, defender and champion of the protocol, quickly jumped in to say the reason bandwidth was a problem for Microsoft was because they had aggregated all their MSDN blogs into one gigantic feed with a high probability of new content always being there and thus greater attraction of (a) everyone subscribing to the mega-feed as a one-stop shop and, (b) everyone setting their RSS readers to pull this feed more frequently in order to stay up to date.
Large file + pulled by many people around the world + with great frequency = expensive bandwidth bill
The debate is still ongoing but I think we’re always going to see the tendency to aggregate feeds into collections. As much as Winer rails against it, the tendency of humans to aggregate for convenience and knowledge sharing coupled with the other tendency of humans to be lazy and go for the pre-packaged is going to win over the ideal which is to require everyone to roll their own.
Despite it being cheaper and better for you, no one catches their own fish anymore.
I read somewhere that Scoble reads something like 1,000 feeds a day. It makes me think that maybe RSS has now gotten to the point where we’re downloading and caching whole segments of the web for individual perusal which seems grossly inefficient. Perhaps it’s time to push it back out again and layer a RSS search engine on top of the “living web” of blogs and their feeds and layer a semantic filter on top of this engine which only feeds you what is of interest and relevant. Ah yes, please define a static filter of what is of interest and relevant. . .