The article is extremely detailed and involves quite a few of mathematical functions and tables of examples. The results of all the analysis aren't very encouraging, however; here's a quote:
What's really stunning about the above table is the stark realization that in supporting numbers of users comparable to Napster, Gnutella would generate more than an unbelievably significant 800MB worth of data for just one of those users to search the entire network for "grateful dead live" and receive responses.
Here's one more for good measure:
From the charts above, it becomes mind-numbingly clear that the Gnutella distributed architecture is fundamentally flawed and can have a horrific impact on any network. On a slow day, a GnutellaNet would have to move 2.4 gigabytes per second in order to support numbers of users comparable to Napster. On a heavy day, 8 gigabytes per second.
This does not sound at all good for those using the increasingly popular argument of 'It doesn't matter if Napster dies, I'll just use Gnutella. It doesn't have any servers, so they can't shut it down.' Keep in mind that the above figures don't actually take any file transfers into account; all that data is just for the search traffic.
So after reading this, what's everyone's opinion on the usefulness of Gnutella? Is the program going to blow up without any help from the RIAA, or are there problems with Mr. Ritter's math and/or assumptions? Comment away.