Moogle1

Because Mike has too many answers and not enough questions.

Why Search and Aggregation Work Best with Small Communities

Posted by Mike Bijon January 26, 2006

In response to The Problems With 2.0, pt 345314: I’ve noticed the “getting stupid” results myself after heavy following of Memeorandum. I think it’s largely due to the limited number of topics important enough to rank on their pages. There seems to be less of an effect when using digg, but I tend to stay off the front page there, as the topics that make it there commonly seem to be of little interest to me.

As for the technical aspects of why search and aggregation work best with less content and in smaller communities

Recent research indicates that search has a reverse effect than is commonly assumed, because it encourages a higher volume of consumption:

in spite of the rich-get-richer dynamics implicitly contained in the use of link analysis to rank search hits, the net effect of search engines on traffic appears to produce an egalitarian effect, smearing out the traffic attraction of high-degree pages. Our empirical data clearly shows a sublinear scaling relation between referral traffic from search engines and page in-degree. This seems to be in agreement with the observation that search engines lead users to visiting about 20% more pages than surfing alone

I suspect that aggregators follow the same patterns and encourage even greater consumption than the input-required world of search terms. If you feel dumber it’s probably a result of the content itself and not the aggregator.

As for the mathematical aspects of aggregation and search, I agree with niblettes that the value of search and aggregators drops as the number of inputs (web pages) and community (user feedback) grows. Instead of following niblette’s suggested attractor model, I think the decreasing value of search and aggregation is due to the nonlinearity of content contributed a big community (like digg’s front page) which results in the creation of unstable bifurcations and the onset of chaos. Nonlinearity may work stylistically for Tarantino, but it’s hard to follow a newspaper if you can’t tell the difference between ads, classifieds, and articles (…also why anthropologists must devote so much time to what they do).

It’s Only a Bubble if the Price Goes Up – Traffic.com IPO

Posted by Mike Bijon January 20, 2006

While checking traffic for my drive home tonight I noticed a small dialog at the bottom of the Traffic.com website. It turns out Traffic.com is going public via OpenIPO and trying to raise between $69-82 million in the process. …isn’t Traffic.com built around largely free municipal traffic data? What’s all the cash for – the AJAX map interface?

Despite all the talk about whether Web 2.0 is turning in to “Bubble 2.0″ there is no way to confirm it unless bubble prices start hitting the markets. Om’s Babble not Bubble 2.0* points out that startup costs, VC investments, and market exits are all low, a good sign that there isn’t a bubble. Regardless, even if VCs and M&As heat up and start throwing money around it still won’t be a bubble until the “regular Joe’s” start throwing cash into the pot so the big players can cash out.

I’ll stop this speculation of my own – and just wait to see if Traffic.com, the next guy, or the next guy go big.

Digg vs. Slashdot – No, It’s About the Audience

Posted by Mike Bijon January 14, 2006

Jason Kottke contrasts the “Slashdot effect” and the “Digg effect” in Digg vs. Slashdot (or, traffic vs. influence) and draws conclusions about the level of influence each site has over the webosphere (both sites are somewhat more mainstream than the blogosphere). I think that the exact traffic levels delivered by each site are relatively unimportant, after all both have very different readers with differrent habits (as a formerly active /. user, through 1999-2000 – I expect that a lot of Slashdot readers are visiting Kottke’s site and numerous pages on it so that they can “better” participate in the commentary in Slashdot’s post comments).

I suspect that both the Digg and Slashdot audiences are unlikely to have Alexa (formerly a spyware company) software or toolbar installed on their PCs, so the Alexa results for both sites are probably an order of magnitude closer to the traffic of “generic” sites like MSN and Yahoo than alexa shows them to be. Regardless, both Digg and Slashdot represent similar types of sites and both also offer a style of news and editorial that would be difficult to accomplish on the same scale in meatspace.

Truth be told – I think the real story is how many active users Digg, Slashdot have and how both sites handle those audiences. Most news sites that started in traditional media would be thrilled to have that many eyeballs just to flash banner ads to – while Digg and Slashdot have that many readers actively involved in their community and the concern is to keep delivering stories of value to their audience. There’s a lesson to be learned there, although very few websites will ever do more than just talk about it because it’s not just about making easy money on the web.

Also worth mentioning, several commenters on Kottke’s story think Slashdot might be able to bump its mainstream readership by changing its tone, article topics, and/or layout. The editors of Slashdot, however, are probably more likely to make fun of mainstream traffic than try to get their attention. Again, exactly why /. can cause the Slashdot effect and CNN.com is an also ran.

End of Net Neutrality Points to Another Bubble – I Want QoS Before I Fund It

Posted by Mike Bijon January 06, 2006

It’s not a stock price bubble now, but the inflated heads of network & telecom companies. Om Malik notes in “Slow Lingering Death of Net Neutrality?” that ISPs want in on the riches of all that data running across their networks. In a very carfully worded proposal the network providers are “offering” content providers better performance for a fee. From the Wall Street Journal’s front page article (found via Rob Hyndman’s “Still More on Network Neutrality”):

The phone companies envision a system whereby Internet companies would agree to pay a fee for their content to receive priority treatment as it moves across increasingly crowded networks. Those that don’t pay the fee would find their transactions with Internet users — for games, movies and software downloads, for example — moving across networks at the normal but comparatively slower pace. Consumers could benefit through faster access to content from companies that agree to pay the fees.

It’s absurd that the network carriers are making promises beyond what they have already failed to deliver on. Consumers already pay ISP subscription fees to get just 60-80% of the connection’s rated bandwidth, yet they still see sound-outs during VoIP calls. At the other end of the carriers’ networks, content providers do have quality of service (QoS) worries, but it isn’t the network carriers delivering solutions to help them. Despite all the service level agreements in the world – every major content provider is already connected to the internet via multiple carriers and uses content delivery network (CDN) providers, companies like Akamai, Digital Island, and Xcelera, to maintain their own QoS levels.

There isn’t a chance that trying to take a slice of the content delivery pie will be good business for the network and telecom carriers. It may bring in more revenue at first, but good business involves keeping your customers happy enough to want (and even demand) more product. The ISPs are having trouble doing that, but they get greedy as they watch Google and iTunes pull in cash by using their networks. If they want to be content providers, then the ISPs should get into the market and see just how competitive it is. Instead the carriers cut risk by adopting a “utility” business model with subscription services and ‘stable and predictable future revenue’. Stable and predictable earnings are risk-free though and don’t deliver some upside to annual earnings for execs looking to get rich on stock options. Instead the carriers could start provisioning their networks to reduce the quality of their product – instead of making a better product the carriers will want to be “paid off” like mob intimidators.

Carriers and ISPs surely see good business as anything that juices the stock price, as evidenced by already poor service levels, but it’s likely to bite them in the tail soon enough. I have no doubts that big ISPs will use even worse service quality to drive both content providers and subscribers to pay more for “better service”. The proposal cited above may be carefully-worded but I imagine service quality will drop unless someone pays up. The door for new network providers is just opening wider and wider. Whether it’s Google turning on dark fiber, Intel building its own WiMax network, or a content provider starting direct delivery – the company that figures out “connection delivery” the same way that Google figured out “information delivery” stands to be a major competitor in a very short time.

« Older blog posts