Adam Fields (weblog)

This blog is largely deprecated, but is being preserved here for historical interest. Check out my index page at for more up to date info. My main trade is technology strategy, process/project management, and performance optimization consulting, with a focus on enterprise and open source CMS and related technologies. More information. I write periodic long pieces here, shorter stuff goes on twitter or


On the integration of Web 2.0 apps

Filed under: — adam @ 9:48 am

Britt sent me this link lamenting the lack of interaction between Web 2.0 services:

This is an interesting and correct observation, but let’s look at anĀ analogousĀ situation – unix command line tools.

Unix is designed around the pipe – the ability to string long chains of commands together, each of which only does a small thing, to accomplish what you actually want to do. There are some places where this breaks down, but by and large, this method has been spectacularly successful.

Web2.0 apps are much better positioned to emulate this than Web1.0 apps, but they’re still not there yet.

What’s missing is the switches that enable those apps to play nice with other apps.

You’re probably familiar with ls, which lists files in a directory:

fields@server2:~$ ls /tmp
mysql-snapshot-20060621.tar.gz mysql-snapshot-20060621_master_status.txt

ls also has another mode, that outputs a long listing, which includes more detailed information about the files:

fields@server2:~$ ls -l /tmp
total 841520
-rw-r–r– 1 root root 860863512 Jun 22 19:08 mysql-snapshot-20060621.tar.gz
-rw-r–r– 1 root root 382 Jun 22 18:50 mysql-snapshot-20060621_master_status.txt

Once you have that, you can pass the list to other programs that may want to filter the list by one of those pieces of data. The default mode is useful for dealing with the files themselves, but less useful if you want to interact with their metadata. What if the -l flag was left out, and that behavior was restricted to maintain ls’s competetive advantage (in the hypothetical situation where it’s something provided by your filesystem vendor)? If the information you’re looking for isn’t returned at all, you may have no other way to get at it. Maybe you’d have to use the vendor’s lslong, which costs money. You may be just fine with that, or you may be compelled to look for a filesystem competitor that does what you want. I’d argue that ls is less useful without that ability. That’s the situation we’re looking at when a Web 2.0 API is lacking certain core features to interact with the data it represents.

Is that an acceptable tradeoff? Maybe it is for a free service. It seems less so for a service you pay for, because fundamentally, you’re paying for the ability to manage your data, not for the ability to use the particular software – that’s the whole concept behind software as a service in the first place.

This is, of course, made more complicated by the fact that Web 2.0 isn’t just data sharing, it’s also about more dynamic interfaces. Theoretically, these two are interconnected and the dynamic interfaces work better because they can deal with small chunks of data that are in more standardized formats, and also theoretically, the data access mechanics are decoupled from the actual interaction semantics, which would have the effect of making outside non-gui access to your data easier with standard tools. In practice, that seems to rarely happen.

This is the only good rationale I’ve heard for using XML for gui/backend interchange.

These are good things to be thinking about when designing web applications. It’s not enough to think of them in a vacuum; we have to consider the implications of living in the ecosystem. It’s possible that that means opening up far more access to the underlying workings than we’re accustomed to. I would LOVE to see some applications that fully work if you take away the browser front-end, but still interact in exactly the same way via HTTP.

[Update: More on this discussion from Phil Windley.]

Tags: , , ,

2 Responses to “On the integration of Web 2.0 apps”

  1. Britt Blaser Says:

    Thanks for this, Adam. So the question becomes whether or not the actual web 2.0 sites are that vital. Most of them take an obvious implementation and demonstrate a way to make it work, not too badly.

    Why not just incorporate the best capabilities of the best collaborative W2.0 apps into a master web service?

    You will recognize that as a rhetorical question.

  2. adam Says:

    I think that’s the wrong approach though – one of the great things about web2.0 is its distributed nature, and one of the great things about unix command line tools is that they generally follow the principles of “do one thing and do it well” and “if someone else does something better, then let them do it”.

    Competition is good for the ecosystem, but so is lack of user confusion, and that’s a delicate balance. If a web2.0 site is the best at what it does and it keeps improving its core functionality, it should be the repository of that function.

    Of course, this breaks down a bit when you’re charging for a service, which also costs money to run. There’s a delicate balance between contributing to the commons, which makes your service more valuable on the whole, and locking down your users to make them use your service, which probably looks like it increases retention.

Powered by WordPress