Web 2.0 in the OrganizationPosted: June 17, 2008 | |
Dave Snowden has written a very interesting post … Small one … You could read it here. Very interesting thought. More often than not, we use things for stuff they werent even remotely intended for.
Something we saw yesterday … I am in Bangalore right now, and i was in my room last night, with a few friends, knocking off a few Beers. As usual, the wall-mounted bottle opener proved inadequate for opening the Beer bottles. And, what did the guys open the bottles with? You wouldnt guess … A spoon. Whoever invented the spoon would never have imagined that.
On a more serious note … i was once working with a client. They were using an enterprise software (read ERP), and interestingly, they were using a particular feature of the applications. Interestingly, that was an undocumented feature (euphemism for bug), and when they upgraded the software … what do you know … the “feature” went away, and they were no longer able to do something they were able to earlier. Nobody would have guessed they would actually have been using that.
Which is why i quite agree with Dave when he quotes …
When I worked at IBM we were asked (in 1990) to 6Sigma our CICS development team. The gurus told us that the next release of CICS could only have 6 bugs (or APARs as we called them). This was ridiculous, but luckily a colleague ran a report and showed that IBM program products had extremely strong positive correlation of profitability with APAR rate. That is, the products with the most APARs were the most profitable. This is because great products, like CICS, get used for lots of things we didn’t think of and for which we didn’t test. Mediocre products only get used for what the tests cover. Bad products don’t get used at all and so generate almost no bugs.
In all probability, you would be using things in ways which the guys who made them never even dreamt of. And this is something which i hold even when it comes to adopting technology … especially in the web 2.0 world … More often than not, you can roll-out some application to the users, and you would find them using these in ways you never were able to imagine in those requirements documents you had written. Which is why, i believe that especially with technology in the web 2.0 space, it would be wise to simply launch this in the organization, and wait and watch … you would find over a period of time, usage emerging … new, and in all probability, innovative usage for these tools. And, it is not in the interest of the knowledge managers, or the larger community, to restrict this usage.
In other words … usage, and hence benefits would tend to be more emergent rather than being pre-defined when it comes to collaboration, or social software. This is a challenge especially to the ROI school of thought, because this very phenomenon would make it quite difficult to actually measure something like this. Remember … ROI of spoon? While this is something we are all grappling with, the other side of the coin also is quite relevant, that is, how does management decide whether to invest or not, unless they can see benefits. Having said this, though, there is also the viewpoint that whether you like it or not, social computing is here to stay … whether within or outside the firewall. More beneficial to adopt it, and see benefits as they emerge. Only thing is, most managers are not comfortable with the idea of something emerging over a period of time. What we dont realize is that most technologies do actually emerge. The internet wasnt invented … sure, the technology was, but the usage … thats something that emerged over a period of time. Same is true of web 2.0, too.
Emergent technology means looking at how people use it over a period of time, and then look at how you would like to guide this technology into the business processes in the organization. Which again is something which, in my opinion, would happen sooner or later … something i have written about before.