September 06, 2005

Web 2.02 (bottom up)

Peter Merholz has a response up to my Web 2.0 piece. Peter is one of the sharpest commentators and observers of the internet around and par the course it well worth reading. And he brings up two points that need some clarification, so as Slick Rick would say, here we go:

First off Peter is entirely right that the early web was not a place where "anyone" could build. Not in the least. But I also was pretty careful not to make that claim, so I'm not sure where the "anyone" Peter puts in quotes comes from.

The distinction between and amateur and professional is like that between the ocean and land, very clear most of the time, but almost impossible to pin down at the border regions. And when you talk about amateur and professional skills rather than amateurs and professionals as in people, well then the distinction becomes almost impossible to sort out. But in the end I think it's pretty damn clear that it takes a lot more skills to be rolling your own up to general standards websites now then it did 10 years ago. You just can't learn how to make Ajax sites or database driven sites or write good CSS the way you used to be able to learn HTML or Flash 3. Precisely demarcating that difference is near impossible, but it's pretty clear that it exists. The somewhat arbitrary and perhaps a bit silly distinction I used, of a skill that a reasonably intelligent and motivated person could learn in a weekend was there precisely to make it clear that the amateur web was not one that "anyone" could get onto, but was one that took a very different sort of learning process then what exists today to become a creator.

Peter's second point is well taken and I'm afraid I'm a bit at fault. I never meant to imply in any way that Peter was intentionally arguing that companies should relinquish control over to his company. However whether he likes it or not I do believe that is part of what he in effect did end up arguing. I mean, the article appeared on the website of the company he founded, a company in the business of selling web consulting. And when he says companies should relinquish control he's not saying they should have a gang of monkey's generate their websites or hire 14 year old "script kiddies" to write their code or turn your whole ecommerce site into a wiki. Relinquishing control is not something that you can just do like its a Nike ad. Rather in order to do it, you need to make sure you do it right. And if you want to do it right, hiring Peter and Adaptive Path is probably one of the smartest things you could do. They are among the very best, I have a strange feeling they'll do a much better job figuring out how to relinquish control then you, or most other companies could do on their own.

There is a reason for this, "relinquishing control" is hard, really hard. And not just psychologically, there are an awful lot of ways you can do it wrong. There is a reason why Amazon lets you add comments to book pages, but not edit the author and title of the book or use the page a private bulletin board, and its not because they hate their customers. Flickr lets you upload photos, but not mp3s or java applets. Ebay lets you sell your items online but requires you register with them. None of these businesses would work if they just let customers do anything and everything. They aren't in the lose control business, they are in the business of facilitating the flow of information. Not just any information too, but specific information, quality specific information, information relevant to their particular focuses.

When Amazon opened up its pages to comments they radically increased the amount of information available about each book purchase. In the process they relinquished some control over to their customers, in a rather controlled manner of course. Flickr gives their users control over their own photos online, but the numerous interface innovations that in part drives their success stem from controlling exactly what type of files the users can post. By narrowing the channel of information down to a couple similar image file types, the Flickr team was able to open up a whole array of ways in which that particular type of information could flow.

It is important to understand that openness and control do not necessarily need to be in conflict, they are not paradoxical at all, but in fact often work together integrally. It is only in very localized circumstances, for instance in the specific decision whether to have an API to a system or not, that the two enter into a dialectical relationship. Most of the time the two coexist quite easily, often complementing each other, and sometime quite essential to each others operation. For instance the distributed network that is the entire internet, would be close to useless without the centralized DNS system, which dictates the address on the network of practically every publicly accessible object on the internet.

My favorite example is still Brian Eno's Music for Airports. On this record Eno set out to create a generative system for music, a way to create music without the rigid control proscribed by western (and most other) music tradition. But to give up control completely is to give up being music at all. Even John Cage, whose 4'33" opens the entire piece up to the audience to create, relies upon the piece being done in a controlled environment. Outside of the performance hall, absent a performer on stage to provide a focal point, the piece no longer is music, is no longer recognizable. Eno, went far beyond this, he carefully curated the sounds going into his record. He went through an elaborate and convoluted process to create longs loops of sound out of rhythmic sync with each other. He hijacked the entire studio space to make the mechanics possible. He gave up control over certain key elements of the piece, the time when any given sound would play, and opened up a vast potential space for variation in the piece, but in order to achieve that liberation, he needed to control most of the process.

From a creators point of view it might be helpful to think not of control, but of self-discipline. Mike Migusrki has a piece doing exactly that, and its quite insightful. As a creator, in order to achieve the freedom to create what you envision in your head you need to achieve a certain mastery of your discipline. Only once you have achieved a certain control over your tools are your free to create what you want. Translated into a networked environment this transforms into a slightly different discipline. Suddenly the tools are shared, in order for information to flow from site to site, system to system a shared discipline must be developed and maintained. This discipline then becomes both a potential means to achieve a freedom and a potential for control to be implemented.

I wrote most of the above last week, before Katrina and its aftereffects disrupted all thought patterns. Since then the Web 2.0 conversation has advanced a bit, most notably with danah boyd's "Why Web2.0 Matters: Preparing for Glocalization". Have a feeling there is plenty more to come too. But overall I have the feeling there isn't really much disagreement. Expansion yes, Web 2.0 is a pretty amorphous thing, but there is something there and everyone wants to finger it. But perhaps the real answer is that old stand by, "all of the above". Or perhaps not, I'm looking forward to what comes next...

Posted by William Blaze at 12:56 PM | Comments (0) | TrackBack

September 04, 2005

Anarchy, New Orleans Edition (bottom up)

The first warning sign I caught was in midst of the Hurricane build up. Can't remember where, but buried in some article was a line about long lines to get into the Superdome, the shelter of 'last resort'. Long lines because security at the door was searching everyone for drugs and guns.

The storm of the century is blasting towards New Orleans and police are busy searching people for drugs and guns, something was ajar, the record skipped a groove. The impact wasn't in yet the storm had not landed, this was supposed to be a story about a natural disaster and the human response, where the hell did the drugs and guns, the search and seizure, where did it come into the picture.

Welcome to New Orleans.

Beneath the jazz history, oil flows and 24 hour drinking establishments, is a city of deeply entrenched poverty, distrust and inequality. Its a city where a quarter of the population lives in poverty. A city where a largely white police force plays enforcer to a population that is 70% black. As liberated as the city may seem to a drinker, its never escaped the shadows of slavery and the equally insidious but far more subtle structures of racism that followed. As in much of the south the Civil War never quite ended in New Orleans. Beneath the Marti Gras facade of the city is a perpetual tension, a poverty that goes beyond economics, a poverty of communication, a poverty of politics, a poverty of trust.

The destruction of New Orleans began long before the hurricane hit. The looting, chaos and armed gangs began long before the levees broke. You could read it in the paper as Katrina approached, a storm is coming and what are the police doing? What they always are doing, searching the population, imposing their will. The city is being evacuated, but the police and general population can never work together in this city, the divides are so deep that they stand up strong and violent even as the levees fall.

In the intensely disturbing days that followed, that as I write this still appear to continue, two news items hit even harder, even nastier, then the rest. One was the stories of New Orleans police turning in their badges, their ties to the community had been severed by the waters, they no longer cared for the city they had sworn to serve and protect. Nothing could be a stronger indictment of just what a wounded community existed in New Orleans, of just how much the police force was their to protect property not serve the people of the city. Perhaps even more shocking and nearly entirely blocked from the news is the fact that troops (Louisiana National Guard?) where blocking the bridge out of the city, preventing thousands from walking out the disaster zone and the Red Cross from coming in. New Orleans had been turned into a prison, a war zone, an area not to be helped, but to be contained. If these reports turn out to be true, so far the only source I've found is of all places Fox New's Shepard Smith, then the story evolves from disaster and into one of crimes against humanity. And I suspect its damn true, I was wondering just why no one was walking out long before that report, and Nola.com was filled with reports of people being denied entry to rescue people at confirmed locations.

What this all builds up to goes beyond just the racism, repression and persistent
low level class warfare at work and into anarchy. Anarchy is a funny word, the mainstream news was full of it for the past few days. Anarchy as chaos, lose of control, the inmates running the prison while the lights stayed out. Anarchists however have quite a different definition of anarchy however, and completely out of step with their philosophy, are rather insistent that others use their definition despite the fact that a vast majority of people use a quite different definition.

My friend tobias c. van Veen provides a good example, in his other wise spot on essay "A Black Rainbow Over Downtown New Orleans", he makes the claim that no, New Orleans is not in a state of anarchy, but rather "the rupture of the facade of global capital". Which is all probably true if one follows one of the rigid definitions of anarchy favored by practitioners, but utterly incomprehensible to those of us who still are aware of word in its common usage. New Orleans was in a state of anarchy after the disaster, a state where the law was absent, a non force, a state of chaos.

What's really interesting to me though is that neither definition of anarchy, the anarchist's own definition or the common more frenzied one need to be contradictory. In fact both anarchies are easily contained within one definition, and both are in reality potential states of one concept, potential states of anarchism.

Anarchy is the social state free of political authority, and in the days after Katrina hit New Orleans is a clear example of what can happen in such circumstances. That "can" is essential though, it does not mean that is what will always happen and in fact there are plenty of examples quite to the contrary. New York after 9-11 is the one that immediately springs to mind, but perhaps Chalmette, Louisiana is even better, a small town seven miles east of New Orleans where the Katrina tied together rather then divide the community.

Anarchy is by its very nature an emergent system. What emerges does not necessarily need to be intelligent or organized, but since there is no direct centralizing force, whatever group behavior exists must be emergent in some manner.* But just how anarchy emerges is not predetermined in any manner, and in fact there are a variety of potential states that it might take. What determines what state anarchy enters into is largely determined by environment, culture and forms of energy circulating within the anarchistic space.

In New Orleans a culture of distrust and borderline warfare was long present in the environment. Poverty, racism and drugs where part of day to day life. As nearly all the white people, along with the black middle class and elite fled New Orleans what remained was largely two groups the helpless and the deeply repressed. Free of the persistent police presence, hungry, lacking water, plumbing and electricity anarchy emerged. Some of the anarchy was people breaking into stores for food and water. Some was people breaking in to obtain those material goods they never obtain in the political and economic climate that was New Orleans. And some of it was just plain people breaking. Pains and pressures snapping into the form of rapes, beatings and bullets directed at the police.

It was all there and apparent as the Hurricane approached. The police officers slowly and intensely searching every person as they entered the Superdome seeking shelter clearly illustrated the failure of this community and the vicious environment constructed to keep it that way. This was a community already at war, a long drawn out police action of a war. A community without trust. These are the force that directed the emergence of anarchy. The forces that pushed the anarchy towards its violent emergence, its most tragic form.

Anarchists, expect perhaps a few lunatics, want no part of this sort of anarchy, and in fact will go to great measures to redefine anarchy to exclude these realities. But in fact the anarchies of the anarchists are merely other potential states of the exact same anarchy that New Orleans produced. Far more positive potential states, and ones that can be glimpsed at in places like Chalmette during this disaster. There residents ignored by authorities for six days distributed food via boat, did their own rescuing and created their own shelter. Just as in New Orleans it was anarchy, the absence of political control, the parish officials had fled. But a very different state of anarchy, guided by an environment not nearly as oppressive as New Orleans.

Just who is responsible for the various police actions around New Orleans is still pretty clear, but its becoming evident that the various government agencies at work went out of their way to ensure the anarchy of New Orleans would be pushed towards a negative not positive state. The searches at the Superdome where just the prelude. The combat operations, "little Somalia" approach of the US Army was the most over the top. Most odious and damaging though was the sealing of the city, the turning of the city into a prison where people could not walk out. Volunteers with boats where turned away, people with confirmed locations could not enter to pick up relatives and friends. Even the Red Cross was kept out. The government it seems was far more concerned with containing the poor of New Orleans then in solving any problems. Its not a new story, its merely a wretched retelling of the same foul story of slavery in America and lord its not pretty. Its a story that will get told again and again too, perhaps never with the same catastrophic energy of Katrina pulsing through it, perhaps never with the same media attention, but the same old story, same old tragedy once again.






* This it should be noted gets directly at one of the biggest confusions surrounding emergence, there is a massive difference between an emergent intelligence, an emergent system and an emergent property.

Posted by William Blaze at 12:07 PM | Comments (3) | TrackBack

August 31, 2005

Libertarian Disasters (bottom up)

Jared Diamond has been asking a question for years. What where the Easter Islanders thinking when they cut down their last tree? If New Orleans is any guide then answer was that they were too busy looting to notice much.

Managers at a nursing home were prepared to cope with the power outages and had enough food for days, but then the looting began. The Covenant Home's bus driver surrendered the vehicle to carjackers after being threatened.

Bands of people drove by the nursing home, shouting to residents, ''Get out!'' On Wednesday, 80 residents, most of them in wheelchairs, were being evacuated to other nursing homes in the state.

''We had enough food for 10 days,'' said Peggy Hoffman, the home's executive director. ''Now we'll have to equip our department heads with guns and teach them how to shoot.''

That's the saddest reminder of how low humanity can sink when things go bad, although Diamond pointing out how the Easter Islander's diet increasing consisted of humans as their society fell just might beat it. It leaves me wondering what the libertarian response to this disaster might be. That the government is actually impeding the repairs, the market would have fixed the levee faster? That looting is better called the "competitive redistribution of goods", and is actually a good thing? Or that if every nursing home aid carried a gun things would have turned out different?

I've been addressing these issues in some very different contexts in the various "bottom up" posts. Well New Orleans is at the bottom, in more ways then one right now, and it will be interesting to see what happens. And these early reports sound more like warfare in the Congo then the sort of beautiful emergence that free marketers and high tech libertarians love to fantasize about. None of this comes much of a surprise to me as I've long been arguing that emergent systems don't just emerge out of the ether. When they do occur they occur in very particular environments.

Markets (and no market is ever really "free") work in civil societies. They tend to fall apart in the face of guns, to the point of non existence in again the Congo, or to the point of deep corruption as in the mafia markets of Russia. Out of all the animals in the world only a few display the sort of emergent intelligence of ants or termites. Occasionally such as in elephant stampedes, humans rioting or perhaps the mythical lemming mass suicides some animals display behavior that's a bit more like emergent stupidity. The point being that emergence is not nearly the simple thing that some would make it out to be. Books on the subject naturally focus on the occasions where it works, but in the process they give a distorted idea of how often they don't work. Which in term leads to fans of the concept having completely unreasonable ideas of how to go about getting that magical self organization to happen.

Self organizing and self regulating systems are fantastic creature, but they take real effort to make happen. The environment needs to be right. For a market that means a stable trusting society with a surplus of goods and a standard of equable exchange. For a community to self organize to prevent looting I suspect you need a sort of cohesiveness, social equality and absence of poverty that just doesn't exist in New Orleans, a city rife with centuries of unresolved social tension. Rather then chaos theory down in Louisiana, instead we get a bit more traditional style of chaos, and no its not nearly as pretty as say a Julia set.

update: I wish I never wondered what the libertarian response to the hurricane was, cause it just made me a bit iller. Over at Reason, probably the premier libertarian blog, the only hurricane post out of nearly 50 in the past 3 days is entitled "Hurricane Bullshit". And its a rant against global warming and the Kyoto accord. Main source? That most reliable of them all, the guy who wrote the book predicting the Dow Jones average would hit 36,000 in 3-5 years. He wrote it oh about 6 years ago...

Posted by William Blaze at 09:01 PM | Comments (1) | TrackBack

August 29, 2005

Tags (bottom up)

Tags:

- Tags are not organizational innovation they are an interface innovation. The difference between a tag and a category is non existent except that the interface threshold to create a tag is so low that people actually do it regularly.

- How long before someone ads subtags to tags? You know like subcategories, like structured information... You know it might actually be useful.

- According to Clay Shirky tags are semi-structured data, but in reality all structured data is only semi-structured. And semi structured data by definition is of course structured. A distinction between highly structured data and semi structured data is workable, but the borderline between the two is murkier then the Mississippi delta in a hurricane.

- Clay seems intent on framing tags like its a war. But what is he warring against? Its a war on an idea, on an ideal, the vision of a unified and complete structure for data. Why someone would want to wage a war on such retarded and impossible idea is a bit beyond me, but I suppose Clay has spent a bit more time with librarians then is healthy.

- Is Clay's bold statement that "classification schemes are going to be largely displaced by tagging" is really, as he himself puts it, "unreasonable"? More like redundant, note the plural in "schemes". All that statement says is that "classification schemes are going to be largely displaced by more classification schemes".

- Make no mistake about it, every "tagger" is creating their own classification scheme, no matter how sloppy it may be.

- Similarly its a bit funny that main people in this debate seem to be Clay, Peter Merholz and Gene Smith, three of the bigger tag proponents around...

- There is clearly something getting lost in the noise here. Perhaps some clarity can come by looking at it not as an issue of how people add metadata to information, but as how people (and machines) navigate information. There is organized navigation, searching a card catalog for instance, and there is algorithmic navigation, say entering term in Google.

- Online this roughly corresponds with clicking on a link versus typing a term into a search box.

- The "I'm feeling lucky" at Google is about a pure an algorithmic navigation as there is. The standard Google results however use an algorithm to generate structured data, an ordered list of terms.

- Tags are structured data, but by lowering the threshold of creating structured data, that is increasing the shear amount of it, the utility of the structuring decreases. At the same time though the increased structured data increases the usefulness of algorithmic navigation.

- So if tags must be a war (and they don't) then it is the algorithm makers who stand to gain the most and the organizers who stand to lose the most.

- Tags are not a war not because algorithmic and organized navigation can peacefully coexist, but rather because their existences are inexplicably intermeshed together.

- Google is a great example, in some ways it is the triumph of the algorithm, yet it's very existence depends upon high structured data. Without the DNS system Google would be worthless. Without html standards Google would be worthless. Imagine if each web page had its own definition of the anchor tag, Google would be worthless. Or if there was no standard way to declare a language for each page. And lets not even get into the fact that the best results in Google are often pages that are directories or in other ways feature highly structured data.

- In light of this Clay's claim that "search has largely displaced directories for finding things" is a bit silly. The two just can't be separated with any neatness.

- And yeah someone should tell Clay "market populism" and "libertarianism" are the exact same thing, I'd send an email, but I think this piece has probably damaged my grades enough as is...

Posted by William Blaze at 12:45 PM | Comments (2) | TrackBack

August 27, 2005

Web 2.0

Are the internet hypelords getting a bit tired? There's this funny whiff of déjà vu that comes along with the latest and greatest buzzword: Web 2.0. Web 2.0? Wasn't that like 1995? Don't they remember that Business 2.0 magazine? Or remember how all the big companies have stopped using version numbers for software and instead hired professional marketers to make even blander and more confusing names? I hear "Web 2.0" and immediately smell yet another hit off the dotcom crackpipe...

But perhaps that's a little too harsh, while Web 2.0 might have emerged in a large part from tech publisher O'Reilly's PR, underneath it is a real feeling among some that there is something going on that makes the web of today different then the web of a few years ago. Blogs, open standards, long tails and the like. The most concise and clear definition I've found is Richard Manus', " the philosophy of Web 2.0 is to let go of control, share ideas and code, build on what others have built, free your data." Which of course doesn't sound that different then say the goes of the plain old unnumbered "web", back ten years ago. But the Web 2.0 are right, the web is different now, but the big differences aren't necessarily found in those prosaic "information wants to be free" ideals, which actually stand as one of the biggest constants in web evolution.

What really separates the "Web 2.0" from the "web" is the professionalism, the striation between the insiders and the users. When the web first started any motivated individual with an internet connection could join in the building. HTML took an hour or two to learn, and anyone could build. In the Web 2.0 they don't talk about anyone building sites, they talk about anyone publishing content. What's left unsaid is that when doing so they'll probably be using someone else's software. Blogger, TypePad, or if they are bit more technical maybe WordPress or Movable Type. It might be getting easier to publish, but its getting harder and harder to build the publishing tools. What's emerging is a power relationship, the insiders who build the technology and the outsiders who just use it.

The professionalization of the web has been a long and gradated process. The line between amateur and pro didn't exist at the dawn of the web, but over the course of the years, over the course of new technologies, a gap appeared and it continues to widen. There have been web professionals for a decade now, but where as the distinction between a pro and an amateur was once a rather smooth one, it is now a highly striated one. Early html took an afternoon to learn. Simple javascript, early versions of Flash, basic database usage, php, these are things that took a motivated but unexceptional individual a weekend to learn. All it took to transform into a pro was a weekend, a bit of drive and the ability to sell yourself to an employer. This is smooth separation.

Its 2005 now Ajax, the latest and greatest in web tech. If you want to build an Ajax site, you have two real options, be a professional or hire a professional. I'm sure there a few people out there who could teach themselves Ajax in a weekend, but they would have to be exceptional individuals. You can't just view source and reverse engineer Gmail or Reblog. You need to be a professional programmer who understands web standards, databases, CSS and dynamic html... These are apps built not just by pros, but often by teams of pros. The difference between a professional and amateur is no longer smooth, but striated.

The Web 2.0 is a professional web, a web run by insiders. In the larger space of the software industry as a whole these are still young brash upstarts pushing a somewhat radical agenda of openness and sharing. In contrast to the agenda's of old line software companies like Microsoft and Sun, AOL and Oracle, the Web 2.0 actually merits some of its hype. The world of RSS feeds, abundant APIs and open source code really is a major departure from the "own and control" approaches of an earlier generation of companies and something I'm personally in favor of. But just how open are these technologies really? And just how many people do they empower? Take a close look and Web 2.0 looks a bit more like a power grab and a bit less like a popular revolution.

Like the proponents of "free" markets, the pushers of Web 2.0 seem to have a quite an idealistic idea of just what "free" and "open" are, and how systems based around those concepts actually function. Peter Merholz is perhaps the sharpest and most thoughtful of Web 2.0 evangelists and his essay "How I Learned To Stop Worrying and Relinquish Control" just might be the best argument for the Web 2.0 philosophy around. But its also paints a radically misleading picture of what it means to "relinquish control". For relinquishing control doesn't just mean letting go, losing control, it actually means controlling just how you let go.

Netflicks is a great example. Merholz talks about how the company success revolved around giving up on late fees, unlike traditional video stores they did not control how long a customer could keep a video. A smart move for sure, but they didn't just relinquish control, but instead opted to control several other key factors. They gave up control on the length of the rental and instead opted to control how many videos a customer could have at any given time, and take control over the final decision as to what video a customer would get. Netflicks isn't giving up control, they are exchanging it, they built a highly controlled system in which enabled them to allow certain vectors, namely the length of video rentals, to fluctuate freely.

What Amazon.com's customer reviews, which Merholz prominently cites as an example of a company relinquishing control to its customers. And indeed if you write a review there is a good chance your words will show up in Amazon's page for the book. Amazon will cede control of that small section of the page to you. But just how much do they really give up? In submitting a review the reviewer grants "Amazon.com and its affiliates a nonexclusive, royalty-free, perpetual, irrevocable, and fully sub-licensable right to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, and display such content throughout the world in any media." Even then Amazon requires you to follow their review guidelines and delays the publication for 5 to 7 business days, quite possibly so that they can review the review in some way. Once this is all done the review is then placed on a page that Amazon has complete control over the layout. The reviews go near the bottom, well "below the fold". So just how much control has Amazon given away? And just how much have they gained back in return?

At the technological core of the Web 2.0 ideology is another innovation that Amazon has been a early leader in, public APIs, or Application Programming Interfaces. APIs are tricky concepts to grasp, they are essentially ways in which on computer program can talk to another, or one part of a computer program can talk to another part. Until recently, until Web 2.0, talking about public APIs basically meant talking about computer operating systems. Most APIs where private, things that development teams used to build complex systems of interlocking programs, things like Amazon, Ebay and Google. Amazon and Ebay in particular have quite complex relationships with a certain subset of their customers who happen to run businesses that rely in part or entirely on using Amazon or Ebay services. Amazon has affiliates and zshops, while Ebay its power sellers and eBay stores. I haven't been able to track down a good history of public web APIs, but I suspect Amazon and Ebay released theirs mainly as a service to their power customers, as a way to help these customers make them even more money. Google on the other hand released its public API mainly as a geek toy, not as a revenue source. The sort of action that makes Web 2.0 devotees ecstatic. The public API is a way to share data, allow independent programmers to build their own applications using information collected and sorted by the likes of Google and Amazon, and allows users to access this data in any variety of ways not fully controlled by the data holder. The public face of the public API is that of openness and sharing, of relinquishing control. Look a bit behind that facade though, and once again, we find yet another system of control.

A public API is not what a companies internal developers are using to extend their systems. It doesn't give you full access to the data or full access to the functionality of the system. This is often a good thing, as an Amazon customer I'm quite happy that the Amazon public API does not include access to credit card data or purchasing habits. Despite all the Web 2.0 hype about open data I've never seen anyone argue for companies sharing this info. But the limits on what can be accessed via a public API go far beyond just protecting confidential user information. In fact the company creating the API has absolute control over what goes into it. They maybe giving up a degree of control, but they are controlling exactly what that degree is.

A company that allows you to access their databases and applications via an API is clearly more open than one with no API at all. But the API is also instrumental in establishing an asymmetrical power relationship between the API maker and the user. The user is free to use the API, but the creator has control over just what goes into the API. In addition the use of the API is almost always governed by a license restricting just how free a user can be with an API. Google's API for instance restricts the number of "automated queries" to 1000 a day. This essentially means that it can be used to prototype an application, but not to create any sort of commercial use beyond the smallest of scales. But just in case the license also clearly prohibits any commercial use at all. Is this a way to free the data or a way to implement another level of control over it?

Any user of a public API runs the risk of entering a rather catch-22 position. The more useful the API is, the more dependent the user becomes on the APIs creator. In the case of Ebay sellers or Amazon affiliates this is often a mutually beneficial relationship, but also inherently unbalanced. The API user holds a position somewhat akin to a minor league baseball team or McDonald's franchisee, they are given the tools to run a successful operation, but are always beholden to the decisions of the parent organization. You can make a lot of money in one of those businesses, but you can't change the formula of the "beef" and you always run the risk of having your best prospects snatched away from you.

There is another asymmetrical relationship at work in the public API system, an asymmetry of data. The public API rarely, if ever, gives full access to the data and the way an internal API can. Even the most open of public APIs will not give access to stored credit card numbers and passwords, at least not intentionally. Often though the gap between the two systems is far greater. Google's public API for instance allows you to do searches and dictionary lookups, but doesn't give access any of the data mining functions at work in Google's internal system. You can't use the API to find out what terms are searched for more, what sort of searches are originating from a particular address, or what one particular user (based on Google's infamous 30 year cookie) has searched for over the past year. That sort of datamining is reserved for Google employees and their associates. And not only is the API user denied access to much of this information, they also are gifting Google with even more data from which it can extract data. With every public API call the creator gives out information it already possesses, while gaining a new piece of information back, information on what people are interested in.

At the core of the API is a system of control, the API creator has a nearly limitless ability to regulate what can go in and out of their system. And it is precisely this system of control that allows the API to set certain vectors of information free. In Google's case the ability to obtain ranked search results, definitions and a few other factors. In Amazon's case its book data, images of the cover, author names, titles, prices, etc. Ebay's lets you build your own interface to sell via their marketplace. Flickr's lets you search photos. In no case does the public API give full access to the system. You can't find passwords, credit card info, users addresses, all of which is a good thing. Nor can you find much info on what other API users are doing, or what the people using the standard web interface to these systems are doing. Often the volume of your activity is restricted. Often access requires registration, meaning not only is the use of the API monitored, but its also possible to associate that activity to a particular individual. By design, and perhaps by necessity an API privileges the creator over the user.

Privilege is what the Web 2.0 is really about. What separates the Web 2.0 from that plain old "web" is the establishment and entrenchment of a hierarchy of power and control. This is not the same control that Microsoft, AOL and other closed system / walled garden companies tried unsuccessfully to push upon internet users. Power in the Web 2.0 comes not from controlling the whole system, but in controlling the connections in a larger network of systems. It is the power of those who create not open systems, but semi-open systems, the power of API writers, network builders and standards definers.

More then anything else the paradoxes of Web 2.0 "freedom" then the open standard. Open standards are freely published protocols that people voluntarily agree to comply with. Standards like html (for publishing web pages), css (for controlling the look and layout of webpages), rss (for subscribing to information feeds) and jpeg (for compressing and viewing photolike images). These standards are not nearly as open as their name might imply. Sometimes they are created and run by corporations (Adobe's pdf format), sometimes by nonprofits (the W3C which governs html standards), sometimes like with RSS there are public fights and competing versions. Implementing changes to an open standard at the very least requires considerable political skills, one can easily make their own version of a standard, but unless they can convince others to adopt their version, its not a standard at all. It is only by gaining users that a protocol gains potency, and to do so the standard itself must be politicized, and frequently institutionalized.*

The real hook to the freedoms promised by the Web 2.0 disciples is that it requires nearly religious application of open standards (when of course it doesn't involve using a "public" API). The open standard is the control that enables the relinquishing of control. Data is not meant to circulate freely, its meant to circulate freely via the methods proscribed via an open standard. In order to relinquish control over the data one first must establish firm control over how that data is formatted and presented. An action that increasingly requires the services of a professional, whose involvement of course adds another layer of control. This is the world of the Web 2.0, a world of extreme freedom along certain vectors, extreme freedom for certain types of information. It is also a world of hierarchies and regulations, a world in which a (new) power structure has begun to establish and stratify itself.

If we return to Peter Merholz's essay, this can be seen rather clearly. It's title indicates its about him giving up control, but of course its really an argument that others should give up control. But where should this control go? How should it be done? This is, in Merholz's words, "a scary prospect". In the end he's not just arguing that companies should relinquish control, rather he's arguing that they should relinquish control over to him, his company Adaptive Path, and others that share their philosophy. Reliquish control over to the professionals, those that know what they are doing, know how to control things on the internet.

None of this should in anyway be construed as a critique of the Web 2.0, rather it is a critique of those who push one-sided visions of what the Web 2.0 is. If pushed into an oversimplified judgment I would come out solidly in favor of public APIs, open standard and circulation of information along the passages these systems create. But these transformations do not come unmitigated, they do not come without hooks and catches. In many ways Web 2.0 is just another revolution. Like many revolutionaries the leaders of the Web 2.0 make broad promises of empowerment for their supporters. But history shows time and time again that dust clears and the dirty battles washed away, it is the leaders, the insiders, that are by far the most empowered. At its heart this is the Web 2.0, a power grab by the internet generation, the installation of a new power structure, a new hierarchy, a new system of control.









*for a much more detailed exposition on the standards process and the issues of protocol see Alex Galloway's .

Posted by William Blaze at 04:59 PM | Comments (11) | TrackBack

August 15, 2005

The Power of Nightmares (bottom up)

Finally got around to watching The Power of Nightmares, or more accurately the final installment of the three part series. This BBC documentary is something of a fetish object among American Leftists, spoken about in hushed reverent tones as an object that will unveil the hidden truths. "Have you seen the Power of Nightmares? You must see the the Power of Nightmares". The object itself circulates via transcript and torrent, a little googling and you too can be an initiate...

Criticism often says as much about the critic themselves as it does about their target. Director Adam Curtis also directed a four hour documentary on Freud and his followers, so he surely must be aware of that fact. So is the autocratic tone of this film a deliberate maneuver or an unintentional slip on Curtis' part? This is a movie about politicians manipulating facts, but Curtis seems intent on mimicking them. Rather then raising questions it dictates an alternative history. Its clearly a successful tactic, but for me at least it deftly undercuts the purpose of the film. Is Curtis deliberately copping the style? Unconsciously aping it? Or is projecting his own paranoia and monomania onto his targets? Regardless of the truth, it makes the film a bit hard to take seriously, both Curtis and his targets want to tell stories without questions, when in reality the facts at hand are rather uncertain.

The most powerful and effective parts of the documentary where simply the clips of Bush and Rumsfeld selling the war. That they grossly distorted the facts shouldn't come as any surprise to just about anyone who has followed the story in any detail, but watching them in action with a few years of hindsight is quite revealing. These are characters who understand the power of authority and how to put it on television, and the left it seems has no counterpart, with perhaps the exception of director Curtis himself. During this build up the left was busy, working the web, trying to be bottom up, protesting in the streets. Some old ineffective tactics, some new ineffective tactics. Even with online fundraising a new effective tactic. But all the while the right kept pushing the tried and true, get on TV and say it with authority.

The more I look at it the more the rhetoric of emergence, "long tails", and "bottom up" begins to resemble a far older idea, divide and conquer. Only this time the dividing is self inflicted, praised even. That not to say I'm here to blanketly dismiss "bottom up", there is far to much unknown, and too much potential, to do anything of the sort. But until these theories come face to face with concept and application of power, they seem doomed to a particular ineffectiveness. In other words, a nightmare.

Posted by William Blaze at 10:45 AM | Comments (1) | TrackBack

August 02, 2005

The Long Tail (bottom up)

The latest and greatest bottom up hype is a concept called the Long Tail and its main booster is Wired magazine's editor-in-chief Chris Anderson. Following what's fast becoming bottom up proper protocol Chris has a blog and its devoted to turning the Long Tail into a book. He's a smart writer and its an interesting read as he knowledgeably tells tales from one could call the emerging networked culture. But something has always gritted on me and to understand just why its worth looking at far less digital topic, abortion in America.

The debate over abortion in the US is a strange sort of conflict. On one side you have "pro-life" and on the other side you have "pro-choice". No one its seems is anti anything. Barring perhaps the radical fringes, you don't find pro-choice protesters talking about how they want to deny women the right to make decisions, nor do you find pro-choice activists talking about how they want to kill babies. The two sides are locked in a deep conflict, but they aren't even arguing about the same thing! Or at least not over the same concepts, they are of course battling over the same action. And they are battling over how they want people, society as a whole even, to look at that particular action.

The concept of the Long Tail comes from a reading of another trendy idea in the world of technology intellectuals, the power law distribution. Power law curves show situations of profound inequality, most famously perhaps being Vilfredo Pareto's observation that 20% of all individuals in a society general control 80% of all the wealth. That was a century ago, and it still holds true. More to the point though, power laws have come into vogue and people are finding them everywhere, especially where networks are involved. The long tail refers to the "tail" of the curve, the 80% of the people making 20% of the money.

Now there is a hell of a lot going on in this area, and it makes Chris Anderson's site quite an interesting read as he details the ins and outs of the information and entertainment businesses reacting to the massive network that is the internet. But the long tail, is not a neutral description, rather much like the stances of both sides in the abortion debate it is a deep ideological one. Much the way the abortion warriors are fighting to control the terms of the debate, the long tail is about controlling what the power law distribution is about. "Pay no mind to the 20% with all the power, what's really interesting is what's happening over here under this long tail..."

There is a huge philosophical issue at stake for those who are best termed the technorati, the boosters of high tech and networks roughly clustered around Anderson's Wired Magazine. In this circle an awful lot of hope and thought has been invested in the idea that the internet and other 'open' networks are a democratizing force. The belief that this is true underlies the much of the moral framework that the technorati in. It gives them faith that they are doing the right thing. The discovery that networks tend to develop quickly into situations of inequality, situations that tend to map towards the very 80-20 power laws that characterize the vast inequalities of wealth and power the internet was supposed to route around, this discovery slices straight to heart of any faith in the democratic power of the internet.

In many ways the long tail resembles a classic magician's slight of hand. A big distraction to call one's attention away from the relevant actions. Suddenly power laws are not illustrations of inequity, but ways to call attention away from it. But its increasingly clear that internet is not a massive democratizing force, but rather a standard transition of power. Sure some of the classic late 20th century media powers might fall, there is way more TV to watch and blogs hit hard at the newspaper and magazine models. But rather then having the power law curves fade we just have new powerhouses moving in. Google, Microsoft, Yahoo, Amazon and the like. Its early on, expect new powers to arise and others to merger. But when the smoke has cleared who wants to bet that the top 20% are still making 80% of the money, the top 20% of sites grabbing 80% of the traffic?

Imagine a medieval lord showing off his serf's vegetables. Talking about how he's empowered them with growing opportunities. Its not then what you'll find on Chris Anderson's blog, except Anderson isn't the lord, just some servant nicely entrenched in the court. The long tail is cast as vast practice of empowering users, freeing them perhaps from the clutches of old media. But ultimately the site is not about freeing anyone, its about seducing and capturing those users, its about building new media kingdoms where the users trade amongst themselves while the technorati lords reap a tax off of every harvest.

Sometimes the tax straight monetary. Ebay is the classic example, like a casino they take a cut out of every transaction. And while Ebay might just be "empowering" thousands or millions of small business people, one wonders just how much more empowered the high ranking Ebay execs and investors are then the average Ebay seller? 80/20 maybe?

Often though "long tail" business is more about information and Anderson stresses the importance of filtering to these businesses, which is spot on. But what he misses is just how asymmetrical the filtering is. Businesses like Amazon, Yahoo and Google filter massive information and then send it back to their users. But they also keep large amounts of the information for themselves and their business partners. Sure they'll give you a slice of what you have, a chance to till some of their information, but in the end they are the lords of their domains, opening what they please (and what benifits them) to the long tail.

I'm pretty certain Anderson and most of his fellow network/technology boosters are not concious of the fact, but there is a strong undercurrent of a power grab to their beliefs. The rhetoric speaks of democratic revolutions that empower everyone, but the reality is that its about empowering a particular set of people. The ability of the internet and its related technologies to upset certain industries, communications systems and political structures is becoming more documented fact and less theory, Anderson's site is great at illustrating some of this pattern. But the particulars of who gains, and more importantly who does not, are far less commented upon.

Does networked technology benefit everyone? Or does it benifit only those who have the access, knowledge and will to use it?

Posted by William Blaze at 11:38 AM | Comments (0) | TrackBack

July 02, 2005

Dark Star Safari (bottom up)

I use Amazon's wish list feature not to wish but to remember. Its a viciously effective form of enhanced memory, any book you ever noticed can get entered into a corporate database and linger. I've long since forgotten why and how Paul Theroux's Dark Star Safari ended up in my wish list, but technology it seems occasionally works and I wound up walking out the library with a book I wanted but had no recollection why.

Theroux funnily enough wrote the book precisely to escape this sort technology. Overland from Cairo to Capetown is the subtitle and its a journey that purposely took Theroux some of the most forgotten and dangerous parts of Africa. Theroux wanted to disappear, to be unreachable, no phone, no email, just gone. He had been a Peace Corp worker in Malawi in the 60's, kicked out helping a political dissident escape the country. Now a successful travel writer and novelist this was his return journey.

The Africa Theroux finds is far worse for the wear, although its never 100% clear if this a function of him being a cranky aging asshole he makes a pretty good case. And fitting of the times (the book was published in 2003) what comes out is a 'bottom up' argument, although Theroux seems to prefer the term 'bare-assed'. Its an argument Theroux borrows from Graham Hancock and Michael Maren, authors of two anti aid books, and makes part of his character. Aid doesn't work is the line, the money goes into everyone's pockets except those that need the aid, and when aid does show up it just leads to dependency among the recipients.

Its a classic anti government argument, too corrupt and too slow the learn from mistakes. In Africa it may well be spot on, Theroux certainly is won over to the line. But half his argument seems to stem from the fact that the aid workers in white Land Cruisers never pick him up on the side of the road. The other half is interesting though, and fuels the stories that make this book quite an entertaining read. Only in the deep country, the bush mainly, does Theroux find the honest Africans he seeks, the cities in a classical theme are pits of corruption and thieves, the relief heavy countryside the same. That's a pretty blunt reading, but their is little subtlety to Theroux's opinion, he pushes to the back country to find what he wants, never it seems really pushing to find the urban upsides. It makes for a good set of adventure tales that way, dugout canoe down the rivers, "chicken bus" death trap rides, dodging "shifta" gunshots. "There are bad people out there".

All the adventure and gusto that launch the journey begin to twist turn and fade as Theroux gets deeper in and more disillusioned. By the end he's riding luxury South African railcars and describing his first class dinners. Like the corrupt politicians he rails against he's quite happy to leave African's "bare-assed". This is bottom up thinking at its lowest, "sink or swim". Theroux gets their by being burned, the school he taught at 30 odd years before is a decaying wreck, and the people suffering harder then his memories.

"Sink or swim" is also a favorite of a breed of conservatives, the pro business libertarians of America come to mind. One wonders what would happen if they where left in the midst of a dark star safari, would they see only corrupt governments, or would they realize just how much their prosperity depends on the stability of civic society? Are the tribal warfare and massacres that mar the worst of the news from Africa bottom up or top down? What could be more bottom up then baling out and letting people figure things out for themselves? Its exactly what Theroux advocates for Africa, but has it ever worked else where. Does running away make history disappear? Or does history just disappear when people's lifespans drop to African levels. The Africa of Theroux's 60's experience it seems is almost gone, but to get it back, and why for that matter remain unanswered.

Posted by William Blaze at 12:28 PM | Comments (2) | TrackBack

June 29, 2005

Music For Airports (Bottom Up)

One of the earliest heads on the bottom up bandwagon was Brian Eno, the seminal music producer. In the late 1970's Eno produced a disc called Music For Airports, and set about pushing the idea of "ambient" music.

Last fall Eno rolled into NY and gave a talk about that album. The inspiration was John Conway's Game of Life. Conway is something of a patron saint to the bottom up evangelists, the Game of Life is a set of simple rules, that when run on a computer create an variety of patterns on a computer screen, patterns that display a degree of self organization. The idea of a simple rule creating complex results is bottom up nirvana, and quite a few people it seems are capable of reading a lot more into Conway's game then what it is, a bunch of pixels moving on a computer screen.

Eno's big thought, motivated, god bless him, by laziness, was to create a system for producing and endless variety of music. It worked via loops. Create a loop of sounds, play it. Create another one that is out of sync with the first, play it. The possible sounds multiply, the progressions evolve. More sounds equals more possibilities, its simple exponential math, pretty soon your out-of-sync loops will be capable of far more potential sequences then anyone could ever listen too.

On today's computerized sequencer and digital playback devices this is an extremely easy process. Eno played a new piece on a set of boomboxes with their cd players set on shuffle. He clearly took pleasure in the arrogance of the act. Back in the 70's though it took a bit more effort, and Eno's studio wizardry certainly calls into question his brags to extreme laziness. To make Music For Airports Eno spliced together reels of analogue tape. The loops where measured in yards or meters, he wave them through the studio furniture and across the room, a labyrinth of recording tape. Miles of it perhaps, woven together with extraordinary effort.

What Eno produced was a generative system a means of producing music a degree outside his control. But the operative word there is "degree", a generative system is still a system of control. In order to make his project work a huge amount of direction and control was necessary. The sounds on those tape loops where all carefully created, captured and curated by Eno. Beautiful sounds. The tape loops where carefully threaded through the studio, the machines turned on, adjusted and manipulated by professionals. The recording then EQed and mastered by more professionals. In order to make a record that sounds great, the way many thing Music For Airports does, Eno put in a lot of directed energy and controlled almost all of the process, or at least delegated control to a pro. Control was only surrendered on one prominent vector, that of the syncing of the various loops.

The system used to create Music For Airports is not bottom up at all. True bottom up music for airports gets made constantly by the travelers and airport workers themselves, random and generally unmusical. Music for Airports on the other hand is meticulously crafted for control to be given up over one particular aspect of the process. It is a system of control designed to allow a selective loss of control, a selective randomness. A generative system.

This is key to understanding what's really going on in "bottom up" phenomena, in the markets, in squatter villages, in ant colonies, in design, in filesharing, in the streets and in the news. The rhetoric of bottom up has little to do with the reality of action. What gets pitched as bottom up can often have its own top down, and maybe its necessary for it to function. What is really interesting is not the "bottom up", but rather the relationships and interactions between the "bottom up", the situations where control is let go, and the "top down", those situations were control is retained and directed. This is the process of generations, of creation, of interaction and progression. Not top down, not bottom up, but both and neither together, working.

Posted by William Blaze at 01:28 AM | Comments (1) | TrackBack

Bottomed Up

"Bottom up", if there is one intellectual theme to this moment in time, buttom up it is. The Wired magazine hyper-capitalists chew it up, as do the neomarxist empire theorists. In science it takes the form of complexity theory and its more pop predecessor chaos theory. In politics its Howard Dean, MoveOn and Michael Moore, but more importantly John Kerry and the Democratic powerbase got hip to the kool aid quick and stole as many of the techniques as they could. In the media its weblogs and "long tails". On Wall Street and in neoclassical economics its about markets and believing in them. A lot of motherfuckers talking about "bottom up" thinking, as opposed to top down of course.

This post is likely the first in a series, I kept on reading books that begged to be tied together in a "buttom up" post, but it soon became clear there where far to many books, the post would need to become posts. Is that a top down decision, me deciding to break up the posts into sections, or is a bottom up decision, the multiplication of books forced me to change tactics? Or maybe, just maybe its sort of dumb to try and look at everything that way...

I'm not sure where the concept and phrase first emerged, but I'm guessing politics or management theory. In these contexts, in places where formal organizational hierarchies are the norm, it actually makes sense. A top down decision comes from the top of the hierarchy, and bottom up emerges from the "workers", from the depths of collective action.

The party line is that bottom up is good, top down is bad. Freedom versus control, collective intelligence versus ego driven power moves, markets versus central planning, linux versus microsoft. The reality is that it makes no sense. Bottom up is a catch phrase for a half formed idea. You can find the idea fully formed in a multitude of manifestations, and they ain't all good, and they sure as hell are not all the same either.

The plan then, the maneuver, is to bob, weave and parse through the bottom up landscape and emerge with some genuinely useful concepts, stay tuned and we'll see how it goes...

Posted by William Blaze at 12:27 AM | Comments (2) | TrackBack
blaze fist