The Internet as a Commons

One of the advantages of a long intercontinental flight is the uninterrupted time available to read and think. Three seemingly unrelated authors and the puzzle of data hogs and net neutrality started to become connected after a couple of hours and a short nap. If only not to forget the line of thinking, here is an extensive post.

There is a growing interest in the work of Elinor Ostrom (Nobel Prize winner economics) and her husband on social institutions: methods of cooperation between people that cannot be characterized as “state” nor “market”. Lately I have been wondering if one could approach the issue of congestion in oversubscribed backhauls ( aka “bandwidth hogging”) or the issue of Net Neutrality as a particular instances of “commons” that could be ruled by social institutions.

A commons problem is defined by her as how to arrange for a long term productive use of a limited  resource system, like ocean fishing.

The use of a limited resource backhaul connection (connecting a large number of subscribers to an interconnection point) can be viewed as a commons problem in my humble opinion.

A Net Neutrality debate pertains to the commons of the Internet. In my world view the “Internet” is a commons created by the voluntary cooperation of autonomous networks in exchanging traffic to and from all destinations.

Ostrom explains in her book “Governing the Commons”  how 3 influential ideas have led to the view that either the State as power center or the Market (private property and free trade) are the solution to commons problems. But the empirical reality is that neither the State nor the Market have been uniformly successful in long term productive commons arrangements.

She points to the issue that the “State” solution is prone to erroneous decision making, creating worse situations than before. If enforcement fails the ravaging of the resource is almost guaranteed, exactly the opposite of what is the goal. The “privatize everything and free the market” solution creates costly transactions, inefficiencies because of the fragmentation and costly governance structures, susceptible to capture by the largest players that can influence the external governance to bend in their direction. And in many cases it is impossible to effectively cut up the commons in pieces.

She shows that voluntarily commitment to a strategy and its enforcement (a social institution) is a third option that can be successful in governing a commons. A number of rules for success have been identified by her. One is that social institutions can be nested in greater social institutions: a good example is the exploitation of groundwater basins on the West Coast of the USA.

Van Asseldonk e.a. have written several papers on the relationship between (economic) network morphology (TVA_morphology.PDF) and entropy ( TVA_configuration.PDF).

Networks in economic sense are seen as a cooperation between actors with their individual competences to create higher productivity than they can achieve individually. Networks are characterized by both their interconnectivity (number of links between actors) and their concentration (distribution of links over actors). The combination leads to a mathematical calculated level of entropy, a level of flexibility and/or chaotic behavior.  Too little entropy and the network cannot adapt to a changing environment. Too much entropy and the stability is gone, every perturbation leads to erratic changes and loss of energy.

The remarkable observation is that in natural occurring complex systems a certain entropy level seems to be optimal, leading to a concentration distribution that is described by the famous Power Law.  Barabasi has written extensively (“Linked”) on networks that have such a concentration distribution, other authors show that it can be found in many, many networks.

Van Asseldonk e.a. have extended the theory to how different networks morphologies (aka organizational structures) cope with the demands for decision making. Decision making is defined as searching within a given variation space for an optimal solution, using a given number of different competences of the actors in the network (aka more or less specialized people or departments).

As the searchable variation space (lets say, the complexity of the environment you want to manage) grows and the number of competences you need to involve in the search grow, the search costs grow exponential and quickly become larger than the production cost of the solution.

 The industrial hierarchy/bureaucracy is the first to succumb, a networked organization holds out longer.

How does all this tie together?

Ostroms observations on the ineffectiveness of the State concur with the search costs observed by van Asseldonk; either you have a bloated State (too much drain), or a small State that cannot search the space and makes grave errors and does not have the resources to enforce. The same applies to the market solution, only transferred to the external governance and transaction structures needed in a privatized solution for the commons (if at all possible).

Her local social institution has neither of the disadvantages and is highly adaptable. It can be nested in greater institutions which seems to be comparable to the “small worlds linked through larger nodes” images conjured by Barabasi in describing complex networks. The scale free  power law connectivity distribution relates to the natural balance between adaptability and stability.

Back to congestion and Net Neutraility.

The governance of the commons “Internet” is achieved by the autonomous networks amongst each other. There are bodies to create and modify the rules of  cooperation. Some unwritten rules for exchanging and interconnecting to create an addressable space are used in practice.  Some of them are agreements, some of them are technical standards, some of it in the implementation of technology (TCP/IP congestion managenent implementation for example). 

Unlike many people seem to think, the users of the commons  “Internet” (including the companies) have not been a party in this governance structure, up till now. The State is intervening (on behalf of them?) now that some powerful networks want to change the rules unilaterally. Which is the source of the current state of discussions.

The interesting intellectual challenge is to apply the guidelines as identified by Ostrom for a succesfull commons and see if they could fit the problem.

The first striking failure is that up till now the users of the commons do not participate in the governance of the commons, nor can they be sure that the ISP as entry-point to the Autonomous Networks represents them correctly. An institutional problem.

The second failure is that there is no transparency on how the commons is used by users, which is technical question. Would it be possible to see how other users of the backhaul are using this commons? And to establish together with your ISP rules on behavior or on investmenst in more capacity?

More on this subject in future posts….

About Herman

Herman Wagter writes on Dadamotive about facts and figures behind issues that interest him. His work as interim manager and consultant has involved him directly in the impact of hyperconnectivity and sustainability on society. As an independent agent and "mobile warrior" he has experienced the pro's and con's of how organizations and projects can be structured, and what the effects on the final result can be. In his opinion we are entering an era of profound change, driven by these fundamental forces. Following the trends, discovering the fun and debunking the half-truths is a passion he likes to share with others.
Posted in: Complexity, Hyperconnectivity.