3.4 Legitimacy Analysis

A social system is here an agreed form of social interaction that persists (Whitworth and de Moor, 2003), and people seeing themselves as a community is essentially their choice. Legitimacy can be defined as a combination of fairness and public good applied to a social system (Whitworth and de Moor, 2003). In politics, a legitimate government is seen as rightful by its citizens and is thus accepted, while illegitimate governments need to stay in power by force of arms and propaganda. By extension, legitimate interaction is accepted by the parties involved, who freely repeat it, e.g. fair trade. Physical and online citizens prefer legitimate communities because they perform better.

In physical society, legitimacy is maintained by laws, police and prisons that punish criminals. Legitimacy is the human level concept by which judges create new laws and juries decide on never before seen cases. A higher affects lower principle applies: communities engender human ideas like fairness, which generate informational laws that are used to govern physical interactions. Communities of people create rules to direct acts that benefit the community like outlawing theft, i.e. higher level goals drive lower level operations to improve system performance. Doing the same thing online, applying social principles to technical systems, is the basis of socio-technical design.

Conversely, over time lower levels get a “life of their own” and the tail starts to wag the dog, e.g. copyright laws that were originally designed to encourage innovators have become a tool to perpetuate the profit of the corporations that purchased those creations (Lessig, 1999), e.g. Disney copyrighted public domain stories like Snow White that they did not create solely in order to stop others using them. Unless continuously “re-invented” at the human level, information level laws inevitably decay and cease to work.

Lower levels like software and hardware are more obvious, so it is easy to forget that today’s online society is a social evolution as well as a technical one. The Internet is new technology but it is also a move to new social goals like service and freedom rather than control and profit. So for the Internet to become a hawker market, of web sites yelling to sell, would be a devolution. The old ways of business, politics and academia should follow the new Internet way, not the reverse.

There are no shortcuts in this social evolution, as one cannot just “stretch” physical laws into cyberspace (Samuelson, 2003), because these laws often:

1)   Do not transfer (Burk, 2001), e.g. what exactly is online “trespass”?

2)   Do not apply, e.g. what law applies to online “cookies” (Samuelson, 2003)?

3)   Change too slowly, e.g. laws change over years but code changes in months.

4)   Depend on code (Mitchell, 1995), e.g. online anonymity means actors cannot be identified.

5)   Have no jurisdiction. U.S. law applies to U.S. soil, but cyber-space is not “in” America.

Figure 3.2: Legitimacy analysis

The software that mediates online interaction has by definition full control of what happens, e.g. any application could upload any file on your computer hard drive to any server. In itself, code could create a perfect online police state, where everything is monitored, all “wrong” acts punished and all undesirables excluded, i.e. a perfect tyranny of code. Socio-technical design is the only way to ensure this does not happen.

Yet code is also an opportunity to be better than the law, based on legitimacy analysis (Figure 3.2). Physical justice, by its nature, operates after the fact, so one must commit a crime to be punished. Currently, with long court cases and appeals, justice can take years, and justice delayed is justice denied. In contrast, code represents the online social environment directly, as it acts right away. It can also be designed to enable social acts as well as to deny anti-social ones. If online code is law (lessig, 1999), to get legitimacy online we must build it into the system design, knowing that legitimate online systems perform better (Whitworth and de Moor, 2003). That technology can positively support social requirements like fairness is the radical core of socio-technical design.

So is every STS designer an application law-giver? Are we like Moses coming down from the mountain with tablets of code instead of stone? Not quite, as STS directives are to software not to people. Telling people to act rightly is the job of ethics not software, however “smart”. The job of right code, like right laws, is to attach outcomes to social acts, not to take over people’s life choices. Code as a social environment cannot be a social actor. Socio-technical design is socializing technology to offer fair choices, not technologizing society to be a machine with no choice at all. It is the higher directing the lower, not the reverse.

To achieve online what laws do offline, STS developers must re-invoke legitimacy for each application. It seems hard, but every citizen on jury service already does this when they interpret the “spirit of the law” for specific cases. STS design is the same but for application cases. That the result is not perfect doesn’t matter. Cultures differ but all have some laws and ethics, because some higher level influence is always better than none.

To try to build a community as an engineer builds a house is the error of choosing the wrong level for the job. Social engineering by physical coercion, oppressive laws, or propaganda and indoctrination is people using others as objects, i.e. anti-social. A community is by definition many people seeing themselves as one, so an elite few enslaving the rest is not a community. Social engineering treats people like bricks in a wall which denies social requirements like freedom and accountability. Communities cannot be “built” from citizen actors because to treat them as objects denies their humanity. Communities can emerge as people interact, but they cannot be “built”.