6.7 Operations

Operations are actorinitiated methods on data entities subject to access control. When an actor requests an operation on a data entity, access control either permits it by passing it to a program or denies it with a message explaining why. In this context, requests can be clustered into basic operation classes as follows:

  Create. A create generates a new entity based on data given, e.g. to create a Wikipedia stub for others to edit. Duplicate uses information from an existing entity to create a new copy. Download transfers a copy to another space, while Print transfers a copy to the printer. All are classed as a “Create” for these purposes.

  Edit. A simple edit alters the information in an entity. Append adds to the data but does not alter any existing data. Edit Version saves the changes as a new entity and keeps a previous version. Revert is the inverse that replaces the current copy with a previous version, as Wikipedia does. While create adds a new entity, edit changes an existing entity.

  Delete. A simple Delete flags an entity for destruction at some later time. Undelete reverses that operation. Destroy is the operation that removes the entity permanently. De-activate invalidates a persona entity by denying the logon operation that activates it, but does not delete or destroy it as a data entity.

  View. A simple View lets a person look at the information in an entity. Variations include being able to see some but not all of the content associated with it, e.g. viewing a post on a game need not show a Spoiler. Since view is a null act that does not change a target’s information it is irrelevant on the information level, but on the social level it is highly relevant, e.g. in some cultures staring at people is an act of aggression, and it is well known that being looked at energizes the viewed party (Geen and Gange, 1983), an effect called social facilitation. In social media, that others are viewing a tweet or post is important so view is not “nothing” at this level, And that an online video has gone viral makes others want to view it too. The effect of viewing illustrates how social requirements “flow down” to software design (see Figure 1.9)

  Move. Changes the parent of an object in an hierarchy.

  Logon. Lets a person activate a persona.

  Include. Adds a persona to a group permission set.

  Exclude. Removes a persona from a group permission set.

  Enter. Grants access to view the objects in a space.

  Ban. Denies access to view the objects in a space.

  Allocate. Grants a right to another actor, e.g. to “friend” a person on Facebook.

  Delegate. Temporarily grants a right to another.

  Transfer. Permanently grants a right to another.

More details follow, but Table 6.2 below shows how the basic operation classes apply to different entity types. Each class has many variants, but this does not affect the key access control issues.

Communication. In a simple communication a sender creates a message that a receiver views. It is by definition a joint act where both parties have choice, so communication should be by mutual consent. The right to remain silent is the choice not to send messages while asking “Can I talk to you?” is getting permission to communicate. The resulting access control rule is:

Rule 5: Every communication act requires prior mutual consent.

The evolution of telephony reflects how this social requirement affected technical design. At first phones just transmitted information so the phone rang and one answered not knowing who was calling. This allowed telemarketing, the forerunner of spam. Cell phones then showed caller id by default so one could choose to respond, i.e. it was more mutual. Social networks then added the synergy that each person could type in their own name and share it with others to add to their contact list, while people using cell phones still had to personally type in contact list names. Just as the engineers who designed TV remotes were locked into the physical level, cell-phone companies were locked into an information level mind-set. Now, as people show a name instead of a number when they call, technology is adjusting to social realities. Giving people the right not to accept messages from anonymous senders is a defense against spam.

Table 6.2: Operation sets by entity type

Entity Type

Operations

1. Actor Entities

 

   a. Persona

View, Delete, Edit, Logon, Deactivate

   b. Role

View, Delete, Edit, Include, Exclude

   c. Group

View, Delete, Edit, Join, Leave

2. Object Entities

 

    a. Item

View, Delete, Edit, Move

    b. Space

View, Delete, Edit, Move, Enter, Ban, Create (within)

3. Right

View, Edit, Allocate, Delegate, Transfer

Next

6.6 Objects

An object is an information entity that conveys meaning by evoking human cognitive processing; e.g. a family photo. Objects support operations like delete, edit and view. An object can be an item or a space.

ItemA simple object with no dependents, e.g. a discussion post. An item can be a:

1.   Post: An item whose meaning stands alone, e.g. a post saying “The Internet is a mirror to humanity”.

2.   Comment: An item whose meaning depends on another; e.g. the comment “I agree” makes no sense alone.

3.   Message: An item with sender and receiver; e.g. an email.

4.   Vote: An item that conveys a position choice from a response set, e.g. to rate a post 1-5 stars.

Online spaces contain items

Space. A space is a complex object with dependents, like an an online wall that accepts photos. A space is a parent to the child entities it contains, which depend on it to exist. A space can be deleted, edited or viewed like an item, but can also contain objects; e.g. a bulletin board is a space that can accept posts. The Enter Space operation allows the objects within it to be displayed. If a post item accepts comments or votes then it becomes a space, as it now has dependents. Spaces within spaces give object hierarchies with the system itself the first space. An object hierarchy is like a tree where the spaces are branches and the items are leaves. That items need spaces as leaves need branches gives the rule that:

Rule 3: Every entity is dependent upon a parent space, up to the system space.

 The Move operation changes the parent of a space. That deleting a space must delete any dependent items in it gives:

   Corollary: Deleting a space also deletes any entities it contains.

For example, deleting a discussion thread also deletes any posts made in that thread. Every system entity is part of an object hierarchy where its ancestors are the set of all spaces that contain it, up to the system itself which is the first ancestor. Likewise its offspring are any child objects it contains plus any children they have, etc. All entities have ancestors and any space can have offspring.

Entering a space. Entering an online space is the operation of letting a persona view the items in it. Just as one can lock the door to a physical space, so the owner of an online space can “lock” it, so only certain people can enter it. Equally the owners of items in it may be able to “cover” them, so they are not visible, by limiting display rights. 

Namespace. A namespace is an space whose parent is a persona that inalienably owns it. A namespace by definition cannot be moved or given away. Facebook is based on the management of namespaces. 

Ancestor rights. In social terms, a person is responsible for what they cause. Hence the owner of a space is responsible for anything offensive within it because their space causes it to exist, albeit indirectly, e.g. YouTube is responsible for an offensive video posted within it even though they did not directly post it because without YouTube it could not exist. In general, any space owner is responsible for the effect of an entity within it, whether a post, video, message or tweet. Since once cannot be responsible for what one cannot see, the space owner must be able to see it. In general, any entity added to a space must be visible to the owners of any ancestor spaces, giving the access control rule:

   Rule 4. The owners of any ancestor spaces have the right to view any offspring created.

For example, this gives the administrator the right to view anything on their system, e.g. a private message sent to a YouTube video owner is not seen by other people but is visible to the administrator, as are videos only shared with selected friends. Equally for example a space owner cannot ban the administrator from entering the space. That any ancestor is responsible for any item added from the moment of its addition gives the corollary:

   Corollary.  Ancestor owners have the right to be notified when a new offspring is created

Next

6.5 Actor Persona


A persona is a data entity that represents an offline actor, like a discussion board avatar, an email address, a social media identity or a chat name. A persona can also represent an offline group, company or organization. A program agent can also act on behalf of an offline actor, as installation software represents the company that owns it. Since accountability for actions ultimately traces back to people, if an installation does wrong we blame the company not the software.

A persona is created when an offline person registers with the system. Registration is the creation of a persona that can be activated by a logon operation that connects it to the offline party, usually by giving a password only that party is assumed to know. Open systems let people register a persona themselves rather than the administrator doing it. The logon persona name can be an online nickname not the person’s real name and still retain online accountability, e.g. an eBay seller nickname that cheats loses any online reputation it may have developed.

A persona owns itself. The social principle of freedom is that every person owns themselves, i.e. is free to choose what they do. If your online persona is doing things you didn’t choose, like sending messages or accepting posts, you are an online slave, and indeed many online viruses aim to create virtual “zombies”. If I adopt an online persona, whether a Hotmail identity or game avatar, it should belong to me as an inalienable right that cannot be given or taken away, based on the natural freedom of the inner self. Applying the same social standard online, a person owns the persona that represents them online, giving the access control rule:

      Rule 2. A persona always holds all rights to itself.

“Always” makes this self-ownership inalienable, so even the system administrator cannot take it away. That the system cannot allocate a persona to another party means it can never be the slave of another. Freedom implies that people also own data properties of themselves like their name, e.g. people can apply to the state to formally change their name because they own it. Freedom implies a person has the right to control the display of personal data, or privacy:

      Privacy Corollary. A persona always holds the right to display itself.

Privacy is about choice not secrecy, so while a persona can choose to be private, i.e. not seen by others, it can also choose to be public. People who choose to publicly reveal personal data lose secrecy not privacy. Privacy is not some unimportant “fluffy” ethic but rather critical to society. Note: nature values privacy as camouflage and the military values it as stealth.

Every online persona has the right to view and edit itself. A person who registers with an online system must always be able to view and change details about themselves that others can see.

Every online persona has the right to unregister itself. Unregister as the undo of register can occur by:

  1. Deleting. When a person deletes their persona, any dependent posts, pictures or messages must also be deleted. 
  2. De-activating. When a person deactivates their persona they revoke their connection to it, so it ceases to be a persona, i.e. a data entity with a logon method that can give an online actor. It simply becomes a system data entity.

Different systems handle requests to unregister a persona in different ways, see here:

  • Twitter: The Twitter “Deactivate my account” link is a permanent delete, although it can take up to a month for tweets and the account to disappear entirely from their system.
  • Linkedin: Has a “Close Your Account” that removes the persona and your data.
  • Facebook: A “Deactivate Account” link immediately makes it invisible to others on Facebook but keeps all data to let you later reactivate the account. To permanently remove data one has to complete an online form that takes 14 days to come into effect. If at any time in that period you logon to Facebook, the request is considered cancelled!
  • Wikipedai: Doesn’t let editors delete accounts at all, as that would remove all their contributions which is not possible for their system. They should however allow an account to be deactivated even if it is not deleted
  • GMail: Delete account permanently deletes the account and all its messages.

Since the persona has meta-rights to itself, it can in theory be:

1)   Transferred. A person can permanently transfer a persona to another, along with any reputation.

2)   Shared. One can share ownership to let another party act for you, e.g. look after your Facebook page.

Abandonment. Currently most social media deactivate and delete accounts that are not used for a period of time, e.g. HotMail accounts inactive for over 90 days are permanently deleted, i.e. if not used they “die.” In access control terms, the administrator who owns the system can deactivate a persona by denying the logon right to enter the system. Without that, a persona becomes a simply an information object that can be deleted. What the administrator cannot do, by Rule 2, is take control of the persona to use for their own purposes while it remains allocated to the person concerned.

Death. If the person behind a persona dies, their data can be deleted, made available to relatives, or even converted to a memorial as Facebook allows. A person’s physical will does not usually cover what happens to their online data, yet many online programs act as if death does not exist, e.g. one can get an eerie Facebook message from a person after going to a funeral. By some estimates there are over 30 million profiles of dead people on Facebook and tens of thousands are added each day, so death is relevant to social media. One answer is a digital will, but a better approach is to let people specify what happens to their digital estate when they die as part of the registration process.

Table 6.1 summarizes persona access rights.

 Table 6.1: Persona access rights 

Persona

View

Delete

Edit

Unregister

Transfer, Delegate

System Admin

 

 

 

Owner

 

 

Next

6.4 The System

The system itself is the first entity and the system administrator is the first actor. Initially, the administrator has all rights to the system and anything in it, so in social terms is the dictator of the domain. Access control then begins with one actor and one object and develops from there as content is added and other actors are invited to participate. While traditional access control lets an administrator arbitrarily set any right, like an all-powerful tyrant, legitimate access control enables the benevolent dictatorship proposed by Plato. Yet it is still a software dictatorship, as even Wikipedia is a benevolent dictatorship not a democracy. For a system like the Internet that was created by many people the rules are more complex as it is “owned” by all the people who made it. This chapter covers rights for an application, not the Internet as a whole.

The access control system mediates all rights to the system. For any system that involves actors, the access control software defines what those actors are allowed to do. Since we have concluded that software has no self to act socially, it must allocate all system rights to actors that represent people. If an access control system were to itself allocate rights, it would have to respond to an access request from itself which is circular. This gives the first access control rule:

    Rule 1. All rights to access control entities must be allocated to actors at all times.

 All entity rights must be allocated to an actor, so when a new post say is added, all rights to it must be defined immediately, although later they can be re-allocated. When a person posts a comment on a bulletin board, the access control system must know right away who can edit and delete it. It is not possible to leave rights undefined, as that would require the access control software to make decisions about a right should it be invoked.

Corollary: The system administrator initially has all rights to the system.

In a single owner system, the administrator has all rights to it. For example, the administrator of a smartphone is its owner, who can uninstall any app. So when the owner installs an app, the software needs their permission to do things like download updates to itself. Current apps deny this right by taking opt-in privileges, which puts the onus on the owner say to turn off downloading updates for every app while roaming to avoid a massive bill. An access control system that applies Rule 1 turns this around, putting the onus on the app to ask permission before the first update download. The phone owner could then set it happen automatically, or turn off updating for apps they never use at all.

In an online community, the administrator allocates rights to others. To develop a community, the administrator must delegate rights to others in the system. Doing this legitimately makes them a leader not a tyrant, e.g. to censor a photo in a collection delegated to another actor, the administrator cant just delete it. The access control system requires them to first take back ownership of the collection before letting them reject the photo in it. To then put things back as they were, they have to offer ownership of the photo collection back to its original owner, who may choose not to accept it, as they may be offended at being overruled. Legitimate access control lets an administrator unravel a social structure to do what they want, but not force others to do their bidding, as will be seen.     

Next

 

6.3 The Basic Model

Expressing social rights in access control terms allows access control rules based on social rules developed over thousands of years in physical society, and so must be compatible with current declarations of human rights. A right in information terms is an actor (A) applying an operation (O) to an entity (E):

Right = (Actor, Entity, Operation) = (A, E, O)

Social rights are (Actor, Entity, Operation) triplets, where an actor is a data entity that represents an accountable person or group, an entity is any meaningful information record, and an operation is any software method that applies to that entity. An information right can then be stored as a permission. Since an actor is also an entity, an actor can act upon itself, as when a person deletes their membership of say Facebook. This equates to the fact that in the physical world, a person can commit suicide, i.e. “delete” themselves.

Online social interactions involve entities and operations as follows:

1)   Entities. An entity is any meaningful information record.

   a)   Actor Persona. A data entity representing an offline person or group, e.g. a company registered on Facebook.

b)   Object. A data entity that conveys information and meaning.

   i.   Item. A simple object with no dependents, e.g. a bulletin board post.

   ii.  Space. A complex object with dependents, e.g. a bulletin board thread.

c)   Right. A system permission for an actor to operate on an entity.

   i.   Simple right. A right to operate on object or actor entities.

   ii.  Meta-right. A right to operate on a right, e.g. the right to delegate a right.

   iii. Role. A right given to a set of actors, e.g. a friend set.

2)   Operations. Program methods that operate upon entities.

   a)   Null operations do not change the target entity, e.g. view, enter a space.

b)   Use operations change the entity meaning in some way, e.g. edit, delete.

c)   Communication operations transfer data from sender(s) to receiver(s), e.g. email.

d)   Social operations change a right, e.g. delegate.

The following access control rules apply to actors operating on information entities. Some conclusions may seem obvious, but recall that to software nothing is obvious and everything must be specified. The goal is to outline an access control framework that allows software designers to implement socially valid rules.

Next

 

6.2 Access Control

Access control grants online permissions, i.e. rights. In computing, decision support systems recommend decisions, access control systems permit them, and control systems implement them. Access control began with multi-user systems, as when people sharing the same resources came into conflict it was necessary to define who could do what to what (Karp et al., 2009). Since this is exactly what a right is, access control is the way to implement rights online.

Traditional access control systems used a subject-object access permission matrix to allocate rights (Lampson, 1969). As computing evolved, this was extended to distributed systems, to allow roles for many person systems. The matrix approach has worked for military (Department of Defense, 1985), commercial (Clark & Wilson, 1987), organizational (Ferraiolo & Kuhn, 2004), distributed (Freudenthal et al., 2002), peer-to-peer (Cohen, 2003) and grid environment (Thompson et al., 1999) cases. In most cases, the permission matrix was centrally controlled by an administrator.

In social networks, access control is more about access than control. For friend interactions, the permission matrix increases geometrically with group size, so for hundreds of millions of people the possible combines are astronomical. Social networks vastly increase access complexity, as millions of people want rights to billions of items. A person may add thousands of photos and comments a year, and they want to control them in a way that was previously reserved for system administrators. Giving social networkers direct local control of their resources is not feasible by allocating read, write and execute permissions from a central authority (Ahmad & Whitworth, 2011). Each person essentially wants to define their own social structure (Sanders & McCormick, 1993), e.g. to restrict a photo to family or friends. Social networks were the perfect storm for the traditional ship of access control.

Current social networks allocate rights inconsistently. Making every network person in effect the administrator of their own local domain wasn’t easy, so different systems did it differently. With no guiding principles, the rules of online social interaction were based on designer intuitions rather than formal models. Hence they vary between systems and over time, with public outrage the only check, e.g. Facebook redefining its privacy options after the Cambridge Analytica scandal is only one of a long list of design changes driven by to a social failure. Since there is still no agreed scheme for allocating rights to create, edit, delete or view online entities, let alone manage roles, there is a need for access control rules that are legitimate, efficient, consistent and understandable. We need a universal description of online rights.

Next

6.1 What are rights?

Communities grant rights to citizens as social permissions to act. I “own” my car because society has given me the permission to own it, so if someone steals it the police try to return it to me and punish the thief. A community that agrees on who owns what improves order and synergy. Rights move conflicts from the physical to the legal level, so people still argue who should own what but no-one is killed or injured in the process. Laws operate on the information level, but on the personal level people just trust each other, and on the community level people just follow norms.

Rights do not mechanize interactions, as they are choices not obligations, e.g. the right to sue does not force us to sue and owning a thing does not stop us from giving it away. Rights define what actors can do, not what they must do.

People are actors not “users”. Traditional computing calls people users as if software was a drug, when actually people choose software and can switch if they want, so they are really actors. In data flow diagrams, people are always sources and sinks outside the information flows. To call people users when online makes them software accessories, which they are not. Since this mindset causes bad design habits, it should be abandoned. Just as we encourage sales assistants to see a customer not a sale, we encourage designers to see people online as actors not accessories.

Actors can act independently of input. Actors don’t just react, but initiate acts based on their autonomy. The word autonomy is from the Greek autos meaning self, and nomos meaning law, means to follow its own laws. As the story goes: light a fire under a table and it burns, light a fire under a dog and it runs away, light a fire under a man and he cooks his dinner. A program entirely defined by its input has no autonomy and so is not an actor, although it can be an agent.

Actors have a “self” that can be held to account for its actions. To hold citizens to account for their actions is the basis of all human society. Community justice evolved from revenge where people personally held others to account, so revenge occurs when community justice fails to work. Accountability assumes choice at some point, so a drunk can be fined because he earlier chose to drink. Even a drug addict who cannot stop now is accountable if once he could. On the other hand, if the state decides that a person is no longer accountable for their actions, it can put them in prison or care. Software is not accountable for its effects as it has no self.

Communities assume that people are accountable and govern accordingly. While philosophers argue over free will, all communities hold people to account for the effects of their acts on others, so criminals are punished and the mentally incompetent are put into care. Accountability is the social principle without which communities fail, but it only applies to people, e.g. in car accidents the driver is accountable not the car. Likewise, the company that writes an installation program is accountable for what it does not the software itself. In colonial times, it was decided that corporations are for legal purposes “persons” and so can enter into contracts. This creates problems as a company is merely an information entity with no accountable self, e.g. “punishing a scam company by “killing” it, or declaring it bankrupt, just lets its owners start another company to do the same again, as they do. Corporate personhood is the social error that allows the wealthy to deflect responsibility for their actions onto an entity with no self to care.

Rights formally express social concepts like fairness as action rules. The law is the sum total of these rules, and in traditional justice those who break the law are caught by police and punished by the courts. As societies grew bigger they needed laws to support synergy and online communities need them for the same reason. However on the Internet Code is Law, as it can be police, judge and jury. Therefore we must reinvent justice in software terms, to develop what Berners-Lee calls an Internet Bill of Rights. This chapter aims to express social rights as access control rules that designers can use in social technical systems.

Next

6. A Model of Online Rights

Communities encourage synergy by granting citizens legitimate rights, permissions based on social inventions like freedom, fairness, privacy and transparency. Likewise synergies such as online trade require a civilized Internet that respects social rights. This chapter outlines a model of universal rights that work both online and off:                     Next

6.1 What are Rights?

6.2 Access Control   

6.3 The Basic Model

6.4 The Information System   

6.5 Actor Persona   

6.6 Objects   

6.7 Operations   

6.8 Roles   

6.9 Ownership

6.10 The Act of Creation   

6.11 The Act of Display   

6.12 Re-allocating Rights   

6.13 Implementation   

6.14 Universal Rights

Discussion Questions   

References   

Chapter 5. References

Adams, J. S. (1965). Inequity in Social Exchange. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology Vol. 2. (pp. 267-299). New York: Academic Press.

Aumann, Y. (1998). Rationality and bounded rationality. In D. P. Jacobs, E. Kalai & M. I. Kamien (Eds.), Frontiers of Research in Economic Theory (pp. 47-60). Cambridge: Cambridge University Press.

Axelrod, R. (1984). The Evolution of Cooperation. New York: Basic Books.

Barker, R. (1990). Political Legitimacy and the State. New York: Oxford University Press.

Beer, D., & Burrows, R. (2007). “Sociology and, of and in Web 2.0: Some Initial Considerations.” Sociological Research Online, 12(5).

Benkler, Y., & Nissenbaum, H. (2006). “Commons-based peer production and virtue.” The Journal of Political Philosophy, 14(4), pp. 394-419.

Bertalanffy, L. v. (1968). General System Theory. New York: George Braziller Inc.

Bone, J. (2005). “The social map and the problem of order: A re-evaluation of ‘Homo Sociologicus‘.” Theory & Science, 6(1).

Callahan, D. (2004). The Cheating Culture. Orlando: Harcourt.

Dayton-Johnson, J. (2003). “Knitted warmth: the simple analytics of social cohesion.” Journal of Socio-Economics, 32(6), 623-645.

Diamond, J. (1998). Guns, Germs and Steel. London: Vintage.

Diamond, J. (2005). Collapse: How societies choose to fail or succeed. New York: Viking Penguin Group.

Diekmann, A. & Lindenberg, S. (2001). Cooperation: Sociological aspects. In International Encyclopedia of the Social and Behavioral Sciences (Vol. 4, pp. 2751-2756): Oxford: Pergamon-Elsevier.

Fukuyama, F. (1992). The End of History and the Last Man. New York: Avon Books Inc.

Granovetter, M. (1985). “Economic action and social structure: The problem of embeddedness”. American Journal of Sociology, 91(3), pp. 481-510.

Hardin, G. (1968). “The tragedy of the commons.” Science, 162, pp.1243-1248.

Harsanyi, J. C. (1988). A General Theory of Equilibrium Selection in Games. Cambridge, MA: MIT Press.

Hogg, M. A. (1990). Social Identity Theory. New York: Springer-Verlag.

Jensen, K., Call, J., & Tomasello, M. (2007). “Chimpanzees Are Rational Maximizers in an Ultimatum Game.” Science, 318(5847), pp. 107 – 109.

Kolbitsch, J., & Maurer, H. (2006). “The transformation of the Web: How emerging communities shape the information we consume.” Journal of Universal Computer Science, 12(2), pp. 187-213.

Kuutti, K. (1996). Activity Theory as a Potential Framework for Human Computer Interaction Research. In B. A. Nardi (Ed.), Context and Consciousness: Activity Theory and Human-Computer Interaction. (pp.247-248). Cambridge, Massachusetts: The MIT Press.

Lind, E. A., & Tyler, T. R. (1988). The Social Psychology of Procedural Justice. New York: Plenum Press.

Mandelbaum, M. (2002). The Ideas That Conquered the World. New York: Public Affairs.

Outhwaite, W. (1994). Habermas: A Critical Introduction. Cambridge: Polity Press.

Persky, J. (1995). “Retrospectives: The Ethology of Homo Economicus.” The Journal of Economic Perspectives, 9(2), pp. 221-231.

Poundstone, W. (1992). Prisoner’s Dilemma. New York: Doubleday, Anchor.

Rawls, J. (2001). Justice as Fairness. Cambridge, MA: Harvard University Press.

Ridley, M. (1996). The Origins of Virtue: Human Instincts and the Evolution of Cooperation. New York: Penguin.

Semler, R. (1989). “Managing without managers.” Harvard Business Review, No 5, Sep-Oct, pp. 76–84.

Shirky, C. (2008). Here Comes Everybody: The Power of Organizing Without Organizations. London: Penguin.

Siponen, M. T., & Vartiainen, T. (2002). “Teaching end-user ethics: Issues and a solution based on universalizability.” Communications of Association of Information Systems, 8, pp.422-443.

Smith, A. (1776/1986). The Wealth of Nations. Harmondsworth: Penguin.

Somerville, J., & Santoni, R. E. (1963). Social and Political Philosophy. New York: Anchor Books.

Tversky, A., & Kahneman, D. (1974). “Judgment under Uncertainty: Heuristics and Biases.” Science, 185(4157), pp. 1124 – 1131.

Tversky, A., & Kahneman, D. (1982). Judgement under uncertainty: heuristics and biases. In D. Kahneman, P. Slovic & A. Tversky (Eds.), Judgement Under Uncertainty (pp. 3-20). NewYork: Cambridge University Press.

Tyler, T. (1999) Deference to group authorities: Resource and identity motivations for legitimacy. Paper presented at the Society of Experimental Social Psychology Annual Conference, 1999. St Louis, Missouri.

von Neumann, J., & Morgenstern, O. T. (1944). Theory of Games and Economic Behavior. Princeton, NJ: Princeton University Press.

Whitworth, B., Van de Walle, B., & Turoff, M. (2000). Beyond rational decision making, In proceedings of Group Decision and Negotiation 2000 Conference. Glasgow, Scotland.

Whitworth, B., & deMoor, A. (2003). “Legitimate by design: Towards trusted virtual community environments.” Behaviour & Information Technology, 22(1), pp. 31-51.

Whitworth, B. (2006). Social-technical systems. In C. Ghaoui (Ed.), Encyclopedia of Human Computer Interaction (pp. 533-541). London: Idea Group Reference.

Whitworth, B., Fjermestad, J., & Mahinda, E. (2006). “The Web of System Performance: A multi-goal model of information system performance.” Communications of the ACM, 49, (5), pp. 93-99.

Whitworth, B. (2008). Some implications of Comparing Human and Computer Processing. Paper presented at the Proceedings of the 41st Hawaii International Conference on System Sciences.

Whitworth, B., Bañuls, V., Sylla, C., & Mahinda, E. (2008). Expanding the Criteria for Evaluating Socio-Technical Software. In IEEE Transaction on Systems Man & Cybernetics, Part A.

Whitworth, B., & Liu, T. (2009). “Channel email: Evaluating social communication efficiency.” IEEE Computer, 42(7).

Whitworth, B. (2009). The social requirements of technical systems. In B. Whitworth & A. deMoor (Eds.), Handbook of Research on Socio-Technical Design and Social Networking Systems. Hershey, PA: IGI.

Wilson, D. S., & Sober, E. (1994). “Reintroducing group selection to the human behavioral sciences.” Behavioral and Brain Sciences, 17(4), pp. 585-654.

Woolcock, M. (1998). Social capital and economic development: toward a theoretical synthesis and policy framework. Theory and Society, 27(2), pp. 151-208.

Wright, R. (2001). Nonzero: The logic of human destiny. New York: Vintage Books.

Chapter 5. Discussion Questions

Research questions from the list below and give your answer, with reasons and examples. If you are reading this chapter as part of a class – either at a university or in a commercial course – work in pairs then report back to the class.

1.   What is social synergy? How do communities encourage synergy? How do they prevent its destruction? How do trust and synergy relate? Give physical and electronic examples.

2.   Give five examples of defections in ordinary life. What happens to a community if everyone defects? Give five online examples of defections, and for two specify how technology lowers the defection rate.

3.   Would you prefer to be a middleclass citizen now or a lord three hundred years ago? Consider factors like diet, health, clothes, leisure, travel, etc. Where did the lord’s wealth mainly come from? Where does the power of your salary to buy many things come from today? How does the principle apply online?

4.   What is a social dilemma? Give three physical examples from your experience. Why cannot individuals solve them? How are they solved? Give three online social dilemmas. How can they be solved by socio-technical design.

5.   What happens if one suggests things in a group? Conversely, what happens if no-one in a group suggests anything? How can groups manage this dilemma? Answer the same questions for volunteering. Give examples in both cases from both offline and online.

6.   What percentage of online actors are lurkers who look but do not post? Go to a popular board you have not used before. What stops you contributing? Add something anyway. How could the board increase participation?

7.   Explain the statement: Personal ethics is community pragmatics. How does ethics affect performance in online systems?

8.   Analyze the case of a thief who steals a wallet and is not caught. List the thief gains and the victim losses to get the net community result. What if everyone in a community steals? Generalize to the online case where spam “steals” a few seconds of your time. How does this differ from an offline theft?

9.   Why is synergy important for larger communities and especially important for social-technical systems? How can technology help increase synergy? Report the current estimated sizes of popular social-technical systems. Clarify what is exchanged, who interacts, the synergy and the defections.

10.   Look at the objects you use every day. How many could you make? How many are even produced in your country? How hard would it be for you to make them? Compare the cost it would take you to make say a simple table with how much you pay for it. Relate this to social synergy.

11.   Discuss whether people are rational actors acting in natural self-interest. Give physical examples of common acts that are irrational, i.e. done knowing they will cause suffering, or knowing that they will not give any benefit. In general, when are people irrational? How does this affect social-technical design?

12.   Describe the volunteer dilemma. Is it that people will not volunteer or that they will? Look at the extreme case of people volunteering to go war. Why did many Japanese and those in other armies volunteer for kamikaze missions? Explain this in terms of Rule 2. How important is volunteering in online communities? How can technology support it?

13.   Describe how online babysitting exchange cooperatives work. Based on a game theory matrix, how would one act if one followed Rule 1? How about Rule 2? How do people actually act? How does the technology affect this?

14.   Social networks like Facebook are about friends but what is a friend? In anyone who helps you a friend? Define a friend in synergy terms. Does friendship imply trust? Is it a one-way or two-way thing? If you “friend” someone on Facebook, are they really a friend? Explain the difference referring to information and human levels.

15.   Consider how one person cheating causes others to cheat, e.g. in sports. Draw a diagram to show how social defections cumulate as each one reduces the probability that cooperation will give synergy benefits. How do social defenses alter these percentages? Use an online case to illustrate.

16.   What is social order? Explain its strengths and weaknesses, with examples. Is social order possible on the Internet? Discuss the success or not of attempts by countries like China and Iran to control the use of the Internet by their citizens. Mention proxy software designed to thwart that control.

17.   What is social hijack? Are all dictators social hijackers? Give physical examples of past communities ruled against their will that eventually rebelled. Can the same occur in online communities? How does STS design affect this?

18.   Can a social system exist if it is not a physical “thing”? If so, are new social designs like democracy social inventions? What then was invented? Make a list of the social inventions of physical society over the last two thousand years. How does social-technology add to that list?

19.   In the middle ages, whether in China, England or Russia, democracy was not only unthinkable but also impossible. Why? What changed in the last thousand years to make it possible? How is the same change factor allowing new social designs to develop on the Internet? (Hint: Consider social health)

20.   Describe some well-known peasant revolts of the past that were successfully put down. If the Arab Spring is the same, but based on modern social-technology, why is it harder to put down? Discuss how the information revolution changed how Arab states have been governed for centuries.

21.   Describe the communism vs. capitalism ideological battle that dominated the last century in Rule 1 and 2 terms. What is the result today, i.e. which social design won? Is China purely communist? Is America purely capitalist? How would you describe the successful social designs of this century?

22.   What exactly is democracy? Is the Democratic People’s Republic of Korea democratic? Were the Greeks of Athens democratic? Is the USA democratic? Are there any online democracies? Is Wikipedia a democracy? Describe online systems that have at least some democratic features.

23.   The Enron collapse lost millions but the 2007–2012 credit meltdown lost billions. Explain how one was an ethical failure and the other a competence failure. What happened to the perpetrators in both cases? Can such things happen online? Describe what would happen to say Wikipedia if it became a. Corrupt or b. Incompetent. Can social-technology reduce the likelihood of such “crashes”?

24.   Do the various golden rules of history follow Rule 1 or Rule 2? How do these golden rules, which began thousands of years ago, affect the design of information technology today?

25.   What is social transparency? Is it making everything visible to all? How does it relate to privacy? How can social-technical systems support both transparency and privacy?

26.   If an ant in an ant colony serves the community, are ants ethically good? Are people forced to serve a community also good? Explain why without freedom no goodness can be. Relate to the issue of whether smart software should take over human choice or whether software should always leave the ethical choices to people.

Next