Chapter 6. References

Ahmad, A. & Whitworth, B. (2011). Distributed Access Control for Social Networks. In proceedings of 2011 International Conference of Information Assurance and Security IAS.

Benkler, Y. (2002). “Coase’s Penguin, or, Linux and ‘The Nature of the Firm’.” The Yale Law Journal 112 (3), pp. 369–446.

Clark, D. D., & Wilson, D. R. (1987). “A Comparison of Commercial and Military Computer Security Policies,” IEEE 1987 Symposium of Security and Privacy. pp. 184–194.

Cohen, B. (2003). Incentives build robustness in BitTorrent. Retrieved from http://www.ittc.ku.edu/~niehaus/classes/750-s06/documents/BT-description.pdf

Ferraiolo, D. & Kuhn, D. R. (1992). Role-Based Access Control. In Proceedings of the NIST-NSA National (USA) 1992 Computer SecurityConference, pp. 554–563.

Freeden, M. (1991). Rights (Concepts in the Social Sciences). Buckingham: Open University Press.

Freudenthal, E., Pesin, T., Port, L., Keenan, E., & Karamcheti, V. (2002). dRBAC: Distributed role-based access control for dynamic coalition environments. In Proceedings of the 22nd International Conference on Distributed Computing Systems (ICDCS 2002). Vienna, Austria.

Geen, R. G., & Gange, J.J. (1983). Social facilitation: Drive theory and beyond. In H. H. Blumberg, A. P. Hare, V. Kent, and M. Davis, Eds. Small Groups and Social Interaction Volume 1. (pp. 141–153). Oxford: Wiley-Blackwell.

Karp, A. H., Haury, H., & Davis, M. H. (2009). From ABAC to ZBAC: The Evolution of access control model. Technical report HPL-2009-30, HP Labs.

Lampson, B. W. (1969). “Dynamic Protection Structures,” AFIPS Conference Proceedings, 35, pp. 27–38.

Locke, J. 1690/1963) An essay concerning the true original extent and end of civil government: Second of ‘Two Treatises on Government’ In J. Somerville & R. E. Santoni (Eds.). Social and Political Philosophy (pp. 169–204). New York: Anchor.

Rawls, J. (2001). Justice as Fairness. Cambridge, MA: Harvard University Press.

Rose, E. (2001) “Balancing internet marketing needs with consumer concerns: A property rights framework,” Computers and Society, 31 (1), pp. 17–21.

Sanders, M. S., & McCormick, E. J. (1993). Human Factors in Engineering and Design. New York: McGraw-Hill.

TCSEC (1985). Trusted Computer Security Evaluation Criteria (TCSEC), DOD 5200.28-STD. Department of Defense.

Thompson, M., Johnston, W., Mudumbai, S., Hoo, G., Jackson, K., & Essiari, A. (1999). Certificate-based access control for widely distributed resources. In Proceedings of the Eighth Usenix Security Symposium, pp. 215–228.

Whitworth, B., & Bieber, M. (2002). Legitimate Navigation Links. In proceedings of ACM Hypertext 2002, Demonstrations and Posters, University of Maryland. pp. 26–27.

Whitworth, B. & Liu, T. (2008). Politeness as a Social Computing Requirement. In R. Luppicini, (Ed.). Handbook of Conversation Design for Instructional Applications (pp. 419–436). Hershey PA: IGI.

Whitworth, B. & Liu, T. (2009). “Channel email: Evaluating social communication efficiency,” IEEE Computer, 42(7). pp. 63-72.

Whitworth, B. & Whitworth, A. (2010). “The Social Environment Model: Small Heroes and the Evolution of Human Society,” First Monday, 15 (11).

 

Chapter 6. Discussion Questions

Research questions from the list below and give your answer, with reasons and examples. If you are reading this chapter as part of a class – either at a university or in a commercial course – work in pairs then report back to the class.

1)   What is access control? What types of computer systems use it? Which do not? How does it traditionally work? What is the social network challenge and how has access control responded?

2)   What is a right in human terms? Is it a directive? How can rights be represented as information? Give examples. What is a transmitted right called? Give examples.

3)   What is the difference between a user and an actor? Contrast user goals and actor goals. Why are actors necessary for online community evolution?

4)   Is a person always a citizen? How do communities hold citizens to account? If a car runs over a dog, is the car accountable? Why is the driver accountable? If online software cheats a person, is the software accountable? If not, who is? If automated bidding programs crash the stock market and millions lose their jobs, who is accountable? Can we blame technology for this?

5)   Contrast an entity and an operation. What is a social entity? Is an online persona a person? How is a persona activated? Is this like “possessing” an online body? Is the persona “really” you? If a program activated your persona, would it be an online zombie?

6)   What online programs greet you by name? Do you like that? If an online banking web site welcomes you by name each time, does that improve your relationship with it? Can such a web site be a friend?

7)   Compare how many hours a day you interact with people via technology vs. the time spent interacting with programs alone? Be honest. Can any of the latter be called conversations? Give an example. Are any online programs your friend? Try out an online computer conversation, e.g. with Siri, the iPhone app. Ask it to be your personal friend and report the conversation. Would you like a personal AI friend?

8)   Must all rights be allocated? Explain why? What manages online rights? Are AI programs accountable for rights allocated to them? In the USS Vincennes tragedy, a computer program shot down an Iranian civilian airliner. Why was it not held to account? What actually happened and what changed afterwards?

9)   Who should own a persona and why? For three social technical systems, create a new persona, use it to communicate, then try to edit it and delete it. Compare what properties you can and cannot change. If you delete it entirely, what remains? Can you resurrect it?

10)   Describe two ways to join an online community and give examples. Which is easier? More secure?

11)   Describe, with examples, current technical responses to the social problems of persona abandonment, transfer, sharing and orphaning. What do you recommend in each case?

12)   Why is choice over displaying oneself to others important for social beings? What is the right to control this called? Who has the right to display your name in a telephone listing? Who has the right to remove it? Does the same apply to an online registry listing? Investigate three online cases and report what they do.

13)   How do information entities differ from objects? How do spaces differ from items? What is the object hierarchy and how does it arise? What is the first space? What operations apply to spaces but not items? What operations apply to items but not spaces? Can an item become a space? Can a space become an item? Give examples.

14)   How do comments differ from messages? Define the right to comment as an access triad. If a comment becomes a space, what is it called? Demonstrate with three commenting social technical systems. Describe how systems with “deep” commenting (comments on comments, etc.) work. Look at who adds the depth. Compare such systems to chats and blogs – what is the main difference?

15)   For each operation set below, explain the differences, give examples, and add a variant to each set:

a.   Delete: Delete, undelete, destroy.

b.   Edit: Edit, append, version, revert.

c.   Create: Create.

Define a fourth operation set.

16)   Is viewing an object an act upon it? Is viewing a person an act upon them? How is viewing a social act? Can viewing an online object be a social act? Why is viewing necessary for social accountability?

17)   What is communication? Is a transfer like a download a communication? Why does social communication require mutual consent? What happens if it is not mutual? How does opening a channel differ from sending a message? Describe online systems that enable channel control.

18)   Answer the following for three different but well known communication systems: Can a sender be anonymous to a receiver? Can a receiver be anonymous to a sender? Can senders or receivers be anonymous to moderators? Can senders or receivers be anonymous to the transmission system?

19)   Answer the following for a landline phone, mobile phone and Skype: How does the communication request manifest? What information does a receiver get and what choices do they have? What happens to anonymous senders? How does one create an address list? What else is different?

20)   What is a role? Can it be empty or null? Give role examples from three popular social technical systems. For each, give the access control triad, stating what values vary. What other values could vary? Use this to suggest new useful roles for those systems.

21)   How can roles, by definition, vary? For three different social technical systems, describe how each role variation type might work. Give three different examples of implemented roles and suggest three future developments.

22)   If you unfriend a person, should they be informed? Test and report what actually happens on three common social networks. Must a banned bulletin board “flamer” be notified? What about someone kicked out of a chat room? What is the general principle here?

23)   What is a meta-right? Give physical and online examples. How does it differ from other rights? Is it still a right? Can an access control system act on meta-rights? Are there meta-meta-rights? If not, why not? What then does it mean to “own” an entity?

24)   Why can creating an item not be an act on that item? Why can it not be an act on nothing? What then is it an act upon? Illustrate with online examples.

25)   Who owns a newly created information entity? By what social principle? Must this always be so? Find online cases where you create a thing online but do not fully own it.

26)   In a space, who, initially, has the right to create in it? How can others create in that space? What are creation conditions? What is their justification?

27)   Find online examples of creation conditions that limit the object type, operations allowed, access, visibility and restrict edit rights. How obvious are the conditions to those creating the objects?

28)   Give three examples of creating an entity in a space. For each, specify the owner, parent, ancestors, offspring and local public. Which role(s) can the owner change?

29)   For five different social technical systems genres, demonstrate online creation conditions by creating something in each. How obvious were the creation conditions? Find examples of non-obvious conditions.

30)   For the following, explain why or why not. Suppose you are the chair of a computer conference with several tracks. Should a track chair be able to exclude you, or hide a paper from your view? Should you be able to delete a paper from their track? What about their seeing papers in other tracks? Should a track chair be able to move a paper submitted to their track by error to another track? Investigate and report comments you find on online systems that manage academic conferences.

31)   An online community has put an issue to a member vote. Discuss the social effect of these options:

a)   Voters can see how others voted, by name, before they vote.

b)   Voters can see the vote average before they vote.

c)   Voters can only see the vote average after they vote, but before all voting is over.

d)   Voters can only see the vote average after all the voting is over.

32)   An online community has put an issue to a member vote. Discuss the effect of these options:

a)   Voters are not registered, so one person can vote many times.

b)   Voters are registered, but can change their one vote any time.

c)   Voters are registered, and can only vote once, with no edits.

Which option might you use and when?

33)   Can the person calling a vote legitimately define vote conditions? What happens if they set conditions such as that all votes must be signed and will be made public?

34)   Is posting a video online like posting a notice in a local shop window? Explain, covering permission to post, to display, to withdraw and to delete. Can a post be deleted? Can it be rejected? Explain the difference. Give online examples.

35)   Give physical and online examples of rights re-allocations based on rights and meta-rights. If four authors publish a paper online, list the ownership options. Discuss how each might work out in practice. Which would you prefer and why?

36)   Should delegating give the right to delegate? Explain, with physical and online examples. What happens to ownership and accountability if delegatees can delegate? Discuss a worst case scenario.

37)   If a property is left to you in a will, can you refuse to own it, or is it automatically yours? What rights cannot be allocated without consent? What can? Which of these rights can be freely allocated: Paper author. Paper co-author. Track chair. Being friended. Being banned. Bulletin board member. Logon ID. Bulletin board moderator. Online Christmas card access? Which rights allocations require receiver consent?

38)   Investigate how social network connections multiply. For you and five others find out the number of online friends and calculate the average. Based on this, estimate the average friends of friends in general. Estimate the messages, mails, notifications etc. you get from all your friends per week, and from that calculate an average per friend per day. If you friended all your friend’s friends, potentially, how many messages could you expect per day? What if you friended your friend’s friend’s friends too? Why is the number so large? Discuss the claim of the film Six Degrees of Separation, that everyone in the world is just six links away from everyone else.

39)   Demonstrate how to “unfriend” a person in three social networks. Are they notified? Is unfriending “breaking up”? That an “anti-friend” is an enemy, suggests “anti-Facebook” sites. Investigate technology support for people you hate, e.g. celebrities or my relationship ex. Try anti-organization sites, like sickfacebook.com. What purpose could technology support for anti-friendship serve?

test

Legitimate access control is essential to increase the trust and participation critical to any online social group. The following access control

rules suggest how Universal Social Rights can be specified in a way that can be implemented by any social technology designer. Below each rule is a brief discussion of how its implementation would civilize the Internet. In the future, we may look back on today as the Wild West period of software.

Accountability Rule: All rights to an existing entity must be allocated to actors at all times.

Software has no right to act of its own accord because it is not accountable. Just as people delivering a new TV deciding to re-organize your lounge is a liberty, so a new browser changing your browser defaults is a liberty. In the law, might is not right and “because I can” is not a reason. Modern smartphones allow other liberties, like apps that upload address book contact lists to use for their own purposes, see here. In December 2012, it was found that Carrier IQ software was recording and uploading keystrokes made, phone numbers dialed and texts sent on 140 million smartphones. An operating system with access control that followed the accountability rule would not allow this, e.g. the Android platform requires apps to ask before accessing the owner‘s data, instead of letting them just steal it.

Freedom Rule: A persona always holds all rights to itself.  

In 1993, NYU undergraduates controlling an avatar called Mr. Bungle in a text based virtual reality called LamdaMOO some how gained the power of “voodoo” to control other players [Dibbell, 1993 #1423], and one night used it to virtually “rape” several female characters, making them respond as if they enjoyed it. No physical law was broken as there was no physical contact and so no legal rape but the LambdaMOO community was so outraged that one of the “wizards” unilaterally deleted the Mr. Bungle character. After much discussion, the system was altered to make “voodoo” an illegal power. The social requirement that one should control oneself carried over into cyberspace. Likewise when a person dies, their online persona should be deactivated as it is no longer an actor. Facebook took a while to realize that when a person dies their family doesn’t want out of control software agents sending jovial reminders to wish them a Happy Birthday, as here. The social logic again prevailed. Now one can memorialize their account so it can be viewed but not logged into or changed. I should not have to die to memorialize a site I own, but should be able to do it at any time, knowing it is irreversible. 

      Privacy Corollary. A persona always holds the right to display itself.

Privacy is not secrecy but the right to display oneself, so one may choose to be public. Current software tends to ignore privacy until challenged, e.g. Facebook only changed the practice of making new accounts public by default to friends only when its privacy came under scrutiny. Yet this gesture still ignores the social rule that all display is up to the person, not Facebook. How hard is it, when setting up an account, to ask whether it be displayed to:

       Nobody (Default)

       Everybody

       Friends Only

Privacy is not hard – it just means that everyone has the choice to display personal data because they own themselves.

Containment Rule: Every online entity is dependent upon a parent space, up to the system space.

That everything exists in something else allows online spaces within spaces. For example, suppose Attila, a discussion forum owner, finds a post by Luke, an independent contributor, to be offensive, but Luke disagrees. What can happen? More importantly, what should happen? Is Luke free to say what he wants? Can Attila simply delete the item because it is his board? Can he edit the item to remove the offensive part? Can he he ban Luke from entering the board? Can he ban Luke from the system? Can he set the board to “watch” Luke, and keep a log of all his activities? Can he alter Luke’s name on the post to “Gross-Luke” until he learns a lesson? Can Luke “fix” the item and resubmit it? Since such situations arise online every day, its time to set some standards on such conflicts.

Following the access control rules outlined here:
1.    Luke owns the “offensive” item, so Attila cannot delete or edit it.
2.    Luke owns his own persona, so Attila cannot change Luke’s name to
to “Gross-Luke”.
3.    Attila owns the board space, so he can withdraw display rights (reject).
4.    If Luke’s post is rejected, he can still see it, edit it, and request it be reconsidered for display by Attila.
5.    Attila cannot ban Luke from from a space that contains an item he owns, else the post becomes an “orphan”.
6.    Attila cannot record Luke’s activity without his knowledge.
7.    Attila can ask the administrator to deactivate and delete Luke’s persona, which then deletes all his posts.

 Supporting legitimate rights allows more positive social interaction. If the post was simply deleted, Luke might assume a system error and resubmit it, unaware it was “rejected”, increasing the conflict with Attila. If his post is edited, posters may lose confidence in the board, even more so if Attilla personally attacks Luke by changing his name. In contrast, if Luke can see his post is rejected and can amend it, he change it and ask again to display it in Attila’s space. He also knows that to submit more such posts, risks exclusion from the system which would delete all his posts. Balancing basic rights allows a socially better interaction. Note that whether the post really is “offensive” is irrelevant to this case.

What if Luke’s “offensive” post is in a thread run by Attila under a higher board run by Ghengis, who does not find the post offensive? If both Attila and Ghengis could take over Luke’s post by editing it, one could get an “edit war” between them. The legitimate option now is that Luke could appeal to Ghengis to “depose” Attila, by taking back his delegated control of the thread, or give it to someone more tolerant. The beauty of social legitimacy is that all the rules still work. While the program answer to the above questions is “whatever you want”, the social answer is to respect basic rights of ownership. When people know what to expect from ownership, they have more trust.

6.14 Universal Rights

Legitimate access control is essential to the trust needed for online social interaction. The following rules suggest legitimate universal rights that can be implemented by any social technology designer, to civilize the Internet from its current Wild West status by code not guns. Others are invited to support this effort, whether by critique, development or use.

What part of “No” dont you understand?

   Accountability Rule: All rights to access control entities must be allocated to actors at all times.

Software has no right to act of its own accord because it is not accountable. Just as people delivering a new TV deciding to re-organize your lounge is a liberty, so a new browser changing your browser defaults is a liberty. In the law, might is not right and “because I can” is not a reason. Modern smartphones allow other liberties, like apps that upload address book contact lists to use for their own purposes, see here. In December 2012, it was found that Carrier IQ software was recording and uploading keystrokes made, phone numbers dialed and texts sent on 140 million smartphones. An operating system with access control that followed the accountability rule would not allow this, e.g. the Android platform requires apps to ask before accessing the owner‘s data, instead of letting them just steal it.

Freedom is owning yourself

   Freedom Rule: A persona always holds all rights to itself.

In 1993, an NYU undergraduate avatar called Mr. Bungle in an online game called LamdaMOO gained the power of voodoo”, the ability to control other players. They then used it to virtually “rape” several female characters, making them respond as if they enjoyed it (Dibbell, 1993). No physical law was broken as there was no physical contact, and so no legal rape, but the LambdaMOO community was so outraged that one of the “wizards” unilaterally deleted the Mr. Bungle character. After much discussion, the system was altered to make “voodoo” an illegal power. The social requirement that one should control oneself carried over into cyberspace. Likewise when a person dies, their online persona should be deactivated as it is no longer an actor. Facebook took a while to realize that when a person dies their family doesn’t want out of control software agents sending jovial reminders to wish them a Happy Birthday, as here. The social logic again prevailed. Now one can memorialize their account so it can be viewed but not logged into or changed. I should not have to die to memorialize a site I own, but should be able to do it at any time, knowing it is irreversible.

The right not to be watched online

      Privacy Corollary. A persona always holds the right to display itself.

Privacy is not secrecy but the right to display oneself, so one may choose to be public. Current software tends to ignore privacy until challenged, e.g. Facebook only changed the practice of making new accounts public by default to friends only when its privacy came under scrutiny. Yet this gesture still ignores the social rule that all display is up to the person, not Facebook. How hard is it, when setting up an account, to ask whether it be displayed to:

       Nobody (Default)

       Everybody

       Friends Only

Privacy is not hard. People have the choice to display personal data because they own themselves.

Online spaces, like physical spaces, contain things

Containment Rule: Every entity is dependent upon a parent space, up to the system space.

That everything exists in a parent space allows online spaces within spaces. For example, suppose Attila, a discussion forum owner, finds a post by Luke, an online contributor, to be offensive, but Luke disagrees. What can happen? More importantly, what should happen? Is Luke free to say what he wants? Can Attila simply delete the item because it is his forum? Can he edit the item to remove the offensive part? Can he he ban Luke from entering the forum? Can he remove Luke from the system? Can he alter Luke’s name on the post to “Gross-Luke” until he learns a lesson? Can Luke fix the post and resubmit it? Since such situations arise online every day, its time to set some standards. The access control rules outlined here suggest that:

1.    Luke owns the “offensive” post, so Attila cannot delete or edit it.
2.    Luke owns his own persona, so Attila cannot change the author name to
“Gross-Luke”.
3.    Attila owns the forum, so he can
reject the display of Luke’s post.
4.    Luke can edit his post to be reconsidered for display by Attila.
5.    Attila can ban Luke from from his forum.
7.    Attila can ask the administrator to remove Luke and all his posts.

6.    Attila can moderate Luke’s further posts.

To reject a post is to not display it

Balancing legitimate rights allows a better interaction. If Attila just deleted the post, Luke might assume a system error and resubmit it, unaware it had been rejected. If Attila edits the post, others may lose confidence in the forum, even more so if Attila changes Luke’s name, i.e. personally attacks him. But Attila can refuse to display Luke’s post, which may warn others. If Luke sees his post rejected for display, he can edit it and ask again to display in Attila’s space. He also knows that to submit more such posts risks the removal from the system entirely. The social balance is that if Attila controls his forum too much it will “die”, while if Luke is a troll he will be excluded.

What if Luke’s “offensive” post is under a higher board run by Genghis who does not find the post offensive? If both Attila and Genghis could take over Luke’s post by editing it, one could get an “edit war” between them. Luke could appeal to Genghis to “depose” Attila, to take back his delegated control and give it to a more tolerant person. Or even to refuse to display Attila’s board entirely. While the code answer to social issues is “whatever you want”, the social answer is legitimate rights that increase trust.

Parent space can watch you play

Offspring Rule. The owners of any ancestor spaces have the right to view any offspring created.

One can create a private video on YouTube that no-one else can see but that does not include YouTube administrators. Likewise no space owner can ban the system administrator, nor can the owner of a discussion thread ban the owner of the forum in which it is contained. In the above example, Attila cannot ban Genghis from looking at Luke’s post.

No spam

Communication Rule: Every communication act requires prior mutual consent.

When systems like email give anyone the right to send a message to anyone the result is spam, or unwelcome messages. In contrast, in Facebook, chat, Skype and Twitter, one needs prior permission to message someone. To “follow” someone is a positive permission to receive their messages. In legitimate communication, a channel must be opened by mutual consent before messages are sent, as detailed in Whitworth and Liu (2009). Giving online communicators channel control protects them against being pestered by spam from unknown people or online agents.

Creative Commons

Public Domain Rule: The full transfer of non-personal information into the public domain is not reversible.

This rule is the basis of the current open access and open source movement. The Creative Commons licenses allow public domain donors to clarify whether what is given includes the right to edit it (No Derivatives means no editing) or sell it (Non-commercial means no selling), and denies them the right to take personal ownership (ShareAlike means not altering the public rights). The idea is that once information is made public, it can never again be private. The synergy of a global information commons that benefits all is the explicit goal.

By invitation discussions are better

Space Entry Rule: A space owner can unilaterally ban or accept persona entry.

The right of the owner of a space to “lock out” those who misbehave is essential, as one “troll” can in effect “kill” an online discussion. It also allows what many see as the future of online interaction, which is invitation spaces. For example, one can set up an open discussion area on a topic, then invite those who add value to an invitation only discussion space. This also addresses the problem of online “bots”, AI software that trawls discussions to add null comments that advocate some sales pitch with no relevance at all to the topic. Defenses like CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) aim to be easy on humans but hard on bots, but focusing on the trolls wastes everyone’s time. Invitation spaces focus on the people who want to discuss a topic, not make money or cause chaos.

Did Facebook let data be stolen?

Ownership Rule: The administrator of an entity can re-allocate its rights and the owner can edit it.

Current society reduces physical conflict by social ownership (Freeden, 1991) which is something software can support and people can understand (Rose, 2000). Clarifying who owns what online reduces conflict, e.g. The Internet Corporation for Assigned Names and Numbers (ICANN) registers who owns Internet names to avoid conflict. Even more complex is the case of a landlord and a tenant, where the landlord has the title and the tenant has the use. In current society, that the landlord owns the letterbox doesnt give them the right to read a tenant’s mail. Facebook’s Cambridge Analytica problem reflects a confusion over who owns personal data, them or the people who put it there? Does being the landlord of an online space give the right to “photograph” their tenants possessions in order to direct ads at them? Facebook cant forever pretend to respect privacy while denying tenants choice over how their data is used, as people wise up. Its future success attracting tenants will depend on what sort of a landlord it wants to be. Many people feel powerless on the Internet only because software makes it so.   

Creation Rule: The right to create in a space resides inherently with the space owner.

Who owns a photo you post online?

In this model, the space owner holds the right to edit it and submitting into a space is an edit of that space. For a physical equivalent, submitting to an online space is like an artist delivering a painting to a gallery that is received at the door and hung up by the gallery staff. This does not mean the artist has no rights, as they may lend their painting on the condition it is given back. Or the space may be a workshop where artists can paint as people watch. Or it may be a graffiti area where artists draw on the walls knowing it cannot be taken away. Online offers all these options, as online journals like First Monday let you submit but not take back, online galleries like Google Photos and DeviantArt let you post and take back, while most forums don’t let people edit or delete posts. An interesting case is Photobucket who in 2017 suddenly made people pay $400/year for sharing what was previously free, so thousands of forums lost pictures for their posts. The issue was not that they were now charging but that they were holding photos submitted under previous TOS hostage for a $400 ransom. Imagine a free display gallery locking your photos away until you paid their new charges! They lost business and in 2018 reduced the charge to $30 hoping to come back. Companies ignore the social level at their peril.

Posting in the wrong place is common in forums

Innovation Rule: The creator of a new entity immediately gains all rights to it in their namespace.

A common complaint of forums is people posting in the wrong thread. Regular contributors report “newbies” who post a question in a new thread instead of an established one, but often it is just a genuine error the software doesnt let them fix. And the moderators who can move items often feel like parents picking up after untidy children. But why is changing the parent a restricted function? Why not let the person who posted do that? If people initially owned what they posted in their namespace, they could unsubmit and resubmit it in the right place, i.e. be responsible for their own post. Then when I make a posting mistake, as we all do, I can fix it myself. In general, delegating rights encourages responsibility.

Transparency Rule: People have the right to know in advance what rights they are giving to others.

Social transparency is the right to know when rights are being exchanged. Creative commons licenses show that giving rights needn’t be as complicated as EULAs suggest, e.g. see here for a summary of terms of service for some online apps. This model clarifies standard questions like “Do I give up administrative ownership?” and “Can I delete my post?”

Visible CCTV cameras for public good are accepted

An interesting case is whether software has the right to secretly record what you do online? The acceptance of CCTV cameras in Britain surprised privacy advocates, but to by choice enter a public area like a park is to implicitly consent to be viewed. Privacy is not violated if there is implied informed consent. People accept CCTV cameras in plain view because the aim is public good and the recording is known, as is how the information is used. In contrast, people object to being “spied on” without their knowledge as they want to know they are being watched. Hence the US fourth amendment only allows wiretapping if formally approved by a judge acting on behalf of the community. That one should not violate everyone‘s rights to catch some offenders was built into the US constitution by its founders to prevent random monitoring

Is spying the new Internet norm?

Welcome to Spyworld? Today, technology gives the power to store every phone call, text, e-mail, tweet and online post, every day, forever. That nobody knew that the US PRISM system was tapping the entire Internet, including its own citizens, was what Edward Snowden blew the whistle on. That what you post online can anytime come back to haunt you will have the same chilling effect on the Internet as a police state does – it stops the bad guys and everyone else as well. In this model, people grant others the right to view them, so any space monitoring should be declared before entry, so people can consent (or not) before entering the space. Monitoring per se is not wrong but secretly spying on people is. Unless social transparency is reinvented online, the fourth amendment benefit will be lost and everyone will use VPNs. If its not OK to wiretap a person without a warrant, its not OK to wiretap the Internet.

Display Rule. Displaying an entity in a space requires the consent of both its owner and the space owner.

Fake news is against society

Initially, online spaces like Youtube and Facebook considered themselves not responsible for what people posted, but this model makes clear that they are. If a space owner controls what displays in their space, they are responsible for posts against the public good because they can stop them. Hence Zuckerberg told Congress “I agree we are responsible for the content” on Facebook, despite previously claiming that it was not so, and Twitter, Google and other online giants will follow suit. Tech companies are not passive platforms on which people put content, hence a UK parliamentary committee concluded “clear legal liability” for tech companies “to act against harmful and illegal content” with failure to act resulting in criminal proceedings. How can companies that harvest people’s data for advertising then disavow control and thus responsibility for that content? That Facebook already faces £500,000 fine over the Cambridge Analytica data harvesting suggests more to come, which is why on July 26, 2018 it lost $US120 billion in the biggest drop in US stock-market history. It does not pay to go against society.

Delegation Rule. Delegating a simple right does not give the right to delegate that right.

Absolute monarchs are long gone

Long ago, absolute monarchs had all rights to all things in their kingdom but those times are long gone. In 1215, Magna Carta’s habeus corpus (Latin “that you have the body”) asserted a person‘s right not to be unlawfully detained. Today, states grant people rights to their own property, including “spaces” like their own home. Such rights are not absolute but delegated. Applying these ideas to online society gives each persona freedom and the right to initially own what they create, following Locke. In this model, the system administrator, who is indeed initially the absolute ruler of their kingdom, can choose to delegate as follows:

  • Create all spaces then delegate. The administrator creates all spaces then delegates them directly, to indirectly retain all control, so they can remove any space owner at will. Companies follow this model of control.
  • Create first spaces then delegate. The administrator creates the first spaces then delegates them. This lets those space owners create sub-spaces that they own. The US and its states follow this model.

In this model, delegation is a trade-off between the creative productivity of free people and the absolute control of unwilling slaves. What doesn’t work, as history shows, is to delegate creative freedom then treat people as slaves by taking what they produce. That delegating does not give the right to delegate limits what is given, but does not change the underlying balance. Previously people fought wars over land but today they fight over information. If the Internet becomes a behavior modification tool on a massive scale, whether for profit or politics, then information wars will kill creativity just as physical wars do. Perhaps one day humanity will agree that everyone wins is better than everyone loses.

Allocation Rule: Allocating a right that makes a person responsible for an existing entity requires their consent.

The Internet is a mirror to humanity

Accepting a delegated right is a choice, so one has the right to resign a delegation at any time, i.e. give back the right to its administrator. Giving people freedom to not contribute to what they don’t like allows a talent market, as those who behave badly lose competent support. This model advocates freedom in all its forms as a fundamental Internet principle. People often underestimate the power they have on the Internet to not participate, as to participate in a thing is to enable it. Ghandi’s non-violent “rebellion” in India was essentially about not participating in an illegitimate rule. The Internet is a mirror to humanity as never before. A global community deciding to boycott an online space would empty it of citizens, making it a social failure. Social earthquakes are rare, like their physical counterpart, but when everybody agrees on something, its hard to stop them.    

In conclusion, the Internet is not a new world but the old world of social interaction in a new context. Since those who do not learn from history are doomed to repeat it, it behooves us to recognize in code what is socially good. Others are invited to help develop social standards for software design, lest by ignorance the Internet experience its own dark age.

Next

6.13 Implementation

Traditional access control was based on a security kernel that checked every access request and decided if it was granted or denied based on a centralized access policy model. The person initiating the request then either saw it executed or got a “permission denied” message. As security was the focus, one decision point handled all requests.

Access control today is about distributing access. Social networks involve millions of people with billions of resources, so central control creates a bottle neck that slows the system down. In addition, people interacting on social media want local ownership for social reasons, as without this a community cannot develop. The focus has changed from security, or denying illegitimate access, to synergy, or allowing legitimate access. The aim is positive rather than negative. It requires the distribution of access control to local roles under the control of the people who create the resources. The access control rules outlined here aim to support the legitimate allocation of rights in an online social system. If distributed certificates of access are stored in the stakeholder’s namespace, only he or she can access and modify them (Figure 6.4).

Figure 6.4: Distributed access control model architecture

Next

 

6.12 Re-allocating Rights

Dynamic social systems re-allocate rights. Re-allocating rights lets an online social system evolve from an initial state of one administrator with all rights, to a community sharing rights. Rights to an entity can be re-allocated, as follows:

1. Transfer. Irrevocably re-allocates a right to another including the right to allocate that right, e.g. selling a house.

2. Delegate. Re-allocates a right but not its meta-right, so the allocation can be reversed, e.g. renting a house.

3. Include. Includes another in a right so they must jointly agree to exercise it, e.g. couples who jointly own a house must both agree to sell it.

4. Share. Shares a right with another so that both can exercise it as if they owned it exclusively, e.g. couples who severally own a bank account can each take out all the money.

While transfer permanently gives away a right, delegate temporarily gives it away as it can later be taken back. To transfer an edit right necessarily also transfers the meta-right, but one can transfer a meta-right without altering any other rights, e.g. when a landlord sells a rented property to another landlord leaving the tenants in place.

Including another in a right logically divides the right by allocating it to their AND set. For example, if a couple owns a house jointly then both must sign the sale deed to sell it and either party can stop the sale. To include someone else in a right requires the meta-right to allocate, i.e. one must truly own the entity involved. One can also include someone in a meta-right, in which case the ownership can only be revoked if they jointly agree.

Sharing a right with another logically multiplies the right by allocating it to their OR set. For example, if two people own a bank account severally, either party can take all the money out of it. Again to share a right one needs the meta-right. If a meta-right is owned severally, then any party can revoke the other’s right which is obviously risky.

The above re-allocation logic works for the ownership of an online document as it does for a house or bank account. One can transfer, delegate, include or share rights with different social results, e.g. sharing the edit of an online paper is risky but invites participation while including other authors is safe but makes editing harder, as every change must be approved by every author. Currently most online allocation involves the sharing of view rights.

Delegation. Delegating say the right to edit does not give meta-rights, so by definition the delegatee cannot pass the right on, just as renting an apartment gives no right to sub-let and lending a book does not give the right to on-lend it. The social logic is that if delegatees can delegate, then the original owner is no longer responsible, e.g. if one lends a book to a friend who lends it to a careless other who loses it, who is responsible? The owner has lost their property due to another they may not even know. Just as renting an apartment does not give tenants the right to sub-let or destroy it, so delegating say a forum thread to another does not allow them to delete or re-allocate it, giving the access control rule:

   Rule 13. Delegating a simple right does not give the right to delegate that right.

Allocating responsibility. When one gives an object to another that makes them responsible for it, so they must agree. For example, to register a car in another’s name both parties must agree. If one could unilaterally allocate car ownership, a person could dump a wrecked car in a public place then register it to someone else, who would then have to pay to remove it. Likewise allocating authorship to an online document requires consent, so one cannot add a paper co-author without their agreement. Access control would have to confirm by asking say: “Brian offers edit rights to the paper ‘Towards Universal Online Rights’, do you accept?The access control rule is:

   Rule 14. Allocating rights that make another personally responsible for an existing entity requires their consent.

Rights that imply responsibility include administrator rights, delete, edit, move and display. An interesting example is Wikipedia who allocated edit rights to the universal set of all people, which doesn’t imply personal responsibility. Letting anyone edit the wiki encouraged both contributors and vandals, people who edit pages to fit an agenda rather than facts. Since letting anonymous people edit lost accountability, they ended up banning everyone at the IP address of a vandal, which is hardly fair. It would be better to register everyone with IP address just one option, i.e. give people the choice.

Allocating rights without consent. In contrast, rights that dont change a target, like view or enter, can be given without consent. Likewise the right to create can be freely given because it implies no responsibility. The corollary is:

   CorollaryAllocating rights that don’t make another personally responsible for an existing entity do not require consent.

Rights that don’t imply responsibility for an existing entity include view, create, enter, copy, download and print. This rule lets lets space owners share enter, view and create rights with anyone.

 Friendship is not a trade. When social media say X wants to be your friend, what this implies is often not clear, except that it is a tit-for-tat trade of social rights, e.g. that if you let them view your stuff, you can view theirs. On Facebook it also lets them post on your timeline by “tagging” photos with your name, though it is never made clear why giving another the right to view your Facebook profile requires you to let them spam you with change notices from theirs. That one can turn this default off is irrelevant, and like a lot of other social media “features” it can change at any time. Rather than trial and error, why not design technology based on social experience?  In physical society one can be a friend without requiring the same in return, and one can love a child even if they do not return the same. Friendship is given not transacted, so this would favor messages like X considers you a friend. This is freely giving friendship as opposed to trading it. 

FOAF isn’t my friend. FOAF (Friend of a Friend) systems work as a “Find Friends” choice on systems like LinkedIn but the assumption that the friends of my friends are also my friends has no social basis. Friendship is not transitive, and early friend systems that assumed it was either failed or abandoned it, as people didn’t want it. Adding someone to my friend list does not give software the right the add all their friends to my friend list. Friendship simply does not work that way.

        Next

6.11 The Act of Display

To display an object is to let others view it. The right to display is not the right to view, as viewing a video online does not let you display it on your web site. When a person walks in public they implicitly display themselves to others, i.e. let others view them. So one can view and take a photo of them without consent, but can’t say display that photo on a magazine cover without consent. Copyright is the recognition of Locke’s principle of creator ownership applied to display.

Display is the meta-right to view. Display is the right to give the right to view to others, and privacy is the right to display one’s personal data. For example, a phone book lets people display their phone number or have a private number that is not shown. Equally the phone company can choose not to display a listing. Social media like Facebook and LinkedIn also let people choose who they display their data to and can also choose not to display a listing. In general, to display an entity in a space is to give its display rights to the space owner, giving the access control rule:

   Rule 12. Displaying an entity in a space requires the consent of both its owner and the space owner.

For example, to put a physical notice on a shopkeeper’s notice board involves:

  • Create. Create a notice. You own it and can still change it or even rip it up.
  • Submit. Give to the board owner, to display on the notice board.
  • Display. The board owner may check notice or let you post it yourself.
  • Stop display. As the notice is displayed by mutual consent, either can remove it. The shopkeeper’s right to take a notice down is not the right to destroy it because he or she does not own it, nor can they alter (deface) the notice.
An Online Video is an agreement to display

 The same social logic applies online. Creating a YouTube video lets its author own it before it is displayed to the public. This is like creating a notice before giving it to the shopkeeper. The video author then submits the video to YouTube, to display it on their “notice board”. In access control terms, authors delegate display rights to YouTube who choose to display it in their space. After a while, the video is converted to their format and displayed. At any time, YouTube or the author can stop the display of the video. That to display a video, photo or text in any online space allocates the view meta-right to the space owner, gives the access control principle:

Rule 14. Displaying an entity in a space allocates the view meta-right to the space owner.

Display result

Space owner

Accept

Reject

Entityowner

Submit

YES

NO

Withdraw

NO

NO

Table 6.4: Display outcomes

Table 6.4 shows the possible outcomes of an attempt to display an entity in a space.

Allocating display rights is the basis of all publishing. Offline publishing generally involves:

1. Review/Edit. For example, journals may take years to review and even then reject.

2. Allocate. The document author allocates display rights to the publisher.

3. Publish. Publisher displays the document to their audience.

Online publishing simplifies speeds up the process, e.g. ArXiv lets authors submit and display immediately and lets them withdraw submissions later, which journals do not allow. There are many rights variants, e.g. bulletin boards let people submit and display but not withdraw and reserve the right to moderate postings, i.e. reject later.

Offline publishing often takes full ownership. By the logic above, publishing only requires delegating display rights but many offline publishers demand more, e.g. IGI takes not just the right to publish a work once but also to re-publish it in whole or part, anywhere they choose, including online, in perpetuity, e.g. my paper Politeness as a Social Computing Requirement published in IGI’s Handbook of Conversation Design for Instructional Applications was later republished in their Selected Readings on the Human Side of Information Technology and in Human Computer Interaction: Concepts, Methodologies, Tools, and Applications without the author being even being advised. Here is one sentence from a typical publisher:

Author hereby grants and assigns to Springer Science+Business Media, LLC, New York (hereinafter called Springer) the exclusive, sole, permanent, world-wide, transferable, sub-licensable and unlimited right to reproduce, publish, distribute, transmit, make available or otherwise communicate to the public, translate, publicly perform, archive, store, lease or lend and sell the Contribution or parts thereof individually or together with other works in any language, in all revisions and versions (including soft cover, book club and collected editions, anthologies, advance printing, reprints or print to order, microfilm editions, audiograms and videograms), in all forms and media of expression including in electronic form (including offline and online use, push or pull technologies, use in databases and networks for display, print and storing on any and all stationary or portable end-user devices, e.g. text readers, audio, video or interactive devices, and for use in multimedia or interactive versions as well as for the display or transmission of the Contribution or parts thereof in data networks or search engines), in whole, in part or in abridged form, in each case as now known or developed in the future, including the right to grant further time-limited or permanent rights.

In sum, offline publishing takes all possible rights and gives the author next to none for their own work. Academic publishing contracts read like what you might sign on entering prison, and often locks up work up so that others can’t read it unless they pay. Hence Internet publishing is growing faster than physical publishing.

Online publishing gives more synergy. For example, adding a YouTube video involves:

  1. Registration. Create a YouTube persona.
  2. Entry. Enter YouTube (not banned).
  3. Creation. Create and upload a video.
  4. Edit. Edit video title, notes and properties.
  5. Submit. Request YouTube to display the video to their public.
  6. Display. The public sees it and can vote or comment.
Figure 6.3: A YouTube video and its rights

YouTube lets anyone register (1), enter (2) and create a video (3) which the author can edit (4) before it is displayed. At this point the video is visible to them and administrators but not to the public and they can still delete it. It is then submitted (5) to YouTube for display to its public (6). To create, edit and display a video are distinct steps, and YouTube can still reject videos that fail its copyright or decency rules. This is not a delete, as the owner can still view, edit and resubmit it. A technology design that required contributors to hand over all rights to their videos to YouTube, as book publishers do, would discourage participation.

Consistency. A critical feature of legitimate rules is that they work consistently. For example, if a video allows comments or votes then it becomes a space itself to receive those comments or votes. Hence video owners must consent to allow comments or votes, just as YouTube had to consent to accept their video (Figure 6.3). That YouTube gives the same rights to others as it takes for itself is a key part of its success as a social not just a technical system

Next

 

 

 

6.10 The Act of Creation

Creating something from nothing is impossible. In the physical world, one must create an entity from what already exists and the same is true on the information level. Creation cannot be an act upon the object created that by definition doesn’t exist before it is created, nor can one request access to an object that does not exist. What does exist before an entity is created is the space that will contain it and so creation is an act upon that space.

Create is an edit of a space. In information terms, creating in a space changes the space by making the created entity part of it. It follows that the right to edit a space implies the right to create in it, giving the access control rule:

   Rule 9. The right to create in a space resides inherently with the space owner.

 This rule is well defined as the system itself is the first space. It allocates the right to create to every space owner:

Create = (Space Owner, Space, Create Entity)

This right is inherent because to lose it is to lose ownership of the space, defined as the right to edit. Hence one can only create in a space if its owner permits, e.g. to add a forum post always requires the forum owner’s permission. If others could edit the forum at will, the forum owner would no longer own it in the sense of controlling it.In this model, creating an object within a space appends it to the space, which is a type of edit.

Creation agreements. When a space owner gives the right to create and people submit a post, picture or video to a blog, forum or video post, they enter into an agreement with the space owner about the rights to what is created, whether text, picture, sound, video or any combination. They may get money as Listverse gives $100 per post, or access to an audience as for say YouTube. The creation agreement defines what rights apply to what was created, including:

1. Display. Who can view what was created, e.g. forums let all others see a post but conferences do not let others view a submitted paper in the review phase.

2. Edit. Who can change what was created, e.g. YouTube gives authors exclusive edit rights, Wikipedia lets anyone edit any creation, blog comments are not usually editable, and ArXiv lets authors update publications as new versions.

3. Delete. Who can delete what was created, e.g. Twitter lets the author delete a tweet so it is not visible any more but an email once sent cannot be deleted.

4. Meta-right. Who truly owns what was created. A space that takes the meta-right to what people submit becomes their administrator, who may just “rent back” use rights to authors, e.g. Dropbox lets people truly own what they post but  Listverse takes full ownership. Currently Facebook is unclear whether what people post belongs to them, or whether it can use their personal data as it wishes.

As can be seen, a creation agreement can take many forms.

Creator ownership. Object creation is a simple technical act but a complex social one. How should the rights to a newly created entity be allocated? What a slave creates by work does not belong to them but in the 17th century the British philosopher John Locke argued that creators owning what they create increases community prosperity by encouraging people to create (Locke, 1690/1963). Locke’s logic applied to a farmer’s crop, a painter’s painting and a hunter’s catch – people produced more if they owned what was produced. After all, why struggle to produce for others to own? If one later chooses to sell or give a creation away, that is another matter. This social requirement gives the access control rule:

   Rule 10. The creator of a new entity immediately gains all rights to it in their namespace.

This conveniently resolves the issue of how to allocate rights to a new entity — they go to its creator by default. On a technical level, a program can act in any way; e.g. it could let the system administrator own all created objects, but on social level all creators would then be slaves. Rule 10 encourages people to create things. Since online systems depend on free people contributing, it does not pay to treat them like slaves, by taking all rights to what they create. Indeed online systems that give more rights to creators tend to get more submissions, just as Locke postulated before the Internet.

Balancing creator and space owner rights. How then can people own what they create online when the right to create resides with the space owner? Rules 10 and 11 separate create from submit as follows:

  • Create. Different spaces provide different create operations, e.g. a forum allows create text, a video space allows create video, a gallery allows create picture, and so on. For all rights to immediately go to the creator of a new entity it must initially reside in their namespace. This lets them check their work before submitting and even leave and come back to submit later. For example, when YouTube lets authors create a video and save without submitting, they recognize creator ownership of what people chose to create. At this point, others cant see the video.
  • Submit. The creator then agrees to give some rights to the space owner, making the video for example visible to others. I am unlikely to post a video of my new baby online if I lose all rights to it, conversely if I give no rights away the space cannot display the video at all. Every online post is a rights agreement between the creator and space owner, e.g. to post on a forum may require one to give up the right to edit or delete the text added.   

Social transparency. When people give away rights in society, it is specified in a contract. In such cases, fairness dictates that both parties know in advance what rights are being exchanged. Contract transparency is the right to know what rights are being given away in that choice. That submitting online is a rights exchange gives the access control rule:

   Rule 11. People have the right to know in advance what rights they are giving to others.

One has the right to read a contract before signing it, yet few do, as it is often a long document written in complex legal language. This is even more true for online End User License Agreements (EULAs), e.g. the May 2011 iTunes agreement was 56 pages long! To make the point, in 2010 Gamestation added a clause to their EULA that purchasers agreed to give their soul to the company and 7,500 people agreed, see here. The offline method of using complex contracts for lawyers to read doesn’t work online, as companies write EULAs customers cant understand to keep them in the dark. Since code is law online, let the access control system ask permission to transfer rights at submit time, say for a Listverse submit:

      Submitting your post will result in the following rights allocations:  Do you wish to continue? YES / NO

      YOU           THEM

       View         View, Display, Edit, Delete, Move, Copy

                       Full ownership is transferred                  Don’t ask me again  ⌈⌋

Making clear the results of a submit rights exchange lets people to decide whether to create or not. For those who often post the “Don’t ask again” meta-choice lets the warning only appear the first time. Using the access control system to describe a rights exchange gives an accuracy and brevity that contracts written by sellers don’t have. Legitimate access control lets online social systems gain the social transparency needed to be trusted.

Allocating roles. When a new entity is created, access control assigns these roles:

1. Owner. Has the right to edit it.

2. Administrator. Has the right to re-allocate its rights.

3. Parent. Is the immediate containing space.

4. Ancestors. Any other containing spaces.

For example, when submitting a paper to a conference mini-track, the owner is the author who can update it but the administrator is the mini-track owner who can delete it if it is rejected. The parent is the mini-track chair. The ancestors of a conference paper are its mini-track, track and conference chairs, and its offspring are any reviews attached to the paper.

Co-Author rights state what the co-owners of the paper can do, as it may have several authors, e.g. co-authors may be able to edit a paper but not delete it, or perhaps they can view but only the main author can edit the paper.

Chair rights. On a social level, the mini-track chair is responsible for the papers in their mini-track, the track chair is responsible for the mini-tracks in their track, and the conference chair is responsible for all tracks in the conference, i.e. for every paper in the conference. Since one cannot be responsible for what one cannot see, when a new paper is added the access control system will give view rights to the mini-track, track and conference chairs. So a paper posted on a mini-track is visible to mini-track, track and conference chairs but not to other track or mini-track chairs.

Author rights. A paper created in a mini-track by by an author who can edit it gets the right to enter that space. If the mini-track space bans the owner of a paper in it, they can no longer see it and so cannot be responsible for it. That those who author papers are responsible for them also requires them to be able to enter the track and conference spaces.  

An online vote is a group creation. Programs like Survey Monkey let people vote on an issue that affects them. The vote result is essentially created by the voters who produced it, so they as a group should own it. This gives them the right to view it, but many online voting systems ignore this social requirement by allocating the vote result to a controller who has no obligation to display it to its creators. We have all taken part in marketing surveys where we did not see the result. Locke’s principle then applies, that people are less likely to participate in surveys where they do not get to see the result, and indeed it is so. In contrast, a legitimate online vote would automatically advise people of their vote result because they created it. Note that during the vote, the system can restrict viewing to people who have voted to avoid voter bias.

Next

6.9 Ownership

Full ownership is to have all rights to an entity. Physical ownership is generally defined as both exclusive rights and control over a property. It includes both possession and control, and title which is the right to possess. In this model, full ownership of a data entity can be stored as a triplet of the owner persona, the entity and the set of all operations on it:

Full Ownership = (Owner, Entity, All Operations)

Full ownership lets the owner use it as they wish. Yet this right is also a data entity in itself that can also be given away.

   Owned entities can be given away. That I own a car for example also lets me lend or even transfer my ownership to another when I sell it. Likewise, the author who writes a document and owns it can give it away to a publisher. Or one can give away some rights, as letting a friend view your profile gives them view rights. In information terms, giving away a right is an act upon the right itself. A meta-right then is the right to allocate entity rights:

MetaRight = (Owner, Entity Rights, Allocate)

Allocate is the operation of changing the right to an entity. The logic is exactly as before, except now the data entity acted upon is a right. The meta-right to an entity is the right to allocate its rights to other persona.

   Meta-rights. Fully owning an entity includes the right to give away rights such as view or edit. A meta-right is the right to change who holds the rights to it. Does this then imply meta-meta-rights? Or meta-meta-meta rights? Such an endless iteration contradicts the information level requirement that every program must halt, i.e. not run endlessly. To avoid meta-meta-rights, a meta-right is the right to allocate all rights to an entity including the meta-right itself.

   The owner of a data entity has control over it. If the registered owner of a car lends it to an adult who runs a red light, the charge is against the driver not person who lent the car. In general, the person in control of a thing is responsible for it, and for a data entity that is the person who can edit it. The person who can change a post owns it in the sense that they are responsible for its effect on others. Likewise the owner of a space is the person who can change it by creating objects within it. In this model, the owner of an entity is the persona who can edit it.

   The administrator of a data entity has the meta right to it. For example, renting a physical apartment gives a tenant control over it, but the landlord retains the meta-right to re-allocate the apartment, so they can ask a tenant to leave. The tenant owns the apartment in that they control it, but the landlord administers it in being able to allocate rights. In this model, the administrator of an entity is the persona who can re-allocate its rights, e.g. a system administrator.

   The above give the access control rule:.

Rule 8. The administrator of an entity can re-allocate its rights and the owner can edit it.

Physical world tenancy agreements usually require pages of text but in the information world giving another control over your data means giving them edit rights but keeping the meta-right. Thus one can delegate ownership but remain the administrator of an entity. However to give away the right to delete an entity lets another destroy it along with all rights, including the meta-right, i.e. giving away the delete right gives away administrative control, giving a corollary to Rule 8:

   Corollary. The administrator of an entity cannot give away the right to delete it.

An administrator who wishes to let another delete an entity must give them the meta-right, i.e. make them administrator’

In conclusion, the owner of an entity who by definition can edit it is directly responsible for it, while the administrator who by definition can allocate rights such as edit is indirectly responsible for it. In full ownership, one person both owns and administers the entity.

Next

6.8 Roles

Roles simplify rights management by applying rights to persona sets. Roles like family, friend or acquaintance are social sets that are understandable to the people who review, define and accept them. Roles are useful online, e.g. Wikipedia citizens can aspire to steward, bureaucrat or sysop roles by good acts. Slashdot’s automated rating system offers the moderator role to readers who are registered (not anonymous), regular participants (for a time), and who have a positive “karma” (how others rate their comments) (Benkler, 2002) . Every registered reader has five influence points to spend on others as desired over a threeday period (or they expire). Such roles encourage a meritocracy, where highly rated commenters get more karma points and so more say on what is seen by others. In access control terms, a role is a rights statement applied to an actor set, e.g. a friend role defines what your set of friends can do:

Friend Role = (Actor SetEntitiesOperations)

To “friend” another person is then simply to add them to the friend role set.

Public domain role. Roles as rights applied to actor sets allows two extreme cases. First, a role can apply to a null actor set, as when one has no friends defined. In this case the role still exists and can be viewed and edited, but has no effect. Second, a role can apply to the universal actor set, of all people now and in the future. This is the public domain role, as when works are put into the public domain to be available to all. The copyright act at first gave credit to the creator of a work (Copyright Law, 2016) for sixty years or until they die. This gave the creator time to get the benefit of their work, after which it was open for public use, as if previous works were copyrighted new creators could not use them to benefit society (Lessig, 2012). Then Disney discovered that they could freely take public domain stories like Snow White and take ownership of them by copyright, raising questions of fair use (Masnick, 2012). Their use of copyright reduced the public good by taking private ownership of what others had given to the public. Today open-source advocates like GNU now use Creative Commons contracts to ensure that no-one steals public domain items. In access control terms, it is apparent that a right granted fully to the universal actor set cannot be reversed, as that would require all people to agree, which is impossible. This gives the access control rule:

Rule 6: The full transfer of non-personal information into the public domain is not reversible.

What people have given into the public domain cannot be later appropriated for ownership by other parties. Since private data is inalienable it cannot be fully transferred, so people can ask web sites to remove personal data, e.g. the European Union court ruled that data on individuals held by Google must be deleted on request, see here.  

Roles allow local control. The friend role lets one easily change who can see photos posted on a Facebook timeline:

Friend Role = (Friend, Timeline, Enter)

Figure 6.1: Unfriending in Facebook

In this specification, Friend is a persona set that can initially be empty if no friends are set. Timeline refers to your Facebook timeline and Enter is the operation that lets a person access it. So to “friend” another person is to add them to your Friend role set, and to “unfriend” them is to remove them from this set (Figure 6.1). To “friend” is spoken of as an act on a person but it does not change the persona entity, so on the information level it is an act upon a local role. Since a person owns their friend role, they do not need permission to friend or unfriend anyone. So when you unfriend a person they do not need to be told and indeed Facebook does not do so. That the owner of a space can unilaterally define who can enter it gives the access control rule:

Rule 7. A space owner can unilaterally ban or accept persona entry.

Note that a person cannot ban themselves or the administrator by Rule 4. Likewise, the owner of a bulletin board can ban people at will by changing the Guest role to exclude them. If banning were an act on another persona it would need their consent. Social transparency then also requires that the owner of a bulletin boards should publish the social reasons that justify banning on this board (Figure 6.2).

Figure 6.2: Bulletin board ban rules should be transparent

Local roles also allow entity and operation sets. In general, a role can refer to:

  • Actor sets. A set of actors.
  • Entity sets. A set of entities.
  • Operation sets. A set of operations.

Few current systems fully use the power of local roles; e.g. social networks could let actors define an acquaintance role, with fewer rights than a friend but more than the public, or an extended family role that for example denies commenting. This would allow people to limit some photos to family only. Note that currently Facebook builds into the friend role the right to post comments or criticisms which not all people like, see here. Instead of the current “one-shoe-fits-all” that forces common local rules on everyone, it is better to make local roles merely defaults, and allow each individual to define their own roles according to need. People want choice, not a “Father knows best” approach.

Next