5.1 Homo Economicus

Figure 5.1: Individuals competing in a world environment

The Homo-economicus model of human social behavior is shown in Figure 5.1 where two individuals compete in a limited resource world environment, e.g. two beetles foraging for the same food in the same area. In this case, if one gets the food then the other loses out and gets no food. As the beetle with the food is more likely to survive, in natural selection individuals compete for advantage. The farmer growing the food also competes with the beetles, and both compete with bacteria that would also consume it. Limited resource environments reward individual competencies like strength or speed that increase success and survival. In Figure 5.1, competition for limited resources develops individuals to be competent to succeed in the world.

Homo-economicus or “economic man” is an individual who seeks to benefit themselves by reducing effort, increasing gain, or both (Persky, 1995). Individuals competing for advantage favors the evolution of competencies by competition. Mill’s economic man seeks wealth, leisure, luxury and procreation above all else, and Adam Smith adds that such individuals competing in a free market also help society, as when everyone competes to become more competent, the society produces more (Smith, 1776/1986). The model assumes people are rational actors who calculate their own best interests although people actually use heuristics—psychologically efficient versions of pure logic (Tversky & Kahneman, 1982). Competition drives self-interested individuals to competence gains by natural selection, giving a social evolution.

That free individuals will act in self-interest is a rule that can be described as follows:

Rule 1: If freely acting individuals {I1, I2 …} face action choices {a1, a2 …} with expected individual utility outcomes {IU(a1), IU(a2), …} then:

If IU(ai) > IU(aj) then prefer ai over aj

In words: Free individuals prefer acts expected to give more utility value to themselves.   

The concept “utility value” here is deliberately left vague, so it may include physical gains like food, social information tokens like money, psychological gains like appreciation, or social gains like reputation, as well as the “gain” of expending less effort or money.

Next

5. The Social Environment

Every society, whether modern or traditional, technical or physical, has a social environment. This chapter links the arcane role of society to modern technology. It presents Enron and the credit meltdown as social environment errors at the information level. In the following, bear in mind that a social system is always people interacting with people, regardless of the base architecture. Social implementations tried in the physical world, like communism or capitalism, can be tried online, and those tried online can be applied offline, as the Arab spring illustrates.                    Next

5.1 Homo Economicus

5.2 Homo Sociologicus

5.3 The Prisoner’s Dilemma

5.4 The Tragedy of the Commons

5.5 Social Synergy

5.6 Social Defection

5.7 Social Dilemmas are Common

5.8 The Zero-sum Barrier

5.9 The Social Environment

5.10 Social Order

5.11 Social Hijack

5.12 Social Inventions

5.13 Social Health

5.14 Communism and Capitalism

5.15 Social Inflation

5.16 Higher Social Levels

5.17 The Golden Rule

5.18 Free-Giving

5.19 Profit and the Internet

5.20 Beyond Profit

Chapter 5. Key Terms

Chapter 5. Discussion Questions

Chapter 5. References

 

Chapter 4. References

Alexander, C. (1964). Notes on the Synthesis of Form. Cambridge, Ma: Harvard University Press.

Cooper, A. (1999). The Inmates are Running the Asylum- Why High Tech Products Drive us Crazy and How to Restore the Sanity. Indiana: SamsPublishing .

Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press.

Foreman, B., & Whitworth, B. (2005). Information Disclosure and the Online Customer Relationship. Paper presented at the Quality, Values and Choice Workshop, Computer Human Interaction (CHI).

Forester, T., & Morrison, P. (1994). Computer Ethics London: MIT Press.

Fukuyama, F. (1992). The End of History and the Last Man. New York: Avon Books Inc.

Gauze, C. F. (2003). I See You’re Writing an Article… Retrieved 2006

Goldstein, M., Alsio, G., & Werdenhoff, J. (2002). “The media equation does not always apply: People are not polite to small computers.” Personal and Ubiquitous Computing (6), pp. 87-96.

Jenkins, S. (2006). “Concerning Interruptions.” IEEE Computer, 39 (11), pp. 114-116.

Larkin, E. (2005). “Spear Phishing” PC World, November, 2005.

Lessig, L. (1999). Code and other laws of cyberspace. New York: Basic Books.

Mayer, R. E., Johnson, W. L., Shaw, E., & Sandhu, S. (2006). “Constructing computer-based tutors that are socially sensitive: Politeness in educational software.” International Journal of Human Computer Studies, 64(1), pp. 36-42.

Miller, C. A. (2004). “Human-Computer Etiquette: Managing expectations with intentional agents.” Communications of the ACM, 47(4), pp. 31-34.

Nass, C. (2004). “Etiquette Equality: Exhibitions and expectations of computer politeness.” Communications of the ACM, 47(4), 35-37.

Pallatto, J. (2007). Monthly Microsoft Patch Hides Tricky IE 7 Download. eWeek.com January 22.

Power, R. (2000). Tangled Web: Tales of digital crime from the shadows of cyberspace. Indianapolis: QUE Corporation.

Pratley, C. (2004). Chris_Pratley’s OneNote WebLog. 2004

Raskin, J. (2000). The Humane Interface. Boston: Addison-Wesley.

Rawls, J. (2001). Justice as Fairness. Cambridge, MA: Harvard University Press.

Rose, G., Khoo, H., & Straub, D. (1999). “Current Technological Impediments to Business-to-Consumer Electronic Commerce.” Communications of the AIS, I(5).

Shectman, N., & Horowitz, L. M. (2003). Media inequality in conversation: How people behave differently when interacting with computers and people. Paper presented at the CHI (Computer Human Interaction) 2003 Conference, Ft Lauderdale, Florida.

Technology Threats to Privacy (2002, February 24). New York Times. p.12.

Transparency-International. (2001). Corruption Perceptions.

Tzeng, J. (2004). Toward a more civilized design: studying the effects of computers that apologize. International Journal of Human-Computer Studies, 61(3), pp. 319-345.

Whitworth, B. (2005). Polite Computing. Behaviour & Information Technology, 24 (5), pp. 353 – 363.

Whitworth, B. (2008). Some implications of Comparing Human and Computer Processing. Paper presented at the Proceedings of the 41st Hawaii International Conference on System Sciences.

Whitworth, B., deMoor, A. and Liu, T. (2006). Towards a Theory of Online Social Rights. In Z. T. R. Meersman, P. Herrero et al. (Eds.), OTM Workshops 2006, LNCS 4277 (pp. 247 – 256). Berlin Heidelberg: Springer-Verlag

Whitworth, B., & deMoor, A. (2003). Legitimate by design: Towards trusted virtual community environments. Behaviour & Information Technology, 22(1), pp. 31-51.

Whitworth, B., & Whitworth, E. (2004). “Spam and the socio-technical gap.” IEEE Computer37 (10), pp. 38-45.

Wright, R. (2001). Nonzero: The logic of human destiny. New York: Vintage Books.

 

4.13 Politeness – A New Software Requirement

Polite computing as a new requirement of software design requires an attitude change, e.g. to stop seeing “users” as little children unable to exercise choice. Inexperienced users may let software take charge, but experienced users want to make their own choices. The view that “software knows best” does not work for computer-literate users. Perhaps once users were child-like, but today they are grown up.

Software has to stop trying to go it alone. Too clever software acting beyond its ability is already turning core applications like Word into a magic world, where moved figures jump about or even disappear entirely, resized table column widths reset themselves and moving text gives an entirely new format. Increasingly only Ctrl-Z (Undo) saves the day, rescuing us from “too clever” software errors. Software that acts beyond its ability thinks its role is to lead, when really it is to assist.

Rather than using complicated Bayesian logic to predict users, why not simply follow the user’s lead? I repeatedly change Word’s numbered paragraph default indents to my preferences, but it never remembers. How hard is it to copy what the boss does? It always knows better, e.g. if I ungroup and regroup a figure it takes the opportunity to reset my text wrap-around options to its defaults, so now my picture overlaps the text again! Software should leverage user knowledge, not ignore it.

Polite software does not act unilaterally, is visible, does not interrupt, offers understandable choices, remembers the past, and responds to user direction. Impolite software acts without asking, works in secret, interrupts unnecessarily, confuses users, has interaction amnesia, and repeatedly ignores user corrections. It is not hard to figure what software type most people will prefer given a choice.

Social software requirements should be taught in system design along with engineering requirements. A “politeness seal” could mark applications that give rather than take user choice, to encourage this. The Internet will only realize its social potential when software is polite as well as useful and usable.

Next

4.12 Politeness in a software democracy

Modern software, whether web sites or apps, can no longer dictate terms to the people that use it, e.g. President Bush chose not to use e-mail because he did not trust it. The days when software can hold people hostage to its power are gone. Given this fact, successful online traders find politeness profitable. EBay’s customer reputation feedback gives people optional access to valued information relevant to their purchase choice, which by the previous definition is polite. Amazon gives customers information on the books similar buyers buy, not by pop-up ads but as a view option below. Rather than a demand to buy, it is a polite reminder of same-time purchases that could save customer postage. Politeness is not about forcing people to buy but about improving the seller-customer relationship, which is social. Polite companies win business because customers given choices come back. Perhaps one reason the Google search engine swept all before it was that its simple white interface, without annoying flashing or pop-up ads, made it pleasant to interact with. Google ads sit quietly at screen right, as options not demands. Yet while many online companies know that politeness pays, for others, hit-and-run rudeness remains an online way of life.

Polite software does not act pre-emptively but lets people choose, is visible in what it does, makes user actions like editing easy rather than throwing up conditions, remembers people personally, and responds to human direction rather than trying to foist preconceived “good” actions on people. Social computing features like post-checks (allowing an act and then checking it later), versioning and rollback, tag clouds, optional registration, reviewer reputations, view filters and social networks illustrate how polite computing gives choices to people. In the movement from software autocracy to democracy, it pays to be polite because polite software is used more and deleted less.

Next

 

4.11 The Sorcerer’s Apprentice

The sorcerers apprentice

The problem of computers taking over what they do not understand is embodied in the story of the sorcerer who left his apprentice in charge of his laboratory. Thinking he knew what he was doing, the apprentice started to cast spells, but they soon got out of hand and only the sorcerer’s return prevented total disaster. When smart software, like that apprentice, acts on its own it causes things to get out of control and people have to pick up the pieces.

As software evolves it can get things wrong more. For example, Endnote software manages citations in documents like this one by embedding links to a reference database. Endnote Version X ran itself whenever I opened the document, and if it found a problem took the focus from whatever I was doing to let me know right away (Figure 4.8). It used square brackets for citations, so assumed any square brackets in the document were for it, as little children assume any words spoken are to them. After I had told it every time to ignore some [ ] brackets I had in the document, it then closed, dropping the cursor at whatever point it was and leaving me to find my own way back to where I was when it had interrupted.

I could not turn this activation off, so editing my document on the computer at work meant that EndNote could not find its citation database. It complained, then handled the problem by clearing all the reference generating embedded links in the document! If I had carried on without noticing, all the citations would have had to be later re-entered. Yet if you wanted to clear the Endnote links, as when publishing the document, it had to be done manually. Selfish software comes with everything but an off button. It takes no advice and preemptively changes your data without asking. I fixed the problem by uninstalling Endnote and installing Zotero. Cooper compares smart software to a dancing bear: we clap not because it dances well, but because we are amazed it can dance at all (Cooper, 1999). We need to stop clapping.

Figure 4.8: Endnote X takes charge

 For many reasons, people should control computers, not the reverse. Firstly, computers manage vast amounts of data with ease but handle context changes poorly (Whitworth, 2008), so smart computing invariably needs a human minder. Secondly, computers are not accountable for what they do, as they have no “self” to bear any loss. If people are accountable for what computers do, they need control over computer choices. Thirdly, people will always resist computer domination. Software designers who underestimate the importance of human choice invite a grass-roots rebellion. An Internet movement against software arrogance is not inconceivable.

Today, too many people are at war with their software: removing things they did not want added, resetting changes they did not want changed, closing windows they did not want opened and blocking e-mails they did not want to receive. User weapons in this war, such as third party blockers, cleaners, filters and tweakers, are the most frequent Internet download site accesses. Their main aim is to put users back in charge of the computing estate they paid for. If software declares war on users, it will not win. If the Internet becomes a battlefield, no-one will go there. The solution is for software to give choices, not take them, i.e. polite computing.

The future of computing lies not in it becoming so clever that people are obsolete but in a human-computer combination that performs better than people or computers alone. The runaway IT successes of the last decade (cell-phones, Internet, e-mail, chat, bulletin boards etc.) all support people rather than supplant them. As computers develop this co-participant role, politeness is a critical success factor.

Next

4.10 Impolite Computing

Impolite computing has a long history. Spam is impolite because it takes choice away from email owners. Pop-up windows are impolite as they hijack the user’s cursor or point of focus. Changing your device settings without asking is impolite. Impolite computer programs:

1)  Use your memory. Software that uses your hard drive to store information cookies or your phone service to download data without asking.

2) Change your settings. Software can change browser home page, email preferences or file associations.

3) Spy on what you do online. Spyware, stealthware and software back doors can gather information from your computer without your knowledge or record your mouse clicks as you surf the web, or even worse, give your private information to others.

For example, Microsoft’s Windows XP Media Player, was reported to quietly record the DVDs it played and use the user’s computer’s connection to “phone home”, i.e. send data back to Microsoft (Technology threats to Privacy, 2002). Such problems differ from security threats, where hackers or viruses break in to damage information. This problem concerns those we invite into our information home, not those who break in. A similar concern is “software bundling”, where users choose to install one product but are forced to get many:

When we downloaded the beta version of Triton [AOL’s latest instant messenger software], we also got AOL Explorer – an Internet Explorer shell that opens full screen, to AOL’s AIM Today home page when you launch the IM client – as well as Plaxo Helper, an application that ties in with the Plaxo social-networking service. Triton also installed two programs that ran silently in the background even after we quit AIM and AOL Explorer.” (Larkin, 2005).

Yahoo’s “typical” installation of their IM also used to download their Search Toolbar, anti-spyware and anti-pop-up software, desktop and system tray shortcuts, as well as Yahoo Extras, which inserted Yahoo links on your browser, altered the user’s home page and made auto-search functions point to Yahoo by default. Even Yahoo employee Jeremy Zawodny disliked this:

“I don’t know which company started using this tactic, but it is becoming the standard procedure for lots of software out there. And it sucks. Leave my settings, preferences and desktop alone”. Jeremy Zawodny

Even today, many downloads require the user to opt-out rather than opt-in to offers. One must carefully review all the check boxes, lest one says something like: “Please send me endless spam on your products”.

A similar scheme is to use security updates to install new products, e.g. “Microsoft used the January 2007 security update to induce users to try Internet Explorer 7.0 whether they wanted to or not. But after discovering they had been involuntarily upgraded to the new browser, they next found that application incompatibility effectively cut them off from the Internet.” (Pallatto, 2007)

After installing Windows security update KB971033, many Windows 7 owners got a nag screen that their copy of Windows was not genuine, even though it was (Figure 4.5). Clicking the screen gave an option to purchase Windows (again). The update was silently installed even with a Never check for updates setting, Windows did check online and installed it anyway. Despite the legitimacy principle that my PC belongs to me, Windows users found that Microsoft can unilaterally validate, alter or even shut down their software, by its End User License Agreement (EULA). As a result, people developed Windows hack tools like RemoveWAT. And Windows clients in general became more suspicious of their updates, while some chose to avoid Microsoft entirely and investigate more trustworthy options like Linux. A later study found that while 22% of computers failed the test, less than 0.5% had pirated software. Disrespecting and annoying honest customers while trying to catch thieves has never been a good business policy.

Figure 4.5:Windows “genuine advantage” software accused honest customers of fraud

Security cannot defend against people one invites in, especially if the offender is the security system itself. However, in a connected society social influence can be very powerful. In physical society the withering looks given to the impolite are not toothless, as what others think of you affects how they behave towards you. In traditional societies banishment was often considered worse than a death sentence. An online company with a reputation for riding roughshod over user rights may find this is not good for business.

Blameware

Blameware is software that when things go wrong reports it as “your error”. It is interesting to compare messages when things are going well to times when they are not. While software seems delighted to be in charge in good times, when things go wrong it seems universally to agree that you have an error, not we have an error. Brusque and incomprehensible error messages like the “HTTP 404 – File not Found” suggest that you need to fix the problem you have clearly created. Although software itself often causes errors, some software designers recognize little obligation to give people information the software has in a useful form, let alone suggest solutions. To “take the praise and pass the blame” is not polite computing. Polite software sees all interactions as involving “we”. Indeed studies of users in human-computer tutorials show that users respond better to “Let’s click the Enter button” than to “Click the Enter button” (Mayer et al. 2006). When there is a problem, software should try to help the user, not blame them.

Pryware

Pryware is software that asks for information it does not need for any reasonable purpose. Figure 4.6 shows a special interest site which wants people to register but for no obvious reason asks for their work phone number and job title. Some sites are even more intrusive, wanting to know your institution, work address and Skype telephone. Why? If such fields are mandatory rather than optional, people choose not to register, being generally unwilling to divulge data like home phone, cell phone and home address (Foreman & Whitworth, 2005). Equally, they may spam an address like “123 Mystreet, Hometown, zip code 246”. Polite software does not pry because intruding on another’s privacy is offensive.

Figure 4.6: Pryware asks for unneeded details

Nagware

Nagware is software that keeps making the same request until you give in and agree. If the same question is asked over and over, for the same reply, this is to pester or nag, like the “Are we there yet?” of children on a car trip. It forces the other party to give again and again the same choice reply. Many users did not update to Windows Vista because of its reputation as nagware, that asked too many questions. Polite people do not bug others but software does, e.g. when reviewing email offline in Windows XP, actions like using Explorer triggered a “Do you want to connect?” request every few minutes. No matter how often one said “No!” it kept asking, because it had no memory of its own past. Yet in the past software has already solved this problem, e.g. uploading a batch of files creates a series of “Overwrite Y/N?” questions, which would force the user to reply repeatedly “Yes”, but there is a “Yes to All” meta-choice that remembers for the choice set. Such choices about choices (meta-choices) are polite. A general meta-choice console (GMCC) would give users a common place to see or set meta-choices (Whitworth, 2005).

Strikeware

Strikeware is software that executes a pre-emptive strike on user resources. An example is a zip extract product that, without asking, put all the files it extracted as icons on the desktop! Such software tends to be used only once. Installation programs are notorious for pre-emptive acts; e.g. the Real-One Player that added desktop icons and browser links, installed itself in the system tray and commandeered all video and sound file associations. Customers resent such invasions, which while not illegal are impolite. An installation program changing your PC settings is like furniture deliverers rearranging your house because they happen to be in it. Software upgrades continue the tradition; e.g. Internet Explorer upgrades that make MSN your browser home page without asking. Polite software does not do this.

Forgetware

Forgetware is software that forgets every last thing you just told it. Its like talking to a person with amnesia. Working with Forgetware makes the simplest interaction hard; e.g. When you put a USB flash drive into a Windows XP computer it was so hard to get back, as you had to select it and stop it (Figure 4.7), and just pulling it out could corrupt its data. Imagine a human assistant that worked like this:

Boss: Hi Bill, please read this book for me (puts book in Bill’s left hand).

Assistant: Bill has found a book in his hand!

Boss: OK, I want the book back now, Bill.

Assistant: Should I safely remove the book in my left or right hand?

Boss: The book in your left hand, Bill. The one I just gave you.

Assistant: Please confirm the book you mean by pointing to it, or the hand, or both, and say “OK”. Then I will try to give it to you.

Boss: (points to left hand) OK!

Assistant: (left hand tries to give book but the right hand is holding it too) I can’t give the book to you right now; try again later.

Boss: Give me the book back!

Assistant: Do you want me to safely remove the book in my left hand or my right hand …

Boss: (grabs book and walks away.)

A human helper who acted like this would be insolent. If the Windows software can discover a USB is busy after the user selects it, why not check it before? Is its time more important than that of the user? Most users of course just check the USB drive light and if it is not flashing pull it out, avoiding all the above. In a similar vein, a computer voice reports my phone messages like this:

There are five new messages. The first message received at 12.15pm on Wednesday the 14th of November is “<hang-up click>” To save this message press 1, to forward it press 3, to reply to it press 5, …, to delete it press 76.

Note: “76” was the actual delete message code number, even though it is probably the most used option, especially for hang up calls. Again, imagine a human secretary who felt the need to report every detail of a call before telling you that the caller just hung up.

Figure 4.7: Windows XP eject USB interface

Next

4.9 Polite Computing Requirements

Based on the previous definition (Whitworth, 2005), polite software should:

  • Respect the owner. Polite software respects owner rights, does not act preemptively, and does not change data without the permission of its owner.
  • Be visible. Polite software does not sneak around changing things in secret, but openly declares what it is doing and who it represents.
  • Be understandable. Polite software helps people make informed choices by giving information that is useful and understandable.
  • Remember past interactions. Polite software remembers its past interactions and so carries forward your past choices to future interactions.
  • Respond to you. Polite software responds to human directions rather than trying to pursue its own agenda.

1. Respect the user

Respect includes not taking another’s rightful choices. If two parties jointly share a resource, one party’s choices can deny the other’s; e.g. if I delete a shared file, you can no longer print it. Polite software should not preempt rightful user information choices regarding common resources such as the desktop, registry, hard drive, task bar, file associations, quick launch and other user configurable settings. Pre-emptive acts, like changing a browser home page without asking, act unilaterally on a mutual resource and so are impolite.

Information choice cases are rarely simple; e.g. a purchaser can use software but not edit, copy or distribute it. Such rights can be specified as privileges, in terms of specified information actors, methods, objects and contexts (see Chapter 6). To apply politeness in such cases requires a legitimacy baseline; e.g. a provider has no unilateral right to upgrade software on a computer the user owns (though the Microsoft Windows Vista End User License Agreement (EULA) seems to imply this). Likewise users have no right to alter the product source code unilaterally. In such cases politeness applies; e.g. the software suggests an update and the user agrees, or the user requests an update and the software agrees (for the provider). Similarly while a company that creates a browser owns it, the same logic means users own data they create with the browser, e.g. a cookie. Hence, software cookies require user permission to create, and users can view, edit or delete them.

2. Be visible

Part of a polite greeting in most cultures is to introduce oneself and state one’s business. Holding out an open hand, to shake hands, shows that the hand has no weapon, and that nothing is hidden. Conversely, to act secretly behind another’s back, to sneak, or to hide one’s actions, for any reason, is impolite. Secrecy in an interaction is impolite because the other has no choice regarding things they do not know about. Hiding your identity reduces my choices, as hidden parties are untouchable and unaccountable for their actions. When polite people interact, they declare who they are and what they are doing.

If polite people do this, polite software should do the same. Users should be able to see what is happening on their computer. Yet when Windows Task Manager attributes cryptic process like CTSysVol.exe to the user, it could be a system critical process or one left over from a long uninstalled product. Lack of transparency is why after 2-3 years Windows becomes “old”. With every installation of selfish software it puts itself everywhere, fills the taskbar with icons, the desktop with images, the disk with files and the registry with records. In the end, the computer owner has no idea what software is responsible for what files or registry records.

Selfish applications consider themselves important enough to load at start-up and run continuously, in case you need them. Many applications doing this slow down a computer considerably, whether on a mobile phone or a desktop. Taskbar icon growth is just the tip of the iceberg of what is happening to the computer, as some start-ups do not show on the taskbar. Selfish programs put files where they like, so uninstalled applications are not removed cleanly, and over time Windows accretes a “residue” of files and registry records left over from previous installs. Eventually, only reinstalling the entire operating system recovers system performance.

The problem is that the operating system keeps no transparent record of what applications do. An operating system Source Registry could link all processes to their social sources, giving contact and other details. Source could be a property of every desktop icon, context menu item, taskbar icon, hard drive file or any other resource. If each source creates its own resources, a user could then delete all resources allocated by a source they have uninstalled without concern that they were system critical. Windows messages could also state their source, so that users knew who a message was from. Application transparency would let users decide what to keep and what to drop.

3. Be Understandable

A third politeness property is to help the user by offering understandable choices, as a user cannot properly choose from options they do not understood. Offering options that confuse is inconsiderate and impolite, e.g. a course text web site offers the choices:

  • OneKey Course Compass
  • Content Tour
  • Companion Website
  • Help Downloading
  • Instructor Resource Centre

It is unclear how the “Course Compass” differs from the “Companion Website”, and why both seem to exclude “Instructor Resources” and “Help Downloading”. Clicking on these choices, as is typical for such sites, leads only to further confusing menu choices. The impolite assumption is that users enjoy clicking links to see where they go. Yet information overload is a serious problem for web users, who have no time for hyperlink merry-go-rounds.

Not to offer choices at all on the grounds that users are too stupid to understand them is also impolite. Installing software can be complex, but so is installing satellite TV technology, and those who install the latter do not just come in and take over. Satellite TV installers know that the user who pays expects to hear his or her choices presented in an understandable way. If not, the user may decide not to have the technology installed.

Complex installations are simplified by choice dependency analysis, of how choices are linked, as Linux’s installer does. Letting a user choose to install an application the user wants minus a critical system component is not a choice but a trap. Application-critical components are part of the higher choice to install or not; e.g. a user’s permission to install an application may imply access to hard drive, registry and start menu, but not to desktop, system tray, favorites or file associations.

4. Remember past interactions

It is not enough for the software to give choices now but forget them later. If previous responses have been forgotten, the user is forced to restate them, which is inconsiderate. Software that actually listens and remembers past user choices is a wonderful thing. Polite people remember previous encounters, but each time Explorer opens it fills its preferred directory with files I do not want to see, and then asks me which directory I want, which is never the one displayed. Each time I tell it, and each time Explorer acts as if it were the first time I had used it. Yet I am the only person it has ever known. Why can’t it remember the last time and return me there? The answer is that it is impolite by design.

Such “amnesia” is a trademark of impolite software. Any document editing software could automatically open a user’s last open document, and put the cursor where they left off, or at least give that option (Raskin, 2000, p.31). The user logic is simple: If I close the file I am finished, but if I just exit without closing the document, then put me back where I was last time. It is amazing that most software cannot even remember the last user interaction. Even within an application, like email, if one moves from inbox to outbox and back, it “forgets” the original inbox message, so one must scroll back to it; cf. browser tabs that remember user web page position.

5. Respond to you

Current “intelligent” software tries to predict user wants but cannot itself take correction, e.g. Word’s auto-correct function changes i = 1 to I = 1, but if you change it back the software ignores your act. This software is clever enough to give corrections but not clever enough to take correction itself. However, responsive means responding to the user’s direction, not ignoring it.

The classic example of non-responsiveness was Mr. Clippy, Office 97’s paper clip assistant (Figure 4.3).

Figure 4.3: Mr. Clippy takes charge.

Searching the Internet for Mr. Clippy” at the time gave comments like “Die, Clippy, Die!” (Gauze, 2003), yet its Microsoft designer still wondered: “If you think the Assistant idea was bad, why exactly?”

The answer is as one user noted is: It wouldn’t go away when you wanted it to. It interrupted rudely and broke your train of thought. (Pratley, 2004).

To interrupt inappropriately disturbs the user’s train of thought. For complex work like programming, even short interruptions cause a mental “core dump”, as the user drops one thing to attend to another. The interruption effect is then not just the interruption time, but also the recovery time (Jenkins, 2006); e.g. if it takes three minutes to refocus after an interruption, a one second interruption every three minutes can reduce productivity to zero. Mr Clippy was impolite, and this is why in XP it was replaced by side tags smart enough to know their place. In contrast to Mr Clippy, tag clouds and reputation systems illustrate software that reflects rather than directs online users.

Mr Clippy, like all selfish software, was oblivious to how the user sees it. Like Peter Sellers in the film “Being There”, selfish software likes to watch but cannot itself relate to others. Someone should explain to programmers that spying on a user is not a relationship. Mr. Clippy watched the user’s acts on the document, but could not see his interaction with the user, and so was oblivious to the rejection and scorn it evoked. It is a truism that most software today is less aware of its users than an airport toilet.

Selfish software, like a spoilt child, also repeatedly interrupts, e.g. a Windows Update that advises the user when it starts, as it progresses, and when it finishes. Such modal windows interrupt users, seize the cursor and lose current typing. Since each time the update only needs the user to press OK, it is like being repeatedly interrupted to pat a self-absorbed kiddie on the head. The lesson of Mr. Clippy, that software serves the person not the other way around, still needs to be learned.

It is hard for selfish software to keep appropriately quiet; e.g. Word can generate a table of contents from a document’s headings. However if one sends just the first chapter of a book to someone, with the book’s table of contents (to show its scope), every table of contents heading line without a page number loudly declares: “ERROR! BOOKMARK NOT DEFINED”, which of course completely spoils the sample document impression (Figure 4.4). Even worse, this effect is not apparent until the document is received. Why could the software not just quietly put a blank instead of a page number? Why announce its needs so rudely? What counts is not what the software needs but what the user needs, and in this case the user needs the software to be quiet. The solution to such problems is polite computing.

Figure 4.4: A table of contents as emailed to a colleague (Word)

Next

4.8 A Definition of Polite Computing

Politeness has been generally defined as being considerate to another in a social situation. If the person being considered knows what is “considerate” for them, politeness can be defined abstractly as the giving of choice to another in a social interaction. This is then always considerate given only that the other knows what is good for him or her. The latter assumption may not always be true, e.g. in the case of a young baby. In a conversation where the locus of channel control passes back and forth between parties, it is polite to give control to the other party (Whitworth, 2005), e.g. it is impolite to interrupt someone, as that removes their choice to speak, and polite to let them finish talking, as they then choose when to stop. This gives a definition of politeness that can be used in computing as:

“… any unrequired support for situating the locus of choice control of a social interaction with another party to it, given that control is desired, rightful and optional.”(Whitworth, 2005, p355)

Unrequired means the choice given is more than required by the law, as a required choice is not politeness.

Optional means the polite party has the ability to choose, as politeness is voluntary.

Desired by the receiver means giving choice is only polite if the other wants it, e.g. “After you” is not polite when facing a difficult task. Politeness means giving desired choices, not forcing the locus of control, with its burden of action, upon others.

Finally, rightful means that consideration of someone acting illegally is not polite, e.g. to considerately hand a gun to a serial killer about to kill someone is not politeness,

Based on this definition, we can formulate politeness for online cases.

Next

 

4.7 Politeness and Etiquette

It is a mistake to define politeness as “being nice” to the other party, as some do (Nass, 2004). Nass gives the example where someone says “I’m a good teacher; what do you think?” and argues that polite people respond “You’re great”, even when they do not agree. He calls agreeing with another’s self-praise one of the “fundamental rules of politeness” (Nass, 2004, p36). Yet one can politely refuse, beg to differ, respectfully object and humbly criticize, i.e. disagree politely. Conversely one can give to a charity in a rude way, i.e. be nice but rude. Being polite is thus different from being nice, as parents who are polite to their child may not agree to let it choose its own bedtime.

To apply politeness to computer programming, we must define it in information terms. Given the current definition of considering others, then as different societies consider others differently, what is polite in one culture is rude in another. In one culture it may be polite to kiss on the cheek, while in another that could be taken to be rude. There is no universal polite behavior, so there seems no basis to apply politeness to the logic of programming. Yet while different countries have different laws, the human goal of fairness that lies behind the law can be attributed to every society (Rawls, 2001). Likewise, different cultures could have different etiquettes but a common goal of considering others, i.e. politeness. In Figure 4.1, the physical practices of vengeance, law and etiquette derive from human level concepts of unfairness, legitimacy and politeness.

So while societies implement different practices of vengeance, law and etiquette, the aims of avoiding unfairness, enabling legitimacy and encouraging politeness remain the same. It follows that just as legitimacy is the spirit behind the law, so politeness is the spirit behind etiquette. So just as legitimacy lets us generate new laws for online cases, so politeness lets us generate new etiquettes for online cases. Etiquette and law are the information level reflections of the human level concepts of politeness and legitimacy.

If politeness can take different forms in different societies, to ask which implementation applies online is to ask the wrong question. This is like asking a country which of another country’s laws they want to adopt, when laws are generally home grown for each community. Countries do not copy laws from other countries, they appropriate them, i.e. adapt them to their own case. Likewise the question is not how to copy customs like shaking hands to online meetings, but how to reinvent politeness online, whether for chat, wiki, email or other groupware. Just as different physical societies develop different local etiquette and laws, so online communities will develop their own ethics and practices, with software playing a critical support role. While different applications may need different politeness implementations, we can develop general design “patterns” to specify politeness in information terms (Alexander, 1964).

Next