2019:Thriving in Safety/Three Contradictions of Safety
This is a Closed submission for Wikimania 2019. It has been reviewed and was not accepted. |
Please have a look at the topics that people want to see in this space before making a submission and describe how your submission connects with those topics.
Description
[edit | edit source]In this session we will discuss three fundamental contradictions of safety.
1. Content Integrity -versus- "The Encyclopedia anyone can edit".
We are beginning to recognize the threat posed to our projects by misinformation and bias campaigns, and the dangers of nation states able to block, filter, or distort our content as it travels to the reader. However, proposed technical mitigations to these problems often complicate user contribution and editing. For example, an ideal cryptographically-signed and authenticated version of Wikipedia, which could be widely distributed via IPFS and sneakernet, would also be almost immediately out of date. Many distribution strategies fail to envision two-way information flows, so content goes out, but new edits can't get back in. (Consider reading via Tor versus editing via Tor.) Guaranteeing content integrity by strong authentication of immutable snapshots of our content encounters many practical obstacles: the "right to be forgotten", libel laws, biographies of living persons, DMCA takedowns, vandalism, etc.
Furthermore, stronger anticensorship policies can backfire: by drawing attention to either Wikipedia or the underlying technology, they can get both blocked, often more broadly than otherwise. Editors can be prevented from contributing due to attempts to prevent reading unrelated content.
2. Reputation system -versus- protecting our contributors.
Wikipedia is based on long-lived persistent identifiers for contributors: IP addresses and usernames. We have a variety of processes based on the long-term persistence of these identifiers: checkuser, autopatrol, autoconfirm, extended autoconfirm, etc, and our existing IP-based reputation systems are incompatible with privacy-protecting technologies such as Tor. These processes make our contributors vulnerable to doxxing and worse. We also preserve user edit history over extremely long time scales and publish that indiscriminately; often even careful users' real-world identities can be easily inferred based on something they let slip once over years of work on project.
In particular, the number of active editors, and active patrollers, on our projects is actually extremely small. Wikipedia is vulnerable to targeted attacks on these, and we do very little to protect this valuable community.
Further, proposed peer-to-peer distribution strategies compound the legal jeopardy of our contributors: distributing content also means distributing the liability for content deemed outré in your particular legal regime; peer-to-peer systems make readers liable for pages they may not even be aware they are sharing.
The Wikimedia project is not just the content, but a particular social model embedded in both the Mediawiki codebase as well as countless templates and policy pages on wiki. Any shift in the reputation system, or how edits are distributed or compensated, shifts the incentives in this social system and can have unexpected consequences.
3. "Every wiki is its own community" -versus- "scalability"
Decentralization has helped safeguard equity and diversity in our projects by ensuring the autonomy of each language group. However, it limits the ability of our communities and content to grow, and often introduces needless incompatibilities and roadblocks as each community individually encounters and must solve the same problems. Centralization, such as global templates, sharing workflows, etc, promises to save redundant work and help our communities grow, but it comes at a price. Centralization (by design) also binds all to a single legal regime—and there may not be one single best regime.
In this session we will discuss the various tradeoffs involved in resolving these three contradictions. We will discuss how specific technologies (peer-to-peer, offline edit queues, blinded tokens) can contribute or complicate solutions.
Relationship to the theme
[edit | edit source]This session will address the conference theme — Wikimedia, Free Knowledge and the Sustainable Development Goals — in the following manner:
SDG #10: Reduced inequalities: by circumventing censorship and providing both read and edit access to everyone, we can reduce inequalities. Balancing local autonomy against global solutions which enhance scalability also helps ensure we don't propagate existing inequalities into our projects.
SDG #17: Peace, Justice, and Strong Institutions. Protecting the fundamental freedoms of our readers and editors.
Session outcomes
[edit | edit source]At the end of the session, the following will have been achieved:
- Better understanding of trade-offs involved in safety, privacy, and security, and ability to balance these when planning projects or interventions.
Session leader(s)
[edit | edit source]- C. Scott Ananian, Wikimedia Foundation
Session type
[edit | edit source]Each Space at Wikimania 2019 will have specific format requests. The program design prioritises submissions which are future-oriented and directly engage the audience. The format of this submission is a:
- Workshop to identify and try to solve problem
- Roundtable discussion forum
Requirements
[edit | edit source]The session will work best with these conditions:
- Room (Please be aware that the main room available will be a 30 person board room. We can try and arrange for other rooms, but cannot guarantee success):
Board room is fine. A small classroom or round table would also work.
- Audience:
10-20 people, no special prior knowledge.
- Recording:
Should be fine to record, although it might make discussion of vulnerabilities easier if it was not recorded.