Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for Proposals: Content removal affordances and semantics #259

Open
wesbiggs opened this issue Aug 30, 2023 · 2 comments
Open

Request for Proposals: Content removal affordances and semantics #259

wesbiggs opened this issue Aug 30, 2023 · 2 comments

Comments

@wesbiggs
Copy link
Member

wesbiggs commented Aug 30, 2023

This issue is derived from #184 and ensuing discussions.

Definitions

Note: These are my interpretations of these concepts for the purposes of this discussion, and may not completely cover all colloquial usage of the terms. If you disagree with these assertions or feel different terminology is needed, please feel free to comment.

Regulatory compliance refers to the ability of a government or related institution to regulate and enforce the removal of speech it deems to be illegal. The specific acts of speech deemed illegal may vary from jurisdiction to jurisdiction and may change over time. Regulatory compliance often extends liability or creates obligations to service providers that are involved in the hosting or dissemination of content. It may also create obligations with respect to the handling of user data and data removal requests from users.

Contractual compliance refers to the ability of a service provider to enforce a contract (typically in the form of terms of service) with its customers (users), and often extends into rules for conduct and speech. Contractual compliance implies certain powers that are available to the service provider, including content removal, contract termination, and potentially fines.

Censorship is typically the name given to enforcement of regulations that are politically motivated and generally not universally accepted; through that lens, censorship is by nature on opinion one holds about the legitimacy of regulatory or contractual limitations on speech.

Moderation is the ability of an appointed entity to enforce speech norms in a specific context. Content that is otherwise compliant (legally and contractually) may still be subject to moderation depending on the context. Moderation can take the form of content screening, or can be applied post-hoc. Traditionally, the operators of social networking platforms have either performed their own moderation or provided tools for users to self-moderate groups or discussions.

Curation has some conceptual overlap with moderation, but can be thought of as the ability of entity that controls a user experience to prioritize or promote certain content, and deprioritize or demote other content, in order to impact the likelihood that it is seen by the viewer. Curation is increasingly done algorithmically by social media platforms, but often has manual aspects. Users may have agency (alongside platform operators) in curation, such as choosing what users or topics to follow.

Content, for purposes of this discussion, means any speech via DSNP in the form of announcements.

Goals

We want DSNP to be practical for service providers and safe for users. What do we mean by this, and how do we achieve these goals?

Practical for service providers: Service providers should have the ability to enforce regulatory and contractual rules about the content that they (1) host and/or (2) disseminate. They should also be free to moderate and curate the experiences of users that they contract with.

Safe for users: Users should be free to seek out service providers that align with their values. They should feel safe that the DSNP ecosystem is not a haven for illegal content or activity. They should have agency over their own data and content.

Content removal

Content removal can take several forms.

  • *View removal: A provider could choose to simply not show or reveal the existence of a content item via its user interface, regardless of what other providers choose to display.
  • Hosting removal: A provider could choose to delete or unpin content so that it is no longer available via its announced URL. It might have some mechanism to communicate this action to other applications or services, perhaps with appropriate information so that other providers (who may be caching the content) can decide whether they should also remove the item.
  • Network removal: An authorized entity could have a means of communicating a removal request that should be honored by all compliant participants. (Currently, a user or their delegate may do this via a Tombstone Announcement.)

Note that the nature of a DSNP system precludes the ability to remove content announcements (that is, metadata containing the content URL and hash) from batch publications, so that is out of scope.

Affordances

Specifically, we are seeking proposals for the technical tooling, at the protocol level, that is required to provide the affordances necessary to meet these goals.

This might include:

  • New or updated Announcement Types.
  • New or updated Operations.
  • New or updated semantics and compliance requirements for the processing of announcements.

User stories for content removal

The proposal should address one or more (and maybe even all) of the following scenarios:

  1. As a user, I want to remove an item of content I have previously announced via DSNP. This should be possible regardless of my current relationship with the provider hosting this content. (Currently implemented via Tombstone Announcement, with semantics of network removal.)
  2. As a service provider, I want to remove an item of content that violates my terms of service. This should be possible regardless of my past or current relationship with the user who created this content.
  3. As a designated moderator, I want to remove an item of content that violates my community's rules.
  4. As a service provider, I want to remove an item of content for regulatory compliance reasons. This should be possible regardless of my past or current relationship with the user who created this content.
  5. As a service provider, I no longer wish to host a piece of content because I am no longer contractually required to do so.

Submissions welcome

Discussion, ideas, or submissions are all gratefully accepted. Submissions should describe the changes required to the DSNP specification, but a formal pull request to the spec repo is not necessary.

@wesbiggs
Copy link
Member Author

wesbiggs commented Mar 2, 2024

Partially addressed with #273

@wesbiggs
Copy link
Member Author

wesbiggs commented Mar 2, 2024

A good discussion on blocklists in IPFS: ipfs/notes#284

A rejected proposal on blocklist format: ipfs/notes#284

Various rabbit holes descend from those issues.

Protocol Labs operates https://badbits.dwebops.pub/, which has a deny list that's roughly 28MB in size as of this posting (428k entries). Note that there is no metadata included in this file.

One schema for takedown notices can be observed at https://lumendatabase.org/, which is a repository for voluntary submissions of things like DMCA requests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant