Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MSC4276: Soft unfailure for self redactions #4276

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

turt2live
Copy link
Member

@turt2live turt2live commented Mar 20, 2025

Rendered

Author: @matrix-org/trust-safety
Shepherd: @matrix-org/trust-safety

Conflict of interest disclosure: Many members of the T&S team are Element employees and may serve additional roles outside their primary responsibility. Not all members of the T&S team are publicly known and disclosing all conflicts may reveal their identities - they have been excluded for this reason.

@turt2live turt2live changed the title MSC: Soft unfailure for self redactions MSC4276: Soft unfailure for self redactions Mar 20, 2025
@turt2live turt2live added proposal A matrix spec change proposal s2s Server-to-Server API (federation) kind:core MSC which is critical to the protocol's success needs-implementation This MSC does not have a qualifying implementation for the SCT to review. The MSC cannot enter FCP. safety labels Mar 20, 2025
@turt2live turt2live marked this pull request as ready for review March 20, 2025 18:19
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implementation requirements:

  • Server

Comment on lines +3 to +5
When a user is removed from a room, the server may issue several redactions to their messages to clean
up rooms. Users may also use similar functionality, if supported by their server, to remove their own
messages after being removed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the context for a remote server or remote user redacting their own messages?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some servers apply redactions as an erasure technique.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But why are the servers erasing messages? Can this please be added to the context in the MSC?

Comment on lines +3 to +5
When a user is removed from a room, the server may issue several redactions to their messages to clean
up rooms. Users may also use similar functionality, if supported by their server, to remove their own
messages after being removed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does remove mean explicitly? Does it mean kicked, banned or both?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No longer joined.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this please be made explicit in the text?

Comment on lines +3 to +5
When a user is removed from a room, the server may issue several redactions to their messages to clean
up rooms. Users may also use similar functionality, if supported by their server, to remove their own
messages after being removed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it means kicked or banned, then is good faith being assumed in either remote users and servers?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I guess the context is something like this: a user has registered on a server for use as a spam/brigade throwaway. This user has been banned by a bunch of remote servers. The server admins discover the throwaway and send redactions from the same user (hijacking the account) to clean and try undo some of the damage. In the rooms where the user has been banned, the redactions will have to use prior state to authorize and soft fail.

So I guess "good faith" here means both discovery of the user throwaway by server admins, and then also their cooperation. But that's not the right way to frame the situation. This is really a proposal to assist server admins in cleaning up accounts that have violated their tos.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not following the concern here, sorry.

Comment on lines +22 to +23
* Only the redactions received within 1 hour of the most recent membership event change can bypass
soft failure.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What tooling is expected to be deployed by remote users or servers such that they send redactions within 1 hour?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to deal with possible federation delays, not tooling delays. The redactions may take a little while to send.

Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the problem i'm trying to communicate to you here is that if the user parts from a room an hour before the server admin responsible for them discovers that the account was used for abuse then the redactions will be soft failed.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additionally, not all server admins respond within 1 hour, the user may already have been banned hours and hours before addressed by a server admin, potentially leaving soft failed media (hi, race conditions) in place for other servers in the room.

Comment on lines +33 to +36
## Alternatives

Another approach could be to modify auth rules to exempt same-sender `m.room.redaction` events from the requirement
to pass authorization at the current resolved state. This approach may not work well with [how redactions work](https://spec.matrix.org/v1.13/rooms/v11/#handling-redactions).
Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 is an alternative that does not require good faith in remote users and servers. For use by room moderators.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 requires this MSC in order to work - it's not an alternative.

Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 is for use by room moderators, using the room moderator to send redactions, not the target user. So does not require this MSC because the redaction events all use valid auth state. This is true even when redactions are sent for events that locally have been soft failed.

due to the authorization rules preventing the redaction event from being validated, despite being part of the
DAG at a legal point.

This proposal suggests that servers be less strict about soft failing self-redactions in particular.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think servers should be more strict about self-redactions due to legal requirements, room policy (it would suck if someone will remove useful information or frame someone by removing context), or moderation purposes (reporting messages after ban, machine learning, etc).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:core MSC which is critical to the protocol's success needs-implementation This MSC does not have a qualifying implementation for the SCT to review. The MSC cannot enter FCP. proposal A matrix spec change proposal s2s Server-to-Server API (federation) safety
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants