Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(sentry apps): add SLO context manager for send alert event (issue alerts) #86356

Merged
merged 40 commits into from
Mar 14, 2025

Conversation

Christinarlong
Copy link
Contributor

After I looked further into the what notify_sentry_app was being was doing and who was calling it, it didn't feel super useful to add a context manager there. So currently the bounds are the send_alert_event task for PREPARE_WEBHOOK and send_and_save_webhook_request for the SEND_WEBHOOK bound

@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Mar 4, 2025
@Christinarlong Christinarlong marked this pull request as ready for review March 5, 2025 00:02
@Christinarlong Christinarlong requested review from a team as code owners March 5, 2025 00:02
Copy link
Member

@iamrajjoshi iamrajjoshi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lookin' good

@iamrajjoshi iamrajjoshi requested a review from a team March 5, 2025 00:07
Copy link

codecov bot commented Mar 5, 2025

Codecov Report

Attention: Patch coverage is 98.91304% with 1 line in your changes missing coverage. Please review.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
src/sentry/sentry_apps/tasks/sentry_apps.py 96.87% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master   #86356      +/-   ##
==========================================
+ Coverage   87.79%   87.95%   +0.15%     
==========================================
  Files        9782     9753      -29     
  Lines      554157   553677     -480     
  Branches    21730    21292     -438     
==========================================
+ Hits       486528   486972     +444     
+ Misses      67237    66323     -914     
+ Partials      392      382      -10     

Base automatically changed from crl/sa-slos-context-manager to master March 10, 2025 18:06
@@ -210,7 +203,7 @@ def send_alert_webhook(
"alert_rule_ui_component_webhook.sent",
organization_id=organization.id,
sentry_app_id=sentry_app_id,
event=f"{request_data.resource}.{request_data.action}",
event=str(SentryAppEventType.EVENT_ALERT_TRIGGERED),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: don't need to cast a StrEnum to str

@@ -498,6 +491,7 @@ def send_resource_change_webhook(
def notify_sentry_app(event: GroupEvent, futures: Sequence[RuleFuture]):
for f in futures:
if not f.kwargs.get("sentry_app"):
logger.info("notify_sentry_app.future_missing_sentry_app", extra={"future": f})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we raise an exception? calling action_inst.after (which calls notify_sentry_app for sentry app actions) is wrapped in safe_execute so i believe raising an error will log an exception

results = safe_execute(
action_inst.after,
event=event,
notification_uuid=notification_uuid,
)

i suppose this means the action is broken anyway, so hopefully we wouldn't migrate it cc @iamrajjoshi

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if one future is missing a sentry app does that mean the rest are broken? If we raise we'll stop prcessing all the other futures. I can capture here though if that works ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you could call logger.error which would capture a sentry error and log, but not raise an exception

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i would be curious if we are ever missing a future in the first place, i have a suspicion that check exists because of mypy

@@ -498,6 +491,7 @@ def send_resource_change_webhook(
def notify_sentry_app(event: GroupEvent, futures: Sequence[RuleFuture]):
for f in futures:
if not f.kwargs.get("sentry_app"):
logger.info("notify_sentry_app.future_missing_sentry_app", extra={"future": f})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i would be curious if we are ever missing a future in the first place, i have a suspicion that check exists because of mypy

@Christinarlong Christinarlong merged commit 7736c94 into master Mar 14, 2025
49 checks passed
@Christinarlong Christinarlong deleted the crl/slo-send-alert-webhook branch March 14, 2025 20:44
@Christinarlong Christinarlong added the Trigger: Revert Add to a merged PR to revert it (skips CI) label Mar 14, 2025
@getsentry-bot
Copy link
Contributor

PR reverted: fa6491f

getsentry-bot added a commit that referenced this pull request Mar 14, 2025
…ent (issue alerts) (#86356)"

This reverts commit 7736c94.

Co-authored-by: Christinarlong <[email protected]>
@github-actions github-actions bot locked and limited conversation to collaborators Mar 30, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Scope: Backend Automatically applied to PRs that change backend components Trigger: Revert Add to a merged PR to revert it (skips CI)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants