Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix and log kinesis duplicates #50

Merged
merged 4 commits into from
Dec 19, 2023
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 76 additions & 6 deletions app/models/kinesis_stream.rb
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,84 @@ def initialize

def create_events(payload)
receive(payload)
# TO DO (possibly?): We may want to consider doing these upserts/inserts in batches to improve performance.
CommentEvent.upsert_all(@comment_events, unique_by: %i[comment_id event_time]) unless @comment_events.empty?
unless @classification_events.empty?
ClassificationEvent.upsert_all(@classification_events,
unique_by: %i[classification_id event_time])

# Because ERAS is one of the ONLY receiving apps that receives from kinesis and BULK UPSERTS (feature of Rails 6+), it has caught duplicates on payload from kinesis stream
# See: https://zooniverse-27.sentry.io/issues/4717869260/?project=4506117954011141&query=is%3Aunresolved&referrer=issue-stream&statsPeriod=14d&stream_index=3
# EVEN THOUGH de-duping the payload by id before upserting should resolve issues with ERAS, (since ERAS only cares about counting the classification/comment once),
# UNFORTUNATELY, there are other apps (eg. Caesar, Tove) that rely on the kinesis stream and where duplicates in payload may affect results.
# Since ERAS is one of the only places we can catch this error (because of how it can bulk upsert), the team has decided to log the error to Sentry when the duplicates in payload occurs
# and also log the payload to Sentry.
## Should note that this duplicate error has been seen before:
## SEE: https://github.com/zooniverse/zoo-stats-api-graphql/pull/128
## ALSO NOTING: THIS CATCH, LOG, DEDUPE AND TRY UPSERTING AGAIN TO DB SITUATION IS TEMPORARY AND ONLY USED
## TO SEE WHAT THE DUPES IN THE KINESIS PAYLOAD ARE.
## ONE MORE NOTE: per Kinesis docs, it is VERY possible for Kinesis stream to send duplicates and
## the recommendation of AWS is to appropriately handle process records.
## SEE: https://docs.aws.amazon.com/streams/latest/dev/kinesis-record-processor-duplicates.html

upsert_comments unless @comment_events.empty?
upsert_classifications unless @classification_events.empty?
upsert_classification_user_groups unless @classification_user_groups.empty?
end

def upsert_comments
yuenmichelle1 marked this conversation as resolved.
Show resolved Hide resolved
CommentEvent.upsert_all(@comment_events, unique_by: %i[comment_id event_time])
rescue StandardError => e
crumb = Sentry::Breadcrumb.new(
category: 'upsert_error_in_comments',
message: 'Comment Events Upsert Error',
data: {
payload: @comment_events,
error_message: e.message
},
level: 'warning'
)
Sentry.add_breadcrumb(crumb)
Sentry.capture_exception(e)
if e.message.include?('ON CONFLICT DO UPDATE command cannot affect row a second time')
@comment_events = @comment_events.uniq { |comment| comment[:comment_id] }
retry
end
end

ClassificationUserGroup.upsert_all(@classification_user_groups.flatten, unique_by: %i[classification_id event_time user_group_id user_id]) unless @classification_user_groups.empty?
def upsert_classifications
yuenmichelle1 marked this conversation as resolved.
Show resolved Hide resolved
ClassificationEvent.upsert_all(@classification_events, unique_by: %i[classification_id event_time])
rescue StandardError => e
crumb = Sentry::Breadcrumb.new(
category: 'upsert_error_in_classifications',
message: 'Classification Events Upsert Error',
data: {
payload: @classification_events,
error_message: e.message
},
level: 'warning'
)
Sentry.add_breadcrumb(crumb)
Sentry.capture_exception(e)
if e.message.include?('ON CONFLICT DO UPDATE command cannot affect row a second time')
@classification_events = @classification_events.uniq { |classification| classification[:classification_id] }
retry
end
end

def upsert_classification_user_groups
yuenmichelle1 marked this conversation as resolved.
Show resolved Hide resolved
ClassificationUserGroup.upsert_all(@classification_user_groups.flatten, unique_by: %i[classification_id event_time user_group_id user_id])
rescue StandardError => e
crumb = Sentry::Breadcrumb.new(
category: 'upsert_error_in_classifications_user_groups',
message: 'Classification User Groups Upsert Error',
data: {
payload: @classification_user_groups,
error_message: e.message
},
level: 'warning'
)
Sentry.add_breadcrumb(crumb)
Sentry.capture_exception(e)
if e.message.include?('ON CONFLICT DO UPDATE command cannot affect row a second time')
@classification_user_groups = @classification_user_groups.uniq { |cug| [cug[:classification_id], cug[:user_group_id]] }
retry
end
end

def receive(payload)
Expand Down