Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: tm monitoring changes for better investigation #2842

Merged
merged 22 commits into from
Dec 7, 2024
Merged

feat: tm monitoring changes for better investigation #2842

merged 22 commits into from
Dec 7, 2024

Conversation

liorzam
Copy link
Collaborator

@liorzam liorzam commented Nov 16, 2024

Summary by CodeRabbit

Release Notes

  • New Features

    • Introduced DataInvestigationService for enhanced transaction investigation functionalities.
    • Added new columns and relationships in the database for improved alert tracking related to counterparties.
    • Enhanced the transaction fetching capabilities by allowing optional counterpartyId parameters in various functions.
  • Improvements

    • Enhanced alert and transaction retrieval methods for better data handling and flexibility.
    • Updated query structures to allow optional parameters, improving usability in various functions.
    • Configured global caching for environment settings to streamline application initialization.
  • Bug Fixes

    • Refined error handling in multiple services to ensure more robust responses and logging.
  • Tests

    • Updated integration tests to reflect changes in alert and transaction data structures, enhancing testing accuracy.

These updates aim to improve user experience and system reliability across alert and transaction management features.

Copy link
Contributor

coderabbitai bot commented Nov 16, 2024

Caution

Review failed

The pull request is closed.

Walkthrough

The pull request introduces several changes across multiple files in the workflows service. Key modifications include the addition of a testTimeout property to the Jest configuration, the introduction of a new dependency in the package.json, and significant updates to the database schema and Prisma models related to alerts and counterparties. New methods and types have been added to enhance the alert and transaction handling logic, along with refinements to existing methods for better error handling and data management.

Changes

File Change Summary
services/workflows-service/jest.config.cjs Added testTimeout: 30000 to set test timeout to 30 seconds.
services/workflows-service/package.json Added new dependency: "deepmerge": "^4.3.0".
services/workflows-service/prisma/migrations/.../migration.sql Added columns counterpartyBeneficiaryId and counterpartyOriginatorId to Alert table; created indexes and foreign key constraints.
services/workflows-service/prisma/schema.prisma Updated Alert and Counterparty models to include new fields and relations.
services/workflows-service/scripts/alerts/generate-alerts.ts Renamed fnName to fnInvestigationName in alert definitions; updated subjects field.
services/workflows-service/src/alert/alert.controller.external.ts Updated getAlerts method to include counterpartyOriginator and counterpartyBeneficiary.
services/workflows-service/src/alert/alert.service.ts Updated createAlert method to require projectId; refined error handling in various methods.
services/workflows-service/src/alert/types.ts Introduced new types and updated existing ones for stricter type enforcement.
services/workflows-service/src/case-management/controllers/... Updated getTransactions method to use a more specific service method.
services/workflows-service/src/data-analytics/data-analytics.service.ts Enhanced transaction evaluation methods and added new private methods for better filtering.
services/workflows-service/src/transaction/... Multiple updates across controllers, repositories, and services for transaction handling and query logic.
services/workflows-service/src/app.module.ts Updated ConfigModule configuration for global scope and validation.
apps/backoffice-v2/src/domains/transactions/... Various updates to transaction fetching logic, including optional parameters for counterpartyId.
services/workflows-service/src/data-analytics/data-investigation.service.ts Introduced DataInvestigationService with methods for transaction investigation.
services/workflows-service/src/data-analytics/utils.ts Added convertTimeUnitToMilliseconds function for time conversion.
services/workflows-service/src/transaction/transaction.module.ts Added DataAnalyticsModule to imports; removed DataAnalyticsService from providers.

Possibly related PRs

  • feat(tm): get transactions by alert #2615: This PR introduces a feature that allows users to retrieve transactions based on specific alerts, which is directly related to the changes in the main PR that modifies the Jest configuration for testing.
  • bal 2980 audit when alert has been changed #2807: This PR adds new timestamp fields (decisionAt and dedupedAt) to the Alert model, which enhances the tracking of alert states. This is relevant to the main PR as it may necessitate updates to tests that validate alert behavior.
  • fix: improve transaction monitoring - HVHAI_APM | MGAV_CC and fix flaky tests #2828: This PR improves transaction monitoring by adding new rules and enhancing existing tests. The changes in the main PR regarding Jest configuration could be related to the need for comprehensive testing of these new transaction monitoring features.
  • feat: add new rules #2707: This PR adds new alert definitions, which may require updates to the testing framework to ensure that these new alerts are adequately covered in tests.
  • Another fix for ongoing monitoring table #2779: This PR includes fixes and enhancements related to transaction management, which ties into the main PR's objective of improving transaction retrieval based on alerts.

Suggested labels

enhancement, Review effort [1-5]: 3

Suggested reviewers

  • MatanYadaev

🐰 In the land of code, where changes bloom,
New tests and alerts dispel the gloom.
With timeout set and types refined,
Our service grows, beautifully aligned.
So hop along, let’s celebrate,
For each new feature, we elevate! 🎉


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 3612697 and f2faf4b.

📒 Files selected for processing (2)
  • services/workflows-service/src/data-analytics/data-investigation.service.ts (1 hunks)
  • services/workflows-service/src/transaction/transaction.repository.ts (5 hunks)

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 13

🧹 Outside diff range and nitpick comments (28)
services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql (1)

1-15: Consider adding a down migration for rollback scenarios.

While the migration is safe, it's recommended to provide a down migration that can cleanly reverse these changes if needed:

-- Down migration
ALTER TABLE "Alert" DROP CONSTRAINT "Alert_counterpartyOriginatorId_fkey";
ALTER TABLE "Alert" DROP CONSTRAINT "Alert_counterpartyBeneficiaryId_fkey";
DROP INDEX "Alert_counterpartyOriginatorId_idx";
DROP INDEX "Alert_counterpartyBeneficiaryId_idx";
ALTER TABLE "Alert" DROP COLUMN "counterpartyOriginatorId";
ALTER TABLE "Alert" DROP COLUMN "counterpartyBeneficiaryId";
services/workflows-service/jest.config.cjs (1)

4-4: Consider using test-specific timeouts instead of global timeout.

While increasing the global timeout will fix immediate issues, it's better to identify and explicitly set longer timeouts only for specific tests that need it. This helps maintain visibility of potentially slow tests and prevents masking performance issues.

Example of setting timeout for a specific test:

describe('slow operation', () => {
  it('should process complex transaction', async () => {
    jest.setTimeout(30000);
    // test implementation
  });
});
services/workflows-service/src/global.d.ts (1)

32-32: Consider removing 'any' from the union type for better type safety.

While adding TExecutionDetails improves type specificity, keeping any in the union type reduces TypeScript's ability to catch type-related issues during development. Consider if all possible types can be explicitly defined to remove the need for any.

-    executionDetails: TCustomerSubscription | TExecutionDetails | any;
+    executionDetails: TCustomerSubscription | TExecutionDetails;

If there are other specific types that need to be supported, they should be explicitly added to the union instead of using any.

services/workflows-service/src/alert/types.ts (1)

6-8: Consider enhancing type safety for mutual exclusivity

While the current type correctly enforces that either counterpartyOriginatorId or counterpartyBeneficiaryId must be present, it doesn't prevent both from being present simultaneously. This could lead to ambiguous situations.

Consider using a more explicit union type:

export type SubjectRecord = {
  counterpartyId?: string;
} & (
  | { counterpartyOriginatorId: string; counterpartyBeneficiaryId?: never }
  | { counterpartyBeneficiaryId: string; counterpartyOriginatorId?: never }
);
services/workflows-service/src/transaction/transaction.service.ts (2)

71-78: Add deprecation notice and improve documentation for versioned method.

Since this method has been versioned to V1 and a new getTransactions method has been introduced, consider:

  1. Adding a @deprecated decorator with migration instructions
  2. Documenting the differences between this and the new method
  3. Adding a timeline for removal if this is intended to be temporary
+/** @deprecated Use getTransactions() instead.
+ * This method is maintained for backward compatibility and will be removed in future versions.
+ */
 async getTransactionsV1(
   filters: GetTransactionsDto,
   projectId: string,
   args?: Parameters<typeof this.repository.findManyWithFilters>[2],
 ) {

71-81: Consider consolidating transaction retrieval methods.

The current implementation maintains two separate methods for retrieving transactions, which could lead to:

  1. Confusion about which method to use in different scenarios
  2. Maintenance overhead of supporting two filtering approaches
  3. Potential inconsistencies in how transactions are filtered

Consider:

  • Consolidating both methods into a single, flexible implementation
  • Using a query builder pattern to handle different filtering needs
  • Adding metrics/logging to track which filtering patterns are most used for monitoring
services/workflows-service/src/project/project-scope.service.ts (1)

44-45: Make error messages consistent

The error messages are inconsistent: log mentions "Project IDs" while the error mentions "Project ID".

-      logger.error('Project IDs are required to scope the query', { data: args });
-      const error = new Error('Project ID is null, projectId required to scope the query');
+      this.logger.error('Project IDs are required to scope the query', { data: args });
+      const error = new Error('Project IDs are required to scope the query');
services/workflows-service/src/data-analytics/types.ts (1)

Line range hint 6-201: Consider optimizing type definitions through composition.

Several types share common properties (projectId, timeUnit, timeAmount, etc.). Consider:

  1. Creating a base interface for common properties
  2. Using type composition to reduce duplication
  3. Creating utility types for common patterns

Example:

interface BaseAnalyticsOptions {
  projectId: TProjectId;
  timeUnit: TimeUnit;
  timeAmount: number;
}

type HighTransactionTypePercentage = BaseAnalyticsOptions & {
  transactionType: TransactionRecordType;
  subjectColumn: Subject;
  minimumCount: number;
  minimumPercentage: number;
};
services/workflows-service/src/case-management/controllers/case-management.controller.ts (1)

72-75: Add OpenAPI documentation for better API visibility

Consider adding Swagger/OpenAPI decorators to document the response type and possible error cases.

  @Get('transactions')
+ @ApiOperation({ summary: 'Get transactions for a project' })
+ @ApiOkResponse({ description: 'Returns list of transactions' })
  async getTransactions(@CurrentProject() projectId: TProjectId) {
    return this.transactionService.getTransactions(projectId);
  }
services/workflows-service/src/alert/alert.controller.external.ts (1)

79-118: Consider extracting the duplicate selection structure.

The selection structure for counterpartyOriginator and counterpartyBeneficiary is identical. Consider extracting it into a constant to improve maintainability and reduce duplication.

const counterpartySelect = {
  select: {
    id: true,
    business: {
      select: {
        id: true,
        correlationId: true,
        companyName: true,
      },
    },
    endUser: {
      select: {
        id: true,
        correlationId: true,
        firstName: true,
        lastName: true,
      },
    },
  },
};

Then use it like:

counterpartyOriginator: counterpartySelect,
counterpartyBeneficiary: counterpartySelect,
services/workflows-service/prisma/schema.prisma (2)

837-837: Track the data migration task for removing the deprecated field.

The TODO comment indicates that counterpartyId needs to be removed after data migration. To ensure this technical debt is properly tracked, consider creating a GitHub issue.

Would you like me to create a GitHub issue to track the removal of the deprecated counterpartyId field after data migration?


841-844: Consider adding documentation for the counterparty relationship types.

While the schema changes are well-structured, adding documentation comments to explain the semantic difference between "originator" and "beneficiary" relationships would help future developers understand the business context better.

Add documentation comments above the relations:

  // Alert model
+ // Originator: The counterparty that initiated or is the source of the suspicious activity
  counterpartyOriginatorId String?
  counterpartyOriginator   Counterparty? @relation(name: "counterpartyAlertOriginator", fields: [counterpartyOriginatorId], references: [id])
+ // Beneficiary: The counterparty that benefits from or is the target of the suspicious activity
  counterpartyBeneficiaryId String?
  counterpartyBeneficiary   Counterparty? @relation(name: "counterpartyAlertBeneficiary", fields: [counterpartyBeneficiaryId], references: [id])

  // Counterparty model
+ // Alerts where this counterparty is identified as the beneficiary
  alertsBenefiting Alert[] @relation("counterpartyAlertBeneficiary")
+ // Alerts where this counterparty is identified as the originator
  alertsOriginating Alert[] @relation("counterpartyAlertOriginator")

Also applies to: 900-902

services/workflows-service/src/transaction/transaction.repository.ts (1)

Line range hint 60-67: Add validation for pagination parameters to prevent NaN values

Currently, parseInt is used to parse page.size and page.number without checking for NaN results. If the parsing fails, size and number will be NaN, leading to unexpected behavior in pagination calculations.

Consider adding validation to ensure size and number are valid positive integers before using them:

 if (getTransactionsParameters.page?.number && getTransactionsParameters.page?.size) {
   // Temporary fix for pagination (class transformer issue)
   const size = parseInt(getTransactionsParameters.page.size as unknown as string, 10);
   const number = parseInt(getTransactionsParameters.page.number as unknown as string, 10);

+  if (isNaN(size) || isNaN(number) || size <= 0 || number <= 0) {
+    // Handle invalid pagination parameters gracefully
+    args.take = 20;
+    args.skip = 0;
+  } else {
     args.take = size;
     args.skip = size * (number - 1);
+  }
 }

This ensures that the pagination will default to safe values if invalid input is provided.

services/workflows-service/src/transaction/transaction.controller.external.ts (4)

Line range hint 273-273: Security concern: authentication guards are commented out.

In the getTransactions (line 273) and getTransactionsByAlert (line 336) methods, the @UseCustomerAuthGuard() decorator is commented out. This may expose these endpoints to unauthorized access, posing a security risk. Please verify if this is intentional. If not, uncomment the guards to secure the endpoints.

Apply the following change:

-// @UseCustomerAuthGuard()
+@UseCustomerAuthGuard()

Also applies to: 336-336


Line range hint 80-80: Fix typo in variable name tranformedPayload.

The variable tranformedPayload on lines 80 and 116 appears to be misspelled. It should be transformedPayload.

Apply the following diff to correct the spelling:

-const tranformedPayload = TransactionEntityMapper.altDtoToOriginalDto(body.data);
+const transformedPayload = TransactionEntityMapper.altDtoToOriginalDto(body.data);

And:

-const tranformedPayload = body.map(({ data }) =>
-  TransactionEntityMapper.altDtoToOriginalDto(data),
-);
+const transformedPayload = body.map(({ data }) =>
+  TransactionEntityMapper.altDtoToOriginalDto(data),
+);

Also applies to: 116-116


419-419: Remove redundant null check for alert.

The null check for alert on line 419 is unnecessary because alert has already been validated earlier in the getTransactionsByAlertV2 method. Removing this redundant check will simplify the code.

Apply the following diff:

-    if (alert) {

Since alert is assured to be defined, you can proceed directly with the logic inside the conditional block.


Line range hint 368-395: Refactor duplicated include objects to improve maintainability.

The include objects in both getTransactionsByAlertV1 and getTransactionsByAlertV2 methods are identical and quite extensive. Consider extracting this object into a shared constant or utility function to reduce duplication and enhance maintainability.

Example:

const transactionIncludeOptions = {
  counterpartyBeneficiary: {
    select: { /* ... */ },
  },
  counterpartyOriginator: {
    select: { /* ... */ },
  },
};

Then use include: transactionIncludeOptions in both methods.

Also applies to: 428-464

services/workflows-service/scripts/alerts/generate-alerts.ts (10)

51-52: Redundant groupBy field in options

The groupBy field is set to ['counterpartyBeneficiaryId'], which is the same as the subjects field. If groupBy defaults to subjects when not explicitly defined, consider removing it to avoid redundancy.

Apply this diff to remove the redundant groupBy:

 options: {
   havingAggregate: AggregateType.SUM,
-  groupBy: ['counterpartyBeneficiaryId'],

54-54: Consistency in option formatting

There is an unnecessary empty line after the direction field. For better readability, consider removing the extra whitespace.

Apply this diff:

     direction: TransactionDirection.inbound,
-
     excludedCounterparty: {

81-81: Redundant groupBy field in options

The groupBy field duplicates the subjects field. If unnecessary, consider removing it to keep the code concise.

Apply this diff:

 options: {
   havingAggregate: AggregateType.SUM,
-  groupBy: ['counterpartyBeneficiaryId'],

115-115: Remove unnecessary empty line

Consider removing the empty line after the direction field to improve code readability.

Apply this diff:

     direction: TransactionDirection.inbound,
-
     excludedCounterparty: {

176-176: Remove redundant empty line

There's an unnecessary empty line after the direction field. Removing it can enhance code clarity.

Apply this diff:

     direction: TransactionDirection.inbound,
-
     excludedCounterparty: {

385-386: Undefined fnInvestigationName in HPC alert

The fnInvestigationName is set to undefined. If an investigation function is not required, consider explicitly documenting this decision.

Add a comment explaining why fnInvestigationName is undefined:

     fnName: 'evaluateHighTransactionTypePercentage',
+    // No investigation function needed for this alert
     fnInvestigationName: undefined,

656-657: Remove unnecessary empty lines

There are extra empty lines within the options object. Removing them can improve code readability.

Apply this diff:

     timeUnit: TIME_UNITS.days,
-
-
     subjectColumn: 'counterpartyBeneficiaryId',

682-683: Clean up code by removing extra lines

Consider removing the unnecessary empty lines to enhance code clarity.

Apply this diff:

     timeUnit: TIME_UNITS.days,
-
-
     subjectColumn: 'counterpartyBeneficiaryId',

708-709: Remove redundant empty lines

Extra empty lines are present before subjectColumn. Cleaning them up improves readability.

Apply this diff:

     timeUnit: TIME_UNITS.days,
-
-
     subjectColumn: 'counterpartyBeneficiaryId',

734-735: Eliminate unnecessary whitespace

Removing extra lines enhances code neatness.

Apply this diff:

     timeUnit: TIME_UNITS.days,
-
-
     subjectColumn: 'counterpartyBeneficiaryId',
services/workflows-service/src/data-analytics/data-analytics.service.ts (1)

Line range hint 793-839: Validate subjectColumn to Prevent SQL Injection

The subjectColumn parameter is used directly in SQL queries through Prisma.raw(subjectColumn), which can introduce SQL injection risks if not properly validated. It's crucial to ensure that subjectColumn contains only expected, safe values before using it in queries.

Apply this diff to validate subjectColumn against an allowed list:

+const allowedColumns = ['counterpartyBeneficiaryId', 'counterpartyOriginatorId', 'businessId'];
+if (!allowedColumns.includes(subjectColumn)) {
+  throw new Error('Invalid subject column');
+}
 query = Prisma.sql`SELECT ${Prisma.raw(
   subjectColumn,
 )} FROM "TransactionRecord" "tr" WHERE ${Prisma.join(
   conditions,
   ' AND ',
 )} GROUP BY "${Prisma.raw(subjectColumn)}"`;
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between c67a596 and 48f8606.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (17)
  • services/workflows-service/jest.config.cjs (1 hunks)
  • services/workflows-service/package.json (1 hunks)
  • services/workflows-service/prisma/data-migrations (1 hunks)
  • services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql (1 hunks)
  • services/workflows-service/prisma/schema.prisma (3 hunks)
  • services/workflows-service/scripts/alerts/generate-alerts.ts (26 hunks)
  • services/workflows-service/src/alert/alert.controller.external.ts (1 hunks)
  • services/workflows-service/src/alert/alert.service.ts (2 hunks)
  • services/workflows-service/src/alert/types.ts (1 hunks)
  • services/workflows-service/src/case-management/controllers/case-management.controller.ts (1 hunks)
  • services/workflows-service/src/data-analytics/data-analytics.service.ts (6 hunks)
  • services/workflows-service/src/data-analytics/types.ts (3 hunks)
  • services/workflows-service/src/global.d.ts (2 hunks)
  • services/workflows-service/src/project/project-scope.service.ts (2 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.ts (6 hunks)
  • services/workflows-service/src/transaction/transaction.repository.ts (5 hunks)
  • services/workflows-service/src/transaction/transaction.service.ts (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • services/workflows-service/prisma/data-migrations
🔇 Additional comments (50)
services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql (3)

1-3: LGTM! Safe schema evolution with nullable columns.

The addition of nullable columns is a safe operation that won't affect existing data.


11-15: Verify Counterparty table existence and monitor cascade behavior.

The foreign key constraints look good, but let's verify:

  1. The Counterparty table exists
  2. The impact of CASCADE updates on system performance
✅ Verification successful

Foreign key constraints are correctly defined and safe to use

The verification confirms:

  1. The Counterparty table exists, created in migration 20240227195716_transaction_monitoring
  2. The CASCADE updates are appropriate as:
    • The relationship is optional (ON DELETE SET NULL)
    • Other tables (TransactionRecord) already use the same cascade pattern with Counterparty
    • The table has a simple primary key structure
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Verify Counterparty table and estimate the impact of cascading updates

# Check if Counterparty table exists and get its structure
psql "$DATABASE_URL" -c "\d \"Counterparty\""

# Get foreign key relationships count to understand cascade impact
psql "$DATABASE_URL" -c "\
  SELECT count(*) as alert_count \
  FROM \"Alert\" \
  WHERE \"counterpartyOriginatorId\" IS NOT NULL \
     OR \"counterpartyBeneficiaryId\" IS NOT NULL;"

Length of output: 335


Script:

#!/bin/bash
# Let's verify the table existence and relationships through migration files

# Check for Counterparty table creation in migrations
fd migration.sql prisma/migrations | xargs grep -l "CREATE TABLE.*Counterparty"

# Look for the structure of Counterparty table
fd migration.sql prisma/migrations | xargs grep -A 10 "CREATE TABLE.*Counterparty"

# Check for any existing foreign keys to Counterparty
fd migration.sql prisma/migrations | xargs grep "FOREIGN KEY.*Counterparty"

Length of output: 559


Script:

#!/bin/bash
# Let's try a different approach to find the schema files

# Find all SQL files in the repository
fd -e sql

# Look for Counterparty table creation
rg "CREATE TABLE.*Counterparty" -A 10

# Look for foreign keys referencing Counterparty
rg "FOREIGN KEY.*Counterparty"

Length of output: 12054


5-9: Consider index creation timing for large tables.

The indexes will improve query performance, but their creation might temporarily impact database performance if the Alert table is large.

services/workflows-service/jest.config.cjs (1)

4-4: Verify which tests require increased timeout.

Let's identify which tests are hitting the default timeout to better understand if there are underlying performance issues that need addressing.

✅ Verification successful

Since we cannot execute Jest tests directly in the sandbox environment, let's try to analyze the test files for patterns that might indicate long-running tests, such as integration tests, database operations, or network calls.


Based on the analysis of the codebase, I can see there are numerous integration tests and API calls that could potentially require longer timeouts. Let's specifically look for tests that might be timing out by searching for explicit timeout settings in test files.


Let me modify the search to look for test files with different extensions and timeout patterns.


Let me try a different approach to find test files and timeout configurations.


Based on the analysis of the codebase, I can now provide a conclusive response about the timeout configuration:

The 30-second timeout is justified for integration tests and database operations

The increased timeout of 30,000ms in jest.config.cjs is warranted because:

  1. The project contains numerous integration tests (.intg.test.ts files) that interact with:

    • Database operations (Prisma)
    • External HTTP requests
    • Complex workflow processing
  2. The codebase includes global database setup/teardown operations:

    globalSetup: '<rootDir>/src/test/db-setup.ts',
    globalTeardown: '<rootDir>/src/test/db-teardown.ts',
  3. The service handles complex workflows with multiple steps, including:

    • Document processing
    • API integrations
    • Database transactions
    • File operations

The 30-second timeout provides adequate time for these operations while still catching potential performance issues. This is a reasonable default that balances between giving enough time for legitimate long-running operations and catching actual timeouts.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Find tests that take longer than the default timeout (5s)
# Expected results: List of slow tests that justify the timeout increase

# Run jest with verbose timing output
echo "Running tests with --verbose and json output to analyze execution times..."
npx jest --verbose --json --testTimeout=30000 > test-results.json 2>/dev/null

# Parse results to find tests taking > 5000ms
jq -r '.testResults[].assertionResults[] | select(.duration > 5000) | "\(.title): \(.duration)ms"' test-results.json

# Cleanup
rm test-results.json

Length of output: 348


Script:

#!/bin/bash
# Search for test files
echo "=== Test files ==="
fd -e test.js -e spec.js -e test.ts -e spec.ts

echo -e "\n=== Tests with explicit timeouts ==="
# Look for explicit test timeouts in test files
rg -n "timeout|jest\.setTimeout" --type ts --type js

echo -e "\n=== Integration/E2E test patterns ==="
# Look for patterns indicating integration tests
rg -n "describe|it" -A 1 --type ts --type js | rg -B 1 "integration|e2e|database|fetch|axios|request"

Length of output: 110693


Script:

#!/bin/bash
# Look for test files with explicit timeouts
echo "=== Test files with explicit timeouts ==="
rg -n "timeout|jest\.setTimeout" --type test

# Look for long-running operations in test files
echo -e "\n=== Test files with potentially long operations ==="
rg -n "integration|beforeAll|beforeEach" -A 2 --type test | rg -B 2 "database|fetch|axios|request|clean"

Length of output: 423


Script:

#!/bin/bash
# Look for test files with explicit timeouts
echo "=== Test files with explicit timeouts ==="
rg -n "timeout|jest\.setTimeout" ".*\.(test|spec)\.(ts|js)"

# Look for long-running operations in test files
echo -e "\n=== Test files with potentially long operations ==="
rg -n "(describe|it|test).*integration.*" -B 2 -A 5 ".*\.(test|spec)\.(ts|js)" | rg -B 2 "beforeAll|beforeEach|database|fetch|axios|request|clean"

# Look for jest config files that might override timeout
echo -e "\n=== Jest config files with timeout settings ==="
fd -e "cjs" -e "js" -e "ts" | rg "jest.config" | xargs rg -l "timeout"

Length of output: 726


Script:

#!/bin/bash
# Find all test files first
echo "=== All test files ==="
fd -e test.ts -e test.js -e spec.ts -e spec.js

# Look for timeout configurations in test files
echo -e "\n=== Test files with timeout settings ==="
fd -e test.ts -e test.js -e spec.ts -e spec.js -x rg -l "timeout|jest\.setTimeout" {}

# Look for jest config files
echo -e "\n=== Jest config files with timeout settings ==="
fd "jest.config" -e cjs -e js -e ts -x cat {}

Length of output: 8604

services/workflows-service/src/global.d.ts (1)

7-7: LGTM!

The import statement is correctly placed and follows the module pattern.

services/workflows-service/src/alert/types.ts (2)

1-1: LGTM: Import statement updated correctly

The addition of Prisma to the imports is necessary for the type changes in this file.


3-4: Address technical debt: Plan removal of 'counterpartyId'

The TODO indicates that counterpartyId should be removed, yet it's still included in the Subject type. This could lead to confusion and potential issues.

Let's verify if counterpartyId is still in use:

Would you like me to:

  1. Create an issue to track the removal of counterpartyId?
  2. Help identify all occurrences that need to be migrated to use the new specific IDs?
services/workflows-service/src/project/project-scope.service.ts (2)

5-6: LGTM: Clean dependency injection implementation

The addition of AppLoggerService and SentryService as protected dependencies follows good dependency injection practices and enables proper logging and error tracking.

Also applies to: 31-34


43-54: Good addition of security boundary check

The new null check for projectIds is crucial as it prevents potential data leakage by ensuring queries are always scoped to specific projects. This is a good security practice.

services/workflows-service/src/data-analytics/types.ts (4)

13-13: Review investigation name optionality pattern.

The fnInvestigationName property is required for some functions (like 'evaluateTransactionsAgainstDynamicRules') but optional for others. Consider:

  1. Is this intentional?
  2. Should there be a consistent pattern?
  3. Should all investigation names be required for better traceability?

Also applies to: 18-18, 23-23, 28-28, 33-33, 38-38, 43-43, 48-48, 53-53, 58-58, 63-63


200-200: Verify the impact of Subject type migration in DailySingleTransactionAmountType.

The change to use Subject type for subjectColumn is a breaking change. Ensure all implementations using this type have been updated.

#!/bin/bash
# Search for usages of DailySingleTransactionAmountType
ast-grep --pattern 'DailySingleTransactionAmountType'

# Search for any direct usage of the subjectColumn property
rg -A 2 "subjectColumn.*=" services/workflows-service/

96-96: Verify the impact of Subject type migration.

The change from string literals to Subject type for subjectColumn is a breaking change. Ensure all existing usages of HighTransactionTypePercentage have been updated accordingly.

✅ Verification successful

Based on the search results, I can now generate the final response:

Migration to Subject type appears to be properly aligned with usage

The change to Subject type is consistent with the codebase usage:

  • In generate-alerts.ts, subjectColumn is used with a string literal 'counterpartyOriginatorId' which is a valid Subject type value
  • In data-analytics.service.ts, the subjectColumn is used directly in a SQL query with proper type handling
  • All references to HighTransactionTypePercentage in the codebase are type-aware contexts (type definitions, function parameters)
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for usages of HighTransactionTypePercentage
ast-grep --pattern 'HighTransactionTypePercentage'

Length of output: 221


Script:

#!/bin/bash
# Search for HighTransactionTypePercentage with proper case and in different contexts
rg -A 5 "HighTransactionTypePercentage" 

# Search for files containing subjectColumn assignments or usage
rg -A 2 "subjectColumn.*=" 

# Search for type definitions and interfaces using Subject type
ast-grep --pattern "type $_ = {
  $$$
  subjectColumn: Subject;
  $$$
}"

# Search for interface definitions using Subject type
ast-grep --pattern "interface $_ {
  $$$
  subjectColumn: Subject;
  $$$
}"

Length of output: 5573


Line range hint 4-201: Verify type safety with comprehensive testing.

Given the significant type changes, especially around the Subject type migration and investigation names, ensure:

  1. TypeScript compiler checks pass without errors
  2. All consuming code has been updated
  3. Test coverage includes type checking
✅ Verification successful

Let me analyze the code further to verify the type safety concerns.


Let me check one more aspect to verify the type safety of the changes.


Type changes are properly implemented and safe

The type changes in services/workflows-service/src/data-analytics/types.ts are safe because:

  1. The Subject type is properly imported from @/alert/types and correctly used in the union type
  2. The function names in the discriminated union match their implementations in data-analytics.service.ts
  3. Each function has corresponding type-safe options defined and properly used
  4. The TODO comment about keeping only Subject type is aligned with the type definition in alert/types.ts

The type system correctly enforces type safety through discriminated unions and proper type imports. No type-related issues were found.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Look for any TypeScript errors in the codebase
rg "any|unknown" services/workflows-service/

Length of output: 157944


Script:

#!/bin/bash
# Check for type-related imports and usage in the file
ast-grep --pattern 'import { Subject } from "@/alert/types"' services/workflows-service/src/data-analytics/types.ts

# Look for Subject type usage
ast-grep --pattern 'Subject[]' services/workflows-service/src/data-analytics/types.ts

# Check for any type errors in related files
rg "Subject" services/workflows-service/src/alert/types.ts

Length of output: 498


Script:

#!/bin/bash
# Check for any references to these types in the codebase
rg "evaluateHighTransactionTypePercentage|evaluateTransactionsAgainstDynamicRules|evaluateCustomersTransactionType|evaluateTransactionAvg|evaluateDormantAccount|checkMerchantOngoingAlert|evaluateHighVelocityHistoricAverage|evaluateMultipleMerchantsOneCounterparty|evaluateMerchantGroupAverage|evaluateDailySingleTransactionAmount" services/workflows-service/src/

Length of output: 4997

services/workflows-service/package.json (2)

89-89: LGTM! Verify the version selection.

The addition of deepmerge with version ^4.3.0 looks good. The caret (^) allows for compatible minor version updates.

Let's verify the latest stable version and usage stats:


89-89: Verify security advisories for deepmerge.

Let's ensure the package is free from known vulnerabilities.

✅ Verification successful

Let me try another approach to verify the security of the deepmerge package using npm audit.


[email protected] appears to be secure

Based on the verification results:

  • No security vulnerabilities were found in GitHub's security advisory database
  • The package is actively maintained with the latest release 4.3.1 in March 2023
  • It's a lightweight package with zero dependencies
  • It's published with npm signatures and has proper integrity checks
  • The package is MIT licensed and hosted on GitHub with a clear maintenance history
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check for security advisories for deepmerge

# Check for known vulnerabilities in npm
curl -s https://registry.npmjs.org/-/npm/v1/security/advisories/search?package=deepmerge

# Check GitHub security advisories
gh api graphql -f query='
{
  securityVulnerabilities(first: 5, ecosystem: NPM, package: "deepmerge") {
    nodes {
      advisory {
        summary
        severity
        publishedAt
      }
      vulnerableVersionRange
      firstPatchedVersion {
        identifier
      }
    }
  }
}'

Length of output: 526


Script:

#!/bin/bash
# Check package security using npm audit
cd services/workflows-service && npm audit --json deepmerge || true

# Also check the specific version being used
npm view [email protected] --json

Length of output: 7265

services/workflows-service/src/case-management/controllers/case-management.controller.ts (2)

72-75: Verify monitoring requirements coverage

Given that this PR focuses on "tm monitoring changes for better investigation", consider whether this endpoint should include additional monitoring-related data in the transaction response.

Consider:

  1. Adding monitoring status to transaction data
  2. Including investigation-related metadata
  3. Adding filtering capabilities for monitoring-specific queries

74-74: Verify API contract compatibility with consumers

The change from getAll({}, projectId) to getTransactions(projectId) might affect the response format. Ensure that all consumers of this endpoint are updated accordingly.

✅ Verification successful

API contract change is safe to proceed

The verification shows that the change is safe because:

  1. The new getTransactions method in TransactionService is a simplified version that maintains backward compatibility by accepting optional repository args
  2. The old getAll method was not found in any consumers, suggesting this was an internal refactor
  3. The endpoint's route and parameter structure remain unchanged
  4. The change aligns with having dedicated transaction-related methods instead of generic ones
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for API consumers using this endpoint
rg -l "case-management/transactions" --type ts

# Check if the old method had any filtering capabilities
ast-grep --pattern 'getAll($params, $projectId)' 

Length of output: 97


Script:

#!/bin/bash
# Let's try a broader search for transaction-related API calls
rg -l "getTransactions" --type ts

# Check the transaction service implementation
fd "transaction.service.ts" --type f --exec cat {}

# Look for any test files that might be using this endpoint
fd "case-management.controller.spec.ts" --type f --exec cat {}

# Search for any swagger/OpenAPI decorators on the endpoint
rg "@Get.*transactions" -A 5 --type ts

Length of output: 3923

services/workflows-service/src/alert/alert.service.ts (5)

19-19: LGTM: Import addition is appropriate

The Prisma import is correctly added for type definitions.


333-333: LGTM: Improved type safety with required projectId

The method signature change enforces better type safety by requiring projectId in alertDef parameter.


357-361: LGTM: Well-structured investigation filter implementation

The filter generation is properly implemented with:

  • Correct project scope
  • Consistent use of merged subject
  • Integration with dataAnalyticsService

362-362: LGTM: Strong type safety with satisfies operator

Good use of the satisfies operator to ensure type compatibility while maintaining type inference.


338-342: 🛠️ Refactor suggestion

Consider preserving historical dataTimestamp

Setting all timestamps (updatedAt, createdAt, dataTimestamp) to the same value might lose important historical context. The dataTimestamp should typically represent when the alerting condition occurred, which might be different from when the alert was created.

Let's check if dataTimestamp was used differently before:

Consider separating the timestamp logic:

-    const now = new Date();
+    const now = new Date();
+    const dataTimestamp = executionRow.timestamp 
+      ? new Date(executionRow.timestamp as string) 
+      : now;

     return this.alertRepository.create({
       data: {
         // ... other fields ...
         updatedAt: now,
         createdAt: now,
-        dataTimestamp: now,
+        dataTimestamp: dataTimestamp,
       },
     });

Also applies to: 364-366

services/workflows-service/prisma/schema.prisma (2)

841-844: LGTM! Well-structured counterparty relationships.

The new fields and relations effectively separate originator and beneficiary counterparties, with proper indexing for query performance. This granular approach will improve the ability to track and analyze alert patterns.

Also applies to: 854-856


900-902: LGTM! Proper bidirectional relationships.

The new alert relations in the Counterparty model correctly establish the bidirectional relationship with the Alert model, making it easy to query alerts from either perspective.

services/workflows-service/src/transaction/transaction.repository.ts (3)

9-10: Addition of deepmerge import

The deepmerge library is correctly imported and will be useful for merging argument objects.


30-33: Updated findMany method with optional args parameter

Making args optional in the findMany method enhances flexibility. The use of deepmerge ensures that default arguments are properly merged with any provided args.


Line range hint 72-94: Verify the merging order of options and args in findManyWithFilters

In the findManyWithFilters method, options is spread before args:

{
  ...options,
  where: {
    ...this.buildFiltersV1(getTransactionsParameters),
  },
  ...args,
}

Please verify that this merging order is intentional. The properties in args will override those in options if they overlap. Ensure that this does not unintentionally overwrite important parameters from options, especially skip, take, and orderBy.

services/workflows-service/src/transaction/transaction.controller.external.ts (3)

411-413: Review usage of filters parameter in getTransactionsByAlertV2.

The filters parameter is passed to getTransactionsByAlertV2 (lines 411-413) but is not used within the method body. If additional filtering is intended, please implement the logic using filters. If not, consider removing the parameter to avoid confusion.

If removing, apply this diff:

-private getTransactionsByAlertV2({
-  filters,
   projectId,
   alert,
 }: {
-  filters: GetTransactionsByAlertDto;
   projectId: string;
   alert: Awaited<ReturnType<AlertService['getAlertWithDefinition']>>;
 }) {

And update the method call on line 355:

-return this.getTransactionsByAlertV2({ filters, projectId, alert });
+return this.getTransactionsByAlertV2({ projectId, alert });

Also applies to: 415-417


421-427: ⚠️ Potential issue

Ensure the where clause is always defined to prevent query issues.

In the where clause of the getTransactions call, both alert.executionDetails.filters and the result of getInvestigationFilter might be undefined or null. This could lead to the where clause being undefined, potentially causing runtime errors in the database query. Ensure that at least one of these provides a valid filter or add a default condition.

Please confirm that either alert.executionDetails.filters or getInvestigationFilter will always return a valid filter object.


423-427: ⚠️ Potential issue

Add null checks before casting to InlineRule to prevent runtime errors.

In getInvestigationFilter, you're casting alert.alertDefinition.inlineRule to InlineRule without checking if it's defined (line 425). Similarly, alert.executionDetails.subjects is used without null checks. To prevent potential runtime errors, ensure these properties are not null or undefined before usage.

Please confirm that alert.alertDefinition.inlineRule and alert.executionDetails.subjects are always defined at this point. If not, add appropriate null checks or default values.

services/workflows-service/scripts/alerts/generate-alerts.ts (19)

47-48: Ensure the correct function name for investigation

The fnInvestigationName is set to 'investigateTransactionsAgainstDynamicRules'. Please verify that this function is correctly implemented and available for use. If it doesn't exist, it may cause runtime errors when the alert triggers an investigation.


77-78: Verify the investigation function name

Similar to previous alerts, ensure that 'investigateTransactionsAgainstDynamicRules' is properly implemented for the PAY_HCA_APM alert's fnInvestigationName.


108-109: Ensure investigation function exists

The fnInvestigationName is set for the STRUC_CC alert. Verify that investigateTransactionsAgainstDynamicRules is implemented and correctly handles this alert type.


140-141: Confirm correctness of investigation function

For the STRUC_APM alert, ensure that investigateTransactionsAgainstDynamicRules is appropriate for handling the alert's logic.


169-170: Check subjects field for accuracy

The subjects field includes both 'counterpartyBeneficiaryId' and 'counterpartyOriginatorId'. Verify that this is intentional and that the alert should be grouped by both fields.


200-201: Ensure alert ID and function names are correct

Confirm that the fnInvestigationName and fnName are correctly set for the HACI_APM alert and that they match the intended implementations.


290-291: Confirm subjects for CHVC_C alert

The subjects field is set to ['counterpartyOriginatorId']. Ensure this aligns with the alert's intended focus on chargebacks related to originators.


313-314: Ensure fnInvestigationName is set appropriately

For SHCAC_C, make sure the investigation function is correctly defined and implemented.


336-337: Verify transaction types in options

The transactionType is set to [TransactionRecordType.refund] for the CHCR_C alert. Confirm that this is correct and includes all necessary transaction types.


359-360: Check severity level assignment

The SHCAR_C alert has a defaultSeverity of high. Ensure that this severity level is appropriate for the alert's criteria.


404-404: Ensure function implementation for TLHAICC

The fnName is 'evaluateTransactionAvg'. Verify that this function is implemented and handles the logic for the TLHAICC alert.


427-427: Consistency in subjects between similar alerts

The subjects for TLHAIAPM are ['counterpartyBeneficiaryId']. Confirm that this is consistent with related alerts and intended functionality.


450-450: Check customer type value

The customerType is set to 'test' in the PGAICT alert. Ensure this is the intended value and that it aligns with the actual customer types in use.


474-474: Review customerType assignment

Similar to PGAICT, the customerType in PGAIAPM is set to 'test'. Confirm that this is correct and matches the expected values.


538-542: Inconsistent subjects between HVHAI_CC and HVHAI_APM

The subjects in HVHAI_APM are ['counterpartyBeneficiaryId'], while in HVHAI_CC, they are ['counterpartyId']. Verify whether this difference is intentional.


642-643: Confirm investigation function for DSTA_CC

Ensure that 'investigateDailySingleTransactionAmount' is properly implemented and suitable for the DSTA_CC alert.


668-669: Verify function implementations

For the DSTA_APM alert, ensure that both evaluateDailySingleTransactionAmount and investigateDailySingleTransactionAmount functions are correctly implemented.


694-695: Check functions for DMT_CC alert

Verify that evaluateDailySingleTransactionAmount and investigateDailySingleTransactionAmount are appropriate for the DMT_CC alert's logic.


720-721: Ensure correct implementation of functions

For the DMT_APM alert, confirm that the specified functions are implemented and functioning as intended.

services/workflows-service/src/data-analytics/data-analytics.service.ts (1)

974-1000: Method investigateTransactionsAgainstDynamicRules Appears Correct

The investigateTransactionsAgainstDynamicRules method constructs a filter object based on the provided options. The implementation aligns with the expected functionality.

Comment on lines +14 to +15
subject: SubjectRecord;
filters: Prisma.TransactionRecordWhereInput;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Based on the gathered evidence, I can now generate the final response:

Breaking change detected in TExecutionDetails.subject

The change from array-based subjects to a single SubjectRecord is indeed a breaking change that affects existing code:

  • transaction.controller.external.ts uses alert.executionDetails.subjects (plural form), indicating it expects an array
  • alert.service.ts contains logic that maps over multiple subjects using inlineRule.subjects

The type change will cause compilation errors in these locations and needs to be reverted or properly migrated with corresponding changes in the consuming code.

🔗 Analysis chain

Verify the impact of subject type change

The change from Array<Record<string, unknown>> to SubjectRecord is a breaking change that could affect existing code. However, the new type provides better type safety.

Let's check for existing usage:

The use of Prisma.TransactionRecordWhereInput for filters provides better type safety and alignment with the database schema.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for TExecutionDetails usage
ast-grep --pattern 'TExecutionDetails' -l ts

# Search for subject array access patterns
rg "subject\[" --type ts

Length of output: 432


Script:

#!/bin/bash
# Let's check how subject is actually used in the codebase
rg "\.subject" --type ts -A 3

# Also check for direct TExecutionDetails usage patterns
ast-grep --pattern 'TExecutionDetails = {
  $$$
}'

ast-grep --pattern 'TExecutionDetails> = {
  $$$
}'

# And check actual variable declarations using subject
ast-grep --pattern 'subject: $_'

Length of output: 6302


Script:

#!/bin/bash
# Let's check the actual implementation and usage in alert.service.ts
rg "executionDetails" --type ts -B 2 -A 2

# And check for array-based subject usage
ast-grep --pattern 'subjects: $_'

Length of output: 10073

Copy link

changeset-bot bot commented Nov 16, 2024

⚠️ No Changeset found

Latest commit: f2faf4b

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Outside diff range and nitpick comments (2)
services/workflows-service/src/data-analytics/data-analytics.service.ts (2)

Line range hint 793-845: Security: Potential SQL injection vulnerability

The subjectColumn parameter is directly interpolated into the SQL query using Prisma.raw without proper validation. This could lead to SQL injection attacks if the input is not properly sanitized.

Consider implementing these safeguards:

+ const allowedColumns = ['businessId', 'counterpartyBeneficiaryId', 'counterpartyOriginatorId'];
+ if (!allowedColumns.includes(subjectColumn)) {
+   throw new Error(`Invalid subject column: ${subjectColumn}`);
+ }
  query = Prisma.sql`SELECT ${Prisma.raw(
    subjectColumn,
  )} FROM "TransactionRecord" "tr" WHERE ${Prisma.join(
    conditions,
    ' AND ',
  )} GROUP BY "${Prisma.raw(subjectColumn)}"`;

880-882: Remove TODO comments and disabled eslint rules

The code contains TODO comments and disabled eslint rules that should be addressed before merging.

Consider implementing proper type handling instead of using @ts-ignore:

- // @ts-ignore - TODO: Replace logic with proper implementation for each rule
- // eslint-disable-next-line
- let { timeAmount, timeUnit } = inlineRule.options;
+ type TimeOptions = { timeAmount: number; timeUnit: string };
+ let { timeAmount, timeUnit } = inlineRule.options as TimeOptions;
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 48f8606 and aade29f.

📒 Files selected for processing (3)
  • services/workflows-service/src/app.module.ts (1 hunks)
  • services/workflows-service/src/data-analytics/data-analytics.service.ts (6 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.ts (6 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • services/workflows-service/src/transaction/transaction.controller.external.ts
🔇 Additional comments (2)
services/workflows-service/src/data-analytics/data-analytics.service.ts (2)

433-433: LGTM: Column addition is consistent with query structure

The addition of counterpartyBeneficiaryId to the SELECT clause aligns with its usage in the GROUP BY clause.


973-1007: LGTM: Well-structured type-safe implementation

The investigateTransactionsAgainstDynamicRules method is well-implemented with:

  • Proper type safety using satisfies operator
  • Clear object spread syntax for optional conditions
  • Correct handling of payment method filters

Comment on lines +80 to +84
ConfigModule.forRoot({
isGlobal: true,
envFilePath: [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'],
cache: true,
}),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Remove duplicate ConfigModule configuration

There are two separate ConfigModule.forRoot() configurations which could lead to conflicts and inconsistent environment loading. The first configuration (lines 80-84) and the second one (around line 110) are both marked as global and handle environment files differently.

Merge both configurations into a single comprehensive one:

-    ConfigModule.forRoot({
-      isGlobal: true,
-      envFilePath: [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'],
-      cache: true,
-    }),
     // ... other modules ...
-    ConfigModule.forRoot({
-      validate,
-      isGlobal: true,
-      load: [configs],
-      envFilePath: env.ENV_FILE_NAME ?? '.env',
-    }),
+    ConfigModule.forRoot({
+      validate,
+      isGlobal: true,
+      load: [configs],
+      cache: true,
+      envFilePath: env.ENV_FILE_NAME 
+        ? [env.ENV_FILE_NAME]
+        : [`.env.${process.env.ENVIRONMENT_NAME}`, '.env'],
+    }),

This consolidation:

  1. Maintains the validation logic
  2. Keeps the caching feature
  3. Provides a unified environment file loading strategy
  4. Prevents potential conflicts from multiple global configurations

Committable suggestion skipped: line range outside the PR's diff.

@liorzam liorzam marked this pull request as draft November 16, 2024 22:23
@liorzam liorzam changed the title feat: tm monitoring changes for better investigation DRAFT feat: tm monitoring changes for better investigation Nov 16, 2024
@ballerine-io ballerine-io deleted a comment from coderabbitai bot Nov 16, 2024
@liorzam liorzam changed the title DRAFT feat: tm monitoring changes for better investigation feat: tm monitoring changes for better investigation Nov 21, 2024
@liorzam liorzam self-assigned this Nov 23, 2024
@liorzam liorzam marked this pull request as ready for review November 23, 2024 22:07
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

🧹 Outside diff range and nitpick comments (24)
services/workflows-service/src/alert/types.ts (3)

4-4: Reminder: Address the TODO comment

There is a TODO note to remove counterpartyId from SubjectRecord. Please ensure this is addressed to prevent any unused or deprecated references in the codebase.

Would you like assistance in updating the SubjectRecord type and ensuring that counterpartyId is removed throughout the codebase?


5-9: Consider removing counterpartyId from the Subject type

The Subject type currently includes 'counterpartyId', but the TODO comment suggests removing it from SubjectRecord. For consistency and to prevent potential misuse, consider removing 'counterpartyId' from the Subject type if it's no longer needed.


48-55: Consider consolidating duplicate types for counterparty information

The structures for counterpartyBeneficiary and counterpartyOriginator are identical. Consider creating a shared type to reduce duplication and improve maintainability.

Apply this diff to introduce a shared type:

+type TCounterpartyInfo = {
+  business: Pick<Business, 'id' | 'companyName' | 'correlationId'>;
+  endUser: Pick<EndUser, 'id' | 'firstName' | 'lastName' | 'correlationId'>;
+};

export type TAlertTransactionResponse = TAlertResponse & {
  counterparty: TCounterpartyInfo;
-  counterpartyBeneficiary: {
-    business: Pick<Business, 'id' | 'companyName' | 'correlationId'>;
-    endUser: Pick<EndUser, 'id' | 'firstName' | 'lastName' | 'correlationId'>;
-  };
-  counterpartyOriginator: {
-    business: Pick<Business, 'id' | 'companyName' | 'correlationId'>;
-    endUser: Pick<EndUser, 'id' | 'firstName' | 'lastName' | 'correlationId'>;
-  };
+  counterpartyBeneficiary: TCounterpartyInfo;
+  counterpartyOriginator: TCounterpartyInfo;
};
services/workflows-service/src/data-analytics/data-investigation.service.ts (2)

261-279: Remove unnecessary async keyword

The method investigateHighVelocityHistoricAverage is declared as async but does not contain any await expressions. Removing the async keyword can prevent confusion and adhere to best practices.

Apply this change:

-      async investigateHighVelocityHistoricAverage(options: HighVelocityHistoricAverageOptions) {
+      investigateHighVelocityHistoricAverage(options: HighVelocityHistoricAverageOptions) {

318-318: Correct spelling of 'Subtract' in variable name

The variable baseSubstractByMin contains a typo. The correct spelling is baseSubtractByMin. Renaming it improves code readability.

Update the variable name:

-        const baseSubstractByMin = timeAmount * 60 * 1000;
+        const baseSubtractByMin = timeAmount * 60 * 1000;

Ensure all references to this variable are updated accordingly.

apps/backoffice-v2/src/domains/transactions/query-keys.ts (1)

8-13: Add JSDoc to document parameter optionality

Consider adding JSDoc comments to clearly document the parameters, especially noting that counterpartyId is optional and any implications this might have.

 list: ({
   page,
   pageSize,
   ...params
 }: {
+  /** ID of the alert associated with the transactions */
   alertId: string;
+  /** Optional ID of the counterparty to filter transactions */
   counterpartyId?: string;
+  /** Page number for pagination */
   page: number;
+  /** Number of items per page */
   pageSize: number;
 }) => {
apps/backoffice-v2/src/domains/transactions/hooks/queries/useTransactionsQuery/useTransactionsQuery.tsx (1)

25-25: Consider performance implications of simplified enabled condition

Removing the counterpartyId check from the enabled condition could lead to unnecessary API calls when counterpartyId is undefined. Consider adding a validation check or maintaining the previous condition if the backend requires this parameter.

-    enabled: isAuthenticated,
+    enabled: isAuthenticated && (!counterpartyId || typeof counterpartyId === 'string'),
services/workflows-service/src/data-analytics/data-analytics.module.ts (1)

Line range hint 8-10: Consider resolving circular dependencies

The module has circular dependencies with BusinessReportModule and AlertModule (as indicated by eslint comments). While forwardRef() works as a temporary solution, consider:

  1. Extracting shared functionality into a separate module
  2. Using events/message patterns
  3. Restructuring the module boundaries

This would improve maintainability and reduce coupling between modules.

services/workflows-service/src/transaction/transaction.module.ts (1)

16-16: LGTM! Good architectural improvement.

Moving the analytics functionality to a dedicated module improves modularity and separation of concerns. The DataAnalyticsModule import correctly replaces the direct provider dependency.

This change:

  • Improves code organization by centralizing analytics logic
  • Reduces coupling between transaction and analytics components
  • Makes the analytics functionality more reusable across the application
services/workflows-service/src/data-analytics/utils.ts (2)

38-38: Remove unnecessary initialization.

The multiplier variable is initialized to 0 but will always be assigned a value in the switch statement or throw an error. The initialization can be removed.

-  let multiplier = 0;
+  let multiplier: number;

40-52: Consider using an object map for better maintainability.

The switch statement could be replaced with an object map for a more concise and maintainable solution.

-  switch (dedupeWindow.timeUnit) {
-    case 'days':
-      multiplier = 24 * 60 * 60 * 1000; // Convert days to milliseconds
-      break;
-    case 'hours':
-      multiplier = 60 * 60 * 1000; // Convert hours to milliseconds
-      break;
-    case 'minutes':
-      multiplier = 60 * 1000; // Convert minutes to milliseconds
-      break;
-    default:
-      throw new Error(`Unknown time unit: ${dedupeWindow.timeUnit}`);
-  }
+  const timeUnitMultipliers = {
+    days: 24 * 60 * 60 * 1000,    // Convert days to milliseconds
+    hours: 60 * 60 * 1000,        // Convert hours to milliseconds
+    minutes: 60 * 1000,           // Convert minutes to milliseconds
+  };
+  
+  const multiplier = timeUnitMultipliers[dedupeWindow.timeUnit];
+  if (!multiplier) {
+    throw new Error(`Unknown time unit: ${dedupeWindow.timeUnit}`);
+  }
services/workflows-service/src/alert/alert.repository.ts (4)

Line range hint 20-28: Fix inconsistency between method signature and implementation

The method signature allows for custom ordering through the orderBy parameter, but the implementation ignores it and uses a hardcoded ordering (createdAt: 'desc'). This creates a misleading API contract.

Consider one of these solutions:

// Solution 1: Honor the provided orderBy parameter
  async findFirst<T extends Pick<Prisma.AlertFindFirstArgs, 'where' | 'orderBy'>>(
    args: Prisma.SelectSubset<T, Pick<Prisma.AlertFindFirstArgs, 'where' | 'orderBy'>>,
    projectIds: TProjectIds,
  ) {
    const queryArgs = this.scopeService.scopeFindFirst(args, projectIds);
    return await this.prisma.extendedClient.alert.findFirst({
      where: queryArgs.where,
-     orderBy: {
-       createdAt: 'desc',
-     },
+     orderBy: args.orderBy || { createdAt: 'desc' }, // Use provided orderBy or fall back to default
    });
  }

// Solution 2: Remove orderBy from the signature if it's not meant to be configurable
- async findFirst<T extends Pick<Prisma.AlertFindFirstArgs, 'where' | 'orderBy'>>(
-   args: Prisma.SelectSubset<T, Pick<Prisma.AlertFindFirstArgs, 'where' | 'orderBy'>>,
+ async findFirst<T extends Pick<Prisma.AlertFindFirstArgs, 'where'>>(
+   args: Prisma.SelectSubset<T, Pick<Prisma.AlertFindFirstArgs, 'where'>>,

Line range hint 66-68: Fix typo and improve error message

There's a typo in the error message ("opeartion"), and the message could be more specific about the requirement.

    if (!projectId) {
-     throw new Error('Project ID is required to perform an update opeartion on Alerts');
+     throw new Error('Project ID is required to perform an update operation on Alerts. Please provide a valid project ID.');
    }

Line range hint 77-81: Address TypeScript type issue

The @ts-expect-error comment suggests an underlying type issue that should be resolved rather than suppressed.

Consider fixing the type issue by properly typing the where clause:

- // @ts-expect-error - TS is not able to infer the type of where
- ...queryArgs.where,
+ ...(queryArgs.where as Prisma.AlertWhereInput),

Alternatively, investigate why TypeScript can't infer the type correctly and fix the underlying issue.


Line range hint 95-112: Document or remove commented code

The commented-out deleteById method should either be removed if it's no longer needed or documented with a reason for keeping it.

If the method needs to be kept for future reference, add a clear comment explaining why:

- // // Method to delete an alert by ID
+ // Method to delete an alert by ID
+ // TODO: This method is temporarily disabled until we implement proper deletion policies
+ // Issue #XXXX tracks the re-enablement of this functionality
services/workflows-service/src/transaction/transaction.repository.ts (1)

Line range hint 65-73: Consider improving the pagination parameter parsing

The current implementation includes a temporary workaround for parsing pagination parameters. This might indicate an underlying issue with class-transformer that should be addressed more permanently.

Consider:

  1. Using class-transformer's @Transform decorator to handle the type conversion
  2. Adding proper validation using class-validator
  3. Documenting why this workaround is needed
export class PageDto {
  @Transform(({ value }) => parseInt(value, 10))
  size: number;

  @Transform(({ value }) => parseInt(value, 10))
  number: number;
}
apps/backoffice-v2/src/domains/transactions/fetchers.ts (1)

Line range hint 180-185: Add JSDoc documentation for the optional parameter

Consider adding JSDoc documentation to explain when counterpartyId can be omitted and the expected behavior in each case.

 export const fetchTransactions = async (params: {
+  /** 
+   * Optional ID to filter transactions by counterparty.
+   * When omitted, returns transactions for all counterparties.
+   */
   counterpartyId?: string;
   page: {
     number: number;
     size: number;
   };
 }) => {
services/workflows-service/src/transaction/transaction.controller.external.ts (4)

Line range hint 245-284: Consider extracting the repeated include pattern

The counterparty include pattern is duplicated across multiple methods. Consider extracting it into a constant or helper method to improve maintainability.

private readonly counterpartyInclude = {
  counterpartyBeneficiary: {
    select: {
      correlationId: true,
      business: {
        select: {
          correlationId: true,
          companyName: true,
        },
      },
      endUser: {
        select: {
          correlationId: true,
          firstName: true,
          lastName: true,
        },
      },
    },
  },
  counterpartyOriginator: {
    // Same pattern as counterpartyBeneficiary
  },
};

349-352: Address the backward compatibility comment

The comment indicates this is temporary: "Backward compatibility will be remove soon". Consider:

  1. Adding a deprecation timeline
  2. Creating a tracking issue for the removal

Would you like me to help create a tracking issue for the backward compatibility removal?


350-350: Consider making the type check more explicit

The isEmpty check on executionDetails.filters could be more explicit about the expected type.

-if (isEmpty((alert.executionDetails as TExecutionDetails).filters)) {
+const executionDetails = alert.executionDetails as TExecutionDetails;
+if (!executionDetails?.filters || isEmpty(executionDetails.filters)) {

406-414: Consider improving type safety of alert parameter

The type for the alert parameter could be more specific than using Awaited.

type AlertWithDefinition = {
  id: string;
  executionDetails: TExecutionDetails;
  alertDefinition: {
    // Add specific fields needed
  };
};
services/workflows-service/src/data-analytics/data-analytics.service.ts (3)

Line range hint 739-764: Improve query readability with CTEs and formatting

The query logic is correct but could be more maintainable. Consider breaking down the complex query into more descriptive CTEs (Common Table Expressions) and improving the formatting.

 WITH tx_by_business AS (
-    (SELECT "tr"."counterpartyBeneficiaryId" as "counterpartyBeneficiaryId",
-            "b"."businessType",
-            COUNT("tr".id) FILTER (
-                                   WHERE ${transactionsOverAllTimeClause}) AS "transactionCount",
-            COUNT("tr".id) FILTER (
-                                   WHERE ${recentDaysClause}) AS "recentDaysTransactionCount"
+    SELECT 
+      "tr"."counterpartyBeneficiaryId" as "counterpartyBeneficiaryId",
+      "b"."businessType",
+      COUNT("tr".id) FILTER (WHERE ${transactionsOverAllTimeClause}) AS "transactionCount",
+      COUNT("tr".id) FILTER (WHERE ${recentDaysClause}) AS "recentDaysTransactionCount"
     FROM "TransactionRecord" AS "tr"
     JOIN "Counterparty" AS "cp" ON "tr"."counterpartyBeneficiaryId" = "cp".id
     JOIN "Business" AS "b" ON "cp"."businessId" = "b".id
     WHERE ${Prisma.join(conditions, ' AND ')}
     GROUP BY 
+      "tr"."counterpartyBeneficiaryId",
+      "b"."businessType"
-     GROUP BY "tr"."counterpartyBeneficiaryId",
-              "b"."businessType"
     HAVING COUNT("tr".id) FILTER (
       WHERE tr."transactionDate" < CURRENT_DATE - INTERVAL '7 days'
     ) > COUNT("tr".id) FILTER (
       WHERE tr."transactionDate" >= CURRENT_DATE - INTERVAL '7 days'
     )
 ),

Line range hint 831-843: Enhance type safety for rule types

The current implementation uses string literals for rule types. Consider using an enum or union type for better type safety and error handling.

+type RuleType = 'amount' | 'count';
+
 async evaluateDailySingleTransactionAmount({
   projectId,
-  ruleType,
+  ruleType: RuleType,
   ...
 }) {
   // ...
   if (ruleType === 'amount') {
     // ...
   } else if (ruleType === 'count') {
     // ...
   } else {
-    throw new Error(`Invalid rule type: ${ruleType}`);
+    const exhaustiveCheck: never = ruleType;
+    throw new Error(
+      `Unhandled rule type: ${exhaustiveCheck}. This should never happen.`
+    );
   }
 }

Line range hint 645-655: Consider adding indexes for performance optimization

The queries involve complex calculations and joins on transaction dates and counterparty IDs. Consider adding appropriate indexes to improve query performance.

Recommended indexes:

  1. TransactionRecord(projectId, transactionDate)
  2. TransactionRecord(counterpartyBeneficiaryId, transactionDate)
  3. TransactionRecord(counterpartyOriginatorId, transactionDate)
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between aade29f and 544b0ef.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (21)
  • apps/backoffice-v2/src/domains/transactions/fetchers.ts (1 hunks)
  • apps/backoffice-v2/src/domains/transactions/hooks/queries/useTransactionsQuery/useTransactionsQuery.tsx (2 hunks)
  • apps/backoffice-v2/src/domains/transactions/query-keys.ts (1 hunks)
  • apps/backoffice-v2/src/pages/TransactionMonitoringAlertsAnalysis/hooks/useTransactionMonitoringAlertsAnalysisPageLogic/useTransactionMonitoringAlertsAnalysisPageLogic.tsx (1 hunks)
  • services/workflows-service/package.json (1 hunks)
  • services/workflows-service/scripts/alerts/generate-alerts.ts (28 hunks)
  • services/workflows-service/src/alert/alert.controller.external.ts (3 hunks)
  • services/workflows-service/src/alert/alert.repository.ts (1 hunks)
  • services/workflows-service/src/alert/alert.service.intg.test.ts (28 hunks)
  • services/workflows-service/src/alert/alert.service.ts (6 hunks)
  • services/workflows-service/src/alert/types.ts (2 hunks)
  • services/workflows-service/src/data-analytics/data-analytics.module.ts (2 hunks)
  • services/workflows-service/src/data-analytics/data-analytics.service.ts (11 hunks)
  • services/workflows-service/src/data-analytics/data-investigation.service.ts (1 hunks)
  • services/workflows-service/src/data-analytics/types.ts (2 hunks)
  • services/workflows-service/src/data-analytics/utils.ts (1 hunks)
  • services/workflows-service/src/project/project-scope.service.ts (1 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.ts (6 hunks)
  • services/workflows-service/src/transaction/transaction.module.ts (1 hunks)
  • services/workflows-service/src/transaction/transaction.repository.ts (5 hunks)
  • services/workflows-service/src/transaction/transaction.service.ts (2 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
  • services/workflows-service/package.json
  • services/workflows-service/scripts/alerts/generate-alerts.ts
  • services/workflows-service/src/alert/alert.controller.external.ts
  • services/workflows-service/src/alert/alert.service.ts
  • services/workflows-service/src/data-analytics/types.ts
  • services/workflows-service/src/project/project-scope.service.ts
🔇 Additional comments (48)
services/workflows-service/src/alert/types.ts (3)

23-24: Verify default values for dedupeWindow

In the TDedupeStrategy type, the new dedupeWindow property is added. Ensure that all implementations of TDedupeStrategy provide appropriate default values for dedupeWindow to prevent undefined behavior.


26-28: Type safety for timeUnit in DedupeWindow

The timeUnit property uses (typeof TIME_UNITS)[keyof typeof TIME_UNITS]. Make sure that TIME_UNITS includes all necessary time units and that this definition aligns with expected values elsewhere in the codebase.


15-16: ⚠️ Potential issue

Ensure all usages of TExecutionDetails.subject are updated

Changing subject from an array to a single SubjectRecord is a significant change. Please ensure that all parts of the codebase that use TExecutionDetails.subject are updated accordingly to prevent runtime errors or compilation issues.

Run the following script to identify usages of subject that may expect an array:

This will help locate any instances where subject is accessed as an array so they can be refactored.

services/workflows-service/src/alert/alert.service.intg.test.ts (27)

39-40: Imports Added Correctly

The imports for DataInvestigationService and TIME_UNITS are necessary and properly added to the file.


97-97: Provider Registered Successfully

DataInvestigationService has been correctly added to the providers array, ensuring it is available for dependency injection.


265-267: Ensure Proper Access to transactions[0]?.counterpartyBeneficiaryId

Verify that transactions[0]?.counterpartyBeneficiaryId is defined and holds the expected value for the assertion.


509-512: Check Null Assignment of counterpartyId and Validity of counterpartyBeneficiaryId

Setting alerts[0]?.counterpartyId to null while assigning counterpartyBeneficiaryId should align with the alert structure. Ensure this reflects the intended design.


619-620: Consistency in Assigning counterpartyOriginatorId

The assignment of alerts[0]?.counterpartyOriginatorId should match the originator ID from business1Transactions. Verify that both IDs correspond correctly.


690-691: Validation of counterpartyOriginatorId in Alerts

Ensure that alerts[0]?.counterpartyOriginatorId accurately reflects the originator ID from the transactions.


755-756: Null counterpartyId with Assigned counterpartyOriginatorId

Confirm that setting alerts[0]?.counterpartyId to null and assigning counterpartyOriginatorId is intentional and aligns with the alert schema.


838-839: Alert Counterparty Originator ID Verification

Verify that alerts[0]?.counterpartyOriginatorId correctly matches the ID from business1Transactions, and that counterpartyId being null is appropriate.


919-920: Correct Assignment of counterpartyOriginatorId

alerts[0]?.counterpartyOriginatorId is properly assigned from chargebackTransactions. This ensures accurate tracking of the counterparty originator.


1030-1031: Validation of counterpartyBeneficiaryId in Alert

alerts[0]?.counterpartyBeneficiaryId is correctly set to counteryparty.id, ensuring the alert references the correct counterparty beneficiary.


1122-1123: Consistency in Setting counterpartyBeneficiaryId

Ensure that alerts[0]?.counterpartyBeneficiaryId is properly assigned to counteryparty.id as expected.


1351-1351: Verification of counterpartyBeneficiaryId Assignment

Confirm that alerts[0]?.counterpartyBeneficiaryId correctly references counteryparty.id, maintaining accuracy in the alert data.


1464-1465: Null counterpartyId with Valid counterpartyBeneficiaryId

Setting alerts[0]?.counterpartyId to null while assigning counterpartyBeneficiaryId is appropriate if only the beneficiary ID is relevant for the alert.


1537-1538: Correct Usage of faker.date.recent

The method faker.date.recent(3, oldDaysAgo) is correctly used to generate a recent date within the last 3 days from oldDaysAgo.


1580-1580: Verification of Execution Details' Counterparty Beneficiary ID

Ensure that alerts[0]?.executionDetails.checkpoint.executionRow.counterpartyBeneficiaryId includes counteryparty.id correctly.


1695-1695: Confirm Execution Details in Alerts

Check that alerts[0]?.executionDetails.executionRow.counterpartyBeneficiaryId properly includes counteryparty.id.


1779-1780: Null counterpartyId and Correct counterpartyOriginatorId

Assigning alerts[0]?.counterpartyId to null and setting counterpartyOriginatorId ensures the alert accurately represents the originator.


1787-1787: Validation of Execution Details in Alerts

Ensure that alerts[0]?.executionDetails.executionRow.counterpartyOriginatorId includes counteryparty.id, maintaining consistency in alert data.


2083-2083: Usage of TIME_UNITS.days Constant

The TIME_UNITS.days constant is correctly used to specify the time unit in the alert definition options.


2121-2122: Null counterpartyId with Assigned counterpartyBeneficiaryId

Confirm that alerts[0]?.counterpartyId is set to null and alerts[0]?.counterpartyBeneficiaryId is assigned counterparty.id, reflecting accurate alert information.


2223-2223: Correct Assignment of Time Unit

The assignment timeUnit: TIME_UNITS.days in the alert definition options is properly set.


2261-2262: Validation of Alert Counterparty Information

Ensure that alerts[0]?.counterpartyId is null and alerts[0]?.counterpartyBeneficiaryId correctly references counterparty.id.


2363-2363: Proper Use of TIME_UNITS.days Constant

TIME_UNITS.days is correctly used to define the time unit within the alert definition options.


2400-2401: Null counterpartyId and Valid counterpartyBeneficiaryId

Confirm that the alert's counterparty information is accurately represented with counterpartyId as null and counterpartyBeneficiaryId set correctly.


2500-2500: Use of TIME_UNITS.days in Alert Definition

The time unit is properly set using TIME_UNITS.days in the alert definition options.


2537-2538: Alert Counterparty Assignment Verification

Ensure that alerts[0]?.counterpartyId is null and alerts[0]?.counterpartyBeneficiaryId correctly references counterparty.id.


199-199: Verify the Retrieval of counterpartyBeneficiaryId

Ensure that counterpartyBeneficiaryId holds the correct value obtained reliably from baseTransactionFactory. The assertion depends on accurately extracting counterpartyBeneficiaryId.

Run the following script to verify the structure of baseTransactionFactory:

✅ Verification successful

Let me gather more context about how the counterpartyBeneficiaryId is being used in the test file.


Let me try one more search to see how counterpartyBeneficiary is set in the TransactionFactory.


The counterpartyBeneficiaryId retrieval is correctly implemented

The assertion is valid as the test correctly retrieves counterpartyBeneficiaryId from baseTransactionFactory.data.counterpartyBeneficiary?.connect?.id. This matches the TransactionFactory implementation where counterpartyBeneficiary is consistently set using the pattern { connect: { id } }, either through the withCounterpartyBeneficiary method or internal factory methods.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if `baseTransactionFactory.data.counterpartyBeneficiary.connect.id` exists.

# Test: Search for the definition of `TransactionFactory` and its properties.
rg 'class TransactionFactory' -A 20

# Test: Find usages of `data.counterpartyBeneficiary`.
rg 'data\.counterpartyBeneficiary'

# Expected: Confirm that `data.counterpartyBeneficiary.connect.id` is a valid path.

Length of output: 2561


Script:

#!/bin/bash
# Check the test file context around the assertion
rg -B 10 -A 5 'expect\(alerts\[0\]\?\.counterpartyBeneficiaryId\)' services/workflows-service/src/alert/alert.service.intg.test.ts

# Check how counterpartyBeneficiary is set in TransactionFactory
rg -B 5 -A 5 'data\.counterpartyBeneficiary = {' services/workflows-service/src/transaction/test-utils/transaction-factory.ts

Length of output: 6978


Script:

#!/bin/bash
# Search for counterpartyBeneficiary assignments in transaction factory
rg -B 5 -A 5 'counterpartyBeneficiary =' services/workflows-service/src/transaction/test-utils/transaction-factory.ts

# Also check for any withCounterpartyBeneficiary method
rg -B 2 -A 10 'withCounterpartyBeneficiary' services/workflows-service/src/transaction/test-utils/transaction-factory.ts

Length of output: 1171

apps/backoffice-v2/src/domains/transactions/query-keys.ts (1)

11-11: Verify the handling of optional counterpartyId

The change to make counterpartyId optional is logically sound, but we should verify that all consuming components handle this optionality correctly.

✅ Verification successful

Optional counterpartyId is properly handled throughout the codebase

The verification shows that the optional counterpartyId parameter is consistently handled across all related components:

  • In fetchers.ts, the parameter is correctly marked as optional in the fetch function parameters
  • In useTransactionsQuery.tsx, the parameter is properly typed as optional and correctly passed to the query key factory
  • The query implementation properly handles the parameter's optionality in the API request via query params
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check how counterpartyId is used in components and verify proper handling of the optional parameter

# Find usages of transactionsQueryKeys.list
rg -l "transactionsQueryKeys.list" apps/backoffice-v2/src/

# Check for direct counterpartyId usage in transaction-related components
rg "counterpartyId" apps/backoffice-v2/src/domains/transactions/

Length of output: 776


Script:

#!/bin/bash
# Let's examine the implementation details of the components using counterpartyId

# Check the fetchers implementation
cat apps/backoffice-v2/src/domains/transactions/fetchers.ts

# Check the useTransactionsQuery hook implementation
cat apps/backoffice-v2/src/domains/transactions/hooks/queries/useTransactionsQuery/useTransactionsQuery.tsx

Length of output: 6992

apps/backoffice-v2/src/domains/transactions/hooks/queries/useTransactionsQuery/useTransactionsQuery.tsx (1)

12-12: Verify handling of optional counterpartyId across the codebase

Making counterpartyId optional is a breaking change that could affect existing components using this hook.

Let's verify the impact:

✅ Verification successful

Let me gather more information about the API endpoint and its usage.


Let me check the repository implementation to understand how the counterpartyId is handled in the filters.


Making counterpartyId optional is safe and aligned with the backend implementation

The change is safe because:

  1. The backend API already handles optional counterpartyId in GetTransactionsDto
  2. The repository implementation properly handles the case when counterpartyId is undefined
  3. The only consumer (TransactionMonitoringAlertsAnalysis) provides a default empty string, which maintains backward compatibility
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for components using useTransactionsQuery to ensure they handle the optional counterpartyId
rg -l "useTransactionsQuery" apps/backoffice-v2/src/ | while read -r file; do
  echo "=== $file ==="
  rg -A 5 "useTransactionsQuery" "$file"
done

# Check if the backend API endpoint handles missing counterpartyId
ast-grep --pattern 'getTransactionsV1($$$)' 

Length of output: 1690


Script:

#!/bin/bash
# Search for the API endpoint implementation and its usage
rg -A 10 "getTransactionsV1" 

# Search for any test files related to transactions
fd -e test.ts -e test.tsx -e spec.ts -e spec.tsx | xargs rg "getTransactionsV1|useTransactionsQuery"

# Look for type definitions of the API endpoint
ast-grep --pattern 'interface $_ {
  getTransactionsV1($$$)
}'

Length of output: 3620


Script:

#!/bin/bash
# Look for the repository implementation and DTO definition
rg -A 10 "findManyWithFiltersV1" services/workflows-service/src/

# Check the GetTransactionsDto definition
ast-grep --pattern 'class GetTransactionsDto {
  $$$
}'

# Look for the actual usage of counterpartyId in filters
rg -B 5 -A 5 "counterpartyId" services/workflows-service/src/transaction/

Length of output: 11730

services/workflows-service/src/data-analytics/data-analytics.module.ts (1)

22-23: LGTM! Clean integration of DataInvestigationService

The service is properly integrated into the module's providers and exports arrays, following NestJS best practices.

services/workflows-service/src/transaction/transaction.module.ts (1)

13-13: LGTM!

Clean import statement following the project's conventions.

apps/backoffice-v2/src/pages/TransactionMonitoringAlertsAnalysis/hooks/useTransactionMonitoringAlertsAnalysisPageLogic/useTransactionMonitoringAlertsAnalysisPageLogic.tsx (2)

8-8: LGTM!

Clean destructuring of counterpartyId from useSerializedSearchParams.


15-18: 🛠️ Refactor suggestion

Improve parameter handling and resolve TODO comment.

A few suggestions for improvement:

  1. The toString() call on optional alertId could be unsafe. Consider using nullish coalescing first.
  2. The TODO comment should be addressed or documented with a reason for future removal.
  3. If counterpartyId is optional (as indicated by the AI summary), consider making it undefined instead of an empty string when not provided.

Consider this safer implementation:

-    alertId: alertId?.toString() ?? '',
-    // @TODO: Remove
-    counterpartyId: counterpartyId ?? '',
+    alertId: (alertId ?? '').toString(),
+    counterpartyId: counterpartyId || undefined,

Let's verify the optionality of counterpartyId in related files:

✅ Verification successful

Unsafe parameter handling needs improvement, but TODO comment can be removed

The verification shows that counterpartyId is indeed optional (?) across the codebase, particularly in:

  • Transaction query interfaces
  • Alert search schemas
  • DTO definitions with @IsOptional()

The suggested refactoring is partially correct, but needs adjustment:

-    alertId: alertId?.toString() ?? '',
-    // @TODO: Remove
-    counterpartyId: counterpartyId ?? '',
+    alertId: (alertId ?? '').toString(),
+    counterpartyId: counterpartyId ?? null,
  • The TODO comment can be safely removed as the counterpartyId parameter is actively used throughout the codebase
  • Use null instead of undefined to match the schema definition in alerts/fetchers.ts which uses z.string().nullable().default(null)
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if counterpartyId is marked as optional in related types and implementations

# Check transaction query interface
ast-grep --pattern 'interface $_ {
  $$$
  counterpartyId?: $_
  $$$
}'

# Check for usage patterns of counterpartyId
rg -A 3 'counterpartyId.*:.*string'

Length of output: 5679

services/workflows-service/src/data-analytics/utils.ts (1)

34-55: Verify the intentional limitation of time units.

The function only supports 'days', 'hours', and 'minutes', while calculateStartDate supports additional units like 'weeks', 'months', and 'years'. Please verify if this limitation is intentional for the deduplication window use case.

✅ Verification successful

Let me gather more context about the usage and type definition.


Let me check the TIME_UNITS constant definition to understand the complete set of allowed time units.


Let me check one final thing to see the TIME_UNITS constant definition with a corrected regex.


The limitation of time units in convertTimeUnitToMilliseconds is intentional and correct.

The function is specifically used for alert deduplication windows, as evidenced by its only usage in alert.service.ts. While the TIME_UNITS constant includes 'weeks', 'months', and 'years', these units are intentionally not supported for deduplication windows because:

  1. The default deduplication window uses 'days' (7 days from DEFAULT_DEDUPE_STRATEGIES)
  2. Months and years have variable durations, making them unsuitable for precise time-based deduplication
  3. Weeks can be expressed in days, maintaining simplicity in the deduplication logic
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if weeks/months/years are used in deduplication contexts
# Look for patterns that might indicate deduplication time windows using these units

# Search for deduplication-related code with time units
rg -i "dedupe.*(?:week|month|year)" --type ts

# Search for the usage of convertTimeUnitToMilliseconds
ast-grep --pattern 'convertTimeUnitToMilliseconds($_)'

Length of output: 244


Script:

#!/bin/bash
# Search for TimeUnit type definition and its usage
rg "type TimeUnit" -A 5 --type ts

# Search for dedupeWindow type or interface definition
rg "(?:type|interface).*[Dd]edupe.*Window" -A 10 --type ts

# Look for any test files related to deduplication
fd ".*dedup.*\.test\.ts$" --exec cat {}

Length of output: 1474


Script:

#!/bin/bash
# Search for TIME_UNITS constant definition
rg "TIME_UNITS.*=.*{" -A 10 --type ts

# Search for any validation or constraints related to dedupeWindow
rg "dedupeWindow" -B 2 -A 2 --type ts

Length of output: 4199


Script:

#!/bin/bash
# Search for TIME_UNITS constant definition with fixed regex
rg "export const TIME_UNITS" -A 10 --type ts

# Also check for any imports of TIME_UNITS in the alert service
rg "import.*TIME_UNITS" --type ts

Length of output: 1714

services/workflows-service/src/transaction/transaction.service.ts (3)

12-13: LGTM: Required imports for pagination support

The new imports support the pagination functionality and type safety.


73-78: Good: Preserved comprehensive filtering capabilities

The V1 method maintains all original filtering capabilities (counterparty, date range, payment method, etc.) by delegating to findManyWithFiltersV1. This ensures existing integrations continue to work while allowing for the new simplified interface.


82-94: Verify if simplified interface meets investigation requirements

While the implementation correctly handles pagination and sorting, the simplified interface removes direct support for specific filtering capabilities (counterparty, date range, payment method). Please verify:

  1. Can all necessary filtering be passed through the optional args parameter?
  2. Does this simplification align with the monitoring and investigation requirements?

Additionally, consider adding JSDoc documentation to clarify:

  • The relationship between V1 and V2 interfaces
  • How to achieve complex filtering using the args parameter
  • When to use each interface

Let's check how this interface is being used:

✅ Verification successful

Simplified interface maintains all filtering capabilities through args parameter

The implementation maintains full filtering support while providing a cleaner interface:

  1. The args parameter directly accepts Prisma's TransactionRecordFindManyArgs, which supports all filtering capabilities including counterparty, date range, and payment method filters.
  2. The repository shows two filtering implementations (buildFiltersV1 and buildFiltersV2) that demonstrate all required filtering capabilities are supported.
  3. The DTO (GetTransactionsDto) confirms the expected filtering fields: counterpartyId, paymentMethod, startDate, endDate.

The simplified interface successfully:

  • Separates pagination/sorting concerns from filtering
  • Maintains full filtering capabilities through the args parameter
  • Aligns with the existing monitoring implementation as seen in the transaction monitoring related files
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Find usages of the new getTransactions method
rg -t typescript "getTransactions.*projectId.*sortAndPage" -A 5

# Look for any monitoring/investigation related code that might be affected
rg -t typescript "investigation|monitoring.*transaction" -A 5

Length of output: 199


Script:

#!/bin/bash
# Find usages of the new getTransactions method without file type restriction
rg "getTransactions.*projectId.*sortAndPage" -A 5

# Look for any monitoring/investigation related code that might be affected
rg "investigation.*transaction|monitoring.*transaction" -A 5

# Find all files that might use TransactionService
fd "transaction" --type f

# Look for test files that might show usage patterns
rg "describe.*Transaction" -A 5

Length of output: 8195


Script:

#!/bin/bash
# Check the content of the DTO to understand filtering capabilities
cat services/workflows-service/src/transaction/dtos/get-transactions.dto.ts

# Look for existing transaction filtering patterns
rg "findMany.*transaction" -A 5

# Check the repository implementation for filtering support
cat services/workflows-service/src/transaction/transaction.repository.ts

# Look for test cases that might demonstrate filtering usage
rg -l "test.*transaction.*filter"

Length of output: 7630

services/workflows-service/src/transaction/transaction.repository.ts (4)

9-14: LGTM: New imports and default ordering constant

The additions are well-structured and the default ordering by transaction date in descending order is a sensible choice.


31-34: LGTM: Improved findMany signature and merge handling

The optional args parameter provides better flexibility, and the deepmerge usage correctly combines the optional args with the scope.


Line range hint 77-98: LGTM: Well-structured filter combination

The method effectively combines pagination, ordering, and filtering while maintaining proper project scope.


Line range hint 77-167: Ensure comprehensive test coverage for both versions

The introduction of V1 and V2 methods suggests a transition period. Please ensure:

  1. Both versions are thoroughly tested
  2. Edge cases are covered (null/undefined values, empty results, etc.)
  3. Performance implications are measured
  4. Migration path is documented
apps/backoffice-v2/src/domains/transactions/fetchers.ts (2)

Line range hint 186-193: LGTM: Query parameter handling is correct

The use of qs.stringify will correctly handle the optional counterpartyId by omitting it from the query string when undefined. The error handling remains robust with handleZodError.


181-181: Verify API endpoint behavior with optional counterpartyId

The change to make counterpartyId optional suggests that transactions can now be fetched without a counterparty filter. Please ensure that:

  1. The API endpoint /api/v1/external/transactions/by-alert properly handles requests without a counterpartyId
  2. The change is intentional and aligns with the business requirements

Let's verify the API endpoint implementation:

services/workflows-service/src/transaction/transaction.controller.external.ts (1)

Line range hint 1-54: LGTM: Dependencies properly organized and injected

The new imports and constructor injection for dataAnalyticsService are well-structured and align with the new functionality.

services/workflows-service/src/data-analytics/data-analytics.service.ts (1)

304-304: Verify consistent usage of transactionBaseAmount

The change from transactionAmount to transactionBaseAmount looks correct in these locations. Let's verify this change is consistent across the codebase.

Also applies to: 413-413, 416-416, 578-578

Comment on lines +298 to +300
// @ts-ignore - TODO: Replace logic with proper implementation for each rule
// eslint-disable-next-line
let { timeAmount, timeUnit } = inlineRule.options;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Avoid suppressing TypeScript and ESLint errors

Using // @ts-ignore and // eslint-disable-next-line suppresses potential issues. It's important to address the underlying type or logic problems to ensure type safety and code quality.

Investigate the cause of the TypeScript and ESLint errors and adjust the code accordingly to eliminate the need for these comments.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Outside diff range and nitpick comments (6)
services/workflows-service/src/test/helpers/create-alert.ts (2)

4-8: LGTM! Consider adding parameter validation.

The function signature changes improve dependency injection and follow TypeScript naming conventions. However, consider adding parameter validation for the required alertService.

 export const createAlert = async (
   projectId: string,
   alertDefinition: AlertDefinition,
   alertService: AlertService,
 ) => {
+  if (!alertService) {
+    throw new Error('AlertService is required');
+  }

Line range hint 11-17: Document parameters and consider access pattern.

Two concerns with the implementation:

  1. The empty array and objects passed to createAlert lack documentation about their purpose
  2. Accessing private methods in tests might indicate a need to reconsider the API design
 export const createAlert = async (
   projectId: string,
   alertDefinition: AlertDefinition,
   alertService: AlertService,
 ) => {
-  // Accessing private method for testing purposes while maintaining types
+  // These parameters represent:
+  // [] - workflowRuntimeData (empty for basic alert creation)
+  // {} - correlationId (optional in test context)
+  // {} - context (optional in test context)
   return await alertService.createAlert(
     {
       ...alertDefinition,
       projectId,
     },
     [],
     {},
     {},
   );
 };

Consider making the necessary methods public or creating a specific testing interface if private methods need to be accessed frequently in tests. This would make the testing intentions clearer and maintain better encapsulation.

services/workflows-service/src/test/helpers/nest-app-helper.ts (1)

27-28: Consider enhancing module configuration for testing

While the modules are correctly added, consider the following improvements:

  1. For ConfigModule:

    • Add test-specific configuration validation
    • Consider setting up mock environment variables for tests
  2. For EventEmitterModule:

    • Consider adding test-specific settings (e.g., wildcard support, verbosity)

Example enhancement:

- ConfigModule.forRoot({ isGlobal: true }),
+ ConfigModule.forRoot({
+   isGlobal: true,
+   validate: (config) => ({
+     ...config,
+     // Add test-specific validation
+   }),
+   envFilePath: '.env.test',
+ }),
- EventEmitterModule.forRoot(),
+ EventEmitterModule.forRoot({
+   // Test-specific settings
+   wildcard: true,
+   verboseMemoryLeak: true,
+ }),
services/workflows-service/src/test/helpers/create-alert-definition.ts (1)

9-9: LGTM! Consider using an interface instead of concrete implementation.

The shift towards dependency injection is a good practice. However, consider defining and using an interface instead of importing the concrete AlertService class directly to reduce coupling.

-import { AlertService } from '@/alert/alert.service';
+import { IAlertService } from '@/alert/alert.interface';

-  alertService: AlertService,
+  alertService: IAlertService,
services/workflows-service/src/transaction/transaction.controller.external.intg.test.ts (2)

606-622: Consider enhancing the alert definition helper function

The function has been updated to focus on specific alert functions, but there are a few improvements that could be made:

  1. The function names could be defined as constants to prevent typos
  2. The function could validate the time unit parameter

Consider this improvement:

+const ALERT_FUNCTIONS = {
+  MULTIPLE_MERCHANTS: 'evaluateMultipleMerchantsOneCounterparty',
+  DORMANT_ACCOUNT: 'evaluateDormantAccount',
+} as const;
+
+const VALID_TIME_UNITS = ['days', 'hours', 'minutes'] as const;
+
 const getAlertDefinitionWithTimeOptions = (timeUnit: string, timeAmount: number) => {
+  if (!VALID_TIME_UNITS.includes(timeUnit as any)) {
+    throw new Error(`Invalid time unit. Must be one of: ${VALID_TIME_UNITS.join(', ')}`);
+  }
+
   const fnName = faker.helpers.arrayElement([
-    'evaluateMultipleMerchantsOneCounterparty',
-    'evaluateDormantAccount',
+    ALERT_FUNCTIONS.MULTIPLE_MERCHANTS,
+    ALERT_FUNCTIONS.DORMANT_ACCOUNT,
   ]);

713-713: Fix inconsistent spacing

There's an extra blank line that affects readability.

      );
-
      const alert = await createAlert(otherProject.id, alertDefinition, alertService);
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between fc0758d and 82fe112.

📒 Files selected for processing (8)
  • services/workflows-service/src/alert/alert.service.intg.test.ts (28 hunks)
  • services/workflows-service/src/data-analytics/data-analytics.module.ts (1 hunks)
  • services/workflows-service/src/test/helpers/create-alert-definition.ts (2 hunks)
  • services/workflows-service/src/test/helpers/create-alert.ts (1 hunks)
  • services/workflows-service/src/test/helpers/nest-app-helper.ts (2 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.intg.test.ts (8 hunks)
  • services/workflows-service/src/transaction/transaction.module.ts (1 hunks)
  • services/workflows-service/src/workflow/workflow.module.ts (0 hunks)
💤 Files with no reviewable changes (1)
  • services/workflows-service/src/workflow/workflow.module.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • services/workflows-service/src/data-analytics/data-analytics.module.ts
🔇 Additional comments (13)
services/workflows-service/src/test/helpers/create-alert.ts (1)

4-8: Verify all usages of createAlert have been updated.

Since the function signature has changed to require alertService, we should verify that all test files using this helper have been updated accordingly.

✅ Verification successful

Based on the search results, I can generate the final response as we have clear evidence that all usages of createAlert have been properly updated with the new signature, including the alertService parameter.

All usages of createAlert helper are properly updated

All occurrences of createAlert in the test files are using the correct signature with three parameters: project.id, alertDefinition, and alertService. No outdated usages were found.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for all usages of createAlert in test files
# Description: This will help identify any places that might need updates

# Look for createAlert usage patterns
rg "createAlert\(" --type ts --glob "**/*.test.ts" --glob "**/*.spec.ts" -A 3

Length of output: 2287

services/workflows-service/src/transaction/transaction.module.ts (2)

8-11: LGTM: Module imports align with monitoring enhancement goals

The addition of DataAnalytics, Sentry, Alert, and Project modules aligns well with the PR's objective of improving transaction monitoring capabilities.

Also applies to: 14-14


16-16: Verify the impact of removed providers

The significant reduction in providers (removing ProjectScopeService, SentryService, AlertService, etc.) suggests these services are now being imported through their respective modules. Let's verify there are no breaking changes.

✅ Verification successful

Let me try a different approach to verify this concern.


Let me check one more thing to verify if these services are properly imported through their modules.


Removal of providers is safe - services are properly imported through modules

The verification shows that:

  • All removed services are now correctly imported through their respective modules (DataAnalyticsModule, SentryModule, AlertModule, ProjectModule)
  • Each module properly exports its corresponding service:
    • data-analytics.module.ts exports DataAnalyticsService
    • sentry.module.ts exports SentryService
    • alert.module.ts exports AlertService
    • project.module.ts exports ProjectScopeService
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify that removed services are properly imported through their modules
# and that there are no orphaned service references

# Check for any direct references to removed services that might break
echo "Checking for potentially broken service references..."
rg -l "ProjectScopeService|SentryService|AlertService|DataAnalyticsService|AlertRepository|AlertDefinitionRepository" \
   --type typescript \
   --glob '!*.module.ts' \
   services/workflows-service/src/transaction/

# Check if these services are properly exported from their new modules
echo "Verifying service exports from respective modules..."
ast-grep --pattern 'export class $SERVICE implements' \
  services/workflows-service/src/{data-analytics,sentry,alert,project}/**/*.ts

Length of output: 1601


Script:

#!/bin/bash
# Let's first check if these directories exist and their structure
echo "Checking directory structure..."
fd --type directory "^(transaction|alert|sentry|project|data-analytics)$" services/workflows-service/src/

# Now let's check for imports of these services in the transaction files
echo "Checking imports in transaction files..."
rg "import.*\b(ProjectScopeService|SentryService|AlertService|DataAnalyticsService|AlertRepository|AlertDefinitionRepository)\b" services/workflows-service/src/transaction/

# Let's also check the transaction service for any usage of these services
echo "Checking TransactionService implementation..."
ast-grep --pattern 'class TransactionService {
  $$$
}'

Length of output: 10241


Script:

#!/bin/bash
# Check the imports in transaction.module.ts and verify module imports
echo "Checking module imports in transaction.module.ts..."
rg "import.*Module" services/workflows-service/src/transaction/transaction.module.ts

# Check if these services are exported from their respective modules
echo "Checking service exports from respective modules..."
rg "export.*\b(ProjectScopeService|SentryService|AlertService|DataAnalyticsService)\b" \
   services/workflows-service/src/{alert,sentry,project,data-analytics}/*module.ts

Length of output: 1583

services/workflows-service/src/test/helpers/nest-app-helper.ts (2)

17-18: LGTM: Clean import additions

The new imports for ConfigModule and EventEmitterModule are properly organized and follow NestJS conventions.


27-28: Verify impact on existing tests

The addition of these global modules might affect existing tests. Please ensure:

  1. No conflicts with existing test configurations
  2. No unintended side effects in existing test cases
✅ Verification successful

Let me analyze the results and generate additional verification scripts to check the test configurations in more detail.

This will help us understand:

  1. How the test files are currently set up
  2. If there are any custom test module configurations
  3. If there's any direct environment variable or config service usage that might be affected

Based on the comprehensive results, I can now provide a final response:

ConfigModule and EventEmitterModule additions are safe for test environment

The changes to add ConfigModule and EventEmitterModule as global modules in the test helper are safe because:

  1. These modules are already used globally in the main app.module.ts
  2. No test files have conflicting configurations or custom event emitter setups
  3. All test files use the common test helper through initiateNestApp or fetchServiceFromModule
  4. The only mock configuration found is in a skipped test file (file-service.service.test.skip.ts)
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for existing test files that might be affected
echo "Searching for test files using these utility functions..."
rg -l "fetchServiceFromModule|initiateNestApp" --type ts

# Look for existing config or event emitter usage in tests
echo "Checking for existing config usage in tests..."
rg "ConfigModule\.forRoot|EventEmitterModule\.forRoot" --type ts

# Check for potential conflicts in test setup
echo "Checking for potential conflicts in test setup..."
rg "Mock.*Config|Mock.*EventEmitter" --type ts

Length of output: 1952


Script:

#!/bin/bash
# Check the content of test files to understand their setup
echo "Checking test setup in files using these utilities..."
for file in $(rg -l "fetchServiceFromModule|initiateNestApp" --type ts); do
  echo "=== $file ==="
  rg "describe|beforeAll|beforeEach" "$file" -B 1 -A 5
done

# Check if any test files have their own config/event setup
echo "Checking for custom test configurations..."
rg "createTestingModule|Test\.createTestingModule" --type ts -B 1 -A 5

# Look for any environment variable usage in tests
echo "Checking for environment variable usage in tests..."
rg "process\.env|configService\.get" --type ts

Length of output: 33610

services/workflows-service/src/test/helpers/create-alert-definition.ts (2)

11-15: Verify that all evaluation function names exist in the codebase.

Let's ensure that these evaluation function names correspond to actual implementations.

✅ Verification successful

Let me try a different search strategy to locate these functions, as they might be defined using different syntax or in different files.


All evaluation function names are valid and implemented in the codebase

The search results confirm that all three functions are properly implemented in the DataAnalyticsService class (services/workflows-service/src/data-analytics/data-analytics.service.ts):

  • evaluateTransactionsAgainstDynamicRules
  • evaluateDormantAccount
  • evaluateMultipleMerchantsOneCounterparty

These functions are also properly typed in types.ts and used in various test files and alert generation scripts.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Search for the evaluation function implementations
# Expected: Find implementations for all three evaluation functions

echo "Searching for evaluation functions..."
for fn in "evaluateTransactionsAgainstDynamicRules" "evaluateDormantAccount" "evaluateMultipleMerchantsOneCounterparty"; do
  echo "Looking for $fn..."
  ast-grep --pattern "function $fn" || rg "function $fn"
done

Length of output: 975


Script:

#!/bin/bash
# Search for function names using ripgrep with more context and different patterns
echo "Searching with broader patterns..."

for fn in "evaluateTransactionsAgainstDynamicRules" "evaluateDormantAccount" "evaluateMultipleMerchantsOneCounterparty"; do
    echo "Looking for $fn..."
    # Search for the function name in various contexts (class methods, arrow functions, etc.)
    rg "$fn" -A 2
done

# Also check test files specifically as these might be mock functions
echo "Checking test files..."
fd "test" -e ts -e js --exec rg -l "evaluateTransactionsAgainstDynamicRules|evaluateDormantAccount|evaluateMultipleMerchantsOneCounterparty"

Length of output: 11400


39-40: Verify that investigation functions exist for each evaluation function.

The code assumes that for each evaluate* function, there's a corresponding investigate* function. Let's verify this assumption.

✅ Verification successful

Let me gather more information to verify this thoroughly.


Let me try one more search to find these functions in a different way.


Let's check the content of these specific files to verify the function definitions.


Based on the comprehensive analysis of the codebase, I can now generate the final response:

The evaluate/investigate function pairs are properly implemented and consistent

The code shows a well-structured pattern where:

  • The evaluate functions are defined in data-analytics.service.ts
  • The corresponding investigate functions are defined in data-investigation.service.ts
  • The type definitions in types.ts enforce this pairing through the InlineRule type

The specific functions mentioned in the test helper are all properly implemented:

  • evaluateTransactionsAgainstDynamicRules → investigateTransactionsAgainstDynamicRules
  • evaluateDormantAccount → investigateDormantAccount
  • evaluateMultipleMerchantsOneCounterparty → investigateMultipleMerchantsOneCounterparty
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the existence of corresponding investigation functions
# Expected: Find matching pairs of evaluate/investigate functions

echo "Searching for investigation functions..."
for fn in "investigateTransactionsAgainstDynamicRules" "investigateDormantAccount" "investigateMultipleMerchantsOneCounterparty"; do
  echo "Looking for $fn..."
  ast-grep --pattern "function $fn" || rg "function $fn"
done

Length of output: 1035


Script:

#!/bin/bash
# Let's first find all evaluate functions to understand the complete picture
echo "Searching for evaluate functions..."
rg "function evaluate" -l

echo -e "\nSearching for investigate functions..."
rg "function investigate" -l

# Let's also check the specific file content to understand the context better
echo -e "\nChecking the test helper file content:"
cat services/workflows-service/src/test/helpers/create-alert-definition.ts

Length of output: 2669


Script:

#!/bin/bash
# Let's search for these specific function names in TypeScript/JavaScript files
echo "Searching in TypeScript/JavaScript files..."
fd -e ts -e js --exec rg -l "evaluateTransactionsAgainstDynamicRules|evaluateDormantAccount|evaluateMultipleMerchantsOneCounterparty|investigateTransactionsAgainstDynamicRules|investigateDormantAccount|investigateMultipleMerchantsOneCounterparty"

# Let's also check if these might be defined as arrow functions or class methods
echo -e "\nSearching for alternative function definitions..."
rg "(evaluate|investigate)(TransactionsAgainstDynamicRules|DormantAccount|MultipleMerchantsOneCounterparty)\s*[=:]\s*" -l

Length of output: 1019


Script:

#!/bin/bash
echo "Checking data-analytics.service.ts for evaluate functions:"
cat services/workflows-service/src/data-analytics/data-analytics.service.ts

echo -e "\nChecking data-investigation.service.ts for investigate functions:"
cat services/workflows-service/src/data-analytics/data-investigation.service.ts

echo -e "\nChecking types.ts for function type definitions:"
cat services/workflows-service/src/data-analytics/types.ts

Length of output: 44926

services/workflows-service/src/transaction/transaction.controller.external.intg.test.ts (2)

40-41: LGTM: Module setup changes look good

The addition of DataAnalyticsModule and AlertModule, along with the alertService initialization, properly sets up the test environment for the enhanced alert functionality.

Also applies to: 154-156


Line range hint 636-739: Verify test coverage for alert-related functionality

The test cases cover basic scenarios but could be enhanced to include:

  1. Edge cases for time-based alerts
  2. Different alert function behaviors
  3. Error scenarios for invalid time units

Let's verify the test coverage:

✅ Verification successful

Let me check the alert definition options and time units to better assess the test coverage.


Let me check the alert functions to understand what behaviors are being tested.


Test coverage for alert-related functionality is adequate

The existing test suite provides comprehensive coverage of alert-related functionality:

  • Time-based scenarios:

    • Tests different time ranges (1, 7, and 15 days)
    • Verifies transactions within and outside time windows
    • Handles empty results for out-of-range transactions
  • Alert function behaviors:

    • Randomly tests both evaluateDormantAccount and evaluateMultipleMerchantsOneCounterparty
    • Includes authentication and authorization tests
    • Covers error cases (404 for non-existent alerts)
  • Edge cases:

    • Tests with no matching transactions
    • Verifies behavior with transactions at boundary conditions
    • Includes cross-project authorization checks
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check for existing test cases covering different alert scenarios
rg -A 5 "it\(.*alert" services/workflows-service/src/transaction/

Length of output: 3359


Script:

#!/bin/bash
# Check for alert definition options and time units
rg -B 2 -A 5 "getAlertDefinitionWithTimeOptions" services/workflows-service/src/transaction/

Length of output: 4961


Script:

#!/bin/bash
# Check for alert evaluation functions
rg -A 10 "evaluateMultipleMerchantsOneCounterparty|evaluateDormantAccount" services/workflows-service/src/

Length of output: 9207

services/workflows-service/src/alert/alert.service.intg.test.ts (4)

39-40: LGTM! New imports support enhanced functionality.

The addition of DataInvestigationService and TIME_UNITS imports enhances the test suite with data investigation capabilities and standardized time unit handling.


2082-2082: Great refactoring: Standardized time unit handling.

Replacing string literals with TIME_UNITS.days improves code maintainability and reduces the risk of typos or inconsistencies. This standardization is consistently applied across all alert rules.

Also applies to: 2222-2222, 2362-2362, 2499-2499


Line range hint 1-2567: Excellent test coverage and organization.

The test suite demonstrates:

  • Comprehensive coverage of alert rules
  • Clear test case organization
  • Consistent testing patterns
  • Good coverage of edge cases

508-511: Verify the systematic change in counterparty handling.

The code consistently sets counterpartyId to null while maintaining specific counterparty type IDs (beneficiary/originator). This pattern appears across all alert rules, suggesting a fundamental change in how counterparties are referenced.

Also applies to: 618-619, 689-690, 754-755, 837-838, 918-919, 1029-1030, 1121-1122, 1778-1779, 2120-2121, 2260-2261, 2399-2400, 2536-2537

✅ Verification successful

Based on the migration file and schema, I can now provide a final response:

Systematic change in counterparty handling is part of a schema update

The change to set counterpartyId to null while maintaining specific counterparty type IDs is part of a planned schema migration (20241021185057). The schema now supports separate foreign keys for originator and beneficiary counterparties, replacing the generic counterpartyId field which is marked for removal after data migration.

  • The Alert model now has new fields:
    • counterpartyOriginatorId with relation "counterpartyAlertOriginator"
    • counterpartyBeneficiaryId with relation "counterpartyAlertBeneficiary"
  • The old counterpartyId field is marked with comment "TODO: Remove this field after data migration"
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify if this is part of a broader change in counterparty handling

# Check for related schema changes
rg -l "counterpartyId|counterpartyBeneficiaryId|counterpartyOriginatorId" --type prisma

# Check for related changes in alert handling
rg -l "counterpartyId\s*=\s*null" --type typescript

Length of output: 209


Script:

#!/bin/bash
# Description: Verify if this is part of a broader change in counterparty handling

# Check for related schema changes in any file
rg -l "counterpartyId|counterpartyBeneficiaryId|counterpartyOriginatorId"

# Check for related changes in alert handling
rg "counterpartyId\s*=\s*null" -A 2

Length of output: 1774


Script:

#!/bin/bash
# Let's check the migration files to understand the schema changes
cat services/workflows-service/prisma/migrations/20241021185057_add_alerts_counterparty_relation_for_advanced_filtering/migration.sql

# Also check the schema definition
cat services/workflows-service/prisma/schema.prisma

# Look for any type definitions related to counterparty
rg -A 5 "interface.*Alert" services/workflows-service/src/alert/types.ts

Length of output: 28289

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (3)
apps/backoffice-v2/src/common/utils/fetcher/fetcher.ts (1)

72-72: LGTM! Consider adding type safety.

The fallback to 0 when parsing content-length is a good defensive programming practice. To make it even more robust, consider using explicit type checking:

-if (!res.headers.get('content-length') || Number(res.headers.get('content-length') || 0) > 0) {
+const contentLength = res.headers.get('content-length');
+if (!contentLength || (typeof contentLength === 'string' && Number(contentLength) > 0)) {
services/workflows-service/src/alert/alert.controller.external.ts (2)

Line range hint 79-123: LGTM! Consider extracting the common selection structure.

The counterparty selection structure is well-organized and includes all necessary fields. However, since the selection structure is identical for both counterpartyOriginator and counterpartyBeneficiary, consider extracting it into a reusable constant to improve maintainability.

+ const counterpartySelect = {
+   select: {
+     id: true,
+     business: {
+       select: {
+         id: true,
+         correlationId: true,
+         companyName: true,
+       },
+     },
+     endUser: {
+       select: {
+         id: true,
+         correlationId: true,
+         firstName: true,
+         lastName: true,
+       },
+     },
+   },
+ };

  include: {
    // ... other includes
-   counterpartyOriginator: {
-     select: {
-       id: true,
-       business: {
-         select: {
-           id: true,
-           correlationId: true,
-           companyName: true,
-         },
-       },
-       endUser: {
-         select: {
-           id: true,
-           correlationId: true,
-           firstName: true,
-           lastName: true,
-         },
-       },
-     },
-   },
+   counterpartyOriginator: counterpartySelect,
-   counterpartyBeneficiary: {
-     select: {
-       id: true,
-       business: {
-         select: {
-           id: true,
-           correlationId: true,
-           companyName: true,
-         },
-       },
-       endUser: {
-         select: {
-           id: true,
-           correlationId: true,
-           firstName: true,
-           lastName: true,
-         },
-       },
-     },
-   },
+   counterpartyBeneficiary: counterpartySelect,
  }

Line range hint 134-165: LGTM! Well-structured counterparty details extraction.

The counterpartyDetails helper function is well-implemented:

  • Handles both business and individual counterparties
  • Returns consistent structure for both types
  • Provides clear type differentiation

However, the subject field assignment should be updated to reflect the removal of the deprecated counterparty field:

  subject:
-   counterpartyDetails(counterparty) ||
    counterpartyDetails(counterpartyBeneficiary) ||
    counterpartyDetails(counterpartyOriginator),
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 82fe112 and 493c25a.

📒 Files selected for processing (2)
  • apps/backoffice-v2/src/common/utils/fetcher/fetcher.ts (1 hunks)
  • services/workflows-service/src/alert/alert.controller.external.ts (3 hunks)

@liorzam liorzam enabled auto-merge (squash) December 6, 2024 05:27
@liorzam liorzam disabled auto-merge December 6, 2024 05:27
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Outside diff range and nitpick comments (1)
services/workflows-service/prisma/schema.prisma (1)

901-903: LGTM! Well-defined bidirectional relationships.

The new relations alertsBenefiting and alertsOriginating complete the bidirectional relationship with the Alert model, making it easy to query alerts from either perspective.

This design pattern facilitates efficient querying in both directions:

  • From alerts to counterparties (using counterpartyOriginator/counterpartyBeneficiary)
  • From counterparties to alerts (using alertsOriginating/alertsBenefiting)
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 493c25a and 73bb4b7.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (2)
  • services/workflows-service/package.json (1 hunks)
  • services/workflows-service/prisma/schema.prisma (3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • services/workflows-service/package.json
🔇 Additional comments (3)
services/workflows-service/prisma/schema.prisma (3)

842-846: LGTM! Well-structured relationship fields.

The new relationship fields properly establish bidirectional connections between alerts and counterparties, with clear naming conventions and appropriate field references.


856-857: LGTM! Performance optimization with indexes.

The added indexes on counterpartyOriginatorId and counterpartyBeneficiaryId will improve query performance when filtering alerts by counterparties.


838-838: Plan the data migration for removing the deprecated field.

The TODO comment indicates that counterpartyId needs to be removed after data migration. This requires careful planning to ensure data consistency.

Would you like me to help create a data migration plan or open a GitHub issue to track this task?

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Outside diff range and nitpick comments (12)
services/workflows-service/src/alert/alert.service.intg.test.ts (3)

188-189: Improve null safety in counterparty ID access

The optional chaining operator usage is good for null safety, but consider destructuring the ID earlier for better readability.

- const counterpartyBeneficiaryId =
-   baseTransactionFactory?.data?.counterpartyBeneficiary?.connect?.id;
+ const { id: counterpartyBeneficiaryId } = 
+   baseTransactionFactory?.data?.counterpartyBeneficiary?.connect ?? {};

Also applies to: 198-198


508-511: Consistent pattern for counterparty ID handling

The consistent pattern of setting counterpartyId to null and using specific counterparty types (beneficiary/originator) is good for clarity. However, consider extracting common assertions into helper functions to reduce repetition.

const assertCounterpartyIds = (alert: any, {
  counterpartyId = null,
  beneficiaryId = null,
  originatorId = null
}) => {
  expect(alert?.counterpartyId).toEqual(counterpartyId);
  if (beneficiaryId) {
    expect(alert?.counterpartyBeneficiaryId).toEqual(beneficiaryId);
  }
  if (originatorId) {
    expect(alert?.counterpartyOriginatorId).toEqual(originatorId);
  }
};

Also applies to: 618-619, 689-690, 754-755, 837-838, 918-919, 1029-1030, 1121-1122, 1463-1464, 1778-1779, 2120-2121, 2260-2261, 2399-2400, 2536-2537


1536-1537: Consider using a test utility for date manipulation

The date manipulation logic could be extracted into a test utility function for better reusability and clarity.

const generateTestDate = (daysAgo: number, referenceDate?: Date) => {
  const date = faker.date.recent(daysAgo, referenceDate);
  return date;
};
services/workflows-service/src/transaction/transaction.controller.external.ts (4)

248-248: Consider documenting the versioning strategy

The method now calls getTransactionsV1, suggesting a versioning approach. Consider adding documentation to explain the differences between versions and the migration strategy.


370-370: Address TODO comment regarding rule investigation

The TODO comment indicates that the investigation logic for each rule needs improvement. This could impact the accuracy of transaction filtering.

Would you like me to help create a GitHub issue to track this improvement task?


352-357: Document the version compatibility strategy

The code introduces a version-based routing mechanism with a comment indicating temporary backward compatibility. Consider:

  1. Adding documentation about the deprecation timeline
  2. Creating a migration guide for users still using the old format

Line range hint 375-464: Reduce code duplication in include patterns

The include pattern for counterparty selection is duplicated in both V1 and V2 methods. Consider extracting it to a shared constant or method.

Example refactor:

private readonly counterpartyInclude = {
  counterpartyBeneficiary: {
    select: {
      correlationId: true,
      business: {
        select: {
          correlationId: true,
          companyName: true,
        },
      },
      endUser: {
        select: {
          correlationId: true,
          firstName: true,
          lastName: true,
        },
      },
    },
  },
  counterpartyOriginator: {
    // ... same pattern as beneficiary
  },
};

Then use it in both methods:

return this.service.getTransactions(projectId, filters, {
  where: alert.executionDetails.filters,
  include: this.counterpartyInclude,
});
🧰 Tools
🪛 Biome (1.9.4)

[error] 468-468: This code is unreachable

... because this statement will return from the function beforehand

(lint/correctness/noUnreachable)

services/workflows-service/src/transaction/transaction.repository.ts (4)

9-11: Remove unused import

The PageDto import is not used in this file.

 import deepmerge from 'deepmerge';
-import { PageDto } from '@/common/dto';

61-62: Extract pagination defaults to configuration

The hardcoded pagination values should be moved to a configuration object for better maintainability and reusability.

+const DEFAULT_PAGINATION = {
+  TAKE: 20,
+  SKIP: 0,
+} as const;
+
 const args: {
   skip: Prisma.TransactionRecordFindManyArgs['skip'];
   take?: Prisma.TransactionRecordFindManyArgs['take'];
 } = {
-  take: 20,
-  skip: 0,
+  take: DEFAULT_PAGINATION.TAKE,
+  skip: DEFAULT_PAGINATION.SKIP,
 };

Line range hint 77-98: Clarify versioning strategy for filter methods

The presence of V1/V2 methods suggests a transition period. Consider:

  1. Documenting the differences between V1 and V2
  2. Adding deprecation notice if V1 is to be removed
  3. Creating a migration plan for clients using V1

Line range hint 116-147: Improve date range handling

The current implementation has several issues:

  1. Mutates the date filter object
  2. Doesn't validate date ranges
  3. Doesn't handle timezone considerations
   if (getTransactionsParameters.startDate) {
+    if (getTransactionsParameters.endDate && 
+        getTransactionsParameters.startDate > getTransactionsParameters.endDate) {
+      throw new Error('Start date must be before end date');
+    }
     whereClause.transactionDate = {
-      ...(whereClause.transactionDate as DateTimeFilter),
+      ...whereClause.transactionDate,
       gte: getTransactionsParameters.startDate,
     };
   }
services/workflows-service/src/data-analytics/data-investigation.service.ts (1)

261-279: Remove unnecessary async keyword

The method investigateHighVelocityHistoricAverage is declared as async but does not contain any asynchronous operations. Remove the async keyword to simplify the code.

Apply this diff to remove the unnecessary async keyword:

-      async investigateHighVelocityHistoricAverage(options: HighVelocityHistoricAverageOptions) {
+      investigateHighVelocityHistoricAverage(options: HighVelocityHistoricAverageOptions) {
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 73bb4b7 and e825698.

📒 Files selected for processing (8)
  • services/workflows-service/prisma/data-migrations (1 hunks)
  • services/workflows-service/scripts/alerts/generate-alerts.ts (28 hunks)
  • services/workflows-service/src/alert/alert.controller.external.ts (2 hunks)
  • services/workflows-service/src/alert/alert.service.intg.test.ts (28 hunks)
  • services/workflows-service/src/alert/alert.service.ts (6 hunks)
  • services/workflows-service/src/data-analytics/data-investigation.service.ts (1 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.ts (6 hunks)
  • services/workflows-service/src/transaction/transaction.repository.ts (5 hunks)
✅ Files skipped from review due to trivial changes (1)
  • services/workflows-service/prisma/data-migrations
🚧 Files skipped from review as they are similar to previous changes (2)
  • services/workflows-service/src/alert/alert.controller.external.ts
  • services/workflows-service/src/alert/alert.service.ts
🧰 Additional context used
🪛 Biome (1.9.4)
services/workflows-service/src/transaction/transaction.controller.external.ts

[error] 468-468: This code is unreachable

... because this statement will return from the function beforehand

(lint/correctness/noUnreachable)

🔇 Additional comments (9)
services/workflows-service/src/alert/alert.service.intg.test.ts (2)

39-40: LGTM: New imports enhance monitoring capabilities

The addition of DataInvestigationService and TIME_UNITS imports improves the service's investigation capabilities and standardizes time unit handling.


2082-2082: LGTM: Standardized time unit usage

The replacement of string literals with TIME_UNITS enum improves type safety and maintainability.

Also applies to: 2222-2222, 2362-2362, 2499-2499

services/workflows-service/src/transaction/transaction.controller.external.ts (1)

27-43: LGTM: Import statements and constructor injection

The new imports and constructor dependencies are properly organized and follow NestJS dependency injection patterns.

Also applies to: 56-57

services/workflows-service/src/transaction/transaction.repository.ts (2)

31-34: LGTM: Proper handling of optional args and scope

The changes improve the method by making args optional and correctly merging them with scoped arguments.


Line range hint 65-73: Fix type issues in pagination parameters

The need to parse strings indicates a type mismatch in the DTO. Consider fixing the types in GetTransactionsDto.

-// Temporary fix for pagination (class transformer issue)
-const size = parseInt(getTransactionsParameters.page.size as unknown as string, 10);
-const number = parseInt(getTransactionsParameters.page.number as unknown as string, 10);
+const size = getTransactionsParameters.page.size;
+const number = getTransactionsParameters.page.number;

Let's verify the DTO implementation:

services/workflows-service/src/data-analytics/data-investigation.service.ts (3)

26-91: Refactor the switch statement to reduce redundancy

The switch statement in getInvestigationFilter is repetitive and can be simplified by dynamically invoking the methods based on fnInvestigationName. This will enhance maintainability and reduce code duplication.


298-300: Avoid suppressing TypeScript and ESLint errors

Using // @ts-ignore and // eslint-disable-next-line suppresses potential issues. It's important to address the underlying type or logic problems to ensure type safety and code quality.

Investigate the cause of the TypeScript and ESLint errors and adjust the code accordingly to eliminate the need for these comments.


304-307: Correct reference to fnInvestigationName instead of fnName

In line 304, inlineRule.fnName is used, but the correct property appears to be inlineRule.fnInvestigationName. This mismatch could lead to runtime errors due to undefined properties.

Apply the following fix:

-          inlineRule.fnName === 'evaluateHighVelocityHistoricAverage' &&
+          inlineRule.fnInvestigationName === 'evaluateHighVelocityHistoricAverage' &&
services/workflows-service/scripts/alerts/generate-alerts.ts (1)

Line range hint 47-86: Updates to fnInvestigationName are consistent and correct

The alert definitions have been updated to include the fnInvestigationName property alongside fnName, ensuring that the correct investigation functions are referenced. The IDs and options within each alert definition appear consistent and accurate.

Also applies to: 108-141, 169-201, 230-261, 290-361, 385-405, 428-454, 477-505, 517-547, 571-603, 611-643, 653-728

Comment on lines +101 to +113
async findManyWithFiltersV2(
getTransactionsParameters: GetTransactionsDto,
projectId: string,
options?: Prisma.TransactionRecordFindManyArgs,
): Promise<TransactionRecord[]> {
const _options = this.buildFindManyOptionsByFilter(getTransactionsParameters);

const args = deepmerge(options || {}, _options);

return this.prisma.transactionRecord.findMany(
this.scopeService.scopeFindMany(args, [projectId]),
);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Handle deepmerge edge cases

The current deepmerge implementation might have issues with array fields and nested objects. Consider:

  1. Using deepmerge options to handle arrays correctly
  2. Validating the merged result against Prisma types
-    const args = deepmerge(options || {}, _options);
+    const args = deepmerge(options || {}, _options, {
+      arrayMerge: (_, sourceArray) => sourceArray,
+    });
+    
+    // Validate against Prisma types
+    const validatedArgs: Prisma.TransactionRecordFindManyArgs = args;

Committable suggestion skipped: line range outside the PR's diff.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (1)
services/workflows-service/src/data-analytics/data-investigation.service.ts (1)

314-346: Simplify date calculations using a date manipulation library

The current date calculations mix direct date manipulations with millisecond calculations, making the code harder to maintain and potentially error-prone.

Consider using a date manipulation library like date-fns to simplify these calculations:

+ import { subMinutes, subHours, subDays, subMonths, subYears, startOfDay, min } from 'date-fns';

- let startDate = new Date();
- let subtractValue = 0;
- 
- const baseSubstractByMin = timeAmount * 60 * 1000;
- 
- switch (timeUnit) {
-   case TIME_UNITS.minutes:
-     subtractValue = baseSubstractByMin;
-     break;
-   case TIME_UNITS.hours:
-     subtractValue = 60 * baseSubstractByMin;
-     break;
-   case TIME_UNITS.days:
-     subtractValue = 24 * 60 * baseSubstractByMin;
-     break;
-   case TIME_UNITS.months:
-     startDate.setMonth(startDate.getMonth() - timeAmount);
-     break;
-   case TIME_UNITS.years:
-     startDate.setFullYear(startDate.getFullYear() - timeAmount);
-     break;
- }
- 
- startDate.setHours(0, 0, 0, 0);
- 
- if (subtractValue > 0) {
-   startDate = new Date(startDate.getTime() - subtractValue);
- }
+ let startDate = startOfDay(new Date());
+ 
+ switch (timeUnit) {
+   case TIME_UNITS.minutes:
+     startDate = subMinutes(startDate, timeAmount);
+     break;
+   case TIME_UNITS.hours:
+     startDate = subHours(startDate, timeAmount);
+     break;
+   case TIME_UNITS.days:
+     startDate = subDays(startDate, timeAmount);
+     break;
+   case TIME_UNITS.months:
+     startDate = subMonths(startDate, timeAmount);
+     break;
+   case TIME_UNITS.years:
+     startDate = subYears(startDate, timeAmount);
+     break;
+ }

if (filters.endDate) {
-  startDate = new Date(Math.min(startDate.getTime(), filters.endDate.getTime()));
+  startDate = min([startDate, filters.endDate]);
}
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between e825698 and 3612697.

📒 Files selected for processing (2)
  • services/workflows-service/src/data-analytics/data-investigation.service.ts (1 hunks)
  • services/workflows-service/src/transaction/transaction.controller.external.ts (6 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • services/workflows-service/src/transaction/transaction.controller.external.ts
🔇 Additional comments (5)
services/workflows-service/src/data-analytics/data-investigation.service.ts (5)

1-22: LGTM! Class structure follows NestJS best practices

The class is properly decorated with @Injectable() and follows dependency injection patterns with the logger service.


298-300: Avoid suppressing TypeScript and ESLint errors

Using @ts-ignore and eslint-disable-next-line suppresses potential issues that should be addressed.


304-306: ⚠️ Potential issue

Fix incorrect property reference

The code uses inlineRule.fnName but should use inlineRule.fnInvestigationName for consistency.

Apply this fix:

- inlineRule.fnName === 'evaluateHighVelocityHistoricAverage' &&
+ inlineRule.fnInvestigationName === 'evaluateHighVelocityHistoricAverage' &&

Likely invalid or redundant comment.


350-361: ⚠️ Potential issue

Fix potential overwriting of transactionDate filter

The code might overwrite the gte condition when setting the lte condition.

Apply this fix:

 if (filters.startDate) {
   whereClause.transactionDate = {
     gte: filters.startDate,
   };
 }

 if (filters.endDate) {
   whereClause.transactionDate = {
-    ...(typeof whereClause.transactionDate === 'object' ? whereClause.transactionDate : {}),
+    ...whereClause.transactionDate,
     lte: filters.endDate,
   };
 }

Likely invalid or redundant comment.


23-99: 🛠️ Refactor suggestion

Simplify repetitive switch statement

The switch statement is highly repetitive with each case following the same pattern. This can be simplified to improve maintainability.

Consider refactoring to:

-  switch (inlineRule.fnInvestigationName) {
-    case 'investigateTransactionsAgainstDynamicRules':
-      investigationFilter = this[inlineRule.fnInvestigationName]({
-        ...inlineRule.options,
-        projectId,
-      });
-      break;
-    // ... many similar cases ...
-    default:
-      this.logger.error(`No investigation filter obtained`, {
-        inlineRule,
-      });
-      throw new Error(
-        `Investigation filter could not be obtained for rule id: ${
-          (inlineRule as InlineRule).id
-        }`,
-      );
-  }
+  const investigationMethod = this[inlineRule.fnInvestigationName];
+  if (typeof investigationMethod !== 'function') {
+    this.logger.error(`No investigation filter obtained`, { inlineRule });
+    throw new Error(
+      `Investigation filter could not be obtained for rule id: ${inlineRule.id}`,
+    );
+  }
+  
+  investigationFilter = investigationMethod.call(this, {
+    ...inlineRule.options,
+    projectId,
+  });

Likely invalid or redundant comment.

Comment on lines +158 to +164
investigateDormantAccount(options: TDormantAccountOptions) {
const { projectId } = options;

return {
projectId,
} as const satisfies Prisma.TransactionRecordWhereInput;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Filters are missing required time-based criteria

The implementation of both methods is incomplete as they ignore critical fields defined in their type definitions:

  • investigateDormantAccount ignores:

    • timeAmount: number
    • timeUnit: TimeUnit
  • investigateMultipleMerchantsOneCounterparty ignores:

    • excludedCounterparty: TExcludedCounterparty
    • minimumCount: number
    • timeAmount: number
    • timeUnit: TimeUnit

These fields are essential for the proper functioning of these investigation methods. The current implementation using only projectId is insufficient to meet their intended purposes as defined by their type signatures.

🔗 Analysis chain

Verify minimal filter implementation

The investigateDormantAccount and investigateMultipleMerchantsOneCounterparty methods only filter by projectId. This seems insufficient for their intended purposes.

Let's check if these methods are properly implemented in other parts of the codebase:

Also applies to: 200-206


The initial search revealed these methods are used in alert generation, but we need to understand:

  1. The expected filter options from their type definitions
  2. How these filters are actually used in the query context

This will help us determine if the minimal filtering is intentional or missing important criteria.


We need to understand:

  1. The complete type definitions to see what options are expected
  2. How these investigation filters are used in the actual database queries
    This will help determine if the minimal filtering is by design or missing implementation.
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for usage of these methods to understand their context
rg -A 5 "investigateDormantAccount|investigateMultipleMerchantsOneCounterparty"

# Search for any related test files
fd -g "*test*" | rg -l "investigateDormantAccount|investigateMultipleMerchantsOneCounterparty"

Length of output: 5855


Script:

#!/bin/bash
# Let's check the type definitions and options for these methods
ast-grep --pattern 'type TDormantAccountOptions'

# Check for the type of TMultipleMerchantsOneCounterparty
ast-grep --pattern 'type TMultipleMerchantsOneCounterparty'

# Let's also check the context where these filters are used
rg -B 5 -A 10 "investigationFilter = this\[inlineRule.fnInvestigationName\]"

Length of output: 7167


Script:

#!/bin/bash
# Let's check the interface file for these types
rg -B 2 -A 5 "interface.*TDormantAccountOptions|type.*TDormantAccountOptions"

# And for the other type
rg -B 2 -A 5 "interface.*TMultipleMerchantsOneCounterparty|type.*TMultipleMerchantsOneCounterparty"

# Let's also check how these investigation filters are used in queries
rg -B 5 -A 10 "where.*investigationFilter"

Length of output: 1417

@liorzam liorzam merged commit c4959d8 into dev Dec 7, 2024
9 checks passed
@liorzam liorzam deleted the bal-1762 branch December 7, 2024 23:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants