/
Revised cron logic for Status Mismatch bugs

Revised cron logic for Status Mismatch bugs

What is the purpose of this article?

The purpose of this article is to provide a resolution for fixing the order sync issue in algolia. The idea is to fetch sync the order items in Algolia without missing any of them. To achieve this, we will save an entry with new job type in order_return_algolia_sync table for these two Apis - /shipments/algolia/sync, order-items/algolia/sync/update :-

Details of the Apis to be updated project - esb client order return service - order-return endpoint - orderReturnEndpointV2, shipmentUpdateReconEndpoint

In this article, we will thoroughly discuss the steps to achieve this task

What are the steps to achieving this task?

Note - 1. Changes will be made in the existing APIs mentioned above.

2. For the first time, we will have to do manual entry in order_return_algolia_sync table.

  1. At every 5 min. duration, Google Cloud Schedule Job begins.

  2. The Google Cloud Schedule Job will then targets shipments/algolia/sync API in ShipmentUpdateReconEndpoint.

    GET API Path - /v1/shipments/algolia/sync
  3. API will fetch the last run cron information from order_return_algolia_sync table.

    SELECT * FROM order_return_algolia_sync WHERE job_type = "shipment_status_update_job_v2" AND STATUS = '1' ORDER BY id DESC LIMIT 1;
  4. After the query returns the result set, Store the value of date_range_end - 2 minutes in a variable named dateRangeStart

  5. Starting From this step to all the below steps will be in a single method Now, run a query to fetch the order items and their max shipment_status_logs.status'created_at date and store the result in a list of DTO - list<itemWithCreatedDate>

    SELECT order_items.id AS orderItemId_PK, shipment_status_logs.created_at FROM order_items LEFT JOIN shipment_status_logs ON order_items.shipment_id = shipment_status_logs.shipment_id WHERE order_items.created_at >= (NOW()-INTERVAL 180 DAY) AND order_items.shipment_id != 0 AND shipment_status_logs.created_at >= (:startDateRange -INTERVAL 4 MINUTE) GROUP BY order_items.id ORDER BY shipment_status_logs.id ASC LIMIT {offset}, LIMIT_CONSTANT; dynamic - offset LIMIT_CONSTANT should either be 5000 or 10000
  6. If the result fetched is empty, then return else move to step 7

  7. Iterate over the list of dto and store the orderItemIds(primary keys) in a list<Long> orderItemIds

  8. If the orderItemIds list size > 0. then pass them order item collected in the taskqueue to sync in Algolia (below API).
    taskqueue name -

    POST API curl --location 'https://order-return-dot-eshopbox-client-portal-prod.appspot.com/_ah/api/esb/v2/algolia/sync/orderItemIds' \ --header 'Content-Type: application/json' \ --data '{ "data": "[23921777]" }'
  9. If records count from above query = LIMIT_CONSTANT (standard limit of query),

    1. { then pass the following data in the new task queue - shipment_status_update_job_v2_next_batch (concurrency should be 1)

      1. offset = current offset+ limit

      2. dateRangeStart = got from above query

      3. make a listener API which will hit step number 5. It will replace the offset and dateRangestart in the corresponding query. }

    2. Else , {

      1. Store the value of maximum shipment_status_logs.created_at in a variable named dateRangeEnd (0 index of list)

      2. Insert the job_type - shipment_status_update_job_v2 with status = 1 , dateRangeStart = : dateRangeStart and dateRangeEnd = :dateRangeEnd in order_return_algolia_sync table in WMS DB. }

If exception occurs in the catch block anywhere during the sync then make the entry in DB with status = 2

  1. Insert the job_type - shipment_status_update_job_v2 with status = 2 , dateRangeStart = : dateRangeStart and dateRangeEnd = dateRangeEnd in order_return_algolia_sync table in WMS DB. }

Note - Same approach will be taken for cancelled cron.

 

Add label

Related content