Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Problem Statement

...

operation-modeworkflow
uploadcreate-upload content
publishcreate-upload-publish content
linkcreate-upload-publish content and link it to textbook


Design

...

1. Validations

File related validations to be done are,

  1. Validate the format of the file
  2. Validate whether the file is readable
  3. Validate whether the file has data

Data related validations to be done are,

  1. Check whether the file is conforming to the bulk content upload template(The template should be configurable)
  2. Number of rows in file should be less than Max rows allowed(configuration)
  3. Duplicity check within the file. Key is Taxonomy(BGMS)+ContentName


2. Synchronous Processing

  1. Upload the CSV file to blob storage

      2. Make an entry in bulk_upload_process table

columndata to insertremarks
id

auto-generated unique id

processId
createdby

uploader id


createdoncurrent timestamp
data

blobstore url of CSV file


failureresult
failedCount to be updated here
lastupdatedon
last updated timestamp to be updated here on each update
objecttypecontent
organisationidtenant id

processendtime


endTime - current timestamp to be inserted here while moving this process to completed state

processstarttime


startTime - current timestamp to be inserted here while moving this process to processing state

retrycount

0Not used

status

queuedstatus - possible values - queued, processing, completed

storagedetails


report - blobstore url of result file

successresult


successCount to be updated here

taskcount

number of records in filetotalCount

uploadedby

uploader id


uploadeddate

current timestamp


3. Make entries into bulk_upload_process_task table (One record per content)

columndata to insertremarks

processid

processidid from master table

sequenceid

auto-generated sequence id

createdon

current timestamp

data

data in JSON format

failureresult


JSON data + failed message

iterationid

0Not used

lastupdatedon


last updated timestamp to be updated here on each update

status


possible values - queuedsuccess, failed

successresult


JSON data + success message


4. For LINK operation-mode, get draft hierarchy of Textbooks mentioned in CSV and cache the dialCode-TextBookUnitDoId mapping in Redis

5. Push events to Kafka with Textbook Id as partition key for LINK operation-mode. Use hashed-value generated during duplicity check as partition key for other operation modes.

Image Added


3. Asynchronous Processing - Samza

  1. Validation of mandatory fields
  2. Validate DIAL code (first against redis-cache, if not present, get draft hierarchy from cassandra and validate)
  3. Validate the file size and file format in Google Drive
  4. Validate the Taxonomy by creating the content - Hit the REST API
  5. Download the AppIcon from Google Drive
  6. Create an asset with downloaded image
  7. Update content with AppIcon image URL
  8. Download content file from Google Drive
  9. Upload content - Hit the REST API
  10. Publish the content - Hit the Java API
  11. Get draft hierarchy of the TextBook from Cassandra
  12. Get the metadata of the published content
  13. Update the draft hierarchy of the TextBook in Cassandra
  14. Update the status back to LMS Cassandra- bulk_upload_process_task table
  15. Retire the Content in case of any exception in the flow


4. Scheduler

  1. Scheduler to run in periodic intervals to consolidate the result from bulk_upload_process_task table and update the master table(bulk_upload_process) with success_count, failed_count, process_end_time, result_file_url and status
  2. While a process is being marked as completed, the result file has to be generated, uploaded to blobstore and URL updated back to bulk_upload_process table


5. Status Check API

  1. Data from bulk_upload_process table to be served based on the processId


6. Status List API

  1. The userId of the user should be deduced from keycloak access token passed in the header.
  2. Statuses of all uploads done by the user has to be served from the bulk_upload_process tables


...

API Specifications

Bulk Content Upload API - POST - /v1/textbook/content/bulk/upload

Request Headers

Content-Typemultipart/form-data
AuthorizationBearer {{api-key}}
x-authenticated-user-token{{keycloak-token}}
x-channel-id{{channel-identifier}}
x-framework-id{{framework-identifier}}
x-hashtag-id{{tenant-id}}
operation-modeupload/publish/link

...

Bulk Content Upload Status Check API - GET - /v1/textbook/content/bulk/upload/status/:processId

Request Headers

Acceptapplication/json
AuthorizationBearer {{api-key}}
x-authenticated-user-token{{keycloak-token}}


Response : Success Response - OK (200) - In Queue

...

Bulk Content Upload Status List API - GET - /v1/textbook/content/bulk/upload/status/list

Request Headers

Acceptapplication/json
AuthorizationBearer {{api-key}}
x-authenticated-user-token{{keycloak-token}}


Response : Success Response - OK (200)

...