Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

API Endpoint: <domain_name>/ingestion/csvnew_programs

HTTP Method: POST       

...

...

This API will import the dimension csv and store upload it in to the inputs combined_input folder in cloud if there are no errors. Later the same file Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will be used by the Nifi processor to process and ingest the data into the database.

...

API Endpoint: <domain_name>/ingestion/csvnew_programs

HTTP Method: POST  

...

 This API will import the event csv and store upload it in to the inputs combined_input folder in cloud if there are no errors. Later the same file Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will be used by the Nifi processor to process and ingest the data into the database.

...

Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in  Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml

...

API Endpoint: <domain_name>/ingestion/csvnew_programs

HTTP Method: POST  

...

This API will import the event csv and store upload it in to the inputs combined_input folder in cloud if there are no errors. Later the same file Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will be used by the Nifi processor to process and ingest the data into the database.

...

Step 3: Click on the send button for the request and if the request is successful the user should see a response message. Please refer to the below screenshot.

After successful execution of the csv import api we get the response and we can see the file status indicating if the file is uploaded or not using GET file status API. If the file is successfully uploaded we will get the response as uploaded and if there are any errors it will send us the response indicating there was an error in the file.

...

This API will be used to write events into the csv file which will be stored in the input folder. The csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. The API can be used to add individual events into csv.

...

This API will be used to write dimensions into the csv file which will be stored in the input folder. The csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. This API can be used to add individual dimensions into csv.

...

This API will be used to write datasets into the csv file which will be stored in the input folder. The csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. This API can be used to add individual datasets into csv.

...

If the csv files are too large in size, the upload process of those files will take more time which will make users wait for longer duration to receive the response from the API. As a result, the upload process will be running asynchronously and we have developed this api to know the status of the file at any particular time.

PUT file status API

Step 1:  Open the specified request & add the details

API Endpoint: <domain_name>/ingestion/file-status

HTTP Method: PUT

...

Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in  Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml .Provide the valid input details for the parameters shown below. 

...

Step 3: Click on the send button for the request and if the request is successful the user should see a response message. The response message will be indicating the file status has been updated.

...

This api is required to update the file status when the file is processed by a processor group and should be moved to ready to archive state and further to be uploaded to s3. This api helps to maintain the track of processed files and differentiate which files have to be uploaded to s3 archive bucket once it is processed.

Schedule API

This API helps to schedule the processor group at any particular time and ingest the data into the database.

...

The schedule api helps to run the processor group in Nifi at a scheduled time. The schedule time can be updated by changing the cron expression for scheduled_at property in the request body. The processor group processes the csv file at a scheduled time and ingests the data into the database.

Upload to S3 API

The api helps to upload the processed files to the archive bucket and also upload the file to the error bucket in s3.

Step 1: Open the specified request & add the details

API Endpoint: <domain_name>/spec/s3

HTTP Method: POST

...

Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml . Provide the valid input details for the parameters shown below.

The scheduled_type key is allowed to take two types of values: archive and error.  

  • If the value is archive then a processor group is created to upload all the archived files present in the archived folder in server to be uploaded to s3 archived bucket. 

  • If the value is an error then a processor group is created to upload the error files present in the error folder in server to be uploaded to s3 error bucket.

...

Step 3:  Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml . Provide the valid input details for the Parameters shown below.

...

This api also schedules to run the processor group at any particular time. The processor group automatically picks all the files in the archived folder or error folder and uploads to s3 archive bucket and s3 error bucket respectively.