Objective
This document provides the details of how the product workflow has been designed by considering the use cases of end users. This will explain the flow of the application starting from Injecting the events, dimensions & datasets.
Purpose: The purpose of the document is to provide how to ingest the data and access the end datasets.
Step-wise Ingestion process
The ingestion of the data can be done using the apis. To run the apis, please use postman as a tool. If postman tool already exists, skip the setting up program section or else install the postman by following steps
Setting up Postman
Download the postman application and import the collection.
Select the import option in the postman to import the collection. Please refer to the below screenshot.
The data can be ingested in two ways
Ingestion of data using csv
Ingestion of data using API
Ingestion of data using CSV
Ingestion of Dimension Data using CSV
Select the csv_import folder in the postman collection.
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/new_programs
HTTP Method: POST
This API will import the dimension csv and upload it in to the combined_input folder in cloud if there are no errors.
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml
Provide the valid input details for the Parameters shown below.
file : Attach the csv file for the importing
ingestion_type : Specify the type of ingestion
ingestion_name : Name of the dimension(dimension name should be present in the database)
Step 3: Click on the send button for the request and if the request is successful the user should see a response message. Please refer to the below screenshot.
After successful execution of the csv import api we get the response and we can see the file status indicating if the file is uploaded or not through GET file status API. If the file is successfully uploaded we will get the response as uploaded and if there is any error it will send us the response indicating there was an error in the file.
Ingestion of Event Data using CSV
Select the csv import folder from the postman collection.
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/new_programs
HTTP Method: POST
This API will import the event csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files.
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml
Provide the valid input details for the Parameters shown below.
file : Attach the csv file for the importing
ingestion_type : Specify the type of ingestion
ingestion_name : Name of the event (event name should be present in the database)
Step 3: Click on the send button for the request and if the request is successful the user should see a response message. Please refer to the below screenshot.
After successful execution of the csv import api we get the response and we can see the file status indicating if the file is uploaded or not using GET file status API. If the file is successfully uploaded we will get the response as uploaded and if there are any errors it will send us the response indicating there was an error in the file.
Things to take care of while ingesting Data / Debugging:
The date format should be correct. The accepted date format is DD/MM/YY
The file name and schema name stored in table should be same.
Error Monitoring
The error file will store all the error records during the ingestion process and will be uploaded to the appropriate cloud storage. The user can login to the respective cloud storage and download the file to take a look at the error records present in the csv. The below mentioned steps will specify on how to access the error file.
The cloud storage bucket/container will be named as cQube-edu.
Inside the bucket there will be multiple folders for different processing stages and the ingestion error files will be stored in ingestion_error folder. This folder will in turn contain folders for each program, the name of the folder will be same as the <program_name>.
The <program_name> folder will internally have a folder which will have current date as the name.
The user can access the current date folder if its present and can see the error files present in it and download the error files to view the errors present in the csv.
Please refer to the below screenshots of different storage type( Minio, Azure and AWS) on how to access the error files
Accessing error files in Minio Storage
Accessing error files in Azure
Accessing error files in AWS
To get the count of processed records and error records GET /ingestion/file-status API can be used.
File Status API
GET file status API
GET file status API
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/file-status
HTTP Method: GET
Step 2: Send the query parameters with reference to the YAML file. The query parameters for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml
Provide the valid input details for the parameters shown below.
Step 3: Click on the send button for the request and if the request is successful the user should see a response message which contains the status of the file.
If the csv files are too large in size, the upload process of those files will take more time which will make users wait for longer duration to receive the response from the API. As a result, the upload process will be running asynchronously and we have developed this api to know the status of the file at any particular time.
Schedule API
This API helps to schedule the processor group at any particular time.
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/spec/schedule
HTTP Method: POST
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml . Provide the valid input details for the Parameters shown below.
Step 3: Click on the send button for the request and if the request is successful the user should see a response message. The schedule time will be updated in the processor group.
The schedule api helps to run the processor group in Nifi at a scheduled time. The schedule time can be updated by changing the cron expression for scheduled_at property in the request body.
V4-Data Emission API
This API are used for data migration. It will help to migrate the data in cQube 4.0 version.
Step 1: Open the specified request and add the details.
API Endpoint: <domain_name>/ingestion/v4-data-emission
HTTP Method: GET
Step 2: Click on the send button for the request and if the request is successful the user should see a response message as “FIles uploaded successfully”. The files will be uploaded to the emission folder created in the respective cloud storage.
National Programs API
This API accepts the event data in the zip file format and adds it to the emission folder in the respective cloud storage.
Step 1: Open the specified request and add the details.
API Endpoint: <domain_name>/ingestion/national_programs
HTTP Method: POST
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml . Provide the valid input details for the Parameters shown below.
Step 3: Click on the send button for the request and if the request is successful the user should see a response message as “File uploaded successfully”. The files will be uploaded to the emission folder created in the respective cloud storage.
The zip file will be extracted and will be read by the adaptors and then moved into the input folders.
0 Comments