...
This document provides the details of how the product workflow has been designed by considering the use cases of end users. This will explain the flow process of the application starting from Injecting the events, dimensions & datasetshow the data is uploaded to the cloud storage.
Purpose: The purpose of the document is to provide how to upload the data cloud storage using ingestion api’s so that it can used for processing to ingest the data and access into the end datasets.
Step-wise Ingestion process
...
This API will import the dimension csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will be used by the Nifi processor to process and ingest the data into the database.
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/march-releasedev/spec.yaml
Provide the valid input details for the Parameters shown below.
...
This API will import the event csv and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will be used by the Nifi processor to process and ingest the data into the database.
...
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/march-releasedev/spec.yaml
Provide the valid input details for the Parameters shown below.
...
After successful execution of the csv import api we get the response and we can see the file status indicating if the file is uploaded or not using GET file status API. If the file is successfully uploaded we will get the response as uploaded and if there are any errors it will send us the response indicating there was an error in the file.
Ingestion of data using API
The data can also be ingested using API’s developed for event,dimension and dataset.
Ingestion of Events using API
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/event
HTTP Method: POST
...
This API will be used to write events into the csv file and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. The API can be used to add individual events into csv.
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/march-release/spec.yaml
Provide the valid input details for the Parameters shown below. The request body should conform to the schema stored in the database for the particular event name.
...
Step 3: Click on the send button for the request and if the request is successful the user should see a response message. Please refer to the below screenshot.
After successful execution of the event api we get the response and the data sent in the request body will be written to the csv file. If there are any errors it will be written to the separate csv file and valid data will be written to the other csv file.
...
Ingestion of Dimensions using API
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/dimension
HTTP Method: POST
...
This API will be used to write dimensions into the csv file and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. This API can be used to add individual dimensions into csv.
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/march-release/spec.yaml . Provide the valid input details for the Parameters shown below. The request body should conform to the schema stored in the database for the particular dimension name.
...
Step 3: Click on the send button for the request and if the request is successful the user should see a response message. Please refer to the below screenshot.
After successful execution of the dimension api we get the response and the data sent in the request body will be written to the csv file. If there are any errors it will be written to the separate csv file and valid data will be written to the other csv file.
...
Ingestion of Datasets using API
Step 1: Open the specified request & add the details
API Endpoint: <domain_name>/ingestion/dataset
HTTP Method: POST
...
This API will be used to write datasets into the csv file and upload it in to the combined_input folder in cloud if there are no errors. Then adapter will use the same files to breakdown the combined input into multiple input files. Later those files will then be used by the Nifi processor to ingest the data into the database. This API can be used to add individual datasets into csv.
Things to take care of while ingesting Data / Debugging:
The date format should be correct. The accepted date format is DD/MM/YY
Make sure the data that you are trying to upload in the system, pass all the foreign key constraint.
Do the necessary changes in the script related to The file name and folder name.
Don’t try to re-upload the same data. It will append the new data not update. So one data file should be uploaded once.
When you are uploading the data into the system, make sure we keep the connection alive with the server by having the focus on the terminal. If the connection will break with the server it will stop the data ingestion. Or you can use the screen option on the server for seamless data ingestion.
If you have large data files and you want to break those data files for each month then you can use this script to break data files. Do the necessary changes in the script related to file name and folder name.
schema name stored in table should be same.
Error Monitoring
The error file will store all the error records during the ingestion process and will be uploaded to the appropriate cloud storage. The user can login to the respective cloud storage and download the file to take a look at the error records present in the csv. The below mentioned steps will specify on how to access the error file.
...
Step 2: Send the query parameters with reference to the YAML file. The query parameters for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/dev/spec.yaml
Provide the valid input details for the parameters shown below.
...
Step 2: Build the request body with reference to YAML file. The request body for the above api is attached in Link for yaml: https://github.com/Sunbird-cQube/spec-ms/blob/march-releasedev/spec.yaml . Provide the valid input details for the Parameters shown below.
...
Step 3: Click on the send button for the request and if the request is successful the user should see a response message as “FIle “File uploaded successfully”. The files will be uploaded to the emission folder created in the respective cloud storage.
...