Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

This document provides the details of how the product workflow has been designed by considering the use cases of end users. This will explain the flow process of the application starting from Injecting the events, dimensions & datasetshow the data is uploaded to the cloud storage.

Purpose: The purpose of the document is to provide how to upload the data cloud storage using ingestion api’s so that it can used for processing to ingest the data and access into the end datasets.

Step-wise Ingestion process

...

  • The date format should be correct. The accepted date format is DD/MM/YY

  • Make sure the data that you are trying to upload in the system, pass all the foreign key constraint.

  • Do the necessary changes in the script related to The file name and folder name.

  • Don’t try to re-upload the same data. It will append the new data not update. So one data file should be uploaded once.

  • When you are uploading the data into the system, make sure we keep the connection alive with the server by having the focus on the terminal. If the connection will break with the server it will stop the data ingestion. Or you can use the screen option on the server for seamless data ingestion.

...

  • schema name stored in table should be same.

Error Monitoring

The error file will store all the error records during the ingestion process and will be uploaded to the appropriate cloud storage. The user can login to the respective cloud storage and download the file to take a look at the error records present in the csv. The below mentioned steps will specify on how to access the error file.

...