Introduction
This wiki details how to enable usage details of the assets/contents contributed and published from via the sourcing to consumptionplatform.
Background & Problem statement
Currently, the Sourcing platform doesn't know the usage details of the assets/contents published for consumption. Due to this, the contributor doesn't have visibility of how many assets contributed are used, how much they are used.
Key Design Problems:
Enable usage details of the assets/contents published from sourcing to consumption.
Proposed Design:
Approach 1: Enhance existing Update Content Rating data product:
The existing Update Content Rating data product gets the content consumption metrics writes them to ES. It can be enhanced to filter out VidyaDaan specific(Sourcing) contents and make a new call to write the VidyaDaan contents to the dock elastic search.
Approach 2: Using the transaction events & processing that transaction event
Currently, transaction event generated as part of neo4j transaction gets pushed into
learning.graph.event
kafka topic by logstash.We can write a Flink job at the consumption layer which will
Read
learning.graph.event
Kafka topic of consumption layer.Search the object from the consumption layer and find the origin id
if the origin id exists in the sourcing database, the metric specific data will be updated in the sourcing database object.