Discussion middleware session
Introduction
This document describes the design of discussion middleware session.
https://project-sunbird.atlassian.net/browse/SB-22319
Problem Statement
Storing the required information of nodebb and make it available whenever we required.
Key designs solutions
Identifying required info for nodebb.
Store that info in DMW cache.
Limitations with existing approach
Entire DMW depends on portal session. If any issue occur in portal session then discussion forum also
should affect. By considering this we are implementing cache in DMW itself.
Solution
Initially we stored some of discussion middleware data in portal session. due to this approach our DMW
depends on portal session. So to make DMW independent we are implementing cache with in DMW.
Architecture Diagram
Current architecture diagram Link
Proposed Changes:
Redis:
To pass data from DMW to new notification service, We planned to use Kafka but in the same way we can use Redis Stream/Redis PubSub also.
Redis Pub/Sub:
In DMW we can publish an event and that event can be listened by Notification service and vice versa.
Redis Stream:
We need to create a redis stream in DMW and add the data into the stream. On other side
notification service will receive the data from the stream.
What data we are storing in DMW cache?
We are storing two main objects in DMW cache.
Keys | Data |
---|---|
User | {"userId": { "uid": "2" }, "userSlug":"content_creator_tn3941","userName": "content_creator_tn3941" } |
Context | [{ "sbIdentifier": "c4320769-13cd-43a9-bc73", "sbType": "group", "cid": "150". }] |
We are storing the DMW cache data in Redis Database
.
Discussion Middleware complete session object
Schema
Key | Type | Key | Schema |
---|---|---|---|
context | Object | cid | { |
user | Object | sbUserIdentifier | { “sbUserIdentifier“: String |
How to access user information from DMW cache.
We can access the user data from cache using sbUserIdentifier
as a key. and it will retune the
below data.
{
"uid": "2" ,
"userSlug":"content_creator_tn3941",
"userName": "content_creator_tn3941",
"sbUserIdentifier": "dfer23-dsds332-33drt-td899w"
}
2. How to access context information from DMW cache.
We can access the context data from cache using cid
as a key. and it will retune the
below data.
{
"sbIdentifier": "c4320769-13cd-43a9-bc73",
"sbType": "group",
"cid": "150"
}
Different ways of trigger notification service:
Type1:
If the parent service have actual notification data along with action data, then we have to send the below formate. And flink job will take this object and process it.
{
service: "discussion-forum", // Service Name
headers: { //request tracebility
sid: "X-Session-id" // from headers
traceID: "X-Request-id" // from headers
},
notifcationData: { // Actual Notification request payload
ids: ["sunbird-user-identifiers"], // sunbird identifiers (To which user need to send notification)
createdBy: { // Which User is doing the action
id: "sunbird-user-identifier",
name: "sunbird-username",
type: "User"
}
action: {
category: "discussion-forum" // Service name
type: "", //action type
template: {
type: "JSON",
params: { // Based on params notification service will form the notification message
param1: "",
param2: "",
param3: "",
}
}
}
additionalInfo: { // specific to the service & common for all API's of the service. Flink job will parse the object & construct specific to Notfi API
context: { // this is mainly used to know on what context this notification was created.
cid: "", // category id
sbItendifier: "", // sunbird Group/Course/Batch identifiers
sbType: "" // type is Group/Course/Batch ... etc
},
// based on resp derive the below fields
category: {
cid: "", // caregory (Under which category this notification created)
title: "", // category title
pid: "" // what is parent category id for cid
} // Flink job has to call API;s to get names of these below (cid, tid, pid etc)
topic: {
tid: "", // topic ID
title: "" // Topic title
}
post: {
pid: "", // Post id
title: "" // Post tile
}
}
}
Example:
Publish topic in kafka with the above data.
Flink job will trigger and receive the kafka topic data(Notification Objcet).
With notification object, Flink job will trigger notification service.
Type2:
Upstream service will use type2 structure, when they do not have information on template . Flink Job will process the data and generate notification api structure and then trigger the notification service. Flink job may call apis or information should be given on the additionalInfo.
Schema Request:
Examples :
Step 1: Group service will push the event in the below format
Group Activity Added Notification Used Case:
Step 2: Flink job will receive the data from Kafka and group processor job will take the based on the value at service field.
Step 3: Processor job create the necessary template for the group operation action.type and create a notification api request object and call notification service api for processing.
Clarification needed
When DMW cache expired but user tried to do some actions then need to show one message. Also need message string : @Manimegalai Mohan .