...
Asset Model: An asset model will be enhanced to support interception points in content. We will start with videos, but this work will be done keeping in mind that in the future, we will extend this concept to support interactive content of any kind. In fact, we could, in the future, extend the concept even to more than two assets. For example, a PDF intercepted with questions, or a Question Set intercepted with a videos, and so onchild could read a few pages in a document (PDF), watch a quick video to visually understand the concept, and quickly practice what she learnt, all in sequence.
Creator: A new content creation UI will be created to enable any user of Sunbird to create interactive videos. They will pick from any preexisting video, and be able to add any existing question sets at any timestamp in the video. A possibility here is to allow for creation of new question sets inside the creation process.
Player: We do not need a new consumption experience for interactive videos. The current video player can be enhanced to support the interactive video content models
Telemetry and Analysis: Detailed interaction events will be captured when an interactive video is consumed. This will include, but is not limited to video playback quality, scrubbing (Pause, Rewind and Fast Forward) behaviour, question events (answering, skipping, etc) and score related data.
...
Srl. No. | User Action | Expected Result |
---|---|---|
1 | User navigates to available content |
|
2 | User clicks on interactive video |
|
3 | Question set shows up on interactive video | User can answer, skip, or re-attempt questions in the question set depending on the properties of the question. Feedback is provided to the user on their performance |
4 | User finishes interactive video | Feedback is provided to the user on their score |
...
Creator Dashboard
I believe we should also have a section on “Creator Dashboard” where creator can see metrics for their interactive videos, generate insights about what to improve.
Srl. No. | User Action | Expected Result |
---|---|---|
1 | User navigates to their Dashboard | Relevant metrics are available for interactive videos created by them including (but not limited to): engagement with questions, number of video watchers, scores, etc. |
Exception Scenarios
Wireframes
For Future Release
In future releases, we will aim to enhance all types of content with interactivity
...
as outlined in the Proposal section above So instead of just an “interactive video”, we will have “interactive content”, which will be a very powerful learning tool for the user.
We will allow for users to add interactivity to a video in the uploading process itself.
We will allow for different question types: Match the Following, Multiple Answer Questions, and so on.
Localization Requirements
...
We will collect click level interaction when an interactive video is played. Some basic events are listed here, but this list will grow as we implement
...
Event Name
...
Description
...
Purpose
...
User attempts a question
...
User clicks/attempts to answer, for example, clicking a choice in an MCQ
...
To capture user behaviour while answering questions
...
User submits their response to a question
...
User clicks on the submission button to store their answer
...
To capture user behaviour while answering questions
...
User navigates in the video
...
User pauses, rewinds, or forwards in a video
...
To capture engagement of different segments in the video
...
User finishes the interactive video
...
User reaches the final timestamp of the video
...
To capture percentage of users finishing the interactive video
Impact on other Products/Solutions
...
Product/Solution Impacted
...
Impact Description
Interactive Videos will emit all telemetry that is already emitted by the included components: the video player, and the QuML player. In addition to this, some custom events specific to the behaviour of interactive videos will be defined.
Key Metrics
The main metric to track here will be engagement with the interactive video. The goal will be to understand which interactive videos are working better, which are not doing so well, and in general, understand what makes for a good interactive video. Some of the metrics we can use for this purpose are listed below
Srl. No. | Metric | Purpose of Metric |
---|---|---|
1 | Video watch duration | To know what percentage of the video is watched by a user, and on aggregate what percentage of users have watched different portions of the video |
2 | Number of questions answered |
|
3 | Score | On an aggregate level, this will help us understand users' performance, and also the level of the questions in the video. |
The goal will be to understand which interactive videos are working better, which are not doing so well, and in general, understand what makes for a good interactive video. For example:
Can we measure what’s the sweet spot of interception point for driving engagement? i.e. Should I put question in the first 1 minute or after 3 minutes or after 5 minutes?
How long should my video be - 3-5 minutes or 8-10 minutes? (Depending on subject and age group / class)
For example, NROER / NCERT has many videos which are 45 minute long (recorded lectures). I want to know how many pieces should I make of them to make them of ideal length (available from 2) and where to put the questions (available from 1).