Info | ||||||||
---|---|---|---|---|---|---|---|---|
|
Test Prep Creation
Test Prep Consumption
Print Practice Questions
How print is initiated from consumption clients:
Jira Legacy | ||||||
---|---|---|---|---|---|---|
|
Report card
When students play questions, they would like to know the summary & detailed report of their performance. Summary should include score, and time taken. Report should show question-by-question which ones were answered correctly / incorrectly, and also provide links to solution.
This Report should be available at the end of Practice Set content
Test Prep Data Driven Decision Making (Reports & Dashboards)
Question Analysis
Questions are tagged to various categories of the framework. For example, in case of K-12 school education, categories such as Board, Medium, Class, Subject, Topic, and Learning Outcome. It is important to denormalise these details for each question. By providing following analysis for each question, we can enable administrators to analyse learner performance for each question.
Also it is important to maintain roll-up of Questions to Content to Textbook in order to filter by textbooks specific to a program.
For each question, we should have
Question Summary
In simpler words: How many people saw the question? How many attempted it? How much time did they spend on it? Find below more details
Number of times question was seen
Number of times question was attempted
Time spent on the question
This will help in overall gross number of times question was seen / attempted. Every time a content is played, session ID is not regenerated.
Number of sessions in which question was seen
Number of sessions in which question was attempted
Time spent on the question
This will help in getting an average number of times a question is played within an app session
Number of people (unique users) seen the question
Number of people (unique users) attempted the question
Time spent on the question
This will help in finding an average number of time a user opens / attempts a question, and if there are any repeat users.
Number of devices opening the question
Number of devices attempting the question
Time spent on the question
This will help in finding if devices are used as shared device or single user device.
Answer / Solution Summary
In simpler words: How many people saw the solution? How many played video in the solution (if available)? How much time did they spend on Answer altogether? How many people saw questions with video solution but did / did not view video?
Number of times / sessions / people / devices seeing or going to Answer? (This applies to MCQ & Subjective questions as these are the only question types on QuML)
Number of times / sessions / people / devices seeing Answer which has Video?
Number of times / sessions / people / devices going to or watching Video?
Time spent on Answer + Time spent on Video (if available)
Performance / Response Summary
In simpler words: How many people saw the question? How many people attempted it? How many people got it correct? How many people got it incorrect? What are the most common incorrect responses? What is the time taken by those who got it correct vs those who got it wrong?
Learning Outcome report
Here is a sample
Expand |
---|
This report summarises performance of all users across learning objectives covered in the test prep content |
Content Usage report
Here is a sample
Expand | ||
---|---|---|
| ||
|