Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current Restore this Version View Page History

« Previous Version 20 Next »

Bulk Upload Questions

https://project-sunbird.atlassian.net/browse/SB-22801

Teachers have a large pool of questions they would have already created for conducting tests, quizzes, and exams. We want to leverage all these assets available in abundance in the community of educators through ecosystem participation. We have seen following scenarios where the need of bulk upload of questions has come up:

  1. An organization already have questions in an existing system. Hence creating them manually through UI is a lot of effort

  2. An organisation is creating new questions, but wants to import questions in multiple systems, not just Sunbird (e.g. DIKSHA). Hence they would like to keep the questions in a common spreadsheet format and bulk uploaded into multiple systems

  3. An organization is creating new questions and also simultaneously getting the questions translated into multiple languages. Hence having the questions in a spreadsheet is easy to share them with multiple translators and get the translations done.

Sunbird enables creation of question sets through various workflows enabled in Sourcing solution. We plan to complement those workflows by enabling bulk upload of questions in following workflows (in order of priority)

  1. Bulk upload questions within a question set, where the question set might be created in a target collection driven sourcing project or in a taxonomy driven sourcing project.

  2. Bulk upload questions within a question set, where the question set is the target object in a sourcing project.

  3. Bulk upload questions in a framework driven sourcing project.

To enable the above capabilities fundamentally requires

  1. Ability to bulk upload questions

  2. Link them to a question set

We will be enabling bulk upload of questions and linking them to a question set considering CSV (comma separated values) input. Users are likely to use tools such as Google Sheets, Microsoft Office Excel, and other spreadsheet editing tools. Detailing out the key milestones below

Milestone 1: Bulk Upload Questions

The goal is to upload questions with its associate media (images)

  1. Support Multiple Choice Question (MCQ) with validations such as

    1. Minimum 2 options, maximum 8 options

    2. At least and only one correct option

  2. Question and Options can have images. Google Drive path for image is provided by user which is extracted by system (bulk upload tool). This is already available as a component. Read more here.

  3. Every question can have following details (metadata)

    1. Name

    2. Taxonomy Framework Categories: Board, Medium, Class, Subject, Topic, Learning Outcome. These will derived from the Question set if questions are being uploaded within a question set.

    3. Keywords

    4. Author

    5. Attributions

  4. User provides CSV in the prescribed format filled with required details

  5. Basic validations such as

    1. Text contains only unicode characters

    2. Any cell does not contain any images

    3. All mandatory columns are filled for a particular row in CSV

  • Kartheek Palla Please share a final format of CSV for Bulk Uploading Questions similar to bulk upload content

Milestone 2: Link Questions to a Question Set which are being bulk uploaded

The goal is to link questions at relevant place in a question set hierarchy structure

  1. Level 1 Question Set Unit

  2. Org_FW_topics

  3. Target_FW_Medium

  4. Target_FW_gradeLevel

  5. Target_FW_subject

  6. Target_FW_topic

Bulk Upload Questions template

https://docs.google.com/spreadsheets/d/1ndzapGGV6q8698x-NQzK_ufln4YX1HQ09jsFsC7kA60/edit?usp=sharing

User flow - Contributor

Flow 1: Contributors will be able to upload questions within a question set

As a contributor I should be able to bulk upload questions for a question set in a sourcing project. Following is the flow to enable this:

  1. When a contributor logs into contribution portal and opens a target collection page of a project to which she can contribute (i.e. her nomination is accepted), currently against each target collection there is an “Upload Content” that opens the collection hierarchy.

  2. Given user has access to a sourcing project where she can contribute, When she creates a question set within the sourcing project, Then she can upload questions to the question set

  3. User will see an option to “Bulk Upload Question” in the question set similar to “QR codes” in the collection asset editor

  4. In addition, there should be a “Bulk Upload Question” option.
    Clicking “Bulk Upload Question” option, Bulk Upload Question screen should open up. The screen should have following options

    1. Option to select a Bulk Upload Question (metadata) file from local folder (of user’s system)
      Assumption: the metadata file will have publicly accessible URLs to the question related files

    2. There is a link to sample metadata file: “Sample Bulk Upload Question metadata file”

    3. User selects metadata file, user clicks “Upload”. System provides a message “Validating file”

    4. The system should first validate metadata file against the selected files. Following are the validations:

      1. All the columns are available

      2. All the mandatory columns have values filled in

      3. There are no duplicate URLs in file path column

  5. In case there are errors in the metadata file validation, display relevant error message on the Upload dialog

    1. Some columns are not available:
      “Metadata file validation failed. Following columns are not found in the file. Please check and upload again: <list the missing column names>”

    2. Some mandatory columns have values missing:
      “Metadata file validation failed. Following rows have missing values. Please check and upload again: <list the row numbers (starting from 1) with missing values>“

    3. Some duplicate URLs in File path column:
      ”Metadata file validation failed. Following rows have duplicate file links. Please check and upload again: <list the row numbers (starting from 1) with duplicate links>“

  6. In case of metadata file validation errors, “Upload” button is disabled unless user re-selects a metadata file again.

  7. In case metadata file doesn’t have any validation errors, the dialog shows
    “Bulk Upload is in progress.
    Number of content uploaded successfully: <no.>
    Number of content failed: <no.>
    Number of content pending: <no.>”

  8. There is an option to download status report as a csv.

  9. There is a “Close” button to close the dialog. When user clicks it, it is returned to Textbook page of the project.

  10. In the Textbook page, whenever user clicks “Bulk Upload Content”, in case a bulk upload for that textbook is in progress, it shows the status dialog as described in point 8.

Limits on number of content, size: Number of contents per textbook for one bulk upload job: 300 Maximum size of each content is same as max size supported by system.

Reference material

  1. Bulk Upload Content related /wiki/spaces/DO/pages/1581350917

    1. https://project-sunbird.atlassian.net/browse/DP-18

    2. https://project-sunbird.atlassian.net/browse/DP-947

    3. https://project-sunbird.atlassian.net/browse/DP-967

    4. https://project-sunbird.atlassian.net/browse/DP-1480

Implementation details

  1. Create a generate QuML API for various interaction types such that it takes required parameters as input and generate a QuML output. For example, Multiple Choice Question fundamentally contains Question, Options, and Correct Answer. The API takes these 3 as inputs in HTML or JSON format and generates a QuML spec. This also allows QuML to evolve rapidly as just by updating this ‘Generate QuML’ API with latest QuML spec, we can upgrade various places where Questions & Question sets are getting created such as Question Set Editor, Bulk Upload Questions & Question sets.

    1. This logic exists today ingrained in the Question creation component. It needs to be extracted out and made available as API / something.


Question Set APIs:

http://docs.sunbird.org/latest/apis/questionapi//#tag/QuestionSet-APIs

Question APIs:

http://docs.sunbird.org/latest/apis/questionapi//#tag/Question-APIs