Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Summary

The Assessment Engine allows LearningBuilder to administer on-line, low-stakes assessments that are typically used for Practice Exams and other supporting evaluations.

Page Propertiesinfo
Target releaseEpic
Document status
Status
titleDRAFT
Document owner
DesignerDevelopersQA

Goals

  • Promote learning by delivering formative assessments

  • Increase engagement by adding interaction to learning activities

  • Benchmark performance before and after learning activities

Capability Summary

  • Support multi-section Assessments with multiple navigation modes ranging from "free range" navigation to tightly controlled, one-question-at-a-time, forward-only progression.

  • "Reflective Feedback" allows answer-level feedback to be presented to the user, which can be used as an educational aid.
    Question Item weighting allows different questions, and different answers within each question, to deliver variable points towards the overall score.

  • User can save progress and return later, picking up where they left off.

  • Supports an optional, per-question time limit.

  • Responsive UI.

Background and strategic fit

LearningBuilder is an assessment platform. One of the modes of assessment is to administer on-line, low-stakes assessments. These assessments are typically used for Practice Exams and to validate learning for situations in which organizations curate learning activities but wish to promote and/or measure engagement with learning materials. Adoption has been slow, but we have a handful of customers who use the assessment engine (ABO and ABOM). Other customers have inquired about it and it comes up frequently with prospects (e.g., NBCRNA). - May 2018.

Assumptions

...

 

Limitations

...

  • Although measures have been taken to protect data integrity and to prevent tampering, Assessments are not designed to manage highly secure tests.

    We want to make it annoying to hack the assessments, but not dedicate significant resources to preventing cheating.

Requirements

  •  

Sample Use Cases

  • Practice Exams

  • Journal Article Review, as part of ongoing Continuing Education

  • Surveys

User Stories

...

Title

User Story

Importance

Implemented?

Notes

1

Create an Assessment

As an Administrator, I create an Assessment so that I can evaluate a person's knowledge.

(tick) Yes

2

Create a Section

As an Administrator, I

can

create a Section of questions that groups a set of questions around common content.

(tick) Yes

3

Create a Question

As an Administrator, I

can

create a Question so that I can measure a specific component of knowledge.

(tick) Yes

4

Re-Order Questions

As an Administrator, I

can

order questions so that there is a sequential experience of questions.

(tick) Yes

5

Create a Response

As an Administrator, I

can

create a Response so that a person can answer a question.

Probably move to Question page

(tick) Yes

6

Specify a Cut score

As an Administrator, I

can

create a cut score to establish the score necessary for a passing result.

(tick) Yes

7

Specify a Competency Model

As an Administrator, I

can

assign a Competency Model to the Assessment to govern the scope of Competencies available to assign to Questions.

(tick) Yes

8

Define Result

As an Administrator, I

can

manage a Results page so that the results can be tailored to the kind of assessment being taken.

 (tick) Yes

9

Publish an Assessment

As an Administrator, I

can

publish an Assessment so that it is available to be completed.

(tick) Yes

10

Un-publish an Assessment

As an Administrator, I

can

un-publish an Assessment so that it is not available to be completed while changes are made.

(tick) Yes

11

Automated Scoring

As an Administrator, by indicating the "correct" answer for each question (or by assigning each answer a value and specifying a Cut Score for the Assessment as a whole), I can create an Assessment that is immediately and automatically scored so that the user receives immediate realtime feedback about their performance.

(tick) Yes

12

Manual Scoring

As an Administrator, I

can

indicate that certain questions (such as essays or other complex interaction types) require human evaluation before a score can be assigned. 

Not implemented yet

(minus) Not yet

Designed, but not implemented

13

Manage Forms

As an Administrator, I

can

create multiple Forms within a single Assessment so that I can administer the Assessment multiple times without duplicating the Assessment Experience.

(tick) Yes

14

Specify that the Assessment has a Time Limit

As an Administrator, I specify the amount of time in minutes a person is allowed to complete the Assessment so that the Assessment is not open indefinitely.

(warning) Partially

Current supports per-question time limit only

15

Track assessment-level timing metrics

System records the amount of time it takes to answer each Question.

User interaction and design

Questions

Below is a list of questions to be addressed as a result of this requirements document:

(tick) Yes

Tracked, but not displayed in the UI anywhere

16

Specify that the Assessment will use Confidence Ratings

As an Administrator, I specify that questions will include a Confidence Rating to understand how confident the Test Taker is in the select answer during an Assessment

(minus) No

Can be modeled using custom Questions, but no way to present those questions only a percentage of the time

17

Specify the Confidence Rating Scale

As an Administrator, I specify the Rating Scale used for the Assessment so that my ratings can use language and specificity I am comfortable with.

(warning) Partially

Scales are managed per-question. There is no facility for centrally defining a scale and then reusing it.

18

Specify that the Assessment will use Relevance Ratings

As an Administrator, I specify that questions will include a Relevance Rating to understand how relevant the Test Taker believes the content is to his or her practice.

(minus) No

Can be modeled using custom Questions, but no way to present those questions only a percentage of the time

19

Specify the Relevance Rating Scale

As an Administrator, I specify the Rating Scale used for the Assessment so that my ratings can use language and specificity I am comfortable with.

(warning) Partially

Same as Confidence Rating

20

Answer Confidence Rating

As a Test Taker, when I answer a Question and Confidence Ratings are turned on, I will answer my Confidence Rating so that I can share my level of confidence in my answer with the Board

(minus) No

21

Answer Relevance Rating

As a Test Taker, when I answer a Question and Relevance Ratings are turned on, I will answer my Relevance Rating so that I can share my belief in the relevance of the Question to my practice.

(minus) No

Same as Confidence.

22

View Time to Complete

As a Test Taker or an Administrator, I can see how long it took to complete a sitting of an Assessment

(warning) Partially

Timing data are tracked, but not displayed. (Can be exposed using custom reports)

23

View Time on each Question

As an Administrator, I can see how long each test taker took to complete each Question.

(warning) Partially

Timing data are tracked, but not displayed. (Can be exposed using custom reports)

24

Record Item Response Strings

As the System, I record the responses to each Question in the QTI standard for the Item type so that the results can be shared with any system using the QTI standard

(minus) Not yet

A QTI export could be constructed using custom reports or the Integration Hub, but no native QTI representation exists.

25

Record Post-Test Surveys

As a Test Taker, I can complete a Post-Test Survey so that I can provide feedback to the Board regarding the Assessment Experience.

(warning) Partially

This can be done using unscored questions or  Workflow Questions instead of a 1st class feature.

Filter by label (Content by label)
maxCheckboxfalse
sorttitle
reversefalse
cqllabel = "assessments"