Skip to content.
Personal tools
You are here: Home » Features » Assessment Web Services Projects

Assessment Web Services Projects

Rowin Young
Last modified 10 Jan, 2006
Published 09 Jan, 2006
Assessment is a relatively mature e-learning area so its not surprising that there have been a number of successful JISC funded web service projects in this domain. In this article Rowin Young co-ordinator of the CETIS Assessment Special Interest Group discusses why assessment is important for the student learning experience with the aid of two scenarios and summarises the progress of some of the web service projects to date.


This article provides an overview of current and completed projects within the JISC e-Learning Programme which have concentrated on the assessment domain. The article begins by explaining the purpose of the IMS Question and Test Interoperability (QTI) specification, and then outlines two simple scenarios to illustrate the use of assessment web services projects, before giving a brief overview of each project.

Assessment standards and specifications

IMS QTI [1] is one of the most mature of the IMS specifications, and currently the only one to exist in two significant versions.

Version 1 was first released in June 2000, and went through a number of revisions until the final form, the v1.2.1 addendum, was released in March 2003. A number of tools, including several successful commercial products, have been produced to comply with v1.2.

Version 2.0 was released in February 2005, and addressed issues which could not be dealt with under the v1 point releases. This new version defined a new information model based on interactions, and included support for dynamic material. It also offered guidance on integration with other IMS specifications, including Content Packaging [2], Learning Design [3] and Simple Sequencing [4]. V2.0 focused only on the individual assessment item (`question'); work on v2.1, which covers assessment construction and outcomes reporting, is ongoing, with the final specification due for release in Spring 2006.

Unlike other IMS specifications, QTI not only describes the behaviour of assessment content but also the structure of that content itself. Both versions define a range of item types, with v2 also providing a custom interaction type to allow developers to integrate new or proprietary items types more easily.

Which specification? QTI 1.x or 2.x?

For development projects within the eLearning Programme, it is probably preferable to adopt v2 as this enables easier integration with existing assessment toolkits, which have been produced in that format, and other toolkits which will implement other specifications informed by v2.

In practice, however, commercial products are likely to remain in v1.2 format for the foreseeable future, until there is customer demand for v2. Migration tools [5] support the conversion of material from v1.2 to v2.0, so resources produced in v1.2 will be able to be reused in tools which conform to the later version.


Assessment is fundamental to the learning process. It supports reflection on learning by allowing learners to evaluate their knowledge and understanding, and provides evidence of learning to employers and educational institutions. A robust and effective e-assessment system with content designed to reinforce the curriculum can provide excellent opportunities for self-assessment as well as a reliable and efficient mechanism for summative assessment.

Scenario 1: Sophia

Sophia is studying Classics at the University of Blankshire. The University does not have an institutional e-learning or e-assessment strategy.

The convenor of Sophia's course is one of a small number of lecturers within the University who use a `vanilla' implementation of an open source Virtual Learning Environment which has been made available and is supported through the good will and availability of staff within the Computing Science department.

Some learning materials have been placed on the VLE, mainly Word documents containing lecture notes and handouts and tutorial worksheets. Some assessment items have been created for delivery within the VLE's quiz system, but there are problems with items containing Greek characters because of the way in which the system interprets Unicode, and where original texts are presented as scanned images, the image link is often broken. Her tutor tries to edit items, but hasn't been trained and lacks the confidence to do anything. The lack of a formal support mechanism for the VLE means that by the time the broken questions have been fixed, the class has moved on to the next part of the course.

Because of the difficulty in assessing free text responses, most of the questions Sophia is able to attempt are in multiple choice format, and so do not provide good practice for her final exam. Her tutor agrees to set sample tests for manual marking, but is then off sick for several weeks, taking the scripts with him, and they are never returned. A pass mark for the two course essays is required for students to be allowed to sit the final exam; her original tutor has not recorded her grade for the first essay and has not returned them, and Sophia's worries about not being permitted to sit the exam disrupt her studies.

Sophia sits her final examination feeling unprepared and unsure of what is really expected of her. The practice materials to which she has had online access are not representative of the final examination, and she struggles to adjust to the different format. She achieves only an average grade for the course, and has to revise her plans for future study.

Scenario 2: Jonathan

Jonathan is studying a module in Forensic Science at the University of Bigton. The University has a keen commitment to e-learning and e-assessment, as well as integrating learning systems with e-admin and library resources.

Bigton uses the same open source VLE as Blankshire, but has invested in dedicated support staff and training for teaching staff. All new students have to attend a compulsory computer skills session which includes and introduction to using the VLE; they receive a certificate based on their performance in the end-of-session quiz they sit through the VLE. Support staff are encouraged to attend relevant events, make contacts and participate in development work, and so have integrated community developments such as a QTI interface for the VLE and a QTI assessment management system, remote marking services and plagiarism awareness tools.

The VLE is extensively used, and content is carefully structured. As well as lecture notes and handouts and tutorial worksheets, students can access specially designed online modules with integrated quizzes. Items are both produced in-house in QTI format and drawn from a subject-specific item bank supported by the subject's accreditation body and contributed to by organisations around the world. These assessment materials help learners identify strengths and weaknesses and plan revision activities in conjunction with an integrated eportfolio module. Where assessment outcomes indicate that learners have weaknesses in particular areas such as chemistry or biology, they are directed to further remedial resources including formative assessments offered within the VLE by the relevant department. Answers requiring manual marking are submitted via the VLE's drop box, and when Jonathan's tutor is off sick for several weeks, his replacement is able to access them and return them promptly. Simulation items replicate expensive laboratory techniques, and allow Jonathan to practice experiments until he is confident he is able to perform them accurately and understands the principles involved. Outcomes from practical laboratory work are recorded at the time of assessment in the VLE's gradebook.

Part of Jonathan's final examination is delivered online in a secure environment. His familiarity with both the assessment management system and the assessment format ensure that his only worries on the day are demonstrating his learning to the best of his abilities. In the practical part of the examination, his experience with the simulation exercises mean that he is able to demonstrate his skill and learning to a good standard. He achieves an excellent grade for his course, and is well-positioned to achieve employment in his chosen field.

The projects

JISC have funded a number of projects looking at a range of assessment-related web services.

APIS: Assessment Provision through Interoperable Segments

The APIS project, based at the University of Strathclyde, provides a modular online assessment rendering engine and services which transform IMS QTI v2.0 content into XHTML [6]. This means that assessment questions in QTI format can then be integrated into various learning systems (see below).

APIS-MS: Assessment Provision through Interoperable Segments – Marking Services

APIS-MS extended the functionality offered by the original APIS project, and produced a toolkit for integrating external marking web services into elearning applications such as VLEs or assessment management systems. It also worked on extending the Remote Question Protocol specification developed by the Serving Maths project (see below) and successfully integrated it into Moodle. APIS-MS was a collaboration between the Universities of Strathclyde and York [7].

ASAP: Automated System for the Assessment of Programming

The ASAP project, based at Kingston University, produced a set of application services to automate part of the assessment process for the teaching of computer programming languages, including unit testing of student code and the provision of feedback; the automated generation of questions; and plagiarism detection and reporting. A significant outcome was the production of tools that can be used within a number of VLEs and interfaces to relevant third party software [8].

ASSIS: Assessment and Simple Sequencing Integration Services

ASSIS, led by the University of Hull with partners including Loughborough, Stanford and Strathclyde Universities, Newark and Sherwood College, Icodeon Ltd and the RELOAD project, provides tools to enable a teacher to browse, search, preview and select assessment items from item banks for incorporation within content packages which may include associated sequencing instructions. ASSIS uses the WS-BPEL (Business Process Execution Language) to support the integration of QTI content with Content Packaging, SCORM and Simple Sequencing, orchestrating a range of web services behind a user friendly interface [9].

ISIS: Integrating Simple Sequencing

ISIS produced libraries to enable the production and consumption of content-sequencing services using IMS Simple Sequencing alongside tools for the visualisation and running of sequences. The project's outputs include an integration guide for IMS Simple Sequencing and QTI. ISIS was a collaboration between the University of Hull and Icodeon Ltd [10].


PyAssess, based at the University of Cambridge, provides an open source toolkit for implementing QTI v2-based assessment (particularly marking) services in Python, building on the Python QTI v1.2 to v2.0 migration tool earlier produced. The project also produced a command line demonstrator to illustrate the use of the toolkit and support the testing and debugging of complex questions. By separating response processing from delivery, PyAssess provides a very secure method of assessment delivery which minimises the possibility of cheating by technically aware candidates [11].

Serving Maths

Serving Maths is a collaboration led by the University of York and including partners from Imperial College, London, the LTSN MSOR Network and Durham, Edinburgh and Sheffield Universities. The project developed open source software tools to address issues around the use of mathematical expressions in assessments. One of the outputs of the project was the Remote Question Protocol [12], which supports remote processing of assessment items on behalf of assessment systems and therefore maximises the range of item types and formats it is possible to deliver within a single assessment session [13].

SPAID: Storage and Packaging of Assessment Item Data

SPAID provides a range of web services to enable the development, population and operation of assessment item banks. The project's outputs include a metadata tagger, which generates a manifest file for a QTI v2.0 item (including QTI-specific extensions), automatically generating those fields which can be completed from the item's XML and providing an interface for manual completion of other fields; a content packager which packages the item according to QTI v2.0 guidelines; a separate zipping service; and basic item bank functions such as upload and search. SPAID was produced by the University of Strathclyde and the Scottish Qualifications Authority [14].


JISC has funded a number of assessment-related -projects which cover a wide range of web services required for assessment construction, delivery and response processing. Considerable expertise is being developed within this area, and emerging networks of developers and projects are helping to extend the range of functions supported by the JISC. Please let us know if there are any projects missing: As new projects emerge they are added to the CETIS Assessment special interest group website [15].


[1] IMS Question and Test Interoperability specification all versions.

[2] IMS Content Packaging specification.

[3] IMS Learning Design specification.

[4] IMS Simple Sequencing specification.

[5] Migration tools, Original Python version; Windows version.

[6] APIS project.

[7] APIS Marking Services project.

[8] ASAP project.

[9] ASSIS project.

[10] ISIS project.

[11] PyAssess project.

[12] Remote Question Protocol

[13] Serving Maths project

[14] SPAID project.

[15] CETIS Assessment SIG.


Supported by JISC Supported by CETIS
Powered by Plone