From EdFutures
Jump to navigation Jump to search

TEST stands for Technology Enhanced Summative Testing.


We know that where accountability is linked to assessment (i.e. students' performance) then summative assessment is the major driver of practice in schools (see 'The assessment problem' in the Schome website for an analysis of some of the issues leading into a suggestion that Social Learning Analytics might be the answer).

Inspired by the work of Andrew Fluck in Tasmania on the eExam system, TEST aims to establish on-screen testing in place of paper-based examinations for high stakes assessments in England.

Some evidence

Exam room.jpg

Paper-based tests cannot assess the new discipline knowledge and new skills and competences that are made possible by digital technology, which is why Luckin et al (2012 p.64) identified that “Much existing teaching practice may well not benefit greatly from new technologies”. Referring to paper-based assessment Jencks (1972 in Gardner 1999 p.97) noted that “there is ample evidence that formal testing alone is an indifferent predictor for success once a student has left school”; this lack of authenticity has been exacerbated by the changes that digital technology has had on society. On-screen testing is being adopted in many sectors, including HE and FE, but has had very little impact on high stakes assessment in schools in the UK (though it is being implemented in the USA, Finland and Australia).

Twining et al (2006 p.20) discussed problems with current (paper-based) assessment models in their report for Becta on the Harnessing Technology Strategy:

There is strong support for the view that we are using the wrong ‘measures’ when it comes to evaluating the benefits of ICT in education (for example Hedges et al 2000; Heppell 1999; Lewis 2001; Loveless 2002; McFarlane 2003; Ridgway and McCusker 2004). Heppell has long argued that the ways in which we assess learning that has been mediated by ICT is problematic, and has illustrated the problem using the following analogy:
Imagine a nation of horse riders with a clearly defined set of riding capabilities. In one short decade the motor car is invented and within that same decade many children become highly competent drivers extending the boundaries of their travel as well as developing entirely new leisure pursuits (like stock-car racing and hot rodding). At the end of the decade government ministers want to assess the true impact of automobiles on the nation’s capability. They do it by putting everyone back on the horses and checking their dressage, jumping and trotting as before. Of course, we can all see that it is ridiculous.
(Heppell 1994 p.154)

It seems clear that current assessment practices do not match well with the learning that ICT facilitates (for example Ridgway and McCusker 2004; Venezky 2001) and there is strong support for the need to change how we assess learning in order to rectify this problem (for example Kaiser 1974; Lemke and Coughlin 1998; Lewin et al 2000; McFarlane et al 2000; Barton 2001; ICTRN 2001; Trilling and Hood 2001).

Assessment is linked to accountability in schools, which results in assessment being a key driver of schools’ behaviours and practices. Evidence from the 22 Vital Case Studies in English schools (Twining 2014a) and the 13 Snapshot Studies in Australian schools (Twining 2014b) indicates that current (paper-based) high stakes assessment inhibits the use of digital technology in schools. Schools that were making extensive use of digital technology reported that there was a mismatch between what they were doing and what their students would be assess on. This meant that they reduced their use of digital technology and moved back to traditional forms of teaching about three months before high stakes national assessments. At the most basic level this was to allow their students to develop their hand writing muscles so that they could write for up to six hours per day during the exam period. More fundamentally they needed to teach their pupils how to write exam answers on paper, as the process is different to word-processing your answers (even if the spell checker and grammar checker are disabled). This provides a disincentive for schools to invest in digital technology (as it will not be fully utilised for a high proportion of the time.

A great deal of work on computer based assessment in schools took place in England prior to 2010. However there has been little growth in on-screen testing in schools since then. This reflects changes in the political and regulatory climate that increased the risk to exam boards of carrying on working in this area.

What are we doing

We are liaising with relevant bodies (e.g. DfE, Ofqual, Exam Boards, researchers, teachers, students) to develop an understanding of the key issues and start to identify possible solutions. This includes work under the auspices of ETAG.

Some of the challenges

There are a wide range of challenges to moving towards on-screen assessment in schools. These include (but are not limited to):

  • Policy issues (e.g. focus on 'terminal' rather than 'when ready' testing
  • Regulatory barriers (Ofqual)
  • Concerns about validity of on-screen assessments
  • Concerns about the need for pedagogical and curriculum changes (so that you are doing the things that on-screen assessment can test)
  • Practical and logistical problems in schools (e.g. insufficient digital technology infrastructure)
  • Commercial concerns (e.g. 'first mover' risks for Exam Boards; challenges linking in with existing systems)
  • Inertia in the system (which might be framed as lack of strategic leadership)

Possible solutions

On-screen assessment is well established in many sectors (almost all sectors other than schools, within England at least). Indeed some countries have already move to on-screen assessment in schools (e.g. Lithuania, Slovenia, and Georgia), others are moving in that direction (e.g. the USA for the Common Core; Finland; parts of Australia, specifically Tasmania). From 2015 the PISA tests will all be taken on-screen, and indeed PISA are introducing new tests from 2018 that will take account of the digital environment (read TES Connect article).

Pearson (one of the two big suppliers of on-screen testing services in the UK) recently published a report (Barber & Hill 2014) in which they argue that current assessment methods are no longer working, and set out a proposal for how to move to on-screen testing in schools (Download the full report or read this brief summary).

Our approach

Exam hall.jpg

Based on work by Andrew Fluck on his eExam System, we are adopting an approach that has two separate components:

  • a technical solution
  • a change management pathway

The change management pathway

This is the most critical element. Traditionally proponents of on-screen testing have focussed on the advantages its offers, which foregrounds the scale of the change involved. Our approach, building on Andrew Fluck's work, is to break the process of moving from paper-based to on-screen testing into smaller steps:

  1. Paper based (everyone does the exam on paper)
  2. Paper-replication (some people do the exam on paper, whilst others do 'the same' exam it on-screen) - this reduces the need for multiple computers, allows a Bring Your Own (BYO) approach, and provides a failsafe (reverting to answering on paper) if there is a technical failure)
  3. On-screen replication (everyone does the exam on-screen, but it would be possible to revert to providing answers on paper if there were a technical failure)
  4. On-screen (everyone does the exam on-screen, and full use is made of the potential of on-screen assessment)

This route allows gradual build up of the necessary robust digital technology infrastructure and associated logistics, expertise, and confidence in the new assessment model(s).

Get involved

If you are interested in on-screen assessment in schools then let us know (for example by posting a message in the discussion forum for this page) and share your experiences within EdFutures.net (e.g. by editing or adding to this page, or creating other pages on the topic)