Welcome to the Curriculum & Leadership Journal website.
To receive our fortnightly Email Alert,
please click on the blue menu item below.
Curriculum & Leadership Journal
An electronic journal for leaders in education
ISSN: 1448-0743
Follow us on twitter

SMART online formative assessments for teaching mathematics

Kaye Stacey
Vicki Steinle
Eugene Gvozdenko
Beth Price
Author contact: priceb@unimelb.edu.au

 Emeritus Professor Kaye Stacey, Dr Vicki Steinle, Beth Price and Dr Eugene Gvozdenko work at the University of Melbourne, where they form the development team for the Specific Maths Assessments that Reveal Thinking, or SMART, formative assessment system. Between them, Kaye Stacey and Vicki Steinle have a deep fund of knowledge about the available research relating to the teaching and learning of mathematics. Aided by the programming and web design skills of Eugene Gvozdenko, and the teaching experience of Beth Price, approximately 60 pairs of short topic tests have been developed. The work reported here began as part of a project funded by the Australian Research Council (LP0882176), and the Victorian Department of Education and Early Childhood Development; in recent years the Catholic Education Office, Melbourne, has also been involved.


In classrooms today it is expected that a student’s learning program will be planned with their current level of understanding in mind. This is often a challenge for teachers when mathematics is the subject under consideration. The mathematics curriculum is particularly demanding, with its focus on developing increasingly sophisticated and refined mathematical understanding, fluency, logical reasoning, analytical thought and problem-solving.

Teachers may react to these demands in various ways. If a topic has been covered in earlier years, they may assume too high a level of understanding from their students before work on that topic begins in their own class. Alternatively, the teachers may lower the level of challenge to make sure that the tasks set for the bulk of the lesson are straightforward enough to be tackled by anyone. This can lead to ‘shallow teaching’ (Vincent & Stacey, 2008). The teacher may also circumvent the challenge of establishing each student’s level of mathematical understanding by emphasising the establishment of routine mathematical skills, which can be readily tested and for which progress can be easily monitored.

It is with all of this in mind that 'smart tests' have been developed for use in year 5 to year 9 maths classes (Stacey, Price, Steinle, Chick and Gvozdenko, 2009). This assessment tool has been designed to give teachers information about the understanding of their individual students in key mathematics topics. A smart test is a specific mathematics assessment that reveals thinking. Most commonly, smart tests focus on students’ understanding of fundamental ideas, like ratio and proportion. Some assessments also target simple skills (identifying adjacent and opposite sides of triangles, for example). Feedback to teachers consists in part of a description of each student’s ‘developmental stage’. These developmental stages are topic specific, and not linked to any overall system of development or government achievement targets. Any relevant diagnosis of a misconception for individual students is also reported to teachers. Teaching suggestions as to how these misconceptions might be eliminated and how to move students on to the next developmental stage are provided to the teacher. This information for teachers is designed to increase the pedagogical content knowledge of mathematics teachers, including those whose initial teacher training did not include mathematics. There are also links to the Mathematics Developmental Continuum, a valuable online resource made available by the DEECD to all teachers regardless of where, and in which educational system, they teach. These smart tests supplement, and certainly do not replace, other assessments tasks that teachers have developed and used over many years.

Smart tests now cover about 60 topics, with two versions of most tests. These matched pairs of tests can be used as pre-test and post-test, so teachers can track students’ progress. Teachers read descriptions of the available smart tests, choose one that is appropriate, and give students a password to access it. The students’ attempts are marked by computer and the patterns of results are electronically analysed to diagnose the developmental stage in the topic and any misconceptions and common errors (Stacey, Price & Steinle, 2012). Each student’s results, with relevant links to address the issues raised, are available to the teacher as soon as he or she logs in. Together, these components highlight the purpose of the smart tests as an ‘assessment for learning’ tool (Stacey & Wiliam, 2013).


An example: Readiness for operations with directed number

Many teachers will be aware that for some students, operations with directed numbers remain a difficulty throughout their study of mathematics. Before students advance to studying operations, it is sensible to check that they understand how negative numbers are ordered and how they fit into the sequence of whole numbers.

The smart test ‘Readiness for directed number’ is intended to be used before secondary school teachers formally address operations on directed numbers in their teaching. ‘Readiness for directed number’ is a typical smart test, as it identifies both developmental stages and misconceptions. Figure 1 shows an item from this test about students’ understanding of integers, inspired by earlier ideas from Küchemann (1981). We have used the task of ordering positive and negative integers as a quick indicator of the student’s underlying conceptual understanding. The item shown in Figure 1 requires students to drag the number cards into the correct order. The number cards snap into place. The precise choice of the numbers in this item is important in order to enable us to identify relevant misconceptions.

Snowman 

Figure 1. The first ‘Readiness for operations with directed number (test B)’ item probes understanding of the order of negative integers and identifies some misconceptions.

We have also used several other items to test students’ ability to move up and down a number line in preparation for operations. Student move the mercury level up and down on virtual thermometers to show temperature increases and decreases. In these items, the visual labelled temperature scale is provided so that students can directly count the number of steps that are being taken. Other items about continuing number patterns across zero have no such visual support. We know that students have other difficulties with non-integer (decimal) negative numbers and therefore there is a separate test that reports on that topic.

Students doing the smart test complete the items, starting with the item shown in Figure 1. By looking at the exact responses and the patterns in all their responses, it is possible to diagnose, with reasonable confidence, what they can do and whether or not they have any misconceptions.

The results allow teachers to identify any global problems in the whole class, or to group students for targeted teaching, and also to move students on more quickly if they demonstrate good understanding. Figure 2 shows the stages that are reported for this test.

 SMART

Figure 2. Information for teachers from ’Readiness for directed number’ smart test.

At the start of 2013, this smart test was used by 163 year 8 students of volunteer teachers from Melbourne metropolitan Education Department schools. In this sample, 49 per cent had reached Stage 3 and so were ready for formal work on operations. Teachers were made aware of this and suggestions were given to help the other 51 per cent to reach readiness before the new topic was introduced.

While a computer diagnosis from a smart test gives good information, its value is reduced if a student misses questions or makes careless errors such as typing a wrong number or selecting a wrong option from a dropdown box. In cases where a student's result is surprising for a teacher, it is possible for the teacher to see the responses given by the student and perhaps follow up with a short interview with the student.


Feedback from teachers

Nearly all teachers who responded to a voluntary questionnaire reported that they have learned something that has been useful for them through using the smart tests (Steinle & Stacey, 2012). Forty-nine per cent said that they had learned something valuable to them; only four per cent reported that they had not learned anything by using the smart tests.

The feedback also shed light on the tendencies for teachers to set the level of challenge too high or too low, discussed at the start of this article. The researchers had expected that the test results would alert many teachers to students’ inadequate preparation for a topic, and would lead them to start their teaching at a lower level than they had expected. The questionnaires revealed that, in some cases, this was indeed the situation. However, in some cases the opposite was true: many teachers commented that the smart test results led them to start their teaching at a higher level than they had previously intended.

As well as informing whole class teaching, many teachers reported using the results of the smart tests to form groups of students with similar needs. The most common strategy was to target problem areas and discuss misconceptions with groups of students who had the same diagnosis.

The most transformative use of smart tests in schools has occurred when they have been used to provide data on student understanding to a team of teachers working on curriculum improvement in their school. Smart tests can also be very helpful to professional learning teams with a focus on mathematics education and formative assessment.

Smart tests can be a powerful resource for diagnosing students’ thinking. They are also easy to use and informative for teachers, which makes them an important component of the assessment for learning process. For more information about smart tests go to our website www.smartvic.com.


References

Küchemann, DE 1981, 'Positive and negative numbers', KM Hart et al. (Eds), Children's Understanding of Mathematics: 11–16, John Murray, Oxford, pp. 102–119.

Stacey, K, Price, B, Gvozdenko, E, Steinle, V, & Chick, H (undated), Specific Mathematics Assessments that Reveal Thinking. Retrieved 18 November 2013 http://www.smartvic.com/

Stacey, K, Price, B & Steinle, V 2012, 'Identifying stages in a learning hierarchy for use in formative assessment – the example of line graphs', J. Dindyal, Cheng L-P, Ng, S-F (Eds), Proceedings of the 35th Annual Conference of Mathematics Education Group of Australasia, MERGA, Adelaide, pp. 393–400.

Stacey, K, Price, B, Steinle, V, Chick, H & Gvozdenko, E 2009, 'SMART Assessment for Learning', Paper presented at Conference of the International Society for Design and Development in Education, Cairns, Australia,September 28–October 1, 2009. http://www.isdde.org/isdde/cairns/pdf/papers/isdde09_stacey.pdf Accessed 21 March 2013.

Stacey, K & Wiliam, D 2013, 'Technology and assessment in mathematics', M A Clements, et al. (Eds.), Third International Handbook of Mathematics Education, Springer, New York, pp. 721-751.

Steinle, V & Stacey, K 2012, 'Teachers’ Views of using an on-line, formative assessment system for Mathematics', Pre-proceedings, 12th International Congress on Mathematical Education Topic Study Group 33, 8–15 July, 2012, COEX, Seoul Korea, pp. 6721–30.

Vincent, J & Stacey, K 2008, 'Do mathematics textbooks cultivate shallow teaching? Applying the TIMSS video study criteria to Australian eighth-grade mathematics textbooks', Mathematics Education Research Journal, 20 (1), pp. 82–107.

Key Learning Areas

Mathematics

Subject Headings

Assessment for learning (formative assessment)
Mathematics teaching
Teaching and learning
Middle schooling