Welcome to the Curriculum & Leadership Journal website.
To receive our fortnightly Email Alert,
please click on the blue menu item below.
Curriculum & Leadership Journal
An electronic journal for leaders in education
ISSN: 1448-0743
Follow us on twitter Curriculum & Leadership Journal iPad Edition For more information or support regarding the CLJ App please contact us

Using school data to inform students' learning

Daniel Balacco
Leadership Consultant, Department of Education and Children's Services SA

Educators are now awash with data. It is important that they are confident and skilled in applying this data to the decisions they make about teaching and learning, so that they become 'active players in the data-rich environment that surrounds them and incorporate a system of use for interpreting and acting on information.' (Earl & Katz 2006, pp. 2–3).

This article presents emergent learnings in relation to the effective use of data for informing teaching and learning at the system, school, class and learner levels based on policy frameworks, inquiry processes and resources trialled in the Department of Education and Children's Services (DECS) in South Australia.

Using multiple data measures

For approximately eight years, DECS Improvement and Accountability Framework (DIAf) resources and current workforce development leadership programs have supported site leaders and regions to apply the concept of 'multiple measures' of data (Bernhardt 2002). The multiple measures model aims to ensure that all relevant data is considered for improvement purposes. It also aims to encourage educators to embrace the use of data as a means to improve learning outcomes rather than understanding it simply as a means for meeting accountability requirements.

The multiple measures model comprises four categories of data that include both qualitative and quantitative evidence crucial for a deep understanding of the key drivers of improvement or contributing root causes. Demographic data sets the context and describes the characteristics of learners, staff and the community. It comes from sources such as records of enrolments and attendance; as well as figures for students with English as a second language, students from non-English speaking or Indigenous backgrounds, or those with disabilities; as well as site financial data and community information from the Australian Early Development Index (AEDI). Perception data covers stakeholder's attitudes, experiences and beliefs about the teaching and learning environment. It can be obtained from sources such as opinion surveys of students, teachers and parents, focus groups and records of parental concerns. Learner achievement data describes the outcomes learners are achieving as a result of teaching and learning. Sources have included SACSA Developmental Learning Outcomes, SACSA Standard Achievement Levels, NAPLAN results and teacher assessments. Process data describes the practices, programs, pedagogy and policies used to deliver learning and outcomes at the learner, class and site level. Possible data sets include observation tools, rubrics and descriptions of pedagogy, class lesson plans and curriculum plans. For further details see the DECS Guide to Self Review p. 4.

The multiple measures model allows various categories of data to be combined in flexible ways to help educators and leaders understand what is needed to improve results in particular areas of inquiry or evaluation, whether at the learner, class, site or regional level. For example, the question 'What teaching practices/processes achieve the most successful outcomes in science for Year 9 girls/boys?' can be answered through a combination of process, achievement and demographic data.

To prevent possible data overload experienced from collecting and using a large range of available multiple measures in various sites, it is important to develop a shared understanding and agreement of selected multiple measures to refocus energy on the critical step of interpretation and use of selected key data measures.

Cycles of improvement

Educational researchers have recently called for a more rigorous use of data in ways that promote an ongoing cycle of improvement (US Department of Education 2009; Bernhardt 2005). Using data within a cycle of improvement ensures that what gets collected is then analysed and used to enhance learning.

The DECS DIAf processes apply the concept of a cycle of improvement. The processes encourage regions, schools and other educational sites to collect multiple measures of data to inquire into practice and performance as part of self review and performance reporting against standards. The findings are intended to inform plans, strategies and priorities as part of improvement planning and targeted actions for intervention and support.

Using data at classroom level

Hattie (2005) suggests that data needs to be available at the classroom level to improve teachers' understanding of learners' needs. With this concept in mind, a DECS inquiry (Spencer & Balacco 2009) in a high-SES primary school, set out to investigate the effective use of multiple measures of data in classrooms. The inquiry applied Bernhardt's (2002) data framework and highlighted the importance of presenting and analysing data at the classroom level to influence change and drive improvement.

It found that an extensive range of learner achievement data and demographic data about students, were consistently gathered across the school in all classes. Learning data was gathered through observations, students' responses to questions in the classroom, rubrics, checklists, test scores and other measures of student work. Teachers used different instruments to identify students' differing learning styles.

The study also found, however, that perception and process data were not effectively gathered or used at the class level, and there was limited explicit connection between teachers' practices in the classroom and the school's directions and goals.

Based on these initial findings, the school trialled the use of classroom-level achievement targets, which had been developed by teachers and students as a means to build stronger connections to school improvement priorities. The school also commenced the collection of classroom perception and process data about engagement and learning in mathematics, and the school's quality teaching principles.

The findings indicated that the additional data provided valuable information for meeting particular student needs and resulted in refinements in teaching practice to support improved student achievement and engagement.

The successful practices highlighted within this inquiry were then promoted nationally and across DECS including statewide presentations of 'SMARTa Targets: Connecting the Site Learning Plan to the Classroom' (Spencer & Balacco 2009; DECS 2008).  

The inquiry reinforced the need for DECS and other education systems to develop rigorous, consistent and timely process and perception data sets coupled with support for their interpretation and use by central and regional offices.

Strategies and resources to support data use

The importance of connecting the presentation and interpretation of data at the class level is a developing focus for information systems and resources developed by state education authorities (see, for example, DET NSW 2010). In DECS, the majority of information systems and resources already support data provision and use at the school level, particularly demographic data and NAPLAN data. For example, one important resource offers a set of questions designed to guide school staff in the use of NAPLAN data at classroom level, year level and for the whole school. Sample questions include:

  • What NAPLAN question items are most commonly answered correctly/incorrectly by year level (or class) compared to national results and/or index category over time?
  • What does the analysis of relevant teacher based assessment data highlight for the whole school, year or class level? How does this compare with NAPLAN analysis?
  • What particular skills or aspects of the curriculum and possible teaching practices does the class/question level analysis indicate a need to focus on?
  • What are the next steps or implications for improvement planning at the whole school, year or class level groupings? Consider the implications for a whole-school approach and the processes necessary for effective planning for improvement.
  • What are the next steps or implications for teaching and learning for the whole school, year or class level? Consider what high leverage strategies, best practice or research might suggest about possible ways for how to improve student learning in the areas identified.

Feedback from DECS regions and selected principals has indicated that these resources are highly valuable. However, research suggests that resources alone are not sufficient and that effective data use in schools requires support and influence by regional and central offices to create opportunities for networking and data sharing between schools (Anderson, Leithwood & Strauss 2010; US Dept. of Education 2009). Over 10 years ago, DECS recognised this need and has since implemented a significant number of strategies and programs covering the topic of data use, underpinned by school improvement, effectiveness research and DECS policy frameworks. Currently opportunities for networking are created by regions, where data expertise is now available for sites to draw on, and centrally, through a range of coaching strategies and leadership development programs.

Summary

DECS has learnt the importance of using a range of learner achievement and multiple measures of data in an ongoing cycle of improvement in order to understand what makes a difference in our preschools, schools and regions. In DECS and across other jurisdictions teacher-friendly information systems, professional learning resources and site leader expertise are beginning to emerge with the purpose of presenting and analysing data at the classroom level in order to influence change and drive improvement. DECS also recognises that the use of data by preschools and schools is assisted by networking and the sharing of ideas and experience about data use. This networking is supported both centrally and by regions.

This learning has the potential to support teachers, leaders and education systems to become effective users of data for improvement purposes and to take closer control of the data-driven accountability context in which preschools, schools, regions and educational systems find themselves.

References

Anderson, S, Leithwood, K & Strauss, T 2010, 'Leading Data use in Schools: Organizational Conditions and Practices at the School and District Levels', Leadership and Policy in Schools, vol. 9, pp. 292–329.

Bernhardt, VL 2002, Data Analysis for Continuous School Improvement, Eye on Education, California.

Bernhardt, VL 2005, Using Data to Improve Student Learning: Middle Schools, Eye on Education, California.

Department of Education and Children's Services 2008, SMARTa Targets: Connecting the Site Improvement Plan to the Classroom. Accessed from http://www.decs.sa.gov.au/quality/

Department of Education and Training NSW 2010, School Measurement, Assessment and Reporting Toolkit (SMART)
http://www.schools.nsw.edu.au/learning/7-12assessments/smart/index.php

Earl, L & Katz, S 2006, Leading Schools in a Data Rich World. Accessed from http://www.eed.state.ak.us/nclb/2008wc/Focus_On_Leadership.pdf

Hattie, J  2005, 'What is the Nature of Evidence that makes a Difference to Learning' – Research Conference Paper:  Australian Council for Education Research. Accessed from http://www.acer.edu.au/documents/RC2005_Hattie.pdf

Spencer, K & Balacco, D  2009, 'Next Practice: What are we learning about teaching from student data' – Research Conference Paper:  Australian Council for Education Research. Accessed from http://research.acer.edu.au/research_conference/RC2009/17august/14/

US Department of Education 2009, 'Using Student Achievement to Support Instructional Decision Making'. Accessed from http://educationnorthwest.org/webfm_send/1035

KLA

Subject Headings

Educational evaluation
Educational planning
South Australia
Statistics
Standards