Assessment Analysis


Assessment Analysis allows for deeper analysis of the school’s external results.



Performance Progress - Growth Tool



  1. The blue menu bar enables the user to move between the areas within Assessment Analysis. The pages will refresh as the user moves between tabs (meaning the assessment item will not stay the same). The first tab looks at Performance Progress, which indicates the degree to which students have improved from one assessment item to the next.
  2. Select the cohort, result period and subject for analysis.
    For instance, Year 10 → Semester One, 2020 → Mathematical Methods.
    Select the provider, section and data type.
    For instance, NAPLAN → Numeracy → Numeracy Score.
    In this example, the user is looking at the Year 10s of Semester One, 2020 (who studied Mathematical Methods). They are analysing the degree to which these students’ Numeracy scores improved over time.
  3. Click Create Report.


  4. The drop-down menus above the chart indicate the initial and latest assessment item being used to measure improvement.
    In the example shown:
    o The initial assessment item is their Year 7 test (which they sat in 2017); and
    o  The latest assessment item is their Year 9 test (which they sat in 2019).
  5. The vertical axis measures effect size (improvement from one test to the next). A positive effect size signals improvement in performance, whilst a negative result shows decline.
    (For additional information on the effect size calculation, please contact TrackOne Studio).
    o The solid red line shows the average effect size (at the national level).
    o  The dotted red line shows the average effect size (at the school’s level).
  6. The horizontal axis indicates the student’s result on the latest assessment item.
    In the example shown, this would be the student’s Year 9 Numeracy scale score.
    o The solid black line shows the average scale score (at the national level).
    o The dotted black line shows the average scale score (at the school’s level).


  7. Students in the green quadrant have scored:o Above the national average in terms of their scale score; and
    o Above the national average in terms of their effect size.
    These results are pleasing. The scale scores are above average and they have improved across the two tests.
  8. Students in the yellow quadrant have scored:
    o Above the national average in terms of their scale score; but
    o Below the national average in terms of their effect size.
    These results can often go unnoticed. Whilst they are above the national average, they have not improved across the two tests.
  9. Students in the blue quadrant have scored:
    o Below the national average in terms of their scale score; but
    o Above the national average in terms of their effect size.
    These results show promise. The scale scores may be below average, however they have improved significantly across the two tests.
  10. Students in the red quadrant have scored:
    o Below the national average in terms of their scale score; and
    o Below the national average in terms of their effect size.
    These results are concerning. The scale scores are below average and they have not improved across the two tests.
  11. Summary statistics for the assessment items are listed below the graph.
  12. Individual student results (on the latest assessment item) are then listed in the next table down. These results can be exported using the green Excel icon in the top right-hand corner of this section.
  13. These results can be sorted in ascending or descending order.
  14. Clicking on an individual bubble will condense these results to those of a single student.
  15. Clicking on a student’s ID number will open their transcript in a separate tab.


Performance Progress - Comparison Tool


Now select the Comparison Tool tab. This tool allows teachers to compare student results between two assessments (the initial assessment and the latest assessment). In our current example we are comparing the Year 10 Mathematical Methods students’ results between their NAPLAN numeracy test in 2017 with their numeracy test in 2019.



Each student in the class is indicated by a blue dot with the vertical axis showing the latest assessment result and the horizontal axis showing the initial assessment result. The black line indicates the trend between the two assessment items for the cohort.


Cohort Performance Over Time


The second tab on the green menu bar is Cohort Performance Over Time, which charts a cohort’s performance on a particular assessment item over the years.



  1. Use the blue horizontal menu bar to move to Cohort Performance Over Time.
  2. Select the cohort, result period and subject for analysis.
    For instance, Year 9 → Semester One 2020 → Mathematics.
    Select the provider, section and data type.
    For instance, NAPLAN → Numeracy → Numeracy Score.
    In this example, the user is looking at the Year 9s of Semester One 2020 who were enrolled in Mathematics. They are analysing the degree to which these students’ Numeracy scores improved over time.
  3. Click Create Report.
  4. There is the option to overlay additional assessment item results (from the same Provider). In the example above, the user may choose to overlay Spelling, Grammar, Writing or Reading results.
  5. There is also the option to overlay State and National average results for a particular assessment item. In the example above, the user may choose to overlay State and National average Numeracy Score results.


Academic Comparison


The third tab on the blue menu bar is Academic Comparison, which plots a cohort’s performance in a particular subject against their performance on an external assessment item.



  1. Use the blue horizontal menu bar to move to Academic Comparison.
  2. Select the cohort, result period and subject for analysis.
    For instance, Year 10 → Semester One 2020 → Mathematical Methods.
    Select the provider, assessment item and data type.
    For instance, NAPLAN → Year 9 2019 → Numeracy score.
    In this example, the user is looking at the Year 10s of Semester One 2020 who were enrolled in Mathematical Methods. They are plotting their Mathematics results against their Year 9 2019 NAPLAN Numeracy scores.
  3. Click Create Chart.
  4. The trend line indicates the expected internal and external results. In the example shown, a student who received a B- in Mathematics should have received a NAPLAN Numeracy score of approximately 720.
    Students above the trend line are performing better than expected externally.
    Students below the trend line are performing worse than expected internally.
  5. The students’ results are listed in the corresponding table.
  6. The size of the bubble indicates the number of students. Clicking on an individual bubble will reduce the table to the students within that bubble.


Performance At Risk


The fourth tab on the blue menu bar is Performance At Risk, which identifies those students who have gained or lost a certain number of points on an external assessment item.



  1. Use the blue horizontal menu bar to move to Performance At Risk.
  2. Select the provider, as well as the earliest and latest assessment item.
    For instance, NAPLAN → Earliest: Year 7 2017 → Latest: Year 9 2019.
  3. Select whether the application should return students who have gained or lost points on the chosen assessment item. Then select the number of points.
  4. Select the section (e.g. Numeracy) and the data type (e.g. Numeracy score).
    In the example above, the application is returning students who gained at least 50 NAPLAN Numeracy scale score points (between sitting the test in Year 7 2017 and Year 9 2019).
  5. Clicking on an individual student’s ID will produce their transcript below.
  6. These results may be exported out into Excel.