Assessment Data “Pain Points” for Educators
1. Simplicity / Training / Learning
Problem: Having to learn and use variant systems to obtain results for different assessments.
Solution: Pre-built Report Library.
We do the work, on the customer’s behalf, to pre-build an assortment of the most-used reports, and we deliver them to each stakeholder’s MMARS™ Report Inbox. There is no training required. And, MMARS™ reports all annual and periodic assessments in a similar fashion – so all test data are in a customer’s MMARS™ Assessment Data Warehouse. One tool, one location, one method for retrieval, similar and understandable report formats. For digging deeper, MMARS™ users can ask us for additional reports, or they can build them, using our simple DIY who/what/when/where interface.
2. Speed / Delivery
Problem: Not having access to the data soon enough to implement effective changes.
Solution: Automated, iterative, efficient "batch" reporting.
Once the student results data are available electronically from the publisher, we can load the data into MMARS™ within 3 days (typically faster). Not only do we load the data, we organize and pre-build, and deliver, a thorough set of summary and pupil reports for each stakeholder, in each distinct audience. Administrators and teachers can be using the data almost immediately. There is no waiting on the publisher or State to reveal information.
3. Student Growth
Problem: Most charts/reports show only achievement. They contain no clear measurement or quantification of the ± growth of students, groups, and subgroups.
Solution: MMARS FPL™ and DFS reports.
MMARS computes year-over-year and term-to-term growth on a per-pupil level. For any scale score based test, we also compute this fractionally – “between the bands”. Each student gets a growth score that clearly articulates their growth, to the hundredth of a performance level (change in Fractional Performance Level™ - ∆FPL™). We also show the change in each pupil’s Distance from Standard (∆DFS), the measurement that is the foundation of the CA School Dashboard. These scores are reported at the pupil level, and are also aggregated into various growth thresholds, producing summary improvement/decline reports for groups and subgroups.
Problem: No access to comparative data (by year/school/grade/teacher/subgroup/student).
Solution: MMARS Jux™.
Jux™ is our short form for “juxtaposition”. We deliver a series of side-by-side comparisons, by year, school, grade, teacher, subgroup … and even by teacher. The teacher reports, when used term to term or spring-to-spring (for annual tests), clearly show the Instructional Impact. Jux™ can reorient the presentation format, so you can see the data just the way you want.
Questions/needs/concerns, by audience, with references to the solutions above.
School Board (liaison to the media & public)
Are we growing? Where? Are we serving all subgroups effectively? (3: Growth, 4: Comparisons)
Superintendents and districtwide administrators (asst supts, directors, etc.)
Need to coordinate delivery of useful reports to principals and teachers, in a timely fashion. (2: Speed, 1: Simplicity)
Need comparative results to identify trends by year, school, grade level, subgroups. (3: Growth, 4: Comparisons)
Must identify curriculum / systemic issues to direct and allocate resources. (3: Growth, 4: Comparisons)
Need to coordinate review of reports with teachers, in a timely fashion. (2: Speed, 1: Simplicity)
Need teacher effectiveness data, showing instructional impact. (3: Growth, 4: Comparisons)
Must identify best-known-methods, share, and guide professional development where indicated. (3: Growth, 4: Comparis
No time to learn a new reporting system (should be teaching). Need reports “pushed” to them. (1: Simplicity)
Need data while it’s fresh, so instructional intervention can be effective. (2: Speed)
Need at-risk pupil data to deliver or outsource remediation. (3: Growth)
Need comparative results to differentiate instruction, including remediation and enrichment. (3: Growth, 4: Comparisons)