With the push towards mastery learning in instructional strategy, educators are constantly looking at data to find weaknesses in their instruction, struggling students, or problems with test questions (a.k.a. “items”). Students must achieve a level of mastery (ex: 90% in action verbs) before moving forward to learn more information. If students struggle, they are given additional support and re-tested.
Districts are now using “managed assessments,” which are the same tests given to all students so there is better analytics to decipher the problem areas. With item analysis, educators and assessment teams can see if the problem is a badly-written or wrongly scored item by using two crucial metrics: item difficulty (the number of students who get the question right) and item discrimination (identifies problematic items by measuring the degree which students with high assessment scores also answer those items correctly).
The existing item analysis tools are often spreadsheets and long tables of numbers that are hard to decipher and/or take action on. Educators need a way to assess quickly which students are struggling and why. Item discrimination and item difficulty are also not well-known terms by teachers. Currently, there are no ways to filter the data by demographics to identify struggling groups (ex: Special Ed or PHLOTE/Primary Home Language Other Than English).
Goals: Create a report within Schoology’s Managed Assessment Reports tool that can be quickly understood by all types of educators (district admins, assessment teams, and instructors) to identify problematic items or struggling students in order to take necessary action. All designs and functionality must be accessible.
Expected impact: Additional assessment scores will increase (as well as item discrimination scores). Users will re-teach areas that all students have missed, re-write questions, give resources to struggling students, and assign advanced material to students that are excelling.
While Schoology users range from students to parents to educators, the Managed Assessment Reports is only useful and accessible to educators. Within educators, there are a variety of roles and permissions.
Schema & flow
Due to all the user roles and data points, I mapped out all the possibilities. The user flow was created from mobile screen thumbnails.
Wireframes & testing
Mobile-first always. Given our initial research and what we’ve heard Schoology clients request, I created detailed wireframes for testing. After many rounds of iteration, we tested with a script over the phone with educators by sending them an InVision link and letting them share their screens as they navigated.
We had insightful and useful feedback and overall, the feedback was incredibly positive and excited for the future. When the got to the crucial “Item details” page (metrics about an individual question such as number of students assessed, student responses and percentages, distractor rationale, etc), we had enthused replies of, “THIS is HOT. This is what I wanted. This is why we’re going to buy.”
My product manager and I refined the definition for item discrimination and difficulty to be friendlier and give users an action.
I was the lead designer on this project; I conducted numerous user tests, researched, wireframed, designed many user journeys based on roles, and applied the new Schoology UI (a current ongoing exercise). To help facilitate communication on functionality, I created a detailed annotation guide with JIRA ticket referrals for the developers.