Using Assessment Analysis Features on eConestoga

Share this Teaching Tip
Reading Time: 3 minutes

Recent updates to eConestoga mean all course shells now include three assessment analysis features visible only to faculty. These features provide valuable data on students’ performance on course assessments across an academic semester. This post explores how you can use this data to reflect on assessment pain points and successes alike and respond accordingly to support students throughout the assessment process. 

eConestoga’s Assessment Analysis Features

The three new assessment analysis features include: 

  1. An “Assessment Analysis” tab under “Settings” that allows you to analyze achievement metrics for students.
  • Assessment analysis metrics will flag grade items:
    • Above or below a threshold average (i.e., above 90% or below 55%)
    • With a fail rate above a certain percentage of students (e.g., 20%)
    • With no scores published within a set number of days (e.g., no scores posted more than 15 days after the assessment end date)
    • Note: The default metric thresholds have been set through consultation with Senior Leadership, and can be adjusted to more effectively represent your specific assessment or course. For example, for a highly supported, small-stakes formative assessment, you might decide that if more than 10% of students are unsuccessful, this warrants further consideration. Also of note is that these percentages will include students receiving a grade of zero, but will not include those without grades entered.
  1. A “Flagged Assessments” widget on the course landing page that gives faculty a quick look at assessments of concern based on the achievement metrics set under the “Assessment Analysis” tab. 
    •  The widget flags high and low averages, high fail grade items, and unpublished assessment scores. 
  2. A pop-out “Assessment Analysis Report” available through the “Flagged Assessments” widget that aggregates and analyzes assessment data. 
    • You can view fail rates and averages, among other data points, and choose to include or exclude zeros from the analysis. 

Reflect, Analyze, Adjust 

The assessment analysis data for your course offers an opportunity to reflect on assessments and adjust teaching and feedback strategies as needed. 

Here are some steps to guide in the process:

  1. Using the “Flagged Assessments” widget, review flagged items and identify any potential sources of concern. For example, has as assessment with a high fail rate been flagged? Or an assessment that is showing scores average well below the set threshold? 
  2. If students have struggled with a particular assessment, consider:
    • Were they assessed on concepts that were not taught? How might you rectify this in future?
    • Were there sufficient opportunities for students to practice and receive feedback before the assessment? Consider how you might incorporate more activities or practice opportunities related this (and other) assessments into future lessons.
    • Were examples of successful assignments provided? Exemplars and examples from previous students can be helpful to support better understanding of expectations, and parameters.
    • Was the rubric or marking scheme shared with the students before the assessment was started, and were they given adequate opportunity to ask questions in order to fully understand how they would be marked?  
    • Were assessment instructions unclear or questions poorly written? If so, in fairness to students, you could consider discounting those questions from the assessment grade. 
    • Did students inadequately prepare? If so, why? Could it be they did not see value in the assessment (e.g., its relative weight in the grading scheme is too low)? Was adequate preparation provided in class?  For level 1 students, do they even know how to effectively study/prepare for that type of assessment?  Consider what you could do to support this important skill set.
    • Was attendance/completion of the assessment so poor that most students failed?
    • Consider if you are grading appropriately to level and Conestoga’s Grading Procedure.
  3. If you created the assessment:
    • Review the clarity of the assessment (e.g., wording of questions). 
    • Survey students on their experience completing the assessment: Where were they most challenged?  
    • Gauge the perceived value of the assessment for students. Did the assessment matter to them? If not, why?
    • Identify trends in student performance across the assessment. For example, was there a high fail question or section?
    • Verify that the assessment aligns with what is being instructed (in amount, content, level of learning outcomes). 
    • Re-examine the rubric or other grading tool: Does it effectively convey expectations for the assignment? 
    • Discount grades if certain components of the assessment are determined to be unfair.
    • Consider external factors, like high rates of absenteeism, in skewing grades. 
    • Implement scaffolded preparation for subsequent assessments, such as small, no-stakes practice assessments. 
    • Provide formative feedback so that students can adjust their learning approach and improve on subsequent assessments. 
    • In preparing for subsequent assessments, revisit learning from the assessment of concern with students to improve study skills.
  4. If the assessment was created by someone else: 
    • Summarize the quantitative data you are seeing that is cause for concern (i.e., whether high fail, high pass, large numbers of non-submissions etc.).
    • Consider reaching out to the person(s) that created or were involved in creating the assessment to share your feedback – perhaps others that teach the same course are experiencing similar concerns. If you are unsure of who created the assignment, or are uncomfortable reaching out to them, consider sharing your feedback on that particular assessment with your Chair to ask for confidential guidance.
    • You can supplement the quantitative data with a summary of relevant qualitative data regarding student performance on the assessment in question. What trends emerged in your feedback to students? What patterns do you observe across student work on this assessment? 

Ada Sharpe

Ada Sharpe, Ph.D. (English and Film Studies), has worked in faculty and support staff roles in the post-secondary sector for over a decade. She has taught and researched in literary studies and writing studies and co-led a university writing centre. Ada specializes in understanding how assessment shapes the teaching and learning experience for faculty and students.

Feedback
Feedback
Did you find what you are looking for? How easy was it to find what you are looking for?
Next
Enter your email if you'd like us to contact you regarding with your feedback.
Back
Submit
Thank you for submitting your feedback!