For Students, Faculty, and Staff: MyU One Stop

Digital Education and Information
College of Education and Human Development
Learning Data Framework | Digital Education and Innovation

Learning Data Framework

The CEHD Learning Data Framework was created to assist CEHD instructors as they use learning data to answer instructional questions. The guide is designed to provide a practical framework with which instructors can identify questions related to an area of their program, course, or teaching practice and structure their coursework in a way to yield interpretable results. 

 

Successfully Use Learning DataImage of the Learning Data Framework Steps making a continuous circle

 

  1. Identify an instructional question
  2. Identify information that will inform the question
  3. Assure course design will capture information needed
  4. Interpret learning data
  5. Take Action!

 

 

 

Each of these steps is outlined in more detail in the following materials. The framework was influenced by a pilot conducted in the College of Education and Human Development (CEHD) aimed to pair instructors’ deep knowledge of their content and students with the data visualization tools available in the university’s LMS, Canvas. The pilot looked to determine how instructors use learning data to influence and answer questions related to their teaching practice and, as the semester progresses, to influence interventions aimed to improve student achievement. Each step features specific insights or examples from this pilot in the ‘Getting to …’ section. 

Step 1. Identify An Instructional Question

While it may be valuable to simply review learning data looking for patterns or anomalies, data becomes more meaningful to instructors when it addresses questions they have about their teaching practice, course, program, or activities. 

  • Determine if your question applies to a program, course, or assignment. 
  • Decide if your question is descriptive, diagnostic, predictive, or prescriptive. 
  • If your question is nebulous, break it down into component parts.
  • If your question has multiple components, determine if any one aspect is dependent on another.  
  • Try to rank or order the questions, as doing so might suss out what is most important to you at this time.
  • Articulate your question in a sentence.
  • Identify if you can take action with the answer to your question during the semester or post semester. 

Getting to Step 1

 “How do students participate in the course?”  Several instructors found it difficult to define and narrow their focus in our pilot. For example, several instructors had questions related to student participation. In articulating their questions, most were interested in specific types of student participation.  Breaking the idea into component parts and articulating it in a sentence allowed for more actionable next steps. “How does the timing of assignments affect student participation?”

Step 2. Identify Potential Data Sources

Learning data can be collected from a multitude of resources. In some cases there may not be enough or the right type of data to address a particular question. Availability of data is constantly changing, so be sure to reassess data sources periodically. 

  • Identify the basic components of the data needed. 
  • Determine at what level the data might be collected; assignment, course, program, etc.
  • Identify if or how you may gain access to the data needed. This is particularly important if your question relates to a program or curriculum. 
  • Determine what actions students take to create the data that interests you.
  • Determine if the data you need is collected and stored automatically or if you must create a way to collect it.
  • Identify a means to capture data, be it an LMS, a spreadsheet, a form, a survey, etc. 
  • When designing a form or survey to collect data, work backwards.  Address what data is ultimately needed in a report and then build corresponding questions.  
  • Review what data is available within your LMS. If you are using Canvas, review this Canvas Data overview.
  • When designing a form or survey, the more specific and granular your questions, typically the more easily the data can be reviewed and used. 

Example: 

“What is the semester you are participating in the pilot?” Fall

“What is the year you are participating?” 2018

Yields more distinct data than:

“What is the semester and year you are participating in the pilot?” Fall 2018

Getting to Step 2

“Does increased communication with students who are struggling influence their performance?”  It was helpful to first identify the basic components of the data needed, in this case, a measure of communication and a measure of student performance  within a span of time. Additionally, a survey of students might also have data that can help address this question. In preparation for the pilot kick-off, we mapped known instructor questions to potential data sources based on the capabilities of the learning management system being used. We used these to help guide the initial conversations with instructors.

Step 3. Assure course design will capture information needed

Courses are often designed in a way that suits a particular content area and instructors’ style, this often means no two courses are alike.  When designing courses with learning data review in mind, it will be helpful to address these recommended practices:

  • Imagine how data might look in your course after setting up all of your materials.
  • Assure assignments have due dates. In most situations, the LMS will provide a timestamp, which allows you to see patterns in data.
  • Only the materials in an LMS can be collected by the LMS.  Readings in Google, a Library Course Pack, external videos or links, for instance, won’t provide information in the LMSs data reporting tool. 
  • Consider using modules to organize information. Modules allow you to visually see components of data collection throughout your course sessions. 
  • Allow time for both pre and post surveys, quizzes or questionnaires if seeking to measure change over time.
  • Seek the assistance of an instructional designer and/or data analyst.

Getting to Step 3

Instructors distinctly design courses with access to materials in mind.  Many instructors limit access to coursework weekly and restrict access to future course materials. Yet, these same instructors often note their students would rather see the course materials in full and be able to submit assignments prior to the official due date if they are able. This paradox led several instructors to ask, “Does allowing early submission of assignments adversely affect student grade?” In order to answer this question, course design was important. Pilot instructors worked with an instructional designer, used course modules within the LMS, used firm (as opposed to flexible or differentiated) due dates, and, most importantly, published their course in full so as to review how students interacted with upcoming course content and if early submission of assignments affected their overall grade. 

Step 4. Interpret Learning Data

Large tables of data can be a treasure trove of information, but can also be overwhelming.  Data that has been transformed into a visual of some kind is often easier to read and understand, but deep understanding might take time and a little research. Having a deep understanding of your content, teaching style, and students will help in the analysis of data made available to you–context is vital in determining what the data means and how it might be used.  When interpreting data, keep the following in mind:

  • Start with looking at data you feel more comfortable interpreting, you are more familiar with, or when you have a very firm anticipation of the outcome.
    • Example: Individual assignment completion compared to individual grade. What does that look like in the data or data visualizations available
  • Practice data interpretation in real time.  Looking at data every week or two as the semester progresses allows you to more easily interpret what is happening because you are situated within the exact scenario.
  • Look for anomalies. Anomalies tend to make the overall pattern of data more clear. Attempting to explain what is happening with the anomaly may give greater insight into the whole. 
  • Invite a colleague or instructional designer who isn’t as familiar with the course to review as well, they may have a different interpretation that is insightful.
  • Seek expert advice: many colleges have services that would provide assistance in visualizing large data sets.

Getting to Step 4

Instructors in the pilot were first asked to review their learning data from a previous course offering and look to answer their instructional question.  In many cases, the lack of context (what was happening at the exact time) or familiarity with the data visualization meant instructors were less confident in the ‘answer’ the data was giving them.  Instructors were specifically asked to check in on their learning data periodically throughout the term to alleviate this issue. 

Step 5. Take Action!

Instructional questions are typically those that seek not only information for information sake, but a means to adjust or modify practice or materials based on the answers gained. Learning data can be used to inform large scale changes, like program sequencing and curriculum adjustment, or to inform large individual changes, like course or activity redesign. Importantly, learning data can also be used to make smaller, but nonetheless significant changes within a semester.

  • Build time for yourself to take action. This may be simply setting aside an hour each week to review data and determine your next steps.
  • Based on your instructional question, brainstorm what actions you imagine you might take to make change before the semester begins. 
  • Determine if you intend to communicate or share data with students.
  • Create a communication plan before the semester begins so as to more effectively take action as the semester progresses. 
  • As you review learning data, keep a list of insights gained that might inform a change to your program/course/activity. 
  • Post-semester action might look at larger changes such as:
    • Adjusting the order of content/activities/courses
    • Inserting activities that scaffold content
    • Adjusting readings to course
    • Modifying assignments
    • Adjusting the length of time spent on topics
    • Modifying the sequence of a curriculum or program
  • Schedule time to evaluate after the semester is complete, readdressing your question and any actions taken during the semester.
  • Invite a colleague or instructional designer who isn’t as familiar with the course to review as well, they may have a different interpretation that is insightful.
  • Remember you don’t necessarily need to redesign your course to effectively use the learning data.  Start small by making semester-based interventions. These semester-based interventions may ultimately affect student outcomes.

In-semester action might look like:

    • Communicating with students who are struggling or falling behind
    • Communicating with students who are doing well
    • Communicating with students after or before a particularly difficult activity
    • Readdressing a topic 
    • Adjusting a discussion topic based on class interest
    • Inserting a check for understanding
    • Flipping aspects of in-class content and homework
    • Providing a follow up post/video/message of clarification
    • Adding a student discussion or question board
    • Adding/adjusting course readings for clarification or to allow a deeper dive into content
    • Adjusting quiz language or quiz questions
    • Reordering of course content

Getting to Step 5

“What are the signals of a student struggling in class?” After checking into learning data approximately every other week during the semester, an instructor noticed a student had virtually stopped participating in the course site, something that hadn’t been noticed in the physical classroom given the large number of enrollees.  Another instructor saw several students whose scores and participation started gradually and persistently decreasing. In both instances, these instructors reached out to students via email and within class to try and rectify the students’ progress. In both instances, instructors were unsure if they would have readily made the observation had they not been reviewing the data. At least one instructor also placed a reminder for future semesters that at this point in the semester it would be helpful to review for any signs of student disengagement. 

 

© Regents of the University of Minnesota. All rights reserved. The University of Minnesota is an equal opportunity educator and employer. Privacy Statement