Contact Support Sign Up
Login  Sign up

Designing an initiative for analysis using Illume Impact

Review these recommendations for effective initiative design using Illume Impact for impact analysis. Some of these recommendations may not be feasible for all initiatives. If you have additional questions about whether your initiative is appropriate for analysis using Illume Impact, please see the Initiative Design Questions article.

  • Improving persistence outcomes was a goal.
    • Note that persistence outcomes used by Illume Impact will be the same as the definition of persistence for your institution used by Illume Students.
    • For example, if your institution is on a Fall → Fall persistence model, Illume Impact will be measuring the impact of an initiative offered in one Fall term on persistence into the next Fall term.
  • The analysis term(s) have already ended and persistence outcomes are available.
    • If you are interested in analyzing the impact of an initiative being offered during the current term, you will have to wait to submit the initiative until the next term. Persistence outcomes for students enrolled during the current term will be known after the next term's add/drop period is over.
    • If your institution is on a Fall → Fall persistence model, you will have to wait to submit the initiative until the next Fall term when persistence outcomes are measured.
  • The initiative occurred in the last 4 years.
  • The initiative is a program that occurred or was offered throughout the term.
  • Since PPSM used by Illume Impact matches students on the census date of a term and then measures persistence outcomes after the term has ended, it is important that data for the initiative being analyzed is distributed throughout that time period for most accurate analysis.
  • The eligibility criteria are clearly defined.In order to match participants with appropriate comparison students, it is important to understand exactly who was eligible for this initiative.
    • For example, was the initiative:
      • Accessible to any student?
      • Mandatory for all first-year students?
      • Targeted towards students in dev-ed?
      • Targeted towards students with a GPA above 3.0?
    • Consider cases that might be less clear:
      • Orientation is open to all first-year students, but transfer students and Business students have different orientation programs and do not participate in first-year orientation. For analysis in Illume Impact, transfer and Business students should not be considered eligible.
      • Writing Center services are open to any student on campus (i.e. all majors, graduate or undergraduate, full-time or part-time, etc.) provided the student is taking an English or Rhetoric course. This means that any type of student can be included as an eligible comparison student if he or she was enrolled in an English or Rhetoric course during the specified term.
      • All students with a GPA above 3.0 were targeted with an ongoing outreach campaign during the Fall 2016 semester. Rather than comparing against other students who were enrolled this term with GPAs below 3.0, the eligible comparison group should include students from previous terms who had above a 3.0 GPA to ensure students available for matching are as similar as possible to initiative participants.
  • The participant selection criteria is clearly understood.
    • For example, was the initiative:
      • Voluntary (e.g. drop-in tutoring)?
      • Randomly selected/assigned (e.g. half of ENG 101 sections were randomly selected to use the new course design)?
      • Selected/assigned through a specific, non-random criteria (e.g. all advisors of first-year, FTIC students will try the new nudge campaign)?
    • Consider how participation is defined in these examples:
      • A free online tutoring resource is offered to 5,000 randomly selected undergraduate students. If you are interested in analyzing the impact of making this tutoring resource available, participants would include all 5,000 students and comparison students would include all remaining undergraduate students. However, if analyzing the impact of using the tutoring resource, participants would be a subset of the 5,000 students who actually logged in and comparison students would include the remaining students who were given access and never logged in.
      • The Writing Center offers three review sessions for any writing assignment. Before analyzing the impact of Writing Center tutoring on persistence, determine whether participation is defined as attending a single review session or completing the series of three sessions.
    • The required data for impact analysis is ready as a .tsv or .csv file.
    • Label the first column student_id, the second column term, and the third is_participant.
      • The student_id and term values should match those you typically see in your SIS source system. When you’re uploading your file, you will see example data specific to your institution to help you provide the expected values.
      • The is_participant column should contain a Boolean value: 1 for students who participated in the initiative and 0 for students who were eligible to participate but did not. Only students who meet the defined eligibility criteria should be included in the file. If the initiative was available to any student at the institution, then only the participating students need to be defined in the file because Illume Impact will pull in the full student body for comparison.
  • For best results, the number of participants is at least 1000.The number of eligible comparison students should be at least as large as the number of participants.
    • Illume Impact requires at least 100 participants for analysis. However, if fewer than 1000 students are used for analysis, there is a lower likelihood of achieving statistically significant results. Drill-down results (i.e. Impact by Student Group) will also be affected, as the sample size for a specific student group could be very small.
  • Additional context about the motivation and implementation of the initiative, as well as any suspect confounding factors, is documented.
    • Include these details in the optional Additional Description section when uploading a new initiative.

Consider confounding factors

Even with these recommendations, there are confounding factors, or other circumstances that could affect the initiative participants or comparison group and make it difficult to determine what exactly had an effect on outcomes. For example, if an institution wanted to pilot Inspire for Advisors, but intentionally made it only available to a single advisor, then any difference in outcomes between students who were assigned that advisor and those who weren’t could be attributed to Inspire for Advisors or to the advisor or both. In other words, since the advisor’s characteristics can’t be accounted for when identifying the comparison group through PPSM, impact analysis results must include caveats about the potential confounding factors.

Consider the following common confounding factors during initiative design prior to impact analysis:

  • Either the participating group or comparison group contains a single study unit, e.g. one of the groups is representative of a single advisor/faculty member/course/department/etc. In this case, it would be difficult to attribute the difference in outcomes to the initiative or the advisor/faculty member/course.
  • The participating group and the comparison group are systematically different in a way that may be directly related to persistence outcomes, e.g. one of the groups is taught by the most experienced and qualified faculty, or the intervention is initiated by faculty voluntarily. Since Illume Impact measures persistence outcomes, if the participation criteria is chosen based upon something that may correlate to persistence it will be challenging to isolate the effectiveness of the initiative.
  • Another initiative is offered in conjunction with the same group at the same time, e.g. first-time full-time students are required to attend a Student Success Course and special advising sessions during their first term. This can be a problem because it will be difficult to isolate the effects of one initiative.
  • The participating group and the comparison group are from different time periods or terms and persistence outcomes were measured at different points in time. This can make analysis difficult because there may have been different factors that affected students based upon when they participated in the initiative.

Even using PPSM with Illume Impact, these confounding factors cannot simply be eliminated or ignored from analysis. If confounding factors are a possibility, any impact analysis should include appropriate caveats and results should be interpreted accordingly.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.