We're putting together information to help you move to virtual advising, deploy nudging, get answers on data science questions, and use your solutions during COVID-19. ACCESS RESOURCES HERE. 

Support is Moving! You can submit a new ticket at support@civitaslearning.com

Welcome
Login
Open navigation

Review analyzed initiatives

After logging in to Illume Impact, you will see a home page showing all Completed initiatives by default. This will include initiatives submitted by all Illume Impact users at your institution. An initiative appears in the Completed tab after it's been submitted to Illume Impact and the persistence lift has been calculated after PPSM.

Each Completed initiative has a corresponding card, showing basic details about the initiative and the results of impact analysis.

The top of the card indicates the lift in persistence that was measured for this initiative using Illume Impact. Persistence is defined as a student re-enrolling for the next term and staying enrolled past your institution's census date (or add/drop period) or graduating. It's important to understand how this lift is calculated and which students are included in analysis. Refer to How is persistence lift calculated?to see a detailed explanation of the values used in this calculation.

An icon on the initiative card indicates the result of the initiative on persistence outcomes.

  • green icon with an upward arrow shows a statistically significant (p-value less than 0.05) increase in the persistence rate of participants.
  • red icon with a downward arrow shows a statistically significant (p-value less than 0.05) decrease in the persistence rate of participants.
  • gray icon with a dash shows that the effect on persistence for participants was not statistically significant. An initiative may not have statistically significant impact results if the number of matched participants and comparison students was low (e.g. fewer than 1,000 students), if there was a lot of variability in the results of bootstrapped samples, or if the persistence lift was very small. On the other hand, an initiative with a small number of students included in analysis may still show statistically significant results if the persistence lift was very pronounced. Results that are not statistically significant could still be meaningful. At this point, you may consider collecting more data before redoing the analysis or rethinking the initiative design.

Beneath the lift in persistence, find other information about the initiative:

  • Confidence interval: The percentage indicated beneath the lift in persistence (e.g. +/- 1.6%) shows the 95% confidence interval for this initiative's impact results. The is calculated by multiplying the standard deviation of measured impact across multiple bootstrapped samples by 1.96.
  • Initiative Name: The name given to this initiative by the user who added it.
  • Number of Analyzed Participants: The number of participant students submitted in the uploaded student list who were matched with a comparison student through PPSM. This number represents the total number of participant-comparison student pairs used for impact analysis.
  • Initiative Goal: The goal for this initiative that was entered by the user who added it.
  • Analyzed Terms: The start and end terms included in the uploaded file, if these terms are verified and kept during data validation. If you choose to ignore either of these terms during data validation, you will see the earliest and latest terms that were actually submitted.
  • Submission Details: The date this initiative was added and the name of the user who added it.

Click the search icon in the upper righthand corner to find a specific initiative. Type the name of the initiative into the search box to see matches.

Narrow down the initiatives you see on the home page by clicking 'Filter' to the left of the search icon.

Available filters include:

  • Lift in Persistence: View only the initiatives where a positive lift in persistence was observed or only those where a negative lift was observed, or hide initiatives that did not have statistically significant results.
  • Submission Time Period: View only the initiatives submitted this week, this month, or this year.
  • Start and End Terms: Select a start and end term to view only the initiatives that took place during the specified range.
  • Number of Analyzed Participants: Move the slider to view only the initiatives where the number of analyzed participants was within the specified range.

Select as many filters from each category as you would like. After selecting filters, click the Apply Filters button to refresh the home page. To modify your filter selections, click the 'x' to the right of any selection or click 'Clear All Filters' to the right of your selections.

If you're looking for a recently added initiative, impact analysis may not be complete. Check the Pending tab to see initiatives still in progress.

If initiative analysis failed, the initiative card will remain in the Pending tab and you'll see a red 'x' icon in the upper righthand corner of the card.

To resubmit the initiative with the same data file and previously submitted information, click the card flip icon in the lower righthand corner. Select the option to "Resubmit Initiative" and initiative analysis will begin again using the same data set and details.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.