We're putting together information to help you move to virtual advising, deploy nudging, get answers on data science questions, and use your solutions during COVID-19. ACCESS RESOURCES HERE. 

Support is Moving! You can submit a new ticket at support@civitaslearning.com

Welcome
Login
Open navigation

Enhancements to Impact to Support Analysis Design & Interpretation

Product Update -    Released On - 03/16/17

We’re excited to launch Illume Impact with our first cohort of partners. Already, we’re releasing enhancements to Impact to support analysis design and interpretation, as well as Impact’s overall usability.

Alerts to Ensure Accurate Analyses: Participating Students

To ensure users have enough data for accurate analysis, Illume Impact will warn users when the number of analyzed participants is too small (i.e. fewer than 200 analyzed participants) and could result in potentially inaccurate results.

Alerts to Ensure Accurate Analyses: Comparison Students

To ensure users have enough data for accurate analysis, Illume Impact will warn users when the number of eligible comparison students is too small and could result in potentially inaccurate or biased results. The "Overall Student Counts" page of the data validation wizard now displays an error message when the total number of identified eligible comparison students is fewer than 100, and displays a warning when the total number of identified eligible comparison students is fewer than 1000 or when the ratio of eligible comparison students to participants is less than 80%.

Data validation page for "Student Counts by Term" now displays terms with 0 participants

The "Student Counts by Term" page of the data validation wizard now displays terms from the user uploaded file where there were no participants but at least one eligible comparison student identified. This capability allows users to upload a list of participants from one term and a list of eligible comparison students from other terms to match against. That may be necessary to analyze initiatives that were not limited pilots, or initiatives in which most of the eligible students were participants. If any term with one or more participants has too few eligible comparison students to match against within the same term (specifically, the ratio of eligible comparison students to participants is less than 80%), then a warning and "let's fix this" prompt will appear for that term on the "Student Counts by Term" page of the data validation wizard. The user can then select additional eligible comparison students from other terms, including terms with no participants.

Exported impact results file now includes "95% Confidence Interval" column

To perform additional reporting beyond Impact, you can access the raw data in a spreadsheet, by clicking "Export Raw Data" on the "Initiative Impact" tab of the initiative detail page. With this enhancement, the exported file will include a "95% Confidence Interval" column that contains the +/- values shown next to each lift in persistence result – both overall, and for each analyzed student group and term, so that you can include those individual and overall confidence intervals in your reports.

Default sort "Completed" initiatives by submission date

The "Completed" initiatives tab is now by default sorted by submission date, with the most recently submitted initiatives displayed on the top left.


Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.