Decide What You are Predicting
The most common predictive learning analytics models predict whether learners will be successful in each current course, but there are other possible models, e.g. learner on-time graduation, effective course designs, or effective instructor behavior. The examples that follow will refer to learner success in courses.
Consulting services are available if you would like to explore Predictive Learning Analytics with us. Please contact your Account Manager or HelpDesk@IntelliBoard.net for additional details.
Choose or Create a Source Report
The IntelliBoard Pro Library includes several reports which can be used as a basis for creating predictive models. Choose a model that matches your LMS features and institutional policies.
Choose a Source Report
The following reports may be used as-is or edited to match the needs of your institution.
- Predictive Learning Analytics Input: Passing by Course End Date (IB): This report may be used with any LMS that provides Course End Dates, such as Moodle.
- Predictive Learning Analytics Input: Passing by Term End Date (Canvas) (IB): This report may be used with any LMS that provides Term End Dates, such as Canvas.
- Predictive Learning Analytics Input: Passing 90 Days (Enrollment Start) (IB): This report is useful when courses do not have fixed end dates, but learners are expected to complete the course within a defined span of time after Enrollment Start, e.g. 90 days. This length of time can be adjusted.
- Predictive Learning Analytics Input: Passing 90 days (Enrollment Creation) (IB) This report is useful when courses do not have fixed end dates, but learners are expected to complete the course within a defined span of time after Enrollment Creation, e.g. 90 days. This length of time can be adjusted. Use this report if your LMS does not support a defined Enrollment Start Date.
Create a Source Report
If you choose to build a new report, it must contain the following data in columns. Other data such as Course Name or Student Name may be included in the report, but will be ignored by the Predictive Learning Analytics system. Please find more information about each requirement below.
- Unique Identifier
- Success Criterion (Target or Label)
- Case Process Criterion
- Multiple Predictor Variables (Features)
Unique Identifier
Each prediction will require a unique identifier. For example, if you are going to make predictions about students in courses, the best unique identifier is the Enrollment ID.
Success Criterion (Target or Label)
You must have one column that defines whether the learner was successful in binary terms, where 1 = successful and 0 = unsuccessful. In our sample reports, a passing course grade of 60% is used. This is defined using a formula in the report.
Other possible criteria include completion of all required activities, achievement of all assigned competencies, continuous participation to the end of the course, etc.
You cannot train a model if all your rows were successful or all failed. Our default reports provide an aggregation in a panel at the top of the report. Check to make sure this value is between .10 and .90.
Case Process Criterion
You must have one column that distinguishes rows that will be used to train the model from rows that will be used to generate predictions. (The same report will be used to train the model and to generate predictions.) In our sample reports, this column is labeled "Current" and a value of 1 = a current, active enrollment that can be used to generate predictions; 0 = an enrollment that has concluded and can be used to train the model. This value is set using a formula that references the course end date, term end date, enrollment start date, etc., depending on the LMS and the institutional requirements.
You will need multiple rows with this value at 0 to train a model. We recommend at least 1000 rows of historical data.
You will need at least some rows with this value at 1 to predict outcomes using a model.
Multiple Predictor Variables (Features)
You may have multiple "predictor" variables that will be analyzed to determine how they affect the outcome. Our default reports include the following:
- Visits per Activity
- Participations per Activity
- Time Spent per Activity
- Submissions per Assignment
- Attempts per Quiz
- Posts per Forum
Each of these columns must have a range of values to be used in a model. In our default reports we provide a Range aggregation -- make sure each column you want to use has a Range >= 1.
Usually it is best to "standardize" these values, because course design can differ so much between courses, programs, and instructors. We recommend dividing by the number of activities in the course.
Create a New Model

Once your report is complete, you can create a new model. Go to Apps > Predictive Models > Create Model. Each model is associated with one connection. Follow the steps outlined below:
- Predictive Model Description: Give your model a name that you can easily recognize and a description that will summarize what the model is based on
- Algorithm: You have a choice of algorithms to build your model. Currently, IntelliBoard Pro supports two algorithms, Logistic Regression and Neural Network. Neural Network tends to produce more accurate predictions but can require more data. Logistic Regression is more "interpretable". The choice is yours.
- Case Identifier: On this screen, select the "Case Identifier" column you will be making predictions about. In our examples, this is Enrollment ID.
- Outcome: On this screen, select the "Success Criteria" column, e.g. "Passed". This identifies the column that defines which learners were successful in the historical data and this is what will be predicted for future learners.
- Feature Columns: You may select multiple columns here. Remember, each column must contain a range of multiple values
- Case Process Criterion: On this screen, select the column that indicates whether the row is historical, used to train the model, or current, used to generate predictions. In our default reports, this column is named "Current."
- Check Data and Create Model: Review your data on the final page and create your model.
Train Model
You are now able to train your model. In this process, historical data is gathered from your report and analyzed. This process takes some time. You do not need to remain on this page or remain logged in to IntelliBoard during this process.
Validate Model
Once your model training is complete, you can validate your model. The training process holds out some training data and uses it to test the accuracy of the model. The overall quality of the model is summarized in the "AUC" value below the chart. Please contact Helpdesk if you have additional questions about this process.
Trigger or Schedule Predictions
After your model is trained, you may trigger predictions manually or schedule predictions from the Models page.
View Predictions in Reports
In the Library you can find the default report Prediction Outcome: Learners At Risk (Per Enrollment) (IB). This report displays the most recent prediction for each learner, along with other data that can be used to help interpret the prediction.
Use Predictions to Trigger Notifications
To create notifications based on predictions, edit the Prediction Outcome: Learners At Risk (Per Enrollment) (IB) or your own output report to include the email address you wish to send the notification to (learner, instructor, advisor, etc.) and follow the directions for Creating Notifications.
Need more information on Predictive Analytics? Please contact helpdesk@intelliboard.net.