In this phase, you will evaluate and assess model performance with statistical methods and explainable AI techniques.
Assess and compare model performance using statistical metrics like AUC, and explainable AI techniques to understand model predictions. This allows for identification of the most reliable models and extraction of biologically meaningful insights from models.
1. Select queue for exploration
Navigate to the Dashboard and select your predictive analysis from the queue
The queue number selected is indicated in the pink box at the top right of the PANDORA interface.
Navigate to Predictive -> Exploration
Configure Exploration space
Select all Response outcomes
Select metrics of interest
Select dataset
Select models to evaluate
2. Evaluate performance of the models
Compare metrics
Compare models based on the metrics selected in 3.b that are shown in the table from part 3.d. Special attention can be given to Predictive AUC and Training AUC scores for each model (Area Under the ROC Curve). More info about metrics here.
Select the ROC Curve Analysis tab in Exploration
Compare ROC Curves for each model to assess classification performance and identify the best models.
Ensure multiple models are selected, then select the Training Summary tab in Exploration
Compare the metrics shown on the box plots for multiple models.
The Performance measurements section can help determine if there are significant differences between model metric values.
The Model fitting results summary provides the five-number summary of each model that is visualized in the box plots.
Select the top model and select the Variable Importance tab in Exploration.
While on the Variable Importance tab, locate the Variable Importance sub-tab
A bar plot will appear showing the top features and their contributions to model variance
List the top predictors for your model
In this example, the top predictors, as shown in the bar graph below, are:
h3_hai_v0_gmt
hmnp_v0_cd4_ifng
z_score_continuous
h1_v0_cd4_ifng
Locate the Features across dataset sub-tab
Select the top features you had listed in part 8, and click the redraw plot button
Examine the dot plots to visualize how the top predictive features vary between responder outcomes
The dot plot below is based on features from step 3.a
4. Interpret the model - Explainable AI (xAI)
Navigate to the Model Interpretation tab
Utilize the various analysis tools to understand how features in the model influence predictions.
Example (Heatmap): Helps the user understand how joint variations of two variables may influence predictions
In Vars, select 2 features of interest like h3_hai_v0_gmt & hmnp_v0_cd4_ifng
Select Heatmap from the Analysis options
Click the Plot Image button
You've now assessed model performance using AUC scores, ROC curves, and summary statistics, followed by deeper exploration of variable importance and feature-level patterns. By selecting top predictors and visualizing their variation across outcome groups, you've gained insight into how specific biological variables drove your model's decisions.