CalibrationEngine class
Calibration analysis utilities.
Compares a learner's self-reported confidence with their actual performance to detect overconfidence and underconfidence patterns.
Constructors
Properties
- hashCode → int
-
The hash code for this object.
no setterinherited
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
Methods
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited
Static Methods
-
calibrationCurve(
List< ConfidenceEvent> events) → Map<int, double> - Calibration curve: for each confidence level (1-5), what is the average actual performance? Returns a map: {1: 0.32, 2: 0.45, ...}.
-
calibrationError(
List< ConfidenceEvent> events) → double - Mean absolute error between predicted (normalized to 0-1) and actual.
-
forTopic(
List< ConfidenceEvent> events, String topicId) → List<ConfidenceEvent> - Filter events for a specific topic.
-
isOverconfident(
List< ConfidenceEvent> events, {double threshold = 0.25}) → bool - Whether the learner is overconfident on a given topic.