What Is Sensitivity In Machine Learning?

Sensitivity

Sensitivity is a measure of the number of positive cases that turned out to be true for a given model. It is also called the true positive rate.

Sensitivity = predicted true positives / (true positives + false negatives)

Sign up for our monthly newsletter, The Drift.

Subscribe