Glossary of AI Terminology

What Is Sensitivity In Machine Learning?

Sensitivity

Sensitivity is a measure of the number of positive cases that turned out to be true for a given model. It is also called the true positive rate.

Sensitivity = predicted true positives / (true positives + false negatives)

Sensitivity

Bi-weekly AI Research Paper Readings

Stay on top of emerging trends and frameworks.