Sensitivity index

The sensitivity index or d (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the signal or noise distribution. For normally distributed signal and noise with mean and standard deviations μS and σS, and μN and σN, respectively, d′ is defined as:[1]

d′ assumes that the standard deviations for signal and noise are equal. d′ can be estimated from the observed hit rate and false-alarm rate, as follows:[1]:7

d′ = Z(hit rate) − Z(false alarm rate),

where function Z(p), p ∈ [0,1], is the inverse of the cumulative distribution function of the Gaussian distribution.

d′ can be related to the area under the receiver operating characteristic curve, or AUC, via:[2]

d′ is a dimensionless statistic. A higher d′ indicates that the signal can be more readily detected.

See also

References

  1. MacMillan, N.; Creelman, C. (2005). Detection Theory: A User's Guide. Lawrence Erlbaum Associates. ISBN 9781410611147.
  2. Simpson, A. J.; Fitter, M. J. (1973). "What is the best index of detectability?". Psychological Bulletin. 80 (6): 481–488. doi:10.1037/h0035203.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.