Open Access. Powered by Scholars. Published by Universities.®
Articles 1 - 1 of 1
Full-Text Articles in Engineering
Relative Performance Of Mutual Information Estimation Methods For Quantifying The Dependence Among Short And Noisy Data, Shiraj Khan, Sharba Bandyopadhyay, Auroop R. Ganguly, Sunil Saigal, David J. Erickson Iii, Vladimir Protopopescu, George Ostrouchov
Relative Performance Of Mutual Information Estimation Methods For Quantifying The Dependence Among Short And Noisy Data, Shiraj Khan, Sharba Bandyopadhyay, Auroop R. Ganguly, Sunil Saigal, David J. Erickson Iii, Vladimir Protopopescu, George Ostrouchov
Auroop R. Ganguly
Commonly used dependence measures, such as linear correlation, cross-correlogram, or Kendall's τ, cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic, or monotonic. Mutual information (MI) has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Recently, several methods have been proposed for the MI estimation, such as kernel density estimators (KDEs), k-nearest neighbors (KNNs), Edgeworth approximation of differential entropy, and adaptive partitioning of the XY plane. However, outstanding gaps in the current literature have precluded the ability to effectively automate these methods, which, in turn, have caused limited adoptions …