A method for continuous-range sequence analysis with Jensen-Shannon divergence
Fecha
2021Autor
Ré, Miguel A.
Aguirre Varela, Guillermo G.
Metadatos
Mostrar el registro completo del ítemResumen
Mutual Information (MI) is a useful Information Theory tool for the recognition of mutual dependence between data sets. Several methods have been developed fore estimation of MI when both data sets are of the discrete type or when both are of the continuous type. However, MI estimation between a discrete range data set and a continuous range data set has not received so much attention. We therefore present here a method for the estimation of MI for this case, based on the kernel density approximation. This calculation may be of interest in diverse contexts. Since MI is closely related to the Jensen Shannon divergence, the method developed here is of particular interest in the problems of sequence segmentation and set comparisons.
El ítem tiene asociados los siguientes ficheros de licencia: