Learning dependence from samples

Sohan Seth*, José C. Príncipe

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

3 Citations (Scopus)

Abstract

Mutual information, conditional mutual information and interaction information have been widely used in scientific literature as measures of dependence, conditional dependence and mutual dependence. However, these concepts suffer from several computational issues; they are difficult to estimate in continuous domain, the existing regularised estimators are almost always defined only for real or vector-valued random variables, and these measures address what dependence, conditional dependence and mutual dependence imply in terms of the random variables but not finite realisations. In this paper, we address the issue that given a set of realisations in an arbitrary metric space, what characteristic makes them dependent, conditionally dependent or mutually dependent. With this novel understanding, we develop new estimators of association, conditional association and interaction association. Some attractive properties of these estimators are that they do not require choosing free parameter(s), they are computationally simpler, and they can be applied to arbitrary metric spaces.

Original languageEnglish
Pages (from-to)43-58
Number of pages16
JournalINTERNATIONAL JOURNAL OF BIOINFORMATICS RESEARCH AND APPLICATIONS
Volume10
Issue number1
DOIs
Publication statusPublished - 2014
MoE publication typeA1 Journal article-refereed

Keywords

  • Association
  • Causality
  • Conditional association
  • Conditional dependence
  • Conditional mutual information
  • Dependence
  • Interaction association
  • Interaction information
  • Metric space
  • Mutual dependence
  • Mutual information
  • Variable selection

Fingerprint

Dive into the research topics of 'Learning dependence from samples'. Together they form a unique fingerprint.

Cite this