Skip to main content

Table 9 Comparison of signal pairs 〈log( U 1 ( τ )), log( U 2 ( τ ))〉 and 〈log( U 1 ( τ )), log( U 3 ( τ ))〉 with each distance measure

From: Using information theoretic distance measures for solving the permutation problem of blind source separation of speech signals

Distance measure

〈log(U1(τ)), log(U2(τ))〉

〈log(U1(τ)), log(U3(τ))〉

Bhattacharyya coefficient

0,03

0,71

Kullback-Leibler divergence

0,0

0,03

Log of the maximum ratio

0,0

0,30

Jensen-Rényi divergence, α = 0.5

17,83

7,48

Jensen-Rényi divergence, α = 1

7,54

1,58

Jensen-Rényi divergence, α = 2

0,99

0,01

Mod. Jensen-Rényi divergence, α = 0.5

4,79

1,26

Mod. Jensen-Rényi divergence, α = 1

1,23

0,26

Mod. Jensen-Rényi divergence, α = 2

0,11

0,01

Mutual information

4,59

6,43

  1. The most dependent value of each distance measure is marked in bold.