Skip to main content

Table 8 Accuracy, precision, recall, and F1 scores of classifiers with low-level acoustic features, and i-vector features features are shown using MIM, mRMR, and JMI feature selection algorithms. Feature fusion is used in the first and second columns. Score fusion is used in third column. Experiments were done using the individual-based approach. Systems with best F1 scores are shown in bold

From: Automatic detection of attachment style in married couples through conversation analysis

 

Low-level acoustic feature fusion

Low-level acoustic and I-vector feature fusion

Low-level acoustic and I-vector score fusion

 

Feature type

Accuracy

Precision

Recall

F1 score

Feature type

Accuracy

Precision

Recall

F1 score

Feature type

Accuracy

Precision

Recall

F1 score

SVM

MIM (30)

0.73

0.72

0.72

0.72

MIM (30)

0.73

0.72

0.72

0.72

MIM (10)

0.61

0.62

0.52

0.57

mRMR (15)

0.68

0.69

0.65

0.67

mRMR (15)

0.69

0.69

0.66

0.67

mRMR (30)

0.57

0.56

0.53

0.55

JMI (15)

0.67

0.65

0.7

0.68

JMI (15)

0.67

0.65

0.71

0.68

JMI (5)

0.65

0.65

0.62

0.64

Decision tree

MIM (5)

0.75

0.74

0.74

0.76

MIM (50)

0.72

0.7

0.74

0.72

MIM (30)

0.75

0.75

0.74

0.75

mRMR (30)

0.74

0.77

0.69

0.73

mRMR (30)

0.79

0.79

0.77

0.78

mRMR (10)

0.64

0.63

0.67

0.65

JMI (30)

0.71

0.7

0.72

0.71

JMI (10)

0.73

0.73

0.71

0.72

JMI (10)

0.74

0.72

0.77

0.75

Random forest

MIM (30)

0.8

0.81

0.76

0.78

MIM (30)

0.8

0.81

0.76

0.78

MIM (30)

0.76

0.75

0.77

0.76

mRMR (15)

0.8

0.84

0.72

0.78

mRMR (15)

0.8

0.81

0.76

0.78

mRMR (30)

0.79

0.76

0.83

0.79

JMI (15)

0.78

0.8

0.74

0.77

JMI (10)

0.77

0.76

0.77

0.77

JMI (30)

0.77

0.78

0.74

0.76

AdaBoost

MIM (30)

0.85

0.87

0.81

0.84

MIM (30)

0.84

0.84

0.83

0.83

MIM (50)

0.8

0.81

0.79

0.8

mRMR (30)

0.82

0.84

0.79

0.81

mRMR (30)

0.82

0.82

0.81

0.82

mRMR (30)

0.79

0.77

0.81

0.79

JMI (50)

0.8

0.83

0.76

0.79

JMI (30)

0.8

0.8

0.8

0.8

JMI (15)

0.81

0.8

0.83

0.81

Gradient boosting

MIM (30)

0.81

0.83

0.77

0.8

MIM (30)

0.84

0.84

0.83

0.83

MIM (30)

0.78

0.77

0.79

0.78

mRMR (50)

0.81

0.82

0.79

0.81

mRMR (50)

0.83

0.85

0.79

0.82

mRMR (30)

0.8

0.81

0.79

0.8

JMI (30)

0.82

0.82

0.81

0.82

JMI (30)

0.82

0.81

0.83

0.82

JMI (15)

0.8

0.8

0.81

0.8

Extra tree

MIM (30)

0.84

0.85

0.81

0.83

MIM (30)

0.84

0.84

0.83

0.83

MIM (30)

0.8

0.8

0.77

0.79

mRMR (15)

0.81

0.82

0.79

0.8

mRMR (15)

0.81

0.82

0.79

0.81

mRMR (30)

0.78

0.78

0.76

0.77

JMI (15)

0.78

0.81

0.74

0.77

JMI (15)

0.79

0.82

0.74

0.77

JMI (30)

0.77

0.79

0.72

0.76

XGBoost

MIM (30)

0.82

0.82

0.81

0.82

MIM (30)

0.82

0.82

0.81

0.82

MIM (30)

0.79

0.77

0.81

0.79

mRMR (30)

0.79

0.78

0.79

0.79

mRMR (30)

0.79

0.79

0.79

0.78

mRMR (30)

0.79

0.78

0.79

0.79

JMI (15)

0.77

0.75

0.79

0.77

JMI (15)

0.77

0.77

0.76

0.76

JMI (30)

0.75

0.75

0.73

0.74

Artificial neural network

MIM(15)

0.7

0.71

0.63

0.67

MIM(15)

0.69

0.71

0.64

0.67

MIM(50)

0.73

0.73

0.71

0.72

mRMR(15)

0.7

0.7

0.65

0.68

mRMR(15)

0.69

0.7

0.65

0.68

mRMR(50)

0.68

0.69

0.62

0.65

JMI(15)

0.63

0.63

0.62

0.62

JMI(15)

0.63

0.63

0.62

0.63

JMI(50)

0.77

0.78

0.74

0.76

SD-DNN

-

0.85

0.97

0.72

0.83

-

-

-

-

-

-

-

-

-

-