Method | precision | recall | F1 |
---|---|---|---|
BoW-MNB | 0.44 | 0.31 | 0.36 |
Feature-rich-MN | 0.69 | 0.32 | 0.44 |
Feature-rich-SVM | 0.71 | 0.31 | 0.44 |
User model-CNN | 0.59 | 0.45 | 0.51 |
LSTM | 0.50 | 0.39 | 0.44 |
Bi-LSTM | 0.56 | 0.40 | 0.47 |
LSTM-attention | 0.54 | 0.35 | 0.42 |
SGL-CNN | 0.51 | 0.56 | 0.53 |
MGL-CNN | 0.63 | 0.48 | 0.54 |