Publication Cover
Automatika
Journal for Control, Measurement, Electronics, Computing and Communications
Volume 65, 2024 - Issue 3
410
Views
0
CrossRef citations to date
0
Altmetric
Regular Paper

Deep neural network-based emotion recognition using facial landmark features and particle swarm optimization

ORCID Icon & ORCID Icon
Pages 1088-1099 | Received 19 Dec 2023, Accepted 11 Apr 2024, Published online: 22 Apr 2024

Figures & data

Table 1. Prior work on Facial emotion recognition.

Figure 1. MUG database basic expressions with one sample image.

Figure 1. MUG database basic expressions with one sample image.

Figure 2. GEMEP Corpus with four sample frames.

Figure 2. GEMEP Corpus with four sample frames.

Figure 3. Proposed approach for emotion recognition.

Figure 3. Proposed approach for emotion recognition.

Figure 4. Face Localization and Landmark Detection.

Figure 4. Face Localization and Landmark Detection.

Figure 5. Manually annotated 66 dimensions Geometrical Face Points.

Figure 5. Manually annotated 66 dimensions Geometrical Face Points.

Table 2. Facial Landmark coordinates for feature extraction.

Figure 6. Flowchart  for optimizing the DNN Hyperparameters using PSO.

Figure 6. Flowchart  for optimizing the DNN Hyperparameters using PSO.

Figure 7. Feature Distribution of emotions in the MUG dataset.

Figure 7. Feature Distribution of emotions in the MUG dataset.

Figure 8. Feature Distribution of emotions in the GEMEP dataset.

Figure 8. Feature Distribution of emotions in the GEMEP dataset.

Figure 9. Deep Neural Networks Architecture.

Figure 9. Deep Neural Networks Architecture.

Table 3. Hyperparameter setting for training the DNN using the PSO Algorithm.

Figure 10. Confusion Matrix for the MUG Dataset with 66 features.

Figure 10. Confusion Matrix for the MUG Dataset with 66 features.

Table 4. Performance measure of seven basic emotions.

Figure 11. GEMEP Dataset Confusion matrix outcomes.

Figure 11. GEMEP Dataset Confusion matrix outcomes.

Table 5. Performance measure of micro-coded emotions.

Figure 12. (a) and (b) Model Performance for the MUG Dataset, (c) and (d) Model Performance for the GEMEP Dataset.

Figure 12. (a) and (b) Model Performance for the MUG Dataset, (c) and (d) Model Performance for the GEMEP Dataset.

Figure 13. Emotion Recognition from real-time Dataset.

Figure 13. Emotion Recognition from real-time Dataset.

Table 6. Cutting-edge results achieved in the emotional datasets.