MbrlCatalogueTitleDetail

Do you wish to reserve the book?
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Hey, we have placed the reservation for you!
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Title added to your shelf!
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Application of artificial intelligence CNN model in emotional recognition of instrumental music

Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
How would you like to get it?
We have requested the book for you! Sorry the robot delivery is not available at the moment
We have requested the book for you!
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Application of artificial intelligence CNN model in emotional recognition of instrumental music
Journal Article

Application of artificial intelligence CNN model in emotional recognition of instrumental music

2025
Request Book From Autostore and Choose the Collection Method
Overview
To enhance the accuracy and expressiveness of emotional recognition in instrumental music, this study proposes a multi-layer music emotion recognition model. The model integrates Convolutional Neural Network (CNN), Bidirectional Gated Recurrent Unit (BiGRU), and Attention Mechanism, aiming to accurately capture complex emotional information in audio. First, the CNN is used to extract local emotional features of the audio. Then, the BiGRU is employed to model the contextual information of time series, strengthening the temporal continuity of emotional expression. Finally, the attention mechanism is introduced to dynamically focus on key emotional segments. To achieve multi-scale feature fusion, the model combines low-level audio features and high-level semantic features through weighted summation during the feature extraction stage. The experimental section is validated using three music emotion datasets, including two publicly available datasets Instrument Recognition in Musical Audio Signals (IRMAS) and Multitrack Dataset for Musical Audio (MedleyDB), as well as a large-scale dataset Database for Emotional Analysis of Music (DEAM), to comprehensively evaluate the performance, generalization ability, and robustness of the model. These datasets cover a large number of multi-category instrumental audio samples. The model is evaluated on three continuous emotional dimensions: Valence, Arousal, and Dominance. The experimental results show that the proposed model achieves Pearson correlation coefficients of 0.871, 0.832, and 0.784, respectively, which are better than those of the comparative models. In terms of Mean Squared Error (MSE), the values are 0.0187, 0.0208, and 0.0243, respectively, indicating higher prediction accuracy. In conclusion, the proposed fusion deep neural network model significantly improves the accuracy and generalization ability of emotional recognition in instrumental music. This study provides an effective method and practical inspiration for emotional modeling in complex musical environments.