Recognition of Music Types
Conference Paper, Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP '98), Vol. 2, pp. 1137 - 1140, May, 1998
Abstract
This paper describes a music type recognition system that can be used to index and search in multimedia databases. A new approach to temporal structure modeling is supposed. The so called ETM-NN (explicit time modelling with neural network) method uses abstraction of acoustical events to the hidden units of a neural network. This new set of abstract features representing temporal structures, can be then learned via a traditional neural networks to discriminate between different types of music. The experiments show that this method outperforms HMMs significantly.
BibTeX
@conference{Soltau-1998-16603,author = {Hagen Soltau and Tanja Schultz and Martin Westphal and Alex Waibel},
title = {Recognition of Music Types},
booktitle = {Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP '98)},
year = {1998},
month = {May},
volume = {2},
pages = {1137 - 1140},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.