Loading Events

VASC Seminar

May

19
Mon
Onur C. Hamsici PhD Candidate Ohio State University
Monday, May 19
3:30 pm to 12:00 am
Spherical-Homoscedastic Distributions and the Design of Bayes Optimal Classifiers

Event Location: NSH 1507
Bio: Onur C. Hamsici received the BS degree in Electrical and Electronics
Engineering from Middle East Technical University, Ankara, Turkey in
2003, and the MS degree in Electrical and Computer Engineering from The
Ohio State University (OSU), in 2005. He is currently a PhD candidate at
OSU. His research interests are statistical pattern recognition, machine
learning, and vision.

Abstract: Many feature representations, as in genomics, describe directional data
where all feature vectors share a common norm. In other cases, as in
computer vision, a norm or variance normalization step, where all
feature vectors are normalized to a common length, is generally used.
These representations and pre-processing step map the original data from
a Euclidean space to the surface of a hypersphere. Such representations
should then be modeled using spherical distributions. However, the
difficulty associated with such spherical representations has impaired
their use in practice. Instead, researchers commonly model their
spherical data using Gaussian distributions — as if the data were
represented in R^p rather than S^(p-1). This opens the question to
whether the classification results calculated with the Gaussian
approximation are the same as those obtained when using the original
spherical distributions. In this talk, I will show that in some
particular cases (which we have named spherical-homoscedastic) the
answer to this question is positive. In the more general case, however,
the answer is negative. I will then show that the more the data deviates
from spherical-homoscedastic, the less advisable it is to employ the
Gaussian approximation. This then lends to the derivation of a set of
Bayes optimal classifiers for spherical-homoscedastic and
–heteroscedastic distributions — the latter being derived by means of
an appropriate kernel function that maps the original feature space into
one where the data adapts to the spherical-homoscedastic model. The
resulting non-linear classifiers have applications in a large number of
real problems, of which, I will show examples in classifying images of
objects, gene expression sequences, and text data. Time permitting I
will sketch the problem of Bayes optimality in feature extraction.