Loading Events

PhD Thesis Proposal

November

14
Tue
Jack Henry Good PhD Student Robotics Institute,
Carnegie Mellon University
Tuesday, November 14
4:30 pm to 6:00 pm
GHC 8102
Trustworthy Learning using Uncertain Interpretation of Data

Abstract:
Non-parametric models are popular in real-world applications of machine learning. However, many modern ML methods that ensure that models are pragmatic, safe, robust, fair, and otherwise trustworthy in increasingly critical applications, assume parametric, differentiable models. We show that, by interpreting data as locally uncertain, we can achieve many of these without being limited to parametric or inherently differentiable models. In particular, we focus on decision trees, which are popular for their good performance on tabular data as well as ease of use, low design cost, low computational requirements, fast inference, and interpretability. We propose a new kind of fuzzy decision tree we call a kernel density decision tree (KDDT) because the uncertain input interpretation is similar to kernel density estimation.

We organize the completed and proposed contributions of this thesis into three pillars. The first pillar is robustness and verification: we show improvement of robustness to various adverse conditions and discuss verification of safety properties for FDTs and KDDTs. The second pillar is interpretability: by leveraging the efficient fitting and differentiability of our trees, we alternatingly optimize a parametric feature transformation using gradient descent and the tree by refitting to obtain compact, interpretable single-tree models with competitive performance. The third pillar is pragmatic advancements: we make advances in semi-supervised learning, federated learning, and ensemble merging for decision trees.

Thesis Committee Members:
Artur Dubrawski, Chair
Jeff Schneider
Tom Mitchell
Gilles Clermont, University of Pittsburgh