Carnegie Mellon University
Advanced Search   
  Look in
       Title     Description
       Inactive Projects
Current Projects, Sorted Alphabetically
Depression Assessment
This project aims to compute quantitative behavioral measures related to depression severity from facial expression, body gestures, and vocal prosody in clinical interviews.
Detailed Wall Modeling in Cluttered Environments
The goal of this project is to develop methods to accurately model wall surfaces even when they are partially occluded and contain numerous openings, such as windows and doorways.
Distributed SensorWebs
The Sensor Web initiative develops and implements wireless technology for distributed sensing and actuation in horticultural enterprises.
DRC Tartan Rescue Team
During the Fukushima-Daiichi nuclear accident, robots weren’t able to inspect the facility, assess damage, and fix problems. DARPA wants to change this.
Dynamic Biped
We are developing a new series of bipedal walking robots that use passive-dynamic principles.
Dynamically-Stable Mobile Robots in Human Environments
We are developing novel dynamically-stable rolling machine and walking machine research platforms to study interactions with people and operating in normal home and workplace environments.
E57 Standard for 3D Imaging System Data Exchange
The goal of this project is to develop a vendor-neutral data exchange format for data produced by 3D imaging systems, such as laser scanners.
The Ember project uses multi-agent teams, comprised of autonomous and human agents, to achieve effective results under emergency situations.
Event Detection in Videos
Our event detection method can detect a wide range of actions in video by correlating spatio-temporal shapes to over-segmented videos without background subtraction.
Exploration of Planetary Skylights and Caves
The NREC is developing an untethered, long range (2,500 ft +), gas line visual inspection robot system that provides real-time video from inside the line, can be deployed in live lines, and can pass through all angles and bends of both 6" and 8" lines.
Extrinsic Dexterity
"Extrinsic Dexterity" is a way to get dexterous manipulation with a very simple hand, by coordinating finger motion with arm motion. The more common approach is to depend entirely on the fingers of the hand, which requires at least three fingers and at least nine motors. We have demonstrated Extrinsic Dexterity using the single motor of the MLab Hand, coordinated with the motions of the arm.
Face Recognition
Recognizing people from images and videos.
Facial Expression Analysis
Automatic facial expression encoding, extraction and recognition, and expression intensity estimation for the applications of MPEG4 application: teleconferencing, human-computer interaction/interface.
Facial Feature Detection
Detecting facial features in images.
Factory Automation
We are developing the next generation of mobile robots for operating in the factory environments. These mobile robots can localize without modifying the factory and navigate any path in the factory, with the ability to replan paths to avoid unexpected obstacles. These new capabilities will increase the throughput of the factories, as well as decrease the time required to deploy (and re-deploy) the robots into the factory.
Feature Selection
Feature selection in component analysis.
Feature-based 3D Head Tracking
A feature-based head tracking algorithm can handle occlusions and fast motion of face.
Fine Outreach for Science
The Fine Outreach for Science, sponsored by the Fine Foundation, provides GigaPan units to scientists and documents the evolution of GigaPan as a research tool.
We are developing videotactile fingertip sensors which will enable people to interact with the visible world via their fingertips.
Footstep Planning for Biped Robots
Navigation strategies for bipeds through complex environments, planning for the full capabilities of the biped.
Forecasting the Anterior Cruciate Ligament Rupture Patterns
Use of machine learning techniques to predict the injury pattern of the Anterior Cruciate Ligament (ACL) using non-invasive methods.
Formal Models of Human Control and Interaction with Cyber-Physical Systems
Cyber-Physical Systems (CPS) encompass a large variety of systems including example future energy systems (e.g. smart grid), homeland security and emergency response, smart medical technologies, smart cars and air transportation. The goal of this project is to develop cognitively-based analytic models of human operators so that they can be integrated with models of the physical/robotic system so that the whole mixed human-CPS system can be formally verified.
Formal Verification of Autonomous Systems
We are developing tools and techniques to support formal verification of autonomous systems.
Foundation for MEMS Synthesis (MEMSYN)
shorten MEMS development cycle
Free-Roaming Planar Motors
We are developing autonomous planar motors for precision positioning.
Frontal Face Alignment
This face alignment method detects generic frontal faces with large appearance variations and 2D pose changes and identifies detailed facial structures in images.