AT&T Labs Fellowship Award Winner
Course you most regretted not taking? None, yet. Designing Multimodal Interfaces for More Natural ... summer project at AT&T Research centers on multimodal
At AT&T Labs - Research, we apply our speech, language and media technologies to give people with disabilities more independence, privacy and autonomy.
The AT&T speech mashup is a web service that implements speech tasks for web applications, enabling users of smart phones and other devices to use and hear voice communications.
Johnston, Michael J.
My research concerns natural language processing for multimodal dialogue systems and its application to the creation of prototype multimodal interfaces for next generation interactive services.
Research of interactive video indexing and retrieval, multimodal interfaces (text, image, speech, touch, visual gesture), content processing, machine learning, biometrics, data mining, and NLP.
Living rooms getting smarter with multimodal and multichannel signal processing
that speech and multimodal interfaces could provide a more natural means for addressing such challenges ... “Living rooms getting smarter with multimodal and multichannel sign
Florham Park, NJ 07932 email@example.com ABSTRACT Multimodal interfaces combining natural ... the years ([4, 13, 17]), building multimodal interfaces re- mains a
, and that more advanced multimodal interfaces will demand an approach to modeling and handling those ... in multimodal interfaces, rather than location reference in pa