Call for RAs: Alexa- vs. human-directed speech project (Dr. Michelle Cohn, UCD Phonetics Lab)
Postdoctoral fellow, Dr. Michelle Cohn, is recruiting undergraduate research assistants (RAs) to serve as confederates in the Alexa- (Amazon Echo) vs. human-directed speech project for Spring 2020 (with possibility of a paid STDT 2 position in Summer 2020 and LIN 199 units in Fall 2020).
RAs are expected to work approximately 6-8 hours a week on the project and will receive 2 units of LIN 199 credit per quarter (Note that 4 units count as upper division elective toward LIN major).
Details (Spring Qtr 2020):
Application deadline: March 8, 2020 (by midnight PST)
To apply, email the following to the project PI, Dr. Cohn (email@example.com):
Georgia Zellou, Michelle Cohn, & Bruno Ferenc Segedin presented 4 papers at Interspeech in Graz, Austria.
See below for links for the papers:
Congrats to Michelle Cohn for receiving a two year NSF Postdoctoral Fellowship to work with Dr. Georgia Zellou (PI: UC Davis Phonetics Lab, Dept. of Linguistics), Dr. Zhou Yu (PI: UC Davis Language and Multimodal Interaction Lab, Dept. of Computer Science) and Dr. Katharine Graf Estes (PI: UC Davis Language Learning Lab, Dept. of Psychology).
Click here to see the official NSF posting
Dr. Georgia Zellou and Dr. Michelle Cohn gave invited talks at the June 2019 Voice User Interface (VUI) Summit at the Amazon headquarters.
A team led by Georgia Zellou has begun a collaborative research effort exploring how people adjust their speech to digital devices, such as Siri or Alexa.
For example, recent Ph.D graduate, Michelle Cohn, has been recording conversations with Gunrock, a social bot created by Zhou Yu's lab that is currently in the semifinals for the Amazon Alexa Prize. You can talk to it yourself if you have an Alexa-enabled device: just say "Let's chat!" and it will randomly invoke one of the three social bots in the running for the grand prize.
Above you can see our microphone next to the Amazon Echo, capturing the interaction