Voice Interaction Research at the MRL

Voice Interaction Research at the MRL
Our research on voice interaction has recently been featured at various places online, and in talks and presentations. This is the attempt of an archive of highlights (most recent up top). Stuart on Computerphile on “How Alexa Works” Stuart Reeves in November 2019. Progressivity for Voice User Interface Design Our paper with the above title ...

RoboClean project

RoboClean project
RoboClean: Human Robot Collaboration for Allergen-Aware Factory Cleaning This project will investigate and demonstrate the potential of human-robot collaboration, integrated with IoT sensors for cleaning and allergen detection on a factory floor. The outcomes of this  project will include the design, implementation, and evaluation of an  interactive  connected system enabling novel human-robot collaboration and sensor data ...

The CharioT Project: Energy Advice for Households

The Chariot Project has developed the Chariot Energy Kit (available here) to support people in managing their energy use. The platform relies on a sensor kit to capture environmental data (such as humidity and temperature), and a visual user interface to interrogate the data for problems, such as damp and cold. Chariot was a collaboration ...

CSCW 2017 Paper and Workshop on ‘talking with conversational agents’

Some of us at the Mixed Reality Lab including Martin Porcheron and Stuart Reeves have become interested in the use of Conversation Analysis to study so-called ‘conversational’ agents. Together with colleagues from Stockholm, Edinburgh, São Paolo, and Cambridge we have just hosted a workshop at CSCW in Portland on the topic (all participant’s position papers are here), ...

HCI Journal Special Issue on Collocated Interaction: New Challenges in ‘Same Time, Same Place’ Research

Guest editors: Joel Fischer, Barry Brown, Andrés Lucero and Stuart Reeves. This is a tentative version of the Call for Papers for the Human-Computer Interaction Special Issue on Collocated Interaction. The definitive version is published in this Taylor & Francis Google Doc. CALL FOR PAPERS In the 25 years since Ellis, Gibbs, and Rein proposed the time-space taxonomy [3], ...

Collaborations with machines

This position has been presented at the CSCW 2016 panel on “Innovations in autonomous systems: Challenges and opportunities for human-agent collaboration”. For an overview of the positions presented, see this Abstract in the ACM DL.   The time could not be more pertinent for the CSCW community to engage with the many challenges posed by ...

CHI ’16 paper: “Just whack it on until it gets hot”: Working with IoT Data in the Home

We have a forthcoming paper at this years’ CHI conference. You can download the PDF from the link below.

CSCW 2016 Workshop on Collocated Interaction: New Challenges in ‘Same Time, Same Place’ Research

**** San Francisco, CA, USA, 27th February 2016. Submission due date: 22nd December 2015, or 8th January 2016. https://collocatedinteraction.wordpress.com/ ***** Call for Participants

New grant: Future Everyday Interaction with the Autonomous Internet of Things

Excited to announce that we have been awarded an EPSRC research grant. The team brings together researchers from the Mixed Reality Lab at the University of Nottingham with researchers from the Agents, Interaction and Complexity Group at the University of Southampton. The project will run from April 2016 for three years. Proposal summary This project ...

Mini Documentary on our Work with Rescue Global

Mini Documentary on our Work with Rescue Global
We have made a short documentary on our collaboration with Rescue Global, a disaster response charity from London, thanks to the support by filmmaker Raj. The film shows the process of us researchers from the EPSRC-funded ORCHID project doing ethnographic fieldwork, and developing technologies to support Rescue Global’s planning work. The result is the Augmented Bird Table (ABT), a ...