A physical computing tool called ‘The Emotional City’ designed to help city planners and urban designers gauge the potential emotional responses their neighborhood designs are eliciting from specific user personas  |  December 2013

The Emotional City consists of a large board, a model of a city grid populated with configurable building types and a robot consisting of an LCD screen and NFC reader. The designer or planner physically moves the robot each step of the way through a possible route in order to simulate way-finding, commuting or walking in a city. Unlike digital simulation which focuses on quickly running simulations, The Emotional City makes the designer take time to consider how each aspect of the user journey is emotionally affecting.

Each configurable building type such as ‘high-rise office’, ‘low-rise residential’ or ‘park’ is associated with an NFC tag and a database ID. As the robot moves through the city space he ‘reacts’ to each building type through a series of emotional metrics that are culled from a user persona database. For example, moving the robot closer to a park may raise his engagement level while having too many skyscrapers in a row will lower it.  The output of moving the character through the space is a real-time visualization of an emotional face on a monochromatic LCD screen representing levels of engagement, boredom or belonging.

My role: Research, low fidelity prototyping, testing with RFID and NFC technology, creating the  board using a CNC router, laser cutting the robot’s body and computer programming (Arduino)