Quick Download
Release 2009.08.18
Version 0.9.1
Quick Direct Links
Page Navigation
Recent Blog Activity
MyExperience is a context-aware data collection platform for capturing objective and subjective data as it's experienced.

about

MyExperience is a BSD-licensed open source mobile data collection tool developed for Windows Mobile devices (including PDAs and mobile phones) using .NET CF 2 and Microsoft SQL Compact Edition. MyExperience is available for free on SourceForge in beta release. Please see our wiki for documentation and our blog for updates.

MyExperience combines sensing and self-report to collect both quantitative and qualitative data on human behaviors, attitudes and activities in the field. Using a mobile phone's wireless connectivity to the internet, researchers have the ability to access MyExperience data as it's collected allowing for ongoing analysis of study data and early detection of subject compliance or technology issues.

MyExperience is based on a three tier architecture of sensors, triggers and actions; triggers use sensor event data to conditionally launch actions. One novel aspect of MyExperience is that its behavior and user interface are specified via XML and a lightweight scripting language similar to the HTML/JavaScript paradigm on the web.

data collection

As a mobile data collection platform, MyExperience has been designed to record a wide range of data including sensors, images, video, audio and user surveys. Sensor data is automatically timestamped and recorded to a local SQL Compact Edition database running on the mobile phone without any user intervention (the data can also be synchronized wirelessly with a web server).

The beta release of MyExperience ships with over 50 built-in sensors including support for GPS, GSM-based motion sensors (based on cellular signals), and device usage information (e.g., button presses, battery life information, etc.). The sensor events themselves can be used to trigger custom actions such as to initiate wireless database synchronization, send SMS messages to the research team and/or present in situ self-report surveys.

Other sensors can easily be added via our plug-in architecture. For example, researchers at Intel Research, Seattle developed a MyExperience software sensor to interface with a Bluetooth-based activity-inference hardware sensor that could recognize activities such as running, walking, and bicycling (see Figure 1).

Figure 1. The Mobile Sensing Platform (MSP), developed by Intel Research, Seattle, is worn around the belt and is capable of inferring human activities. This data is wirelessly communicated to the user's mobile phone running MyExperience via Bluetooth. In this example, MyExperience is configured to trigger a short self-report survey for the user after a detected walking episode completes. Click on the screenshot to enlarge.

self-report surveys

The beta version of MyExperience provides fourteen separate survey response widgets (a selection of which are shown below) from radio button lists and text fields to widgets that allow the subject to take pictures, video, or even to record their responses audibly. This response data is stored in a local database on the mobile device which can be synchronized wirelessly via WiFi or the cellular networks to the research team's servers.

Figure 2. MyExperience allows researchers to specify a variety of self-report response widgets to gather both closed-form (e.g., likert scale) and open-form data (e.g., audio recordings). Click on a screenshot above to get a larger resolution version.

example

The XML file below demonstrates how a researcher would program MyExperience for a study relating heart rate and perceived levels of pain. Note that the file is only around 50 lines long (including comments) and collects both sensor and user response data. Here, two sensors are used: a GPS sensor and a heart rate sensor. A trigger is constructed to invoke a "pain survey" whenever the subject’s heart rate exceeds 150 beats per minute. The pain survey involves two questions: the first asks if the subject is currently experiencing pain and, if so, the follow-up question asks for a verbal description of this pain. The heart rate, location data, and survey responses are automatically recorded to a SQL database on the phone that can be automatically synchronized with a server-side database.
  1. <?xml version="1.0" encoding="utf-8"?>
  2. <myexperience name="PainStudy" version="1.0">
  3.   <sensors>
  4.     <!--Define our two sensors.-->
  5.     <sensor name="LocationSensor" type="GpsSensor"/>
  6.     <sensor name="HeartSensor" type="HeartRateSensor"/>
  7.   </sensors>
  8.  
  9.   <actions>
  10.     <!--Define our pain survey action. Make sure to set
  11.     the EntryQuestionId property as that is required-->
  12.     <action name="PainSurvey" type="SurveyAction">
  13.       <property name="EntryQuestionId">PainLocation</property>
  14.     </action>
  15.   </actions>
  16.  
  17.   <triggers>
  18.     <!--Define our one trigger. Triggers are automatically called when
  19.     their sensor values change. In this case, the trigger gets a
  20.     reference to the "HeartSensor" and checks to see if the heart
  21.     rate is above 150bpm. If so, the pain survey is launched-->
  22.     <trigger name="HeartRateTrigger" type="Trigger">
  23.       <script>
  24.         hrSensor = GetSensor("HeartSensor");
  25.         if(hrSensor.StateEntered > 150){
  26.           painSurveyAction = CreateAction("PainSurvey");
  27.           painSurveyAction.Run();
  28.         }
  29.       </script>
  30.     </trigger>
  31.   </triggers>
  32.  
  33.   <!--In this example, we only ask two questions. -->
  34.   <questions>
  35.     <!--Ask the subject if they are currently experiencing pain.
  36.     If so, we branch to the "AudioDescription" question. Otherwise,
  37.     the self-report survey ends.-->
  38.     <question id="PainLocation"
  39.         text="Are you currently experiencing pain?">
  40.       <response widget="RadioButtonList">
  41.         <option goto="AudioDescription">Yes</option>
  42.         <option>No</option>
  43.       </response>
  44.     </question>
  45.  
  46.     <!--This question is only asked if the subject responded with
  47.     "Yes" to the "PainLocation" question. It launches an AudioRecorder
  48.     widget so that the subject can verbally respond with their answer.-->
  49.     <question id="AudioDescription"
  50.         text="Please describe the pain you are feeling.">
  51.       <response widget="AudioRecorder"/>
  52.     </question>
  53.   </questions>
  54. </myexperience>

More examples and documentation on how to write the MyExperience.xml file can be found here.

studies

Although still in development, MyExperience has already been successfully used in a wide range of studies including:
  • A study investigating the use of wearable activity-inference devices and mobile phone technology to promote physical activity. This work was conducted by Intel Research, Seattle and the Computer Science and Engineering department and Information School at the University of Washington
  • A study by the Digital Health Group at Intel using Bluetooth-enabled wearable heart rate variability monitors to trigger mobile therapy sessions related to stress and/or anger management
  • A joint project by the University of Washington Exploratory Center for Obesity Research and the Department of Urban Design and Planning looking at the correspondence between sensor measured physical activity levels and geospatial location
  • An investigation of the link between a person's place visit behaviors and their preference for those places (e.g., if I frequently visit Pagliacci's pizza, can we infer that I like pizza or, further, that I like Italian food in general?). This study was jointly conducted by Intel Research, Seattle and the Computer Science and Engineering department at the University of Washington.
  • Two pilot studies exploring automatically inferred context and mobile phone usage: (1) investigated the use of SMS and the user's motion and (2) studied battery charging behavior and location.

methodologies

The Experience Sampling Method (ESM), also referred to as Ecological Momentary Assessment (EMA), was developed primarily by Csikszentmihalyi and Larson at the University of Chicago Department of Psychology in the early 1980s. ESM, as a research method, is largely characterized by in situ sampling of a subject's thoughts, feelings or behaviors as they are experienced. Compared to other self-report techniques (e.g., retrospective surveys, interviews), ESM can provide more accurate assessments of everyday behaviors because the data does not suffer from recall bias. In the 1980s, ESM research was typically conducted using a pager and paper/pencil; subjects would carry around and fill out small notebooks, typically formatted with predefined questions or scantron sheets. Since the late 1990s, however, electronic data collection tools such as the Experience Sampling Program (ESP) have often replaced the paper/pencil method. Electronic-based self-report offers numerous advantages over its non-digital counterpart including time-stamped data, access to data as it is being collected, multimedia capture (e.g., audio or video), and the incorporation of sensor data.

Context-aware experience sampling extends traditional sampling strategies used in ESM by incorporating sensing technologies such as as GPS, accelerometers, heart-rate sensors to automatically trigger sampling events (e.g., automatically detect when a subject arrives at home to remind them to take their medicine and fill out a short survey). The sensor data can also be used as an additional source of data for analysis to augment self-report (e.g., correlating the subject's heart ratewith their self-reported activity). Context-aware experience sampling was pioneered by Professor Stephen Intille and colleagues at MIT with the Context-Aware Experience Sampling (CAES) tool. The MyExperience project is heavily influenced by this work and other past tools and is currently focused on offering a new generation of data collection methods for mobile devices. For more information on in situ self-report methods, see An Overview of In Situ Self Report and the MyExperience Tool (PDF).

The MyExperience tool supports all of the popular sampling strategies found in ESM including:

  • Regular Intervals: recordings scheduled at regular intervals
  • Daily: recording once daily, usually at the end of the day
  • Interval: recording multiple times per day
  • Intensive: recording very frequently (e.g., once an hour or more)
  • Variable/Random: recording scheduled at random, variable intervals
  • Event: recording triggered by event of interest (e.g., the subject is asked to journal everytime s/he smokes)
  • Context/Sensed: like event-contingent recording but uses automatically sensed context to signal subject (e.g., the subject is prompted when his/her heart rate spikes)
We believe MyExperience offers tremendous new data collection opportunities for researchers interested in employing the ESM technique in their work. However, MyExperience can also be used strictly as a journaling or diary application or, alternatively, in any situation when structured, form-based mobile data collection is necessary (as is becoming quite popular in developing world research).

tool history

The MyExperience project was started by Intel Research, Seattle and the University of Washington in the spring of 2005 out of a need to collect sensor-based location information along with self-report data on a user's cell phone. At the time MyExperience was a secondary effort, driven in large part by the needs of a study called "Vote with your Feet" (PDF) lead by Jon Froehlich, Mike Chen, and Ian Smith. After this study was completed, it was determined that many other interesting studies could be conducted that incorporated both sensor data and user-response data in the field. Moreover, the cell phone was determined to be a near perfect platform for data collection as it was virtually always with the participant and intrinsically had the ability to wireless synchronize data back to the research team in real time. Thus, MyExperience soon became a primary focus and funds were appropriated in the spring and fall of 2006 to make MyExperience more generalizable for field studies. In February of 2007, MyExperience was open sourced under the BSD license and the source code repository moved from Intel Research to SourceForge where it still resides today.

testimonials

It kept surprising me with how flexible MyExperience was -- your program is really neat.
-Dr. Stuart Ferguson, Psychologist, University of Pittsburgh
People express interest in the tool frequently.
-Dr. Margaret Morris, Psychologist, Digital Health Group at Intel
MyExperience is such a good platform. It is very easy to get into the code
-Jürgen Stumpp, Universität Karlsruhe
What's most interesting about MyExperience is that it can trigger anything in response to such a wide range of events or combination of events, as well as that it captures all of the data too, to help with analysis
-Adrienne Andrews, University of Washington
The structure of the XML is excellent and is deeply expandable through C# extensions to the MyExperience system. With the starting version of MyExperience and with custom updates, our team has produced some very complex trigger based journaling. We are using the manual and random time sensors. We also created, integrated, and am using several new sensors: general phone button sensor, outlook tagged appointment sensor, and BT iMote beacon detector (location). We also added new script behavior and added several new widgets such as an image map, animation view, and time input. Finally, we have significantly updated the smart button list control to enable some new UI capabilities such as a thermometer scale selection and better check box behavior.
-Bill Deleeuw, Intel Engineer, Digital Health Group

acknowledgements

We graciously acknowledge the resources and support provided by the following organizations:

UW CSE Logo dub Logo Intel Logo