Mobile Research - State of the Union Pulse - The Good/Bad and Ugly

So, we just finished up with our first mobile pulse polling project for the State of the Union. This was our first app based real-time research project that we did. I'll document some of the issues bugs and ideas that got generated as part of this.

Emails + Push Notification

We decided to send email as well as push notification (for iPhone users) -- turns out this pisses people (and confuses people) - we should _either_ send push notifications or send emails. Sending both does not make sense (of course in hindsight.)

Opt-Out link on Email:

The Opt-Out link was not getting appended to the emails. Again a rookie mistake. We were so busy focusing on the iPhone/Andriod user experience, we missed out the opt-out piece on the emails. We also need an opt-out on the app itself. Deleting the app (from a technical standpoint) does not help because, we are not guaranteed notification of the deletion of the app. That is a simple technical issue -- will be fixed next time around.

Multiple Emails:

OK - some reason some folks were getting multiple emails for each survey. The way we set this up was to send out a "pulse" when we did a survey. I think when we do these live polling models, we should NOT be sending emails - just rely on push notification. A single email with a launch would probably have sufficed. We ended up sending about 5 emails (we did a PRE, 3 pulses and a POST survey) -- ok - probably too much - but we had to keep all our partners also on board. Next time, we'd probably limit the PRE event survey to about 5 questions -- pulses are not emailed and then a post-event survey also via push notification.

The data-set:

This is where things became interesting. Each of the pulses were filled out almost instantly. We got 85% of the data within the 5 minutes of sending the pulse - . We'll be sharing the data-set after we scrub the data for privacy artifacts like IP addresses etc.

The other thing we did was to create the pulse questions on the fly. All the partners had a group chat session and we decided on the pulse survey. This is probably where we got carried away and we sent about 3 pulses. For example, after seeing the seating arrangement, we pulsed the question around the seating arrangement. This is where real-time polling gets interesting -- where the questions can be defined and executed in real-time.

Top Line data:

We did 5 Surveys all up: PRE, 3 Pulses and a POST. IN addition we also had a dial test running in parallel. (Yes a lot of stuff to happen - but this was a first pilot and we wanted to push the limits to see where the limits are!) If we did not push the  limits, we would never know where things break. We had told everyone that this is a beta test and the first time someone has even attempted (at this scale) a mobile research project of this scale. We had over 20 partners and collectively deciding the questions on the fly on a real-time chat system and pushing the surveys to the users.

Here are some top line reports:



I'll be posting more data as we clean things up.

Finally, The mobile software SurveySwipe is still in beta; in many ways this project tonight was a full scale field test. With that in mind, I am not surprised that there may be a few bugs yet; I've yet to see any of the mobile survey apps that don't have them. I suspect that the opt-out challenge is just part of that beta process.


  1. [...] in all – A PRE, 3 Pulse surveys and a POST survey. I’ve already documented some of the lessons learned with this exercise. This post is about the data we [...]

  2. [...] read all the details of what they did well and what the Survey Analytics team learned from their SurveySwipe beta test.  But the real lesson here wasn’t in the technology, but in the [...]

  3. [...] Check out some of the responses from our network below (full breakdown available here). And click here to see composite responses from other progressive organizations who tried out the survey along with [...]


Post a Comment