some USSD testing and some delays

I ended my last post with the promise to share our experiences with the distribution of mobile phones and also to talk about the testing of new technologies for data gathering (particularly USSD). Unfortunately, as there have been some delays in setting a date for resuming the panel (which is currently not active), also the distribution of phones has been stalled. As things stand now, the 50 phones will most likely be distributed during the course of next week, which means that I will not be involved in the exercise personally as I will be leaving for Amsterdam in a couple of days.

When it comes to the testing of USSD, however, there is more to say. About three weeks ago, we conducted a pilot test of USSD for administering a two-question survey to 50 respondents selected from our panel. As these respondents had so far always been called for the interviews and never used any other channel for submitting their responses, we first sent a text message explaining the procedure and asking them to initiate the questionnaire by punching in a specific number code. However, it soon became clear that a simple text message is not sufficient to make respondents comfortable with switching technologies: quite a few respondents sent their responses as a text message and others called up the call centre number to ask what they were supposed to do. In the end, only one person answered both questions using USSD.

While this seems like a highly disappointing result at first, the main lesson that I would take away from this, is not that USSD cannot be used for conducting surveys. I still believe that the technology has great potential (see my earlier posts on this) for quickly gathering simple data from large samples. The main lesson, rather, seems to be that personal on-the-spot training of respondents is crucial when anything else than voice is being used. Even more so, as many respondents probably have used USSD for other purposes in the past (e.g. for transferring money). While this is admittedly not based on solid evidence, I believe that if respondents had been trained during the face-to-face to use USSD as a surveying tool, response rates would have been much larger. But this is something that remains to be seen, as Data Vision now has a automated system in place capable of further testing the technology for surveying purposes.

As I have mentioned above- and much to my frustration – I will not be present for the actual distribution of the mobile phones. As this is such an important step in not only increasing panel size (which is currently at about 310 active respondents) before the new round of surveying will start, but foremost of rendering the sample more representative, I will make sure to soon post some of the results and field experiences of my colleagues at Uwazi/Twaweza and Data Vision.

About these ads
This entry was posted in Uncategorized. Bookmark the permalink.

One Response to some USSD testing and some delays

  1. Jack says:

    Good info!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s