Select Page

Mobile vs. PC Responses To a Survey Using Long Lists

By Don Phipps, CEO, Applied Marketing Research

 

Recently Applied completed a study that was designed to enable respondents to respond via mobile smartphones (Apple Iphone and Android mobile phones) and PCs (Apple and Windows-based).  While this article will not disclose any findings from that study, I wanted to issue a short paper on my experience designing a survey for mobile and PC administration and a demographic comparison between the answer sets for both types of respondents. 

One major difference between PC-only surveys and a dual purpose mobile/PC survey is the additional testing required.  Visualizing how the same survey appears on a smartphone screen as opposed to  a PC screen, with the necessary scrolling on the smartphone that may not be required on a PC-only survey, is an important consideration in mixed device survey  design.  Mobile respondents may tire with having to scroll and drop out of the survey.  Testing therefore is more involved, as it not only involves programming logic, but visual display of the survey on two radically different sized screens.

For this survey, testing involved Iphones, Android phones, and Ipads as well as Windows-based and Apple OS laptops.  Essentially this doubled the amount of time we typically allocate to testing.  And this survey instrument incorporated a number of long lists, as well as piping and other types of programming that needed thorough review. 

The sample size for this study was robust, with an N of 700 which has an overall margin of error of ±3.1%.

Even though respondents were given the opportunity to respond via mobile phones or PCs, only 4 in 10 (39%) chose to respond by smartphone while 6 in 10 (61%) responded via PC.

Respondents age 45-54 were more likely to respond via PC while those ages 35-44 were more likely to respond via mobile.  There were no significant differences in response by device observed in those ages 18-24 and those ages 25-34.

Compared to those responding via mobiles, PC respondents were:

  • More likely to hold graduate degrees
  • More likely to say they did not have kids 18 or younger living with them
  • More likely to report household income of $150,000 or higher.

Compared to those responding via PCs, mobile respondents were:

  • More likely to hold technical or associates degrees
  • More likely to say they had kids age of 18 or younger living with them
  • More likely to report household income of $15,000-$29,999.

Those responding via PC were more likely to say they made most of the household decisions related to day to day finances while those responding via mobiles were more likely to say they made household decisions related to day to day finances in collaboration with another member of the household.

We had two other concerns in approaching the survey.  One concern was drop rates.  Our hypothesis was that, no matter the care put into the survey development and testing, we would experience higher drop rates with mobile users when compared to PC users.  This was in fact the case.  Only 16% of PC users dropped out before completing the survey.  The rate of drop outs for mobile users was double that, with 30% terminating before completing the survey. 

The other concern was how much longer would the survey take on a mobile vs. a PC.  Post survey data reveals that mobile users spent 18.9 minutes on the survey while PC users spent only 12.1 – meaning it took mobile users 56% longer to complete the survey.  That’s a lot of extra time.

Be that as it may, we are now in a world where mobile phones are used to complete surveys on a regular basis.  Being cognizant of the additional testing require to run a survey on mobiles vs. PCs, the extra time required to complete a survey on a mobile vs. a PC, and the tendency to experience higher drop rates on a mobile vs. a PC should help researchers complete projects successfully while minimizing sampling effects based on the device itself.