Get your free guide to using you, I and we personal pronouns >>>

94% said they buy, only 1/3 did

 

94% of consumers said they were willing to pay a price premium for an energy efficient lightbulb, but only one-third actually purchased the product.

This is the problem of “Willingness-to-Pay (WTP)” research, and why not all behavioural studies are the same.

When people are asked a hypothetical, like “would you buy this product?” or “how much would you pay?”, they give you their best guess. As well intentioned as their answer might be, it doesn’t mean it will result in real behaviour.

The light bulb moment

In the case of the lightbulbs, researchers wanted to bring this issue to a head. 

A random selection of hardware store customers were asked questions about their attitudes towards energy-saving products and their willingness to pay for energy efficient light bulbs. 

Researchers thanked them with a discount coupon for lightbulbs of between 90% and 10% and used the coupon to track how many people actually bought the globes during the next month.

Half of the customers received a discount that meant the price was lower than the price they said they were willing to pay, and 94% claimed they were willing to pay more for an energy efficient bulb, anyway.

However, only one-third of customers surveyed actually purchased a globe. Worse, only 19% of those who said they’d be happy to pay a premium went on to buy.

According to the researchers, “...after controlling for the size of the coupon discount itself, self-reported willingness to pay had an insignificant effect on coupon redemption behaviour.”

Now, this study had a small sample size (342 people) in a small developing island state (Saint Lucia in the Caribbean) and tested only one product (energy efficient light bulbs), but it illustrates the precarious nature of survey based, stated preference research.

As the researchers note, “the results from self-reports of WTP consistently overestimate consumers' willingness to pay, and this bias has made stated preference methods controversial in economic and marketing literatures due to the extent to which predictions fail to correspond to actual market behaviour.”

Implications for you

If you are basing your decisions on ASKING your customers (or colleagues, for that matter), you are likely getting a distorted and unreliable gauge on behaviour. 

Three lessons from this:

  1. When you see a headline or read an article about what consumers are doing, question the methodology. If it’s based on a survey, take the findings with a big grain of salt.
  2. Look for research that is based on experimental observation, ideally in a randomised-controlled trial (RCT). Run experiments yourself.
  3. To get a complete view of customer behaviour, make sure you have all four bases of my customer insights landscape covered. Learn about it in my Influencing Action course.



This article was also published in Smartcompany: https://www.smartcompany.com.au/people-human-resources/dont-rely-on-what-customers-say/

Ref: Reynolds, Travis & Kolodinsky, Jane & Murray, Byron, 2012. "Consumer preferences and willingness to pay for compact fluorescent lighting: Policy implications for energy efficiency promotion in Saint Lucia," Energy Policy, Elsevier, vol. 41(C), pages 712-722.



Close

50% Complete

Two Step

Register your interest and Bri will let you know as soon as the course is available