Every once in a while there’s a great discussion going on over at the eM+C LinkedIn Group. A discussion that took place recently is no exception.
On July 10, Chris Goward, CEO and co-founder of web optimization firm WiderFunnel Marketing and an eM+C LinkedIn Group member, asked a great question in the discussion area of the group. It was titled “Do McAfee or Hacker Safe Security Badges Increase E-commerce Conversion Rate[s]?”
“We’ve all heard the claims from security badge vendors that adding these icons will lift your conversion rate,” he said. “But, do they actually deliver in real-world A/B tests?”
He also posted new results his firm had recently published around this topic and asked members what their thoughts were around it. “Have you seen positive (or negative) results from adding security badges?” he asked.
Rick Isenberg, eM+C LinkedIn Group member and president at RBI Marketing Consulting, responded shortly thereafter.
“Hacker Safe case studies, as I recall, are now years old,” he said. “It went from being a ‘what’s that up there?’ to ‘everyone has it’ to ‘maybe I don’t care.’ And now you have browsers determining safety, etc.”
All of the following will impact the results, Isenberg noted:
1. Who is your customer? Experienced web shopper, or inexperienced web shopper?
2. Did you put it in a place that makes sense? Does it conflict with the other items in that location?
Multivariate results, not just A/B results, also would be important to see in regards to this issue, Isenberg said. Why? Because he did an A/B test on a website a while ago with and without a security logo, and the one without the logo fared better.
With that result, he concluded that the logo should not be used, “I’d like to see testing done with all the possible logos and in many different locations,” he said. “Does having it next to the cart image hurt you, for example, but having it on the top of the page actually help? I’ve seen too many times where people do an A/B test, both online and offline, and then draw conclusions that aren’t correct, i.e., they take the conclusion too far.”
Goward then responded. “You’re describing an A/B test with multiple variations of similar elements. A multivariate test, in contrast, would have multiple section of the page that vary independently and concurrently, which could end up with odd combinations showing multiple placements of similar logos around the page,” he said. “It may seem like splitting hairs, but they’re very different test structures.”
He added, “You’re correct about taking conclusions too far. It’s a common temptation that we often coach clients through. No single test would show that security logos always lower or raise conversions. Only through an ongoing commitment to testing new variations can you start to see patterns and learning.”
Great points! What do you think about this subject? Any real-world stories? If so, feel free to leave a comment here.
Want to participate in similar discussions? Then join the eM+C LinkedIn Group. There’s always something interesting going on there.