When I was writing last week’s post about poor customer service, I remembered a study I wrote about back in 2019. The study was about how so many companies were terrible at responding to customer service emails. It was released by the Norwegian CRM provider SuperOffice.
At the time, the study was mentioned in a number of articles. The findings were compelling:
Sixty-two percent of companies didn’t respond to customer service emails. Ninety percent of companies didn’t let the customer know their email had been received. Given the topic of my post, this was exactly the type of empirical evidence I was looking for.
There was just one problem. The original study was done in 2018. I wondered if the study had been updated. After a quick search, I thought I had hit pay dirt. Based on the landing page (which came at the top of the results page for “customer service benchmark report”) a new 2023 study was available.
Perfect, I thought. I filled in the lead contact form, knowing I was tossing my name into a lead-generation mill. I figured, “What the hell. I’m willing to trade that for some legit research.” I eagerly downloaded the report.
It was the same one I had seen four years earlier. Nothing was new.
Puzzled, I carefully went over the landing page wording. Sure enough, it said a new report had just been released. It gave some tidbits of the new findings, all of which were exactly the same as the 2018 report. After each “finding,” I was told “Tweet this!”
I was starting to get the whiff of something rotten from the State of Norway.
I tracked down the post author through LinkedIn. He was an SEO contractor based in Estonia. He replied saying he thought the company was still working on the new report.
I then reached out to the company. I not only wanted to see what they said about the report, I also wanted to see if they responded to my email. Did they walk their own talk?
To their credit, they did respond, with this, “We are sorry that the report have [sic] not been updated, and right now we have no plans to do that.”
So, the landing page was a bald-faced lie? I mentioned this in an email back to them. They apologized and said they would update the landing page to be more accurate. Based on the current version, it was nudged in this direction, but it is still exceedingly misleading.
This is just one example of how corporate white papers are churned out to grab some attention, get some organic search rankings and collect leads. I fell for it, and I should have known better. I had already seen this sausage factory from the inside out.
Back in the days when we used to do usability research, we had been asked by more than one company to do a commissioned study. These discussions generally started with these words: “Here is what we’d like the research to say.”
I’m guessing things haven’t changed much since then. Most of the corporate research I quote in this column is commissioned by companies who are selling solutions to the problems the research highlights.
For any of you in the research biz, you know ethically what a slippery slope it can be. Even in the supposedly pristine world of academic research, you don’t have to turn over too many rocks to uncover massive fraud, as documented in this Nature post. Imagine, then, the world of corporate commissioned whitepaper research, where there is no such thing as peer review or academic rigor. It’s the gloves off, no-holds-barred, grimy underbelly of research.
With our research, I tried to always make sure the research itself was done well. When we did do commissioned research, we tried to make the people who paid the bills happy by the approach we took to interpreting the research. That’s probably why we didn’t get a lot of commissions. Most of the research we did was for our own purposes, and we did our best to keep it legit. If we did get sponsors, they went in with the understanding that we were going to let the results frame the narrative, rather than the other way around. I wanted to produce research that people could trust.
That was the biggest letdown of the SuperOffice experience. When I saw how cavalier the company was with how they presented the research on their landing page, I realized that not only could I not trust their promotion of the research, I had trouble trusting the original research itself. I suspected I may have been duped into passing questionable information along the first time.
Fool me once…