Some surveys are designed and commissioned to collect data. Others are designed and commissioned to collect money.
Every other month, at least, we get news articles and commentary fretting about some recent poll showing that some allegedly worrisome percentage of some group either believes in something apparently anathema or else refuses to believe in something that’s supposed to be elementary and fundamental for that group.
The polls themselves are designed to produce those articles which, in turn, are designed to flatter, frighten, and outrage their readers. That’s the sweet spot. Give people a chance to feel superior, scared, and angry all at once and they will reach for their checkbooks.
This is a reliably lucrative business model. Poll, publicize, fundraise. Cash the checks and commission the next poll.

The most effective practitioners of this fundraising art keep two sets of mailing lists. One consists of a not-at-all random sample of people selected for their potential to provide “alarming” responses to the loaded questions in your survey. The second consists of those likely to find those solicited and elicited answers alarming. You conduct your polls with the first group and send your fundraising letters to the second.
You’ll also want to cultivate a third mailing list. That one consists of journalists, pundits, and media platforms who have demonstrated an eagerness to take your poll results at face value and to amplify the flattery, fear, and outrage you’re banking on. This will turn out to be a rather long list.
I suspect that at least some of the practitioners of this business model approach this work with a calculating cynicism that exceeds even my cynical description of it above. But I don’t think that’s true of all of them. Many of them, I think, started out with some intention of using surveys to compile data about public opinion and beliefs, but gradually succumbed to the subtle incentives that turned that project into the incompatible project of poll-driven fundraising. You can do a data poll and produce data or you can do a money poll and produce money. But you can’t do both at the same time and, well, you can’t pay the rent with data.
All of which is to say that the next time you see a headline lamenting that “45% of Christians don’t believe in Jesus” or “72% of clergy don’t believe in ordination” feel free to ignore it. That alarming “statistic” is the product of people who set out to create an alarming statistic. Their “survey” was refined by trial and error until its fuzzy questions produced the apparent clarity of the desired results which are, in turn, intended to produce the desired emotional response.
That desired emotional response is the key to recognizing a money poll. There will be other hints — such as the murkiness of the actual polling questions and the difficulty you’ll have even finding out what they were. But the biggest indicator will be the way the results of the poll are presented to short-circuit the better angels of your nature. It will flatter you with assurances of your superiority while encouraging you to imagine the worst of your neighbors. It will tempt you to think “Thank God I am not like those publicans” and “Kids these days” and “this country is going to Hell in a hand basket” because none of those thoughts reflects or encourages the best version of you, and that best version of you wouldn’t be inclined to write a big check in response to their fundraising.
A good rule of thumb here is the ZOMG!!1! test. If the headline announcing the poll results wouldn’t be materially different if the headline writers had included a “ZOMG!!1!” before or after their headline, then you’re looking at a money poll.
Money polls never tell us anything reliably true about the beliefs and opinions they claim to be measuring. But they tell us everything we need to know about the groups commissioning — and profiting from — those polls.