For all sorts of reasons, opinion polls mean far less than we think.

Some people don’t trust promises of anonymity, and either refuse to participate, or tell the pollsters what they think they want to hear.  This is especially true in an era when some opinions are subject to harsh penalties.  You can lose your job.

Some think they understand the question but don’t, especially when news coverage has been inadequate.  Are they being asked whether black lives matter, whether they agree with the actions of people who say black lives matter, whether they agree with the reported actions of people who say black lives matter, or whether they agree with their demands?

Some are systematically underrepresented because of how the pollsters divide people into groups and weight them.  Do they weight them according to how they are registered, how they say they are likely to vote, or how they voted in the last election?

Perhaps the best-known reason is that the answer the person gives depends on how the question is worded.  Are they asked whether they support restrictions on abortion on demand, or on “reproductive liberty” or “a woman’s right to choose”?

These problems aren’t easy to solve, partly because of the biases of the pollsters themselves.  Quis custodiet ipsos custodes?  It would be interesting to poll the pollsters, but who could you trust to poll them? 

As to the problem of how to word the questions, though, here’s an idea:  Assign people randomly to groups.  Word the question differently for each one.  Don’t report “the answer.”  Instead report the differences in the answers.  Let readers make of them what they will.