"First, do no harm" is a pretty good rule for everyone in all circumstances, but it is especially appropriate for those allegedly committed to the old standards of journalism. Certainly no Pulitzer judge would honor a manufactured story, especially one that misled news consumers about the real facts of any situation.
Almost all "national" polls about the presidential race are at best entertainment, however, and some are downright disingenuous and a few intentionally misleading.
"[H]ere's my bottom line," The Weekly Standard's Jay Cost wrote last week in the conclusion of his analysis of the most recent Washington Post/ABC News poll, which found President Obama to have a significant lead over Mitt Romney among all adults, and a huge lead on key measures such as likability.
"ABC News/WaPo has again offered up a pro-Democratic sample that helps Team Obama spin the day's news," Cost concluded.
Cost reached this "bottom line" because of the "hugely Democratic tilt" of the poll's sample, which put questions to a population of respondents 11 percentage points higher in Democrat makeup than in GOP participants, "an unjustifiable number," Cost concluded (his emphasis, not mine).
These are the equivalent of dueling words among the statistically inclined like Cost and the alchemists at the Post/ABC poll. Along with the New York Times' Nate Silver and RealClearPolitics' Sean Trende, Cost is the best of the new generation of Michael Barones, and his searing takedown of the Post/ABC poll should trigger some internal intervention by those concerned that the pollsters are going to hurt the brands peddling the results.
Many observers questioned the motivation of the pollsters, but almost all fair observers also credit the Washington Post's Dan Balz as among the country's best political reporters, and he defended the poll in a fairly long interview on my radio program that spread over two days (the transcript of which is available at my website). Judge for yourself whether Cost or Balz has the better argument, but the real question is, why is a news organization spending any money at all on such an obviously useless data set, no matter how fair or unbalanced the sample?
All serious observers of the looming Obama-Romney contest admit it will come down to the results in between 10 and 15 states.
Those states depart significantly from the national electorate in makeup and turnout history, and key data such as the unemployment rate and gas prices vary significantly among them.
Polls of likely voters in Florida, Virginia, North Carolina, New Hampshire, Pennsylvania, Ohio, Wisconsin, Iowa, Missouri, Colorado, New Mexico and Nevada, and perhaps one or two more, would be worth reading right now.
Modeling turnout in these states is hard, defenders of the status quo assert, but if the battleground states cannot be polled in any way that can produce valuable results, why then do even less predictive polls of national electorates that matter not at all to November's results.
Either meaningful polling can be done, or it can't. If it can be done, it should be done in a way that tells people which way the election really is heading.
In fact, such polling can be done, and is routinely conducted by campaigns.
"News organizations," however, choose to spend their resources on absurdly tilted samples and nearly meaningless questions.
Why? Cost suggests the obvious answer, and given the absence of any equally plausible alternative explanation, old-school journalists would have to conclude that Cost is correct.
Examiner Columnist Hugh Hewitt is a law professor at Chapman University Law School and a nationally syndicated radio talk show host who blogs daily at HughHewitt.com.