home

archives

March 2011 home page

  ALERT

ASSEMBLYMAN GARRICK'S FAIRGROUND SURVEY DISCREDITED BY EXPERT

posted March 6, 2011

full text of Dr. Hoftetter's report | Dr. Hoftetter's credentials



Martin Garrick released "survey results" from readers of his newsletter which purport to show little support for Del Mar's proposed acquisition of the fairgrounds, with governance by a board composed of representatives around the County.

An analysis of Garrick's poll  by a  renowned expert in survey research, Dr C. Richard Hofstetter, concluded that "professional probability sampling procedures" were not followed.  He states that Garrick used a "convenience sample...persons who are intensely involved are more likely to respond. In fact, the Fair Board did exactly that in this case, sending an eblast to 170,000 vendors, employees, exhibitors, and event attendees encouraging them to fill out the survey opposing the sale.  He says the results rarely provide accurate views of any population except the minority who choose to respond. 

Dr. Hofstetter says they do not meet the Code of Ethics  of the American Association for Public Opinion Research Code.  Polls such as Garrick's "are generally understood as supporting political goals rather than representing opinion", he said.

Dr. Hofstetter examined each of the six questions in Garrick's so-called survey and determined that they "censor the range and distribution of response". In an interview, Dr Hofstetter cautioned "caveat emptor", let the buyer beware.

Assemblyman Garrick  is circulating the flawed results of his survey (called by some, "a push poll")  among elected officials in Sacramento and San Diego County to garner support for his opposition to the proposed fairgrounds transaction.

 

The full text of Dr. Hofstetter's analysis

Dr. Hofstetter's credentials

Members of Congress frequently and legislators less frequently circulate survey “instruments” to their constituents worded in such a way as to create apparent support from groups of their constituents.  While I cannot read Assemblyman Garrick’s mind, the survey that he reported in his February 2, 2011, newsletter appears to have been designed to have that impact based on the information contained in the report that was included in his newsletter.

Scientific surveys are based on two prerequisites:  1) Sampling a population in such a way that one can claim that the sample responses can be generalized accurately within a small margin of error to the larger population.  In this case, the population implied is assemblyman Garrick’s constituents, persons of Republican, Democrat, independents, other persuasions, and persons  with not partisan dispositions who are American citizens and eligible to vote in his district.  2) The responses resemble the range and intensity of opinion as well as those who have no opinion and/or no information about the issues involved.  Pollsters often measure both the existence of opinions about issues and the depth of knowledge about the issue, and report that information.  To accomplish these ends, analysts will scrutinize cross tabulations of information and intensity by the distribution of responses.  Both of these components are necessary in order to claim the generalizability and validity of responses to any poll.

Sampling problems in polls of the kind that assembly man Garrick has circulated involve what is called a “convenience sample” since the persons to whom the survey was sent is not well defined.  It is not clear what the population of “constituents” were.  Nor does it appear that professional probability sampling procedures were followed.  Exactly to whom was his survey sent and how were they selected?  How was it sent? Was a probability sampling procedure used?  I doubt that since legitimate survey sampling is an expensive exercise (I hope as a taxpayer I did not get stuck with the bill).  What proportion of persons who received the survey returned responses?  We do know that persons who are intensely involved are more likely to respond to such polls so that they rarely provide accurate views of any population except the minority who choose to return responses.  Answers to these questions for respectable scientific survey reports are ethically required by the American Association for Public Opinion Research (AAPOR) Code of Ethics.  Professional pollsters do not take polls of the kind that assemblyman Garrick reported seriously in light of AAPOR requirements so that such “polls” are generally understood as supporting political goals rather than representing opinion.

The issue of survey design also raises questions about what the data represent.  Assemblyman Garrick’s poll consists of six issue preference questions.  Closed ended questions are questions that ask a question and present the response alternatives to those questions.  Using closed ended questions is a fairly standard approach to survey measurement, but it has severe limits and can present highly biased information if that is the only approach to measuring opinion about the issue.  The distribution of opinion can easily be biased if a broad range of responses is not included in questions.  I have no fault with “Do you support the idea of the state selling the Del Mar fairgrounds” (YES/NO), for instance, in question 2 (although I would have deleted “the idea of” to focus the question a bit more on the sale), but question 1 concerning what the Del Mar fairgrounds “mean to you” probably greatly censors the range and distribution of response and professional pollsters would have asked that as an open ended (i.e., free response) question prior to presenting alternatives.  Questions 4-6 are subject to the same criticism.  Second, on any such policy issues, additional questions that measure the salience and information about the specific issues are vital to understand what any population really “thinks” about an issue.  How much information does any public have about the sale issue?  Americans tend to over report opinions of issues since we tend to think we should have opinions on issues whether we do or not.  The sale issue is fairly esoteric and I think most have very little information on the issues so that they are extremely susceptible to selecting response alternatives that impress them for some reason that has little or nothing to do with the purpose of the survey.  

 

Dr. c. Richard Hofstetter's credentials: 

He began conducting scientific sample surveys in 1967 and has continued to do this in his research to the present day, and has published nearly 250 papers in refereed scientific journals in addition to others that were not as rigorously refereed.  Some have been published in communication, public health, medicine, economics, sociology, as well as political science journals in Europe, Asia, and the United States.  He has taught polling and statistics to undergraduate and graduate students at Ohio State University, University of Houston, and San Diego State University since 1967 in addition to teaching basic statistical proficiency to upwardly mobile Foreign Service officers in Washington and to San Diego City Schools research personnel during this time.   He also worked to develop survey research units at each university.  Since 2000, he has been the principal investigator on projects funded for nearly $4.5 million, and worked in support of many other funded projects as co-investigator, methodologist, analyst, etc.  He is a professor of political science and adjunct professor in the graduate school of public health at San Diego State University.

 

 
 

© 2007-2015 Del Mar Community Alliance, Inc.  All rights reserved.

 
 

 

ackli