Lecture 6 Data Collection: Procedures, Instrumentation, Practical Considerations

Gang He

April 20, 2024

Recap lecture 5

  • Evaluation design
  • RCT
  • Quasi-experimental (RD, DiD)
  • Observational
  • Proposal
  • Progress on evaluation project

Today’s agenda

  • Data sources
  • Statistical data
  • Survey
  • Interview
  • Evaluation project consultation

Literature review

  • Google Scholar
  • Review articles
  • Experts interview
  • Examples

Existing data

Example: Prof. MacBride’s Queens Curbside Recycling study

Collecting program data

  • Experiments
  • Interview
  • Survey
  • Field
  • Case study

Probabilistic sampling

  • Random sampling
  • Stratified random sampling
  • Cluster sampling

Sample size: Power calculations

“Smallest sample (lowest budget)” to possibly measure program impact.

Survey Challenges

  • Response rate
  • Quality and consistency
  • Data security

Survey strategies

  • Concise/short
  • Start with an easy quesiton (yes or no to get “hooked”)
  • Donot leading questions
  • Show progress/time
  • Brainstorm ideas to get good-quality of survey

Comparing different survey methods

Criteria Mail Internet Telephone In Person
Quality of data
Opportunities for analyses
Resources needed

Focused group

Interview

  • Homework
  • Prepare and test your questions
  • Semi-structured interview
  • Plan and being flexible
  • Note taking
  • Seek feedback

Fieldwork

  • To describe what happens at the level being examined (local office, local program, local agency, local community, state office, state agency, and so on) by collecting information about procedures and data on activities, services, institutional features, outcomes.
  • To explain why the situations are as they are.

Strategies

  • Provide incentives
  • Optimize survey length
  • Follow up
  • Multiple channels
  • Targeted audience
  • Avoid repetitive

Tools of trade

  • Google/Microsoft Forms
  • Survey Monkey
  • Qualtrics
  • Amazon

Train staff

  • Goals, process, and responses
  • Assessment
  • Mock interviews

Test instruments

  • Clear vague and biased wording
  • Logical order
  • Completeness of intended data

Run pilot

  • Get feedback and comments
  • Revision
  • Refining

Ready for a full implementation.

References

Newcomer, Kathryn E., Harry P. Hatry, and Joseph S. Wholey. 2015. Handbook of Practical Program Evaluation. 4th edition. San Francisco: Jossey-Bass.