Lecture 9 Data Collection: Procedures, Instrumentation, Practical Considerations

Gang He

April 1, 2025

Recap lecture 8

  • Guest Speaker: Beatrice Teston
  • Overcome Resistance and Improve Organization Capacity
  • Story: James March and Organization Theory
  • Case: American Red Cross (ARC) Haiti Relief Program

Today’s agenda

  • Data sources
  • Statistical data
  • Survey
  • Interview
  • Evaluation project consultation
  • Case: AI and Big Data for Development Evaluation

Literature review

  • Google Scholar
  • Review articles
  • Experts interview
  • Examples

Existing data

Collecting program data

  • Experiments
  • Interview
  • Survey
  • Field
  • Case study

Probabilistic sampling

  • Random sampling
  • Stratified random sampling
  • Cluster sampling

Sample size: Power calculations

“Smallest sample (lowest budget)” to possibly measure program impact.

Survey Challenges

  • Response rate
  • Quality and consistency
  • Data security

Survey strategies

  • Concise/short
  • Start with an easy quesiton (yes or no to get “hooked”)
  • Donot leading questions
  • Show progress/time
  • Brainstorm ideas to get good-quality of survey

Comparing different survey methods

Criteria Mail Internet Telephone In Person
Quality of data
Opportunities for analyses
Resources needed

Focused group

Interview

  • Homework
  • Prepare and test your questions
  • Semi-structured interview
  • Plan and being flexible
  • Note taking
  • Seek feedback

Fieldwork

  • To describe what happens at the level being examined (local office, local program, local agency, local community, state office, state agency, and so on) by collecting information about procedures and data on activities, services, institutional features, outcomes.
  • To explain why the situations are as they are.

Strategies

  • Provide incentives
  • Optimize survey length
  • Follow up
  • Multiple channels
  • Targeted audience
  • Avoid repetitive

Tools of trade

  • Google/Microsoft Forms
  • Survey Monkey
  • Qualtrics
  • Amazon

Train staff

  • Goals, process, and responses
  • Assessment
  • Mock interviews

Test instruments

  • Clear vague and biased wording
  • Logical order
  • Completeness of intended data

Run pilot

  • Get feedback and comments
  • Revision
  • Refining

Ready for a full implementation.

Survey and interview simulation

NYCHA Energy Burden Project Example

References

Newcomer, Kathryn E., Harry P. Hatry, and Joseph S. Wholey. 2015. Handbook of Practical Program Evaluation. 4th edition. San Francisco: Jossey-Bass.