Survey implementation

Interested in more webinars and material on M&E? Sign up for our newsletter!

About the webinar

The Monitoring and Evaluation webinar series “Survey Design and Implementation” is a series of three live sessions addressed to M&E professionals working in the social sector. These webinars comprise a course which will help you get a comprehensive understanding of all the steps involved in survey design such as developing questionnaires and ethical considerations and in survey implementation such as designing tools and methods for data collection, monitoring and analyzing results. The third session will bring in real life examples from organizations who have been developing surveys using ActivityInfo.

The series is addressed to entry/intermediate level professionals and it is highly recommended that you join or watch the recordings of all webinars in their consecutive order so as to benefit from the complete course.

About this session

This is the second session we introduce you to survey implementation basic considerations both for the preparation and the implementation phase. We look into sector-specific surveys and cross-cutting surveys.

In summary, we explore:

Preparing to implement M&E surveys:

  • Resourcing and logistics
  • Training enumerators and data collectors

Implementing M&E surveys:

  • Data collection methods and tools design
  • Ensuring data quality and reliability
  • Data quality enforcement (examples with ActivityInfo)
  • Validation (examples with ActivityInfo)

Additional topics:

  • Ethical considerations in survey implementation
  • Monitoring results
  • Analysis of results (examples with ActivityInfo)

View the presentation slides of the Webinar.

Is this Webinar series for me?

  • Are you working in projects or programs in which you are called to develop, monitor and analyze surveys?
  • Are you looking for guidance on good practices for survey design and implementation?
  • Do you wish to ask questions about these topics?

Then, watch our webinar!

List of sessions

Survey Design and Implementation series:

  1. Survey design for quantitative data collection
  2. Survey implementation (May 23rd)
  3. Real life examples for survey design and implementation (June 27th)

Questions and answers

How to clean data before carrying out quantitative analysis?

Indeed while some of the discrepancies that you will find in your database are legitimate as they reflect variation in the context, others will likely reflect a measurement or entry error. These can range from mistakes due to human error, poorly designed recording systems, or simply because there is incomplete control over the format and type of data imported from external data sources. Data cleaning is therefore essential for accurate quantitative analysis,and this involves several key steps. First, understand the data context and use comprehend variables. Inspect the data to identify and handle missing values, outliers, and verify correct data types. Clean the data by removing or imputing missing values, addressing outliers, converting data types, standardizing units and formats, removing duplicates, and correcting inaccuracies. Transform the data by normalizing numerical values, and aggregating as necessary. Verify the cleaned data against original sources and use statistical methods for validation. Document all cleaning steps and update metadata. Tools like R, and SQL, along with automation through scripts, can facilitate this process.

Can you go back to the example about the size calculator?

You can utilize this sample size calculator for free

How do we use Quasi-experimental methods during evaluation?

Remember that Quasi-experimental design is used when it's not logistically feasible or ethical to conduct randomized, controlled trials. As its name suggests, a Quasi-experimental design is almost a true experiment. However, researchers don't randomly select elements or participants in this type of research as the methods requiring randomization suggest. To use the Quasi experimental method you may consider Quasi-experimental designs without control groups, Quasi-experimental designs that use control groups but no pretest,Quasi-experimental designs that use control groups and pretests and Interrupted time-series designs.

What is your advice as I plan to change my career to becoming an ME officer?

While we are not a career advisory organization, I would give you the same tips for beginners in M&E. Follow courses on the basics of M&E and ActivityInfo provides beginners with a lot of resources to get started including a 1 month free trial of the software. Explore these resources and keep practicing and following our free M&E webinars.

How do we ensure validity in the survey that we have developed or adopted from other sources?

This is a very good question because we often find ourselves conflating “adopting” with “adapting” and they mean two different things. While in adopting you copy verbatim the survey instrument face and content, in adapting you reflect a different context. In general, adopting (using verbatim) is preferable to adapting for a few reasons. First, when the instrument is adopted, then the reliability and validity studies that have been conducted on that instrument can be applied to your study, so you do not have to collect validity evidence. However, when an instrument has been adapted, then it has been significantly changed so the reliability and validity evidence will not apply to your study. Second, adopting an instrument links your evaluation to all other evaluations that have used the same instrument. Finally, adopting the instrument saves you time and energy in making significant changes.

About the Speaker

Victoria Manya has a diverse background and extensive expertise in data-driven impact, project evaluation, and organizational learning. She holds a Master's degree in local development strategies from Erasmus University in the Netherlands and is currently pursuing a Ph.D. at the African Studies Center at Leiden University. With over ten years of experience, Victoria has collaborated with NGOs, law firms, SaaS companies, tech-enabled startups, higher institutions, and governments across three continents, specializing in research, policy, strategy, knowledge valorization, evaluation, customer education, and learning for development. Her previous roles as a knowledge valorization manager at the INCLUDE platform and as an Organizational Learning Advisor at Sthrive B.V. involved delivering high- quality M&E reports, trainings, ensuring practical knowledge management, and moderating learning platforms, respectively. Today, as a Customer Education Specialist at ActivityInfo, Victoria leverages her experience and understanding of data leverage to assist customers in successfully deploying ActivityInfo.

Sign up for our newsletter

Sign up for our newsletter and get notified about new resources on M&E and other interesting articles and ActivityInfo news.

Which topics are you interested in?
Please check at least one of the following to continue.