In recent years, organizations and employees are being measured not on their stories, but on their numbers. When you introduce your organization, people will always ask you about your numbers — What’s your reach? What’s your impact? Meanwhile, funders and senior management are running after the concepts of big data — year-over-year increase in reach, impact measurements, geographic comparisons, and more.

Quantifying the qualitative has become a norm rather than a choice.

To measure these numbers, organizations are increasingly implementing monitoring and evaluation frameworks. These frameworks collect large amounts of data and produce insights that help the organization’s employees learn more about their programs. However, the irony is that the people this data is supposed to help treat these frameworks as a useless burden, which eventually makes the data useless.

New call-to-action

Why are monitoring and evaluation frameworks failing?

Today’s M&E frameworks are drawn from the online world. This is a problem because they are being used to measure and evaluate the offline world.

Monitoring how websites work is different than monitoring how humans function.

Here are four things that today’s monitoring and evaluation frameworks are doing wrong:

1. Tracking unnecessary information

M&E frameworks often end up capturing a lot of information that is not necessary for a program. For example, if we are conducting an education program that works to improve students’ learning outcomes by providing books, we don’t have to collect information about students’ mid-day meals.

2. Capturing data at wrong intervals

Data can be captured at weekly, monthly, or even quarterly intervals. It’s essential to choose the right interval for your specific project. For example, school infrastructure won’t change in a year, so it shouldn’t be captured on a quarterly basis. On the other hand, student learning outcomes can change on a weekly or monthly basis, so it does not make sense to only capture that data point on a yearly basis.

3. One-way data flows

Often, the people who collect data don’t get to see the results of their data or how their data contributed to larger program changes. That makes the task of data collection seem uneventful and useless. If people don’t believe that their data has any value, they will not have an incentive to capture good data.

4. Unclear balance between metrics and programs

In my interviews with field staff, I have found that field officers often don’t know whether meeting their M&E metrics or doing quality work is more important. Obviously, both are important. However, field officers usually tend to focus on one and ignore the other.

How can we fix monitoring and evaluation frameworks?

Clearly monitoring and evaluation frameworks have plenty of potential pitfalls. Here are five easy fixes that will help you make sure your M&E framework is actionable and effective.

1. First things first: change the mindset

A fundamental problem with M&E frameworks is that field staff doesn’t know why M&E is important or how can it can improve their organization’s programs. It’s important to show field staff the larger picture — explain exactly how this data will be helpful for them, not just for the funder report.

2. Get the data, give the insights

Make sure that any data that you get from the field goes back to the field as insights. People will start valuing the M&E framework once it starts giving insights. These insights do not need to be complex. They can be as simple as “There are 45 new beneficiaries this month, which is 20% better than the previous month”. Once data is turned into clear insights, field officers will realize its importance.

3. Create a human-centric data collection framework

M&E frameworks are created with data as the focus. That seems logical, since the framework is meant to collect and analyze data. However, humans are ultimately using the framework to input data and gain insights. If the framework isn’t designed around humans — the way they work, how they think, and what they need — it will be useless.

To design a human-centric data collection framework, it is important to do the following:

  • Design data collection around your field staff. It’s important to create workflows that suit your field staff, and the best people to tell you that are the field staff themselves.
  • Optimize data collection with your staff’s field visits and daily workload. If someone goes to the field three times a week, make sure that their work can be done in three days. If your field staff only goes to a particular area once a month, you should not ask them to collect data in that area on a weekly basis.
  • Make the data collection form as small as possible by getting rid of information that can be calculated or repetitive information. For example, if you are collecting the number of students and teachers in a school, don’t ask your staff to also collect the student-teacher ratio. This can be calculated. As another example, if you are measuring students’ learning outcomes on a weekly basis, don’t re-collect each student’s basic information (name, age, grade) each time. Instead, collect each student’s basic information once, then just update each student’s learning outcomes in the future. (Our Collect tool makes this possible through its monitoring feature. Read more here.)
  • Use a platform that makes M&E happen in real time. Field officers often wonder whether the data that they submitted has errors or is even being used. Ensure that your framework tells your data collectors that data has been received, verified, and used. This will help you gamify data collection to keep your field staff engaged.

4. Iterate. Iterate. Iterate.

The most important part of building a good M&E framework is constantly learning from your data and improving your processes.

  • Learn from the collected data. If the data you collect is not changing significantly over time, you might want to reduce the frequency of collecting that data. If particular fields are being ignored over and over, you might want to get rid of those fields.
  • Learn from the field staff. Take regular feedback from field staff to understand what’s working and what’s not working. For example, you might have set that a certain data parameter should be collected early in the month. However, your field staff might have learned that it makes more sense to collect that data point at the end of the month. Listen to them and incorporate regular feedback.

Pro Tip: Select a tool that lets you change your data collection form while you are collecting data. This means you won’t have to stop the data collection process any time you get feedback.

5. Make the entire process engaging.

The final step is to engage everyone in your organizations and not just your funders and senior management. The basic step is to figure out what insights will make sense for various employees of the organization:

  • Create reports that cater to various levels in the organization. The information that is most relevant to a field worker is different from what is most relevant for a program manager. It’s important to understand the outcomes and indicators for various types of people — program heads, field heads, administration, etc.
  • Engaging insights can make your employees look at your data. Always present comparisons rather than absolute numbers. For example, show the percentage change from last month rather than presenting absolute numbers for each month. Show geospatial comparisons rather than giving a table that lists data for each geography. The more engaging your data is, the more people will use it.
  • One of the best ways to make data engaging is to visualize it. A dry number-filled report will only result in unopened emails and reports. Use maps, bar graphs, pie charts, and other visualizations to make it easier for users to interpret data.
  • A/B test your reports to see what employees like more. Try different versions of reports or send them at different frequencies to see what leads to better engagement.
  • Use reports during program review sessions. This helps employees get in a habit of referring to reports. It’s important for data to become a part of daily organizational planning, rather than a one-off exercise.

There is no single formula to make the most actionable monitoring and evaluation framework. However, we have seen organizations facing all of these problems and have iterated with them to make actionable, effective M&E frameworks. The key is to ensure that you catch problems as they come up and iterate to solve them.

Let us know what you do to improve your M&E framework in the comments section!

New call-to-action
New call-to-action
Author

Resident Enterpreneur at Atlan. Believes in the potential of data & technology in solving many human problems. The harder the problem, the better it is.

26 Comments

  1. Pingback: How to Create an Effective Monitoring and Evaluation Framework – Emeka Onu

  2. Tsering Yankey Reply

    One of the functions of M&E is to learn and improve but i don;t understand how we should do it. Can you please give me one example of learning from M&E and improving.

    • Richa Verma Reply

      Hi Tsering,

      Thanks for your comment. You are absolutely right — one of the main functions of M&E is to learn and improve. The best way of going about doing so is to collect relevant data that lets you measure the progress of important aspects of your program. For example, suppose a non-profit is working with children to improve their learnings using special books that the non-profit has designed. The best way to learn and improve is to collect data on which books children like the most and why. The non-profit can also collect granular information from teachers about various chapters in these books and understand which topics are received well and which are not. Doing such an exercise on a year-to-year basis and improving the books based on this feedback will be an example of monitoring and improving. 🙂

  3. I work for a very small organization, and we have an Employee Evaluation system not an M&E system so I would like to know whats the link between this two system. Is there a need to implement both

    • Richa Verma Reply

      Hi Leeto,

      There are a couple of differences between Employee Evaluation and an M&E system:

      (1) An M&E system tracks and evaluates an organization as a whole and not just the performance of an employee.

      (2) The basic aim of an M&E system is to improve the performance of various projects/programs run by organizations. However, the basic aim of an Employee Evaluation system is to track and measure the performance of employees.

      Having said this, an Employee Evaluation system can typically be a sub-part of an M&E system. And, yes, I recommend implementing both the system. However, not in isolation.

      • You are quite right. M&E is not personnel management. Actually most people loath M&E because managers use it for personnel appraisal which i think is wrong

  4. I am working for big INGO as ME professional ( recently joined the org), entirely implementing projects through national NGOs and I am tasked to develope ME framework, as I try to review the org ME systems and practice , I found out that I have to start things from scrach . Most of the ME activities are limitted to projects and there is no national level ME framework in use . so what do you advice me on how to go about the ME developement process .

    • Starting things from scratch would be the way to go. M&E has recently been introduced and there’s is so much out there (experience) that is needed to build the literature base.

      I’m speaking from experience as I too recently joined an INGO and found myself in the same shoes.

      • Richa Verma Reply

        I agree with Bahle, Talile. Re-thinking through the process while setting up M&E processes is very important. It helps you look at the framework more broadly since you have shot at changing things beyond just fixing the current loopholes.

        However, just to make sure that you are not causing a lot of dissatisfaction with the staff, I recommend you phase out the changes that you want to bring. And, might be wise to first do the small evident fixes and then bring about major changes.

  5. Vijay kumar Reply

    Nothing new in M&E all are being done even in Govt. System also.National and international are doing it since long back

  6. After having been associated with number of grass-root NGOs, my realizations

    1. M&E parts has never been taken seriously by NGOs, rather being perceived as obligatory exercises as funder/donor ask for it.
    2. There is a commitment deficit on the part of Management to execute and learn from this course correction task–M&E.
    3. Integration of program and M&E—it is hugely neglected area for which many a times these two never complement each other.
    4. Data Quality—huge issue. The disconnect among M&E designers, surveyors, program staffs and management could be single most reason for this debacle.

    But lets not be gloomy about it as there are now more spaces and platforms where talks and actions for robust M&E are happening….

    • Richa Verma Reply

      Thanks for sharing your learnings, Mosharaf. Hope this article helps to tackle these challenges. 🙂

    • I totally agree. It takes time to bridge the gap between program team and M & E teams.

  7. Ian C Davies Reply

    Monitoring and evaluation are distinct and fundamentally different practices, although complementary. Conflating them, e.g. calling and treating M&E as one, creates confusion rather than clarity, reduces the value and potential of each, and ultimately does not help with our understanding conceptually and in practice. On the evaluation side I recommend reading Scriven, Schwandt, Greene and Patton for starters. On the monitoring side you may wish to read Deming, Drucker and Mintzberg. Cheers, Ian C Davies, Credentialed Evaluator (CE)

    • Richa Verma Reply

      Ian, thanks for bringing this up. I agree that monitoring and evaluation, in principle, are two different practices and have fairly different individual goals. However, many organizations unfortunately see M&E as one thing and set employees to handle both parts of M&E. The aim of this article was to share practical learnings to help people working in these organizations to come up with their own approach towards M&E. My goal wasn’t to impart technical knowledge for both monitoring and evaluation separately.

  8. Nicole Drakes Reply

    An excellent article. I particularly agree with the need to involve those who are collecting the data in designing the data collection instruments and feeding back information collected via M&E. Often, people view M&E as they would a UFO if they ever saw one. They don’t understand what is and if they understand, they do not see why they should do it. Selling people on the benefits of M&E to them in my experience helps a lot especially if they work in an organisation that is not employee-focused.

    I agree the ultimate goal of M&E is to facilitate improvements. I have found, however, that people tend to see the M&E process as a criticism of them as an individual and their competence. The challenge lies in helping them understand that M&E is not a punishment but a self-improvement tool which if used appropriately can benefit everybody. What can I say building a sustainable M&E culture is hard but essential.

    • Richa Verma Reply

      Thanks, Nicole! I am in agreement with your point about building and sustaining M&E cultures.

      • Thanks.
        Nicole Drakes.
        i am very appreciated to see M&E experts who are ready to devote their time and energy to improve the skills of the M&E staffs,
        As i am M&E staff i have seen here more integral comments from you and according to my understanding to M&E is to tract projects before starts and ends.
        Before start M&E works as basline monitoring and end, post monitoring.
        While evaluation my understanding is to measure the first as formative evaluation.
        Second as final evaluation to discivers orojects changes and improvements.

  9. Kebba N Sima Reply

    This is indeed quite brilliant and insightful. Worth reading and paying heed to as one develops an M&E framework
    Thanks

  10. This is a good read, I have worked as both an M&E Specialist and a Programmer and have realised that in most cases the people who are supposed to be champions of M&E lack knowledge on programming and are less involving in the development of the M&E framework. This results in poor response to M&e needs from programme staff at the same M&E people then feel it is a prerequisite that whatever tool has been developed should be used religiously by programmes irregardless of whether it makes sense to them. Buy-in is the buzz word here. Get programmers involved and they will adhere to the requirements of the framework.

  11. Humayoon Iqbal Reply

    I wish I could this article in PDF so that I can study it offline whenever I want it.

  12. Acquilus Peter Barasa Reply

    Thanks alot for this relevant piece of information.Other than data collection,analysis and report writing,what are the other key aspects of the M&E framework?I concur with the writer on the importance of giving feedback or insight(s) to the field staff on the importance of the data collected;it is always overlooked thus delinking the data collectors to the final product.always a pleasure reading your informative and well researched articles.

    • The most important aspect of an M&E framework is to understand ‘what data should be collected and why’ and then mapping different users to different pieces of information that is relevant to you.

  13. Sahr Kassama Junior Reply

    Hi Team,
    thanks so much for your guide sent to me on how to effective monitoring and evaluation framework. It was work receiving and reading.
    Regards,
    Sahr Kassama Junior

  14. Stephen Selmah Wilfred Reply

    I work with a Health organization in my country and I really enjoy how you outline everything.
    Your advice on the M&E framework is more interesting thank you so much for these notes.

Write A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.