Monitoring and evaluation (M&E) has changed dramatically during the Covid-19 pandemic. In some instances it has been dropped as budgets have been cut and programmes have been adjusted, yet it continues to play a vital role in helping funders and non-profit organisations (NPOs) understand the new and changing needs of service users. As we rethink our responses, it is important to ask how we can use M&E to design more effective programmes and whether existing M&E frameworks are still relevant.

On 10 September 2020, responsible business consultancy Trialogue hosted a webinar on how M&E has adapted during the crisis, and what lessons have emerged for funders, NPOs and other stakeholders. The panellists were Fatima Adam (Programme Director: Research and Evaluations at Zenex), Lawrence Leseka (Design, Monitoring and Evaluation Coordinator at JAM South Africa) and Zulaikha Brey (Senior consultant: M&E at Trialogue).

A snap poll conducted at the beginning of the webinar canvassed attendees about the M&E challenges posed by Covid-19. Close to half had experienced programme cuts or delays (45%), while 36% had shifted to digital data collection, and 31% struggled with access for onsite data collection. In addition, 17% had experienced M&E budget cuts (see graphic below).

Adapting M&E to changing programmes

For Zenex, M&E has been crucial in tracking the effect that Covid-19 has had on education outcomes. The organisation has explored new ways of conducting M&E, starting by asking “What can be evaluated under these circumstances? Is evaluation feasible?” It shifted towards rapid assessments and changed its methodologies. Existing surveys did not suffice as they could not be triangulated with other methods due to social distancing. “You cannot have M&E as usual if you have changed your entire project design,” Adam asserted.

Zenex has had to work more rapidly and make do with imperfect data. “We have had to be more discerning about the data we collect, thinking about what is essential and what is nice to have. We have applied the ‘less is more’ principle,” said Adam. The organisation has largely relied on telephones and SMS for data collection, in both urban and rural areas.

Joint Aid Management (JAM) SA, which serves around 120 000 children in more than 3 000 Early Childhood Development (ECD) centres nationwide, changed its approach to food distribution when ECD centres were closed during lockdown. “With the consent of donors, we repurposed our funds so we could give food parcels and nutrient-rich porridge to entire families, so they did not starve,” Leseka said.

JAM introduced an SMS voucher system to control food distribution, prevent the spread of the virus, and protect the dignity of service users. It also educated parents and guardians about starting or maintaining food gardens to ensure food security and intends monitoring household progress over time. “It was our funders who provided us with options for changing our programming, encouraging us to focus on what could be delivered,” Leseka pointed out, adding that a larger budget was necessary to help to feed more people.

Rather than focusing on outcomes, JAM’s M&E set out to ensure that planned activities actually took place. It conducted remote monitoring and relied on ECD practitioners to collect data and submit reports. It also made use of a software programme to keep track of food delivery or collection, scanning virtual vouchers delivered via SMS. “Our reporting cycle changed from monthly to weekly so we could stay up to date with what was happening in each province,” Leseka explained.

Webinar attendees took part in a second poll that asked how organisations have adapted their M&E practices as a result of Covid-19. Almost half (47%) said they were conducting online surveys, with 32% conducting phone/SMS surveys and 31% indicating that they were putting safety measures in place for data collection (see graphic below).

Finding new ways to add value to evaluations 

For Trialogue, an M&E provider, it was necessary to review how it conducted evaluations, ensuring they were relevant and still a priority to clients. “How did the methodologies that we had originally proposed to clients prior to Covid-19 need to change? What value could we add to the evaluation?” were questions the company asked.

While some evaluations were put on hold – in the education sector, teaching and learning time was limited and therefore precious – others went ahead because of the contribution M&E makes to programme improvement. Challenges included the costing and timing of evaluations due to movement restrictions and the need to assess outcomes in the light of current rather than historical context, including market shocks like Covid-19 job losses skewing the data in employment programmes. Data collection could no longer be done in person so Trialogue used a number of digital methods, including WhatsApp focus groups, where participants were incentivised, virtual observations of training, virtual walkabouts of infrastructure projects and online and telephonic surveys.

Brey stressed the need to base findings on sound evidence, with triangulation where possible, and to be mindful of convenience bias when collecting data remotely, as some respondents do not have internet access or may be technophobic. The pandemic has provided an opportunity to collect outcome-level data but has also underlined the vital importance of accurate user data, and the need to place more emphasis on secondary data at this time.

Key M&E learnings for funders and NPOs

According to Adam, you need to be clear about your purpose when aligning project design and evaluation design. “Ask if it is feasible and evaluable. Be flexible but focus on using quality instruments and data as far as possible. There can be a messiness to doing things more rapidly, so be clear about where and how you are going to collect data.”

For Leseka, it is important not to drop M&E, which leads to a lack of accountability and does not do away with the need to report later. “Listen to what your donors and field staff think can work and engage with what you think is possible or not possible,” he recommended. “Where it was not possible to verify data, we told our funders we could not be sure of reliability but were doing the best we could under the circumstances.”

Brey pointed out that, when it comes to adapting M&E, no one size fits all. “Every organisation and every context are going to be unique, and there are always factors you need to take into consideration, like costs, available resources, and so on,” she said. “Adapt according to your own unique situation, make sure your framework is rigorous, and don’t be afraid to ask for help.”