As every year, the SEO has recently transmitted a report of its activities for 2017 to the Minister for Development Cooperation, and to Parliament. Glo.be went to meet Cécilia De Decker, Special Evaluator since 2016
Cécilia Dedecker, Head of the Special Evaluation Office - SEO - is authorised to evaluate Belgian Development Cooperation. As part of an external evaluation office, although placed for administrative matters under the authority of the Chairman of the Executive Committee of the FPS Foreign Affairs, Foreign Trade and Development Cooperation, her position ensures an independence in the choice, implementation and distribution of her evaluations.
What are the aims of the Service of the Special Evaluator?
Our main responsibility is to evaluate strategies, programmes or themes related to funding for development cooperation. Once findings have been made, we need to make recommendations. Of course these recommendations are drawn up independently, but also in collaboration with the relevant departments of the Directorate General for Development (DGD), and with other development cooperation actors such as NGOs or the Belgian development agency Enabel (e.g. BTC). Our recommendations need to be implemented realistically, taking into account the respective room for manoeuvre.
A new mission has been entrusted to us since 2014. By Royal Decree, we are now responsible for certifying the evaluation capacities of organisations that are funded by Belgian Development Cooperation. This means that Belgian NGOs must internally evaluate their own interventions, in a consistent manner.
Are the evaluation procedures, both those carried out by the SEO and our Belgian development cooperation actors as well as those carried out by your colleagues abroad or at a supranational level, therefore part of the same rationale, with comparable criteria?
It is indeed at the OECD Development Assistance Committee (DAC) level that we discuss and exchange our own experiences in order to establish the best possible approaches. As such, there is a range of criteria that everyone relies on. Currently, all evaluations must meet five criteria. Relevance: does the intervention, strategy or policy meet essential needs? Will the beneficiaries have a better quality of life as a result? Is it in accordance, firstly, with the priorities of the partner State, and secondly with those of Belgian development cooperation? Effectiveness: did the intervention achieve the expected results? Efficiency: did it achieve the results as well as could be expected with regard to the means and resources used? Sustainability: Does it achieve lasting results (which continue after the end of the intervention)? Impact: does it have a long-term effect for the beneficiaries and does it have an impact on the community or society as a whole? Can these effects clearly be attributed to the intervention?
You mention that these are the criteria used today. Should these evaluation criteria evolve or be changed in the future?
The interpretation of the current criteria often varies from person to person. Take the relevance criterion for example. This implies that a programme must meet certain needs. But what are these needs? Those identified at the government level, or national policies, or should we be looking at local needs instead? There is consequently still some brainstorming to be done on the current "instruction manual".
This reflection is happening within the OECD's DAC, and is also taking into account broader perspectives. The current criteria may also need to be adapted or adjusted in line with the United Nations Sustainable Development Goals (SDGs). In these SDGs, which set goals to be achieved over a period of fifteen years, there is also an emphasis on fairness, including ideas relating to human rights in particular. It would clearly be appropriate to incorporate these aspects into our evaluations.
We are now responsible for certifying the evaluation capacities of organisations that are funded by Belgian Development Cooperation.
In terms of resources and aims, is the SEO in Belgium comparable to what is done in other countries?
I have a team of 6 people, including an assistant. In addition to this relatively small team, I have a budget of just under €1,200,000 to recruit external experts for evaluations. Without complaining about our situation, there are 28 people in the team in the Netherlands in comparison, and in Germany they have a service of about 50 people, but without using external experts like we do here and in the Netherlands.
We should also acknowledge that the Anglo-Saxon and Nordic countries have more substantial resources. This is reflected in the discussions within the OECD's DAC, in which their members are the most active.
Returning to the Belgian development actors, does this new mission of certifying their internal evaluation procedures therefore fit within this rationale of standardising the approaches at all levels?
Internal evaluations within organisations have always existed. But in the past, they were seen more as a way for organisations to be accountable, to justify themselves in relation to the money spent. The work carried out in recent years has resulted in things being better formalised, with the aim of improving the quality of internal evaluations. Today, there is more emphasis on the quality of the data on which the evaluations are based. This helps evaluations to be more credible and therefore more usable. The important thing is obviously that they are part of a learning process, and not simply a justification with regard to the public authorities. The aim is to be able to change course if there is still room for improvement.
On the subject of lessons learned, have you encountered any catastrophic situations in terms of results since the start of your mandate?
In recent years, the evaluations have been mixed in terms of impact. One of the projects lacked convincing results. It was a project supporting technical and vocational education in Congo. We realised that the equipment had effectively been delivered, but it was often not used, and in any case was underused. There were various explanations why: teachers had not received training (they were therefore unable to use it), students and schools did not have the means to buy consumables (or the consumables were not available on the local market), the electricity supply was patchy, etc.
The important thing is obviously that they are part of a learning process, and not simply a justification with regard to the public authorities.
At the other end of the scale, have many projects been veritable success stories in your opinion?
Some evaluations make it possible to highlight very positive effects on populations. An example that springs to mind is the water supply projects in villages in Peru. The aim of ensuring a better water supply resulted in real added value for the local populations. Besides the time saved for the villagers, the impacts on health and hygiene were also observable thanks to this access to drinking water.
But rather than focusing on the results of specific projects, our work with the SEO consists of more comprehensive evaluations, with a cross-cutting approach, involving multiple organisations on the same theme. For the 2017 report, the theme of support for the private sector was chosen, with a selection of projects that were deemed to be highly relevant at their levels (supporting cooperatives, producer groups, local companies or local financial institutions). The results have been observed in terms of increased productivity and incomes, essentially thanks to the introduction of new practices, the acquisition of new skills, access to new markets and the resulting increase in production. However, the results are quite mixed depending on the organisational capacities of the cooperatives who receive support. For example, in Peru, the results were very positive for a coffee and cocoa cooperative which had a strong leader and the trust of its members. On the other hand, the results were very poor in another cooperative that had experienced organisational problems which undermined the trust of members and partners.
How do you choose the themes that will be evaluated in more detail from one year to the next?
This is done through multiple consultations together with the Strategic Unit of the Minister and the Strategic Committee of the DGD. The umbrella organisations of NGO federations are involved in the reflection, as are our diplomatic posts, and ENABEL. As such, certain deadlines need to be taken into account, as the programmes must be sufficiently advanced to be the subject of an evaluation. What is important is to see the progress made between a decision taken and the effects achieved. We try to carry out the evaluations at the end of the cycle. They last about 9 months, including preparation and finalisation. Although at the end of the day, it is still just a snapshot of a given period!
How is the follow-up of recommendations organised?
At the end of the evaluations, the reports are sent to the Minister, the administration (DGD) and the organisations in question. These different actors provide a response to the recommendations by indicating whether or not they accept them, and the actions they plan to improve the situation in the future. Until now, there has been no subsequent follow-up of these responses from management. Since this year, the DGD has taken this concern into account, and carried out a first follow-up exercise on the actions undertaken.
This has made it possible to take a fairly positive look, by observing that some evaluations were actually appropriate even if, in the face of certain important findings, such as the attention given to the sustainability of interventions, there remain important challenges, particularly in the context of the least developed countries. The conditions there are less conducive for lasting effects to be achieved when the interventions end.
The 2016 regulations on non-governmental cooperation have evolved favourably, taking into account one of our recommendations which aimed to shift the focus from reporting on activities to reporting more on the effects of interventions on populations.
See in French the rapport annuel du SES