Going beyond reporting requirements to gain insights
I was recently paying a credit card bill when I saw the reports on my spending categories – some questionable splurges had happened – and I decided to keep a close eye on my spendings from then on. My phone too guilt-trips me daily with screen time data, eventually pushing me to reduce my screen time. So if data can help us improve our performance, why aren't we using data/metrics to assess our effectiveness when it comes to impact incubation? During one of our training sessions in the ‘Incubating Incubators program’, when I brought up the topic of assessing the performance of incubators themselves, few had a clear plan for collecting and measuring data on their own performance, indicating a lack of clarity on why this is crucial.
Interestingly, in this month’s Wednesday Wisdom Session with Abby Davidson of Aspen Network of Development Entrepreneurs (ANDE) and Nicholas Colloff of Argidius Foundation, one key learning was that a lot could be learnt about incubator performance by intentionally designing and monitoring metrics with a clear learning objective in mind.
I decided to dig further into the topic, and in this interview, Harry Devonshire, the evaluation & learning manager at Argidius Foundation, offers more insights on how well-designed metrics can help improve impact. Here is an excerpt from the interview:
Q: Metrics are often put in as an afterthought or to satisfy funder reporting requirements, or both. Why should ESOs/incubators consider ‘conscious design’ of metrics? What are the benefits? Any examples of where this has really helped?
Data-informed learning is a powerful way to improve impact. For example, a number of years ago, Technoserve decided to prioritize cost-effectiveness across their programs using a metric called Return on Total Investment. The average ROTI, an indicator of the cost-effectiveness at which impact is achieved, has grown from 3.4 in 2017 to 5.2 in 2021. This is in spite of the discount COVID has wrought. This means that for every $1 spent on program cost, supported enterprises increase their revenues by $5. An enterprise development program that costs $5,000 per enterprise, helps that enterprise grow their revenues by $25,000. Seventy-five percent of the latest generation of programs are achieving ROTIs higher than 6, with some achieving ROTIs over 30.
To not just generate but to strategically use this metric, Technoserve collected the underlying impact and cost data across their programs, regardless of whether funders required it or not.
In this case, metrics were approached from the angle of R&D rather than reporting.
Q: What do funders look for in addition to impact metrics? What are they learning about an ESO based on the maturity of their metrics design?
It depends. If they are accountable to the public, such as a sovereign donor or national government, then accountability is often the priority. A certain level of capacity is necessary to meet their requirements.
Private funders can be more flexible. One community of funders is on a journey from "proving impact"’ to "improving impact." At Argidius, we are interested in impact and learning in equal parts. If we are not going to learn something that will help ourselves and the sector become increasingly effective, impactful, and sustainable, then we don’t do it.
We are particularly impressed by ESOs that are driving their own learning agendas. For example, we first met Balloon Ventures when they were seeking research funding for a randomized control trial to test if and how the Lean Start-Up Methodology could cost-effectively grow small businesses in developing countries. A commendable commitment, especially given results are soon to be published come success or failure.
Q: What are 3 principles that incubators can use in designing metrics for monitoring and evaluation?
The three principal questions to ask oneself are:
What does success look like?
How would you know?