Let’s start with a story about learning and failure at one foundation–and then move on to a renewed effort to share knowledge by IssueLab.
The story: Recently, I had an enjoyable conversation with Reeta Roy, who is president and CEO of the Mastercard Foundation. Based in Toronto and dedicated to poverty alleviation in Africa, the Mastercard Foundation was formed in 2006 when MasterCard, the credit card company, went public. The company endowed the foundation with shares of its stock, which have grown like crazy–up by about 1,900 percent in the last decade. Consequently, the MasterCard Foundation now has assets of about $11 billion, which makes it the fourth biggest foundation in North America–bigger than Hewlett, Packard, Kellogg, Rockefeller, etc. (Credit cards are a very nice business, particularly when people carry balances at interest rates of 11% to 23%.)
A PR firm retained by the foundation had emailed me, offering the interview. I was interested for several reasons, without having a particular story in mind. I wondered how a foundation that has grown so rapidly can learn fast enough so that it can spend its money wisely. The foundation was about to announce $10.6 million in grants to five for-profit companies that serve clients in rural Africa; that’s newsworthy, too, because foundations typically make grants to nonprofits. Finally, I noted that the foundation had initially focused on education — it is the biggest foundation funder of education in Africa — and on microfinance, which made sense, giving its roots in the credit industry. But doubt has arisen about the benefits of microcredit, and I was curious to know what the foundation had learned about microfinance, and whether it had pivoted in a different direction.
Roy and I spoke for about 25 minutes. She told me that the foundation has, among other things, broadened from microfinance to what is now called financial inclusion, a concept that includes financial literacy, mobile money, insurance for smallholder farmers and the like. “All of it is really focused on the needs of the client,” Roy said. “It’s not a magic bullet, but it’s a massive enabler in getting people out of poverty.” The foundation partners with others, including the Gates Foundation and the Equity Group Foundation, which have programs on financial inclusion, so they can learn together. And, she told me, the foundation isn’t afraid to fail. “We’re trying to create an environment at the foundation where people are not afraid to say that something did not work,” she said. So far, so good.
Then things got interesting. In an email to the PR firm, I suggested this approach to a story or blogpost:
I think the real story here is the challenge of ramping up quickly, and learning as you go. Maybe one way to do this would be to hone in on a program that worked, and another that didn’t. I saw the report on your website about the Stryde project that seems to have been a big success. Maybe you can point me to an example of a program where you did research and, as a result, changed its direction or focus, or dropped it entirely.
The PR representative responded: “This is a great idea!” And later: “We think this is a good direction to go in with the piece.” But his next email said people at the foundation “didn’t decide they wanted to go this route with the piece.” When I wondered why the foundation was reluctant to open up about failure, he replied: “I’m not sure if it’s reluctance on The Foundation’s part as much as it is a lack of time since these programs have taken effect to really have some concrete measurements, positive or negative.” Which would make sense if I were asking for a randomized control trial or a peer-reviewed study, but I wasn’t: All I was seeking was a report about learning from failure, or an internal document that said, in effect, this grant has run into headwinds, so let’s tack in another direction. And the foundation has no hesitancy to share success stories on its website.
Pardon me for generalizing, but this reluctance to talk about failure is too frequently part of the culture of foundations and nonprofits. Partly that’s because foundations and nonprofits don’t do enough evaluations; they are hard to do, and they cost money, and donors are reluctant to divert money away from programs into research. I get that. But foundations have nothing to lose by sharing their insights….which brings me to Issue Lab.
Formed in 2005 as a searchable website to collect and share knowledge from the social sector, IssueLab was absorbed by the Foundation Center in 2012 and is being revamped this week. The idea behind Issue Lab is simple, as Gabriela Fitz, who started it, told me by phone:
The products that result from grants are a public good, and they should be shared as such. This knowledge belongs to the field.
Alas, it’s not easy to persuade foundations to publish what they know, despite efforts by IssueLab to eliminate as many obstacles as possible to sharing.
“Foundations have, historically, wanted to control that knowledge,” Fitz said.
This is troubling.
Why don’t foundations share what they learn?
Foundations and nonprofits spend tax-advantaged money to study the world’s most important problems. We can presume that they want to be better understand how to solve them. Why, then, wouldn’t they want to share what they are learning, in any way they can? As a newcomer to the social sector, I remain puzzled. (See my previous blogposts, Should foundations be subject to Freedom of Information laws? and When foundations are uncharitable.) About the only explanation that I can come up with for the reluctance to talk more widely about their results is that personal embarrassment that talking about failure might bring. I assume that people don’t get fired from foundations for making grants that don’t accomplish their aims, but perhaps I’m wrong about that?
Since its inception, Issue Lab has collected nearly 20,000 “resources” from about 5,500 publishing organizations. That sounds impressive, but Fitz told me that it is “barely scratching the surface.” To test out that proposition, I searched IssueLab for research on several topics that are interests of mine — clean cookstoves, cash transfers and animal welfare — and found a couple of reports of value, but not much.
But do we need Issue Lab when we have search engines such as Google, I wondered? Fitz explained that Google’s algorithms tend to drive users to the websites of well-established think tanks like the Urban Institute and Brookings. They produce valuable work but too often the lessor known but incredibly valuable evidence about programs operated by nonprofits winds up buried deep in search results.
The revamped IssueLab, which rolls out this week, makes it easier than ever to upload, discover, and share research. Search functions are improved, and publishers can use Digital Object Identifiers, or DOIs, to ensure long-term discoverability across the Internet.
What would help even more would be for foundations to make it standard practice to publish all of their research to IssueLab. They could quite simply require all (or nearly all) of their grantees to publicly share any evaluations, as a condition of accepting grants. There’s precedent for this in the public sector: The National Institutes of Health has a public access policy to ensure public access to the results of NIH-funded research, through a library called PubMed. IssueLab could then rank research by timeliness and relevance, making it more useful.
In fairness, some foundations are transparent and trying to make all of philanthropy more so. Among those cited by Fitz are Hewlett, Ford, Carnegie, Rockefeller and the group of foundations that formed the Fund for Shared Insight, which funds IssueLab.
This isn’t a trivial matter. As Fitz and her colleague Lisa Brooks wrote last year in Foundation Review:
Despite our best intentions, the program officer who is considering new areas for investment still can’t do a quick search on what’s already been learned about an issue, problem, or attempted solution. The nonprofit practitioner who is shifting toward an earned-income model still can’t easily track down existing models from which to borrow. The evaluator who has been hired to understand the impact of an initiative still has no way to easily review existing evaluations of similar efforts. And the people we, as a sector, serve – those who rely on us to build on and improve the services we deliver – still bear the brunt of our failure to learn from mistakes and successes.