In our March 17 webinar, TABridge co-organizer Allen “Gunner” Gunn presented developing work by Aspiration on an issue that is widespread but rarely addressed: the ability of philanthropies to thoroughly assess technology proposals from NGOs and smaller advocacy groups. Aspiration helps nonprofits and foundations use technology more effectively and sustainably.
A full archive of the Webinar is available to play online. You can also download the slide presentation (pdf). The tools presented continue to undergo refinement, so if you are interested in using them and providing feedback, please see below for ways to get involved.
Aspiration’s “donor checklists” have been in development since a 2013 “Cautionary Story Sprint” event led by James Vasile of OpenITP, said Gunner, with refinements taking place during the 2014 Nonprofit Development Summit. Both events were sponsored by the Ford Foundation, who also support the Transparency and Accountability Initiative (T/AI).
TABridge’s “Fundamentals” Guide provided additional material for the donor recommendations, Gunner said. Our TABridge network was designed with donors as required participants in the transparency sector’s progress toward smarter, more strategic uses of technology. See notes from our second Bridging Session event for more about donor/grantee collaboration.
Because digital technology serves an increasing number of roles for civil society organizations, foundations and donors who support these groups face an increasing demand to consider technology-centric or technology-backed funding proposals. But the programs and proposals that donors encounter may suffer from one or more of these common challenges:
- The relationship between proposed tools and the strategies they support is often unclear, or missing.
- The proposal for a new technology isn’t supported by a clear understanding of the community to be served.
- Ambitious claims about new functions or broad impacts are difficult to verify or to rebut.
- Capacity questions about the ability of the proposing parties to bring technology to bear go unaddressed.
- Up-to-date field scans of existing technology options in a given context are the rare exception rather than the rule.
To give philanthropy teams a framework for solving these challenges, Aspiration and its community of experts have been drafting “technology proposal review checklists” for donors. Their goal, said Gunner, is to provide context and coverage in considering appropriateness, uniqueness, viability and sustainability of technology-related funding proposals.
As the TABridge team has noted on many occasions, a technology plan that sounds exciting or sophisticated frequently is not the one that is most likely to succeed. Reviewing your plan, you should look for “look for outcomes, not acronyms” or other jargon, said Gunner. “Innovation is good to look for,” he added, “but look for precedents too,” examples that demonstrate the proposed tactics have some value.
Gunner then walked our international webinar audience through the six donor checklists currently under development, which suggest reviewing tech proposals, based on: strategy and focus; scope and Implementation; organizational factors; security; intellectual property and “red-flag words.”
“This framework is a work in progress,” Gunner said. “It has not yet been used ‘in the wild,’ and we welcome feedback and questions.”
To assess the Strategy and Focus of a tech proposal, a funder should ask if the technology supports a (believable) larger plan. Checklist notes included:
- Will the technology really support that strategy?
- Does the strategy stand on its own?
- Can the applicant articulate the abilities, needs, and challenges of the intended users of this technology?
- Does the organization know what unique barriers their desired audience might face participating in engaging, or have a plan to overcome them
Specificity can be one sign of strategic thinking Gunner said. “We’ll write an app for the Cloud!” is not a strategy.
When looking at the Scope and Implementation of a proposed plan, Aspiration suggests asking if it is “iterative, collaborative and transparent.” Does it, for instance:
- Propose creating new or improving existing software or web-based tools?
- Build on existing, proven components?
- Clearly articulate defined milestones that convey and intentional, incremental approach to project realization?
- Explain how users and target communities will be involved throughout the course of the project?
- Include a commitment to provide regular updates to all stakeholders?
- Provide for ways to track success indicators, such as new users or data uploaded?
While an NGO seeking support may count on new tools and outside experts to take it to a new level, Organizational Factors are crucial to understanding if a proposal is viable. As Gunner put it, “Are the principal stakeholders up to the task?” For instance:
- Will the applying organization be the group to use the technology, the developer building the technology, or some combination of both?
- Do they have the internal technical or external advisory capacity to execute the project?
- Have they done similar or related projects to establish a track record?
- How will the project be maintained after this funding has finished? Is there a long term support plan?
- Can staff operate the technology by themselves, or will they require long-term developer support?
Aspiration warns that “it is really hard to be good at ‘tech-ing’ and nonprofitting. Beware nonprofits developing technology in-house.” Another risk mentioned was vendors who seek to “bake in” the need (and cost) of a permanent role for themselves even after a project is complete.
The most comprehensive set of considerations presented were in the area of Digital and Information Security, from basic questions like whether a project has associated security risks, or whether an organization has conducted any “thread modeling,” to deeper questions like the need for ongoing audits of new technology as it evolves, or whether provisions exist to train administrators to move information into or out of new systems—to backups, for instance—with adequate encryption. Other key questions for privacy and security included:
- Does the project involve a sensitive topic or subject that could put staff, users, allies, or anyone else in physical or other danger?
- If data is being collected that could potentially put a user at risk, the user should be told in clear language before their information is collected. This practice is called “informed consent.”
- Does this project collect or store any personal or identifying information that, if intercepted by an adversary, could get anyone in trouble?
- Does the handling of this data bring enough benefit to offset the risk of it being intercepted by a third party?
- Is there a training plan to ensure collaborators apply security tools properly and securely?
- If the project incorporates pre-existing tools for anonymity, privacy, cryptography, or security, have those tools been adequately reviewed and tested?
- How does the security of this project compare to others in this space?
The last two checklist topics under development by Aspiration included Intellectual Property and “Red-Flag” Words—words used by organizations that, “if they write them, you should probably question them,” as Gunner put it.
The questions around Intellectual Property (IP) focused on whether the proposed project produces “reusable assets,” code, content or data that would or should be reused in the future. Is there any reason to restrict such use, for instance if IP terms in any related contracts preclude it?
Returning to the dangers of jargon, Aspiration’s checklist of “Red-Flag” Words offered a primer in phrases that should raise eyebrows during proposal reviews: over-used terms like “encryption,” “anonymized data,” “future-proof,” “NSA-proof,” the especially-jargony “best-of-breed,” and even the term favored by so many in the transparency community, “open.”
For “open,” said Gunner, “no universally applied definition exists. When a project describes itself as open, work to understand what that is intended to imply.”
Aspiration will be updating and disseminating these checklists over the next several months, and seeking feedback from donors, technologists and large and small organizations. They are also seeking “early adopters” to try out the checklists, so feel free to contact them at email@example.com, or visit aspirationtech.org.
Allen Gunn (@allengunn) of Aspiration is a co-organizer of the #TABridge network and works to help NGOs, activists, foundations and software developers make more effective use of technology for social change.
Ruth Miller (@mcplanner) is Lead Technology Strategist for TABridge co-organizer Aspiration, where she researches and documents Aspiration’s best practices for justice-oriented technical development.
Illustration by Valentina Cavallini, from TABridge Guide to Fundamentals for Using Technology in Transparency and Accountability Organisations.