Practical Guide to AI for Grant Writing

expert development, grant writing, marketing & communications

Welcome to a Practical Guide to AI for Grant Writing

Version 1.0

This is a resource expected to evolve with both feedback and new developments. A new version is already underway with further discussion of policy guidelines, suggested prompts, equity issues, and new tools. It may be greatly enriched by your suggestions and insights. What did I miss? What relevant things are you doing and wondering? Please don’t hesitate to use the form at the bottom or email me to share your own experiences, questions, and suggestions so we can level up this resource. 

Jen Jayme
January 2026

Introduction

If you’d told my younger self that during my lifetime, I’d use artificial intelligence to help write grant proposals, I’d have laughed my leg warmers off. I thought I’d grow up to be a Solid Gold Dancer* instead. Yet here we are.

Solid Gold was a hot 80’s show with professional dancers in fabulous costumes doing choreographed disco to top 10 hits

We can’t deny it. AI is transforming every sector, including nonprofits, and grant writing is one area seeing rapid adoption. AI promises much-needed efficiencies for perennially overloaded nonprofit staffers, yet also raises questions about authenticity, quality, ethics and environmental impacts. In this guide, I explore key things to know about using AI for grant writing, evaluate some leading platforms, and consider the environmental impact of AI.

To be clear, this is a practical guide to AI adoption from the view of an ambivalent Gen Xer. I won’t pretend to be an early adopter—I’m more like a technically proficient curmudgeon. Like other Gen Xers, I’ve been watching the dawn of this new AI age with a mix of curiosity and crotchety hesitation. Our generation bridged the analog and digital worlds; we learned to adapt as technology rapidly evolved from beige microwave-sized personal computers to today’s smartphones. 

Yet our view of AI is tempered by Neuromancer, The Terminator, and lived experience: we also saw 8-tracks, cassettes, Walkmans, floppy disks, dial-up AOL, and Palm Pilots come and go. Technology can be an amazing enhancement, yet it can also be a distraction with hidden costs. It may save us time on tasks…but too often we fill that saved time with more work. Just when we learn the latest thing, some new technology whips around the corner. And who knows the toll it takes on our environment, health, relationships and communities?

Furthermore, AI raises critical questions around equity. As noted wisely by Michelle Flores Vryn in the AI Equity Project 2025, “AI isn’t just a tool but an infrastructure. And once infrastructure sets in—like roads, electricity, or the internet—it doesn’t just change what we do. It changes who has power, who has access, and who gets left behind.” 

Yet the AI movement and the potential is too great to ignore. It’s not an overstatement to say “Artificial intelligence is the new electricity.” And I believe in order to engage meaningfully in this movement, we have to roll up our sleeves and start learning. Publicly. So I tried out ChatGPT, Microsoft CoPilot, Claude, Google’s Gemini, DeepSeek and Grantable specifically for grant writing. After months of training and experimenting, here’s what I learned.

“Artificial intelligence is the new electricity.”
Dr. Andrew Ng
Computer Scientist, Entrepreneur, Educator
Co-Founder, Coursera

Overview: A Perspective on AI

 There’s a proliferation of AI tools and important differences between them. A basic understanding of the architecture and prompt writing skills works across all of them. It’s important to understand three things about an AI tool: 

  1. Like a chainsaw, AI is a tool that needs a human operator supervising each job. AI can generate very plausible-sounding content that contains factual errors. Don’t expect to automate everything or use AI results without applying your human judgment. It’s essential to have a human-in-the-loop to provide strategic direction, fact-checking, contextual knowledge, and an authentic voice. 
  2. AI tools differ in what universe of data they search for source information. Make sure you understand: Is it searching the web? Does it have access to some proprietary data that lives behind a paywall? Does it access your own email and cloud? 
  3. AI tools vary in whether they retain and “train” on the information you input.  Data retention policies differ – not just from one tool to another, but between free and paid subscription levels. Be sure to ask about each tool’s data retention policy, and take care with the information you input. 

TIP: There’s a list of recommended free and low-cost trainings at the bottom of this articleFor a good introduction, I recommend Anthropic’s “AI Fluency for Nonprofits” course, which lays out the art of prompt writing and emphasizes the need to apply diligent human judgment to every result. 

The Upside: AI Is The Smart Assistant I Always Needed

After taking the time to watch a training video, I personally found it pretty easy to get the hang of using various AI tools. The results improved as I took more online trainings and learned to write better prompts. Now, I won’t go back to working without using AI for grant writing – it definitely offers advantages:

  • Huge Time Saver on Drafts: At lightning speed, AI tools can produce drafts that fit character limits using the same sources I’d use – e.g., a past proposal, an annual report – and I was surprised by the quality narrative that surpassed what I’ve seen from some human assistants. A McKinsey study demonstrated that AI-driven automation reduced time for routine grant writing tasks—like research and initial drafting—by 30-50% (McKinsey Global Institute, 2023).
  • Smart Analyst: AI can also perform analytical tasks like summarizing a lengthy government RFP, aligning an answer to funder interests, and creating data visualizations much faster and more effectively than many humans. AI is a game-changer for drafting things like Work Plans and Budget Narratives for Federal grant applications. AI can even take a haphazard list of resources and organize them into a bibliography with citations in proper APA or MLA format. It not only saves time but also precious mental bandwidth.
  • Creativity Booster: To my surprise, I found AI tools provided a good muse and thought partner. It turns out AI can actually outshine human experts when it comes to generating novel ideas (Si et al., 2024).  In one study, researchers found that ChatGPT-4 generated more creative solutions to complex problems than most human participants, with AI responses scoring 15% higher on novelty and usefulness, and in another study, research revealed that ChatGPT-4 outperformed humans in divergent thinking tests, scoring significantly higher on both fluency and flexibility (Hubert et al., 2024).  I found it particularly helpful when facing a blank page or a question I didn’t immediately know how to answer.
The key thing I learned: Initially, I assumed the value of using AI was that it could completely automate grant writing tasks, yet I learned that any narrative produced by AI should always be regarded as a preliminary draft that requires refinement and verification. Even with that approach, the time and mental load AI saved me on a few proposals very quickly justified the time I invested in taking a few webinars. Now it feels like working without AI assistance is choosing to do things on Hard Mode. In fact, one of the most compelling arguments for using AI is that, as MIT research has shown, using AI can increase productivity, quality of work AND happiness in work (Brynjolfsson et al., 2023; Dell’Acqua et al., 2023). Case in point: when AI put all of my references in proper APA citation form in seconds with no effort on my part, it left me a happy camper. 
"While much of the conversation about AI has centered on efficiency, the real opportunity may lie in something far more human: happiness...The research reveals a clear correlation between frequent AI use and elevated job satisfaction."
AI at Work: New Global Study Links AI to Greater Happiness
Jabra & The Happiness Research Institute

The Downside: Sometimes Enthusiastically Misguided

I did find some problems with using AI for grant seeking. Many of these may be resolved in time as AI tools evolve. For now, these flaws should be regarded – not as reasons to avoid using AI – but rather, as potential issues to learn how to mitigate:

  • Hallucinations: AI tools sometimes plant inaccurate, false or fabricated data in very confident and credible-sounding statements or citations. It happens often enough that there’s a common name for this: “hallucinations.” I experienced this with research for this blog, and when I asked for a list of nonprofit CEOs in my county; maybe 80% of the results were correct, but enough random misstatements killed my confidence in the results. All AI tools can hallucinate, and some are worse than others. Today, data points and references produced by AI need fact-checks. Ultimately, AI tools need to be held accountable to a common reliability or trust score that hasn’t yet been developed.
  • Heartless: AI writes clean copy but often lacks the passion essential in our cause-based work. Without an added human touch, an AI-written proposal can fail to stand out to funders and may even stir a negative “uncanny valley” response. You can just tell when there’s a human behind the writing and when there isn’t. 
  • Context Cluelessness: AI tries but doesn’t really get nuance, and nuance is absolutely critical for understanding donor motivation, conveying values, and building genuine relationships with funders. 
  • Slop: “Slop” is a term used to describe low-quality digital content, and the fact that Merriam-Webster dictionary named it the word of the year indicates its unfortunate prevalence (CNN Business, 2025). My example: I tried testing various AI tools to generate section heading ideas for this article that referenced 1980’s pop culture. Each one cheerfully dished out a dozen groaners, citing the Flux Capacitor and DeLorean, Miyagi, Nintendo, even “Who You Gonna Call? AI Grant Busters.”  They were all just cringey. AI doesn’t know how or when to reply, “Human, this is just generally a bad idea and I’m going to save you from yourself by not responding.”
  • Ethics and Environmental Impact Questions: With each prompt, I couldn’t help but wonder how funders regard use of AI in proposals: do they consider it cheating, plagiarism or laziness? And what about the environmental impact?  I keep hearing AI is terrible for our planet. It turns out, as with everything else AI-related, the answers are complex and rapidly evolving. A discussion of the environmental impacts follows.
Because of these weaknesses, use of AI works best with a human in the loop. There are plenty of “work laundry” tasks that can be handled autonomously by AI, but at some point in the process, it’s best to have a human shaping decisions and controlling quality. Without a human in the loop, we end up with “workslop” – AI content that appears polished but lacks real substance and hasn’t benefited from human judgment. And workslop isn’t harmless; according to Harvard Business Review, AI-generated workslop is destroying productivity, creating more work, and wasting time for colleagues who have to fix or interpret it. 
 
So use AI, but use it sparingly and thoughtfully. As one champion of nonprofit digital transformation, Beth Kanter, capably sums up: “It’s like hot sauce, not ketchup” (Microsoft, 2025; Northern California Grantmakers, 2025). AI should be used strategically, intentionally, and in just the right doses.
"Not everything can be done with AI - it is hot sauce, not ketchup. To reap the benefits of AI and use it responsibly you have to select the right use case and mitigate risks."
Beth Kanter
Consultant & Author, The Smart Nonprofit
Microsoft Global Nonprofit Leaders Summit

Is This Trending or Not?

How Many Nonprofits Are Using AI for Grant Seeking?

Although talk of AI is ubiquitous, personally, I’ve seen relatively few grant professionals actively using AI for their daily work. According to the AI Equity Project, in 2024, most (92%) nonprofits reported that they felt unprepared for AI, and 60% expressed uncertainty and mistrust. The AI Equity Project’s freshly published 2025 report indicates most still face barriers to adoption: 60% of nonprofits report a lack of in-house expertise to assess AI tools, and only 4% have AI-specific training budgets. (Das & Vryn, 2025)  A broader 2025 report shows nearly three times as many Americans reject AI’s growing use (49%) as embrace it (17%) (Edelman, 2025). Meanwhile, preliminary findings in the soon-to-be-released Nonprofit Tech for Good Survey show that “the vast majority of nonprofits are not using AI for fundraising. In fact, the nonprofit sector has barely entered the early adoption phase.” (Mansfield, 2026)

Yet, at the same time, there’s research that shows adoption of AI among nonprofits is accelerating rapidly: separate studies by Blackbaud and Bonterra demonstrated that nearly 7 out of 10 fundraisers—about 70%—are leveraging AI tools in their work.(Blackbaud Institute, 2025; Bonterra, 2025). Of course, both are online platforms that have integrated AI capabilities and stand to benefit from a nonprofit’s fear of missing out, and their statistics may refer to adoption among their client base, or individual adoption rather than adoption at the institutional level. There’s also some research showing nonprofits are actually adopting AI even faster than the private sector, with 58% of nonprofits using some form of AI compared to 47% of B2C businesses (Fast Company, 2025; Twilio, 2024). However, those statistics may also be somewhat skewed to reflect larger nonprofits, which adopt AI at nearly twice the rate of smaller ones (66% vs. 34%). (2025 AI Benchmark Report) 

Overall, according to the 2025 AI Benchmark Report, 1 in 4 (24.6%) nonprofits are already using AI for grant writing, and 60% express strong interest in leveraging AI for grant seeking and fundraising. This trend is driven by both the need for greater efficiency and the preponderance of nonprofit work that revolves around communication tasks, for which AI is particularly well-suited.

It’s clear that AI is here to stay, and adoption is expected to accelerate faster each  quarter. Nonprofits’ ability to adopt AI will correlate largely to the resources they have previously invested in IT staff and tools. Those understaffed in IT are ill-equipped to develop AI policies and implementation plans. For workers, now is the time to begin experimenting with different tools to build fluency and learn your preferences. For leaders, now is the time to establish an AI strategy that addresses the opportunities, risks, and standards such as data privacy protocols, disclosure to funders, and staff training requirements.

“Artificial Intelligence, deep learning, machine learning — whatever you’re doing if you don’t understand it — learn it. Because otherwise, you’re going to be a dinosaur within 3 years.”
Mark Cuban​
American Businessman, Co-Owner of the Dallas Mavericks

Reviews: My Take on Specific Tools

Here’s what I found as a seasoned grant writer and AI beginner using five tools for grant writing and prospect research tasks:

Claude: Excels in conversational tone and nuanced understanding. Favored by AI insiders for emotional intelligence, and often produces less generic prose that feels more human and engaging. Also excels at handling longer documents and complex topics requiring careful explanation. Claude can process entire RFPs and long documents while maintaining high safety standards and low hallucination rates.

Claude also caters a bit more to the nonprofit sector, and integrates with Candid and Blackbaud to provide deeper insights for funder research and best practices. This makes it particularly valuable for grant writing. Claude became my personal favorite.

Data is only used in model training if users explicitly opt in, and in that case, data is retained for up to 5 years. If you opt out, the existing 30-day retention period continues.

Explore Claude for Nonprofits: toolkit & special offers.

Costs: Free tier; Pro at $20/month ($17/month annually); Team at $25-30/user/month.

Google Gemini: Touted as the “Swiss Army Knife” for those living in Google Workspace, Gemini’s strength is its seamless integration with Google Docs, Sheets, Gmail, and Drive. Responses are grounded in your own emails and files saved in your Drive as well as the broader internet.

I found Gemini to produce the best results on funding opportunity searches, with the most accurate and nuanced information.

There’s also a nice “Double-check” button feature that uses Google Search to cross-reference its own claims.

However, Gemini’s answers can sound a bit overly patronizing to the user (like, it told me every prompt I wrote was an absolutely fantastic idea), and narrative can lean towards generic-sounding “corporate-speak.”

Also, be aware that Google may use personal account data and user prompts for model training unless you manually opt out in the settings.

Costs: Free tier; Pro (Advanced) tier $19.99/month (includes 2TB of storage); high-end Ultra tier $249.99/month; Workspace business users typically pay $20–$30/user/month.

Microsoft Copilot: If you already use Microsoft Office, Copilot offers distinct advantages as it integrates seamlessly with Word, Excel, Outlook and Teams. It can draw upon data from the web as well as your “grid” of files on OneDrive and SharePoint and your calendar and chats – to craft narrative and data visualizations within the security of your existing system. It also purportedly learns your writing style over time.

It can be pricier than other options, and if you’re not already wed to the Microsoft environment, the advantages are fewer.

Copilot is not specifically trained on grant writing or nonprofit-specific language.

Prompts and responses are typically retained for up to 30 days for service improvement purposes, with data deleted after this period. Customer data is not used to train foundation large language models.

Cost: Free basic version; Pro at $20/month; Microsoft 365 Copilot at $30/user/month.

Grantable is designed specifically for grant professionals and quickly gained my respect. It features a “Smart Content Library,” tools for tracking grant opportunities, collaboration features, and AI-powered narrative writing. You can upload a PDF application and Grantable will smartly break it into sections you can work on separately.

It also lets you build a library of modular paragraphs and key data points for consistent use.

While Grantable’s AI capabilities and built-in features aren’t as extensive as the other general-purpose AI tools and it doesn’t currently offer grant search, it’s a solid investment if you manage lots of proposals and reports and use another resource for funder research.

Data privacy is a strength: Grantable doesn’t sell user data or use it for AI training—your content remains yours and is handled with strict privacy standards.

Cost: Pricing includes a free tier with limited AI credits and a Starter plan at $24/month, with Pro paid plans available.

ChatGPT: Excels at rewriting content to improve clarity or shorten to fit character limits. Free version is great for emails, fine for first drafts and brainstorming.

Just don’t let it write your mission statement unless you want something that sounds like a motivational poster. And don’t trust it blindly—fact-check everything. I found ChatGPT to generate the most inaccurate information of all of these tools.

In fact, a study of mental health literature reviews found that ChatGPT (GPT-4o) fabricated roughly one in five academic citations, and more than half of all citations (56%) were either fake or contained errors (Linardon et al., 2025). Also, be aware that ChatGPT retains user chats for Free and Plus users indefinitely unless manually deleted, with a 30-day server removal process after deletion. And by default, conversations may be used for model training unless you opt out in Settings.

Cost: Free basic version; Plus at $20/month; Pro at $200/month.

DeepSeek: A newer, cost-effective AI platform—free to use with very low API costs, generating answers three times faster than GPT-4. It supports multiple languages and excels at technical and logical tasks, making it ideal for high-volume, technical work where budget matters.

However, I personally found DeepSeek less user-friendly, with a less intuitive interface, and didn’t love the text-only limitation (it cannot generate images or data visualizations like other tools).

It also lacks nonprofit-specific features like grant database integration, and there are privacy concerns due to data sent to China and content restrictions on certain topics.

Costs: Free tier with limits; API pricing is $0.28 per million input tokens and $2.19 per million output tokens, making it much cheaper than competitors.

Top Picks: Tools of Choice

  • Best for compelling narrative: Claude
  • Best for Microsoft ecosystem integration: Copilot 
  • Best for Google ecosystem integration and upcoming deadline research: Gemini
  • Best for grant-specific workflows: Grantable
  • Where to start? I recommend you choose the AI tool that integrates with most of your work – either Microsoft Copilot or Google Gemini. Develop some comfort and fluency with prompt writing in that platform, then branch out and experiment to see what you like. 
  • Personally: I’ve come to favor different tools for different jobs, and I don’t mind switching around. I use Claude for narrative – hands down the best writer – and Gemini for prospect research. When I need to produce a slide deck or a complete Word document, like a project timeline, or extract info from emails, I use Microsoft Copilot. The prompt-writing skillset works across all of these. I also found Grantable uniquely helpful in organizing a pipeline of funders, deadlines, and grant writing tasks, and surely some would favor it. I just have a system for funders and deadlines that I prefer to use. Lastly, for the work I do, I haven’t found myself compelled to use ChatGPT or DeepSeek beyond my initial testing. 

What About the Environmental Impact of AI?

As residents of this fine planet, particularly ones in the business of social good, we cannot overlook the environmental impact of AI. It takes massive computational power to train and run large language models. Millions of mathematical operations performed by graphics processing units (GPUs) and servers collectively contribute to drains on three resources:

  1. High Energy Consumption: The powerful server farms (data centers) housing
    the AI models require immense amounts of electricity—often up to ten times
    more per query than a standard search engine query (Bourcier, 2025)—and
    in many regions, this power is still sourced from fossil fuels, increasing
    carbon emissions (Kanoppi, 2025; Polytechnique Insights, 2024).
  2. Water Usage for Cooling: To prevent these high-power servers from
    overheating, data centers use huge quantities of water to cool them. Every
    100-word prompt uses the equivalent of roughly one bottle of water. This
    puts a significant strain on local water supplies, particularly in
    drought-prone areas, as much of that water is lost to evaporation (World Economic Forum, 2024).
  3. Hardware & E-Waste: The specialized, high-performance hardware (GPUs/TPUs)
    needed to run AI is material-intensive to manufacture, requiring the
    mining of rare earth minerals and eventually contributing to the growing
    problem of electronic waste.

A single complex AI prompt may result in emission of several grams of CO₂ and processing just three prompts could add up to 10–20 grams—roughly the energy used to brew a cup of coffee or power a standard LED light bulb for several hours. These impacts multiply exponentially as more people and organizations adopt AI.

The AI platforms I’ve explored all say they’re working on various strategies to improve their energy-efficiency and mitigate their environmental impacts. Mainly, they’re all striving to optimize infrastructure to reduce the carbon footprint for each prompt. Both Microsoft and Google have stated goals to be carbon-free by 2030. Google also currently matches 100% of its annual global electricity use with renewable energy purchases.

So which tool is the best for the environment?  Hard to say. The carbon footprint appears pretty similar across these AI tools today, and there’s no reliable universal framework – like a trusted environmental impact score – that enables climate-conscious users to compare the eco-friendliness of AI tools. But there are developing efforts to allow for “apples-to-apples” comparison like the AI Energy Score and the Functional Unit (FUEL) Framework, and both the European Union and the U.S. government are working on regulations to mandate reporting on energy use, resource consumption, and lifecycle impacts. Here’s a call for social impact organizations to play a critical role in pushing for a common ecological impact framework as AI adoption inevitably progresses. And it seems AI use will, inevitably, progress.

In the meantime, for the everyday worker, know that a human writer working without AI actually produces a much larger environmental impact. A human in the US authoring a single page typically generates about 1,400 grams of CO₂ given the energy required to support human activity during the writing process. That’s nearly the CO₂ generated by taking a 15-minute hot shower (using an electric boiler), or driving a standard gas-powered car for about 4 miles. AI systems actually emit between 130 and 1,500 times less CO₂ equivalent per page of text generated compared to human writers, and between 310 and 2,900 times less CO₂ per image than human illustrators. Even the act of using a laptop is more carbon-intensive for human writers than for AI systems performing the same task, as AI handles computations more efficiently at scale.

No matter what AI tool you choose, the per-page emissions will be hundreds or even thousands of times lower than human writing on a laptop. In this sense, use of AI could actually help cut down overall emissions

(Luccioni et al., 2023)

Nevertheless, it’s still essential for organizations to remain mindful of their total AI usage and prioritize vendors that demonstrate a clear commitment to environmentally responsible operations. The cumulative impact of frequent prompts can still add up. Further, the conversation doesn’t stop at emissions alone. Analyses focused on direct outputs don’t always account for wider considerations such as professional displacement or the legal complexities around AI training data.

AI is still a rapidly evolving space. For me, the environmental advantage of AI is clear for certain writing and illustrating duties, and true leadership demands we balance productivity, efficiency, and ethical responsibility.

Let’s pursue digital transformation with a balance of boldness and care to ensure our progress is both purposeful and sustainable. 

Conclusion: Use the Tech, Keep the Soul

As a generation that navigated the shift from card catalogs to Google, Gen Xers bring a valuable perspective to this emerging AI movement: we know that every “miracle” tool comes with a terms-of-service agreement written in fine print. AI is no different. It’s an incredible yet flawed intern—one that’s super fast, occasionally brilliant, but prone to making things up if it thinks it’ll please you.

Use AI to streamline grant writing and improve efficiency, but remembr it’s no substitute for human passion, creativity and judgment. In the world of philanthropy, our greatest asset isn’t our ability to type faster; it’s our ability to build relationships, understand nuance, and tell a story that moves a human reviewer. Use AI to clear the “blank page” hurdle or to summarize that 80-page RFP, but don’t let it sit in the driver’s seat.

We’ve survived the transition to the digital age by staying adaptable and keeping our “BS detectors” set to high. Let’s approach AI the same way: with our sleeves rolled up, our eyes wide open, and our hands firmly on the keyboard. After all, the robots might be able to write the narrative, but they’ll never have the heart for the mission.

Right now, AI won’t replace us grant writers. But it can free us from the grind so we can focus on what really matters: building relationships, telling authentic stories, and maybe even leaving work before sunset.

Some Learning Links

LIVE
February 12, 2026 | 2:00 pm ET
AI Can Turbocharge Your Grantseeking: Here’s How
Chronicle of Philanthropy
Early bird price of $69 ends on January 15, 2026 at midnight ET. Specifically focused on use of AI for grant seeking. 

January 28 – February 11, 2026 | 3 Wednesdays, 1.5 hour sessions
Certificate in AI for Marketing and Fundraising
Nonprofit Tech For Good
$229 for the series. Certificate upon completion.

ON DEMAND

AI Fluency for Nonprofits
Anthropic Academy
Focused specifically on use by nonprofits. Self-paced in five 15-min increments. Certificate upon completion.

Introduction to Prompt Engineering
Learn Prompting
60+ content modules, 13 languages, interactive examples, Discord community

Elements of AI
University of Helsinki & MinnaLearn
1.8M+ students enrolled, ~25 hours, available in all EU languages

Google AI Essentials
Google (Coursera)
10-15 hours, certificate upon completion, free or $49/month with 7-day trial on Coursera

Prompt Engineering for ChatGPT
Vanderbilt University (Coursera)
Beginner-friendly, ~20 hours, 576K+ students enrolled, teaches prompt patterns and applications, certificate upon completion

Enhancing Productivity for Grant Management with AI (Parts 1 & 2)
For AFP Members. Specifically addresses using AI for grant writing and provides a few good tips not found elsewhere.

Bibliography

Blackbaud Institute. (2025). The status of fundraising 2025: Global research insights on fundraising in the AI era. https://help.blackbaud.com/docs/0/assets/guides/re/reports.pdf

Bonterra. (2025). The AI readiness path: Key insights for nonprofits and funders.

Bourcier, L. (2025, August 19). What impact does AI have on the environment? Gonzaga University. https://researchguides.gonzaga.edu/az/databases

Brynjolfsson, E., Li, D., & Raymond, L. R. (2023). Generative AI at work. National Bureau of Economic Researchhttps://doi.org/10.3386/w31161

CNN Business. (2025, December). Merriam-Webster’s 2025 Word of the Year takes aim at poor AI content.

Das, M., & Vryn, M. F. (2025). The AI Equity Project reporthttps://aiequityproject.my.canva.site/2025

Dell’Acqua, F., McFowland, E., Mollick, E. R., Lakhani, K. R., & Resutek, C. J. (2023). Navigating the jagged technological frontier: Field experimental evidence of the effects of AI on knowledge worker productivity and quality. Harvard Business School Technology & Operations Mgt. Unit Working Paper No. 24-013https://doi.org/10.2139/ssrn.4573321

Edelman. (2025). 2025 trust barometer flash poll.

Fast Company. (2025, July). The AI-native nonprofit era is coming.

Hubert, K. F., Awa, K. N., & Zabelina, D. L. (2024). The current state of artificial intelligence generative language models is more creative than humans on divergent thinking tasks. Scientific Reports, 14(1).

Jabra & Happiness Research Institute. (2024). AI at work: New global study links AI to greater happiness. https://www.jabra.com/ai-at-work

Kanoppi. (2025). Search engines vs AI: Energy consumption compared. https://kanoppi.co/

Linardon, J., Jarman, H., McClure, Z., Anderson, C., Liu, C., & Messer, M. (2025). Influence of topic familiarity and prompt specificity on citation fabrication in mental health research using large language models: Experimental study. JMIR Mental Health, 12, Article e80371. https://doi.org/10.2196/80371

Luccioni, A. S., Jernite, Y., & Strubell, E. (2023). Power hungry processing: Watts driving the cost of AI deployment? arXiv preprint arXiv:2311.16863. https://doi.org/10.48550/arXiv.2311.16863

Mansfield, H. (2026, January 9). New data reveals most nonprofits aren’t using AI for fundraising. Nonprofit Tech for Good. https://www.nptechforgood.com/2026/01/09/new-data-reveals-most-nonprofits-arent-using-ai-for-fundraising/

McKinsey Global Institute. (2023, June 14). The economic potential of generative AI: The next productivity frontier. McKinsey & Company.

Microsoft. (2025, March). Global nonprofit leaders summit.

MIT News. (2025, January 17). Explained: Generative AI’s environmental impact. https://news.mit.edu/

Morrison, J. (2025). Holistically evaluating the environmental impact of creating language models.

Nonprofit Pro. (2025, February). 2025 AI benchmark report: How artificial intelligence is changing the nonprofit sector.

Northern California Grantmakers. (2025, December). Panel on AI readiness.

Polytechnique Insights. (2024, November 13). Generative AI: Energy consumption soars. https://www.polytechnique-insights.com/en/columns/society/managers-and-information-have-times-changed/

Si, C., Yang, D., & Hashimoto, T. (2024). Can LLMs generate novel research ideas? A large-scale human study with 100+ NLP researchers. Stanford University.

Twilio. (2024, September). State of nonprofit digital engagement report 2024.

World Economic Forum. (2024, November 7). Circular water solutions key to sustainable data centres. https://www.weforum.org/

What Do You Think?

As mentioned at the start, this guide is expected to evolve with both feedback and new developments. It would be greatly enriched by your insights! What did we miss? What relevant things are you doing and wondering? Please share your own experiences, questions, and suggestions so we can further develop this resource.

2 Responses

  1. Jill Alexander says:

    I’m about to be 82, ten years retired. Grant writing and management was one of many responsibilities during my non-profit career. I won’t be writing anymore grants, but if I were to get back in the game, your GUIDE would be on top of my list as an intelligent, user-friendly introductory resource. It includes clear, concise, comprehensive information and an honest assessment of available tools. “Use the tech, keep the soul” is great advice! I’m sure your GUIDE will help today’s grant writers make thoughtful use of AI. You’ve encouraged me to confront the “monster”. Let good judgment prevail!

    • Jill, This is treasured praise coming from someone with your years of experience. Thank you for taking the time to read and comment. My hope is to create a bridge to discovery and thoughtful use of AI for social good. If you managed to make it to the end and feel more ready to confront the monster, that’s a win!

Leave a Reply to jeniferjayme Cancel reply

Your email address will not be published. Required fields are marked *