What funders really think about AI-generated grant applications

Posted on 15 Oct 2025

By Matthew Schulz, journalist, Institute of Community Directors Australia

DALL E2023 Film Noir Typist

Artificial intelligence (AI) is becoming an essential tool for not-for-profits seeking to win grants, but the trend is not without risks.

In a white paper presented to the Education Network’s recent Charities and Not-for-Profits Conference, Catherine Brooks, CEO of Equitable Philanthropy, outlined both the opportunities and challenges of using AI in fundraising, and how funders are responding.

AI as a support, not a substitute

Catherine Brooks
Catherine Brooks

Brooks said AI should support, not replace, organisational expertise. She said that funders want to see strategic clarity and community insight in applications, which she described as a a human skill.

The CEO of the Geelong Community Foundation, Amy Waters, says in the white paper, “It is essential that charities do the initial work to clearly define the activity that they are seeking funding for. This can’t be AI driven. It needs to be informed by the applicant’s understanding of their target group and their real-world experiences in supporting their beneficiaries.”

Brooks said human insight and connection builds the foundation of trust, a task that AI cannot do.

Funders’ perspectives on AI in grant applications

Equitable Philanthropy surveyed major funders on their views about AI use in grant submissions to develop the white paper.

Brooks said most funders were now comfortable with the use of generalist tools like ChatGPT, or specialist grantwriting tools such as Drafter, which was developed by the Funding Centre. And she believes they can improve the quality of applications. At this stage, very few funders require applicants to disclose their use of AI.

Gill Whelan from the Decjuba Foundation told Equitable Philanthropy, “AI tools should be used as a starting point only. Make sure you imbue your writing with the unique tone of your organisation to avoid sounding like everyone else.”

Brooks stresses in the white paper that authenticity is critical, suggesting that AI could help generate drafts and structure responses, but that the organisation’s mission and voice must stand out.

“It is essential that charities do the initial work to clearly define the activity that they are seeking funding for. This can’t be AI driven."
CEO of the Geelong Community Foundation
Why your organisation needs an artificial intelligence governance framework

Practical guidelines for responsible use

Brooks provided several recommendations to guide ethical and effective use of AI in grant writing:

  • Do not outsource program logic to AI: Define your objectives and theory of change using your own knowledge.
  • Use AI to polish, not plan: Refine your writing, but develop strategy independently.
  • Check and validate outputs: Review AI-generated content against real-world experience.
  • Stay human: Prioritise emotional understanding and lived experience in applications.

She said these principles reflected trends in the sector, in which AI is viewed as a tool, but not a decision-maker.

Trust and transparency in the AI era

Brooks said trust remained central to philanthropy, and organisations must approach AI with transparency and care. She noted that while funders were increasingly expecting NFPs to be open about how they use AI and data, there were no compulsory disclosure requirements.

Nevertheless, she encouraged organisations to publish responsible AI usage policies, predicting that the practice would soon be seen as a marker of ethical governance.

The SmartyGrants-affiliated Institute of Community Directors Australia provides guidelines, policies and templates on the use of AI.

Sector implications and next steps

Brooks said the Funding Centre’s Drafter project, which aims to help organisations write stronger grant applications, was a sign that AI would become more integrated into grantmaking processes.

She recommended that NFPs:

  • begin trialling AI tools under clear internal guidelines
  • develop or update their AI fundraising policies
  • proactively seek funder feedback
  • preserve their mission, voice and community connection in all communications.

Brooks said that while AI offered clear efficiency gains for NFPs, those benefits depended on how the technology was used.

She said using AI was “not just a technical matter, but a question of governance, integrity and long-term impact”.

Responsible use of AI must be grounded in trust, transparency and authenticity, she said.

“Technology should amplify your values, not distort them.”

More from Community Directors Intelligence