AI Is Here (and it’s not going away!): How Canadian Non-Profit Leaders Can Use It—Safely and Strategically
That happened fast, didn’t it? Artificial Intelligence isn’t coming—it’s already here.
For Canadian non-profits, it’s easy to assume AI is for big tech companies or heavily funded start-ups. But in today’s economy, where demand for services is growing, and resources are becoming harder to secure, looking for ways to integrate AI in an intentional and ethical way makes sense for Executive Directors and Boards of Directors.
Practical Ways Non-Profits Can Use AI Right Now
Save Time on Admin & Reporting Tasks
The most obvious way to “dip the proverbial toe” into the AI game is to use it to save time on administrative tasks. AI tools like ChatGPT, Microsoft Copilot, and Google Duet can help draft board reports, donor letters, grant applications and internal communications. I’ve found it to be especially helpful for taking meeting notes and creating meeting minutes.
Resource: The Project Management Institute has many courses focused on using AI to manage projects.
Improve Fundraising and Donor Stewardship
For larger non-profits who maintain a CRM system, Canadian tools like Fundraising KIT use AI to analyze donor behaviour, suggest optimal outreach timing and personalize messages at scale. Evan for smaller organizations who want to increase their resource development activities, it’s worth taking a look at the technology as it develops, because it changes quickly.
Resource: Fundraising KIT has several free resources on their website – take a look and give one of them a try!
Strengthen Volunteer Management and Engagement
AI can help smaller non-profits by automating volunteer coordination tasks such as scheduling shifts, sending reminders, and tracking volunteer hours. Some AI tools even analyze volunteer data to identify engagement patterns and improve retention.
Resource: Check out online systems such as Sign Up Genius and Track it Forward – both systems use AI for components of their analysis.
Risks to Watch Out For:
Reproducing Bias or Exclusion in Program Delivery
AI systems can unintentionally reinforce systemic inequities—especially when built on biased data or applied without community input. For example, an AI chatbot that only “understands” formal English might exclude newcomers, youth, or people with disabilities. One way to mitigate this risk is to involve those with lived experience in testing the tool and asking “who might this tool leave out?”.
Data Privacy and Compliance
Uploading sensitive data into free AI tools may violate Canadian privacy laws or funder agreements. Using AI often means uploading data to third-party platforms—sometimes outside of Canada or outside your control. This can expose sensitive donor, volunteer, or program participant information to unintended use. To mitigate this risk only use PIPEDA-compliant tools and ensure that staff is trained properly on what data can and cannot be entered into the system. Finally, it’s important to communicate transparently with stakeholders about AI use and privacy.
Overuse and/or Staff Resistance
I’d bet that nearly everyone on your team already has an opinion about AI. Some are likely using it already—quietly drafting emails or brainstorming with ChatGPT—while others may view it as a threat, not a tool, and avoid it altogether. There’s a wide spectrum of engagement levels taking place in the non-profit sector right now. The examples shared in this article are intentionally low-risk and supportive—but even these can raise concerns. To reduce resistance, leadership should be clear and proactive: AI is here to support your team, not replace it. Involve staff in exploring how these tools can be used safely and meaningfully in your context. When people feel included and informed, they’re more likely to lean in than push back.
Three Things EDs Can Do Right Now to Set the Tone for Safe, Strategic AI UseBottom of Form
Pilot Something Small. Try using AI to draft a policy, write a thank-you letter, or analyze a survey. Invite a few members of the team to also join and provide feedback.
Create internal guidance. Who can use AI, and for what? What data is off-limits?
Connect with peers. Ask colleagues in professional groups what they are doing and how they are using AI in their non-profit. There are many resources on the internet that can also help.
Three Questions Every Non-Profit Board Should Be Asking About AI Right Now
Are we already using AI tools—and how?
You might be surprised what staff have already started testing.Do we have the right governance in place?
If not, start with light-touch principles: equity, privacy, transparency, mission-alignment.Are we supporting strategic innovation?
AI should be a board-level conversation—not just a back-office experiment.
AI may not be a silver bullet for your non-profit – but it can be a smart tool in the toolbox. Start small. Stay strategic. And above all, make sure the use of AI strengthens – not sidesteps – the organization’s mission and values. I’m here to help! If you need help looking at how to integrate AI into your non-profit, or developing your AI policy and procedures, reach out to chat!
Is Your Clinic Ready for AI?
Practical Tips/Reminders for Canadian health Clinic Owners to support data protection and privacy for their clients.
Practical Tips to Consider
Disclaimer: I’m not a legal expert or privacy regulator. This article is to help distill key themes and offer some practical starting points for clinic leaders.
We Create a LOT of Data—But Who’s Using It?
As humans interacting with phones, wearables, and digital health systems, we create copious amounts of data every day.
The average person generates hundreds of gigabytes of data daily—and potentially terabytes per year (IDC & Seagate, 2020).
In healthcare, even where patient records and imaging contain rich insights, only 3–5% of this information is actively used to improve care (IBM Watson Health, 2021; Deloitte, 2023).
A 2023 Canadian Medical Association survey found that only 20% of Canadians feel confident they understand how their health data is being used.
And the truth is—many health providers don’t either.
As AI tools enter clinics and data volumes grow, these transparency and trust gaps are only getting more urgent.
Canada’s AI Momentum in Healthcare
AI is reshaping healthcare in Canada—from drug development and diagnostics to administrative workflows, telehealth, and triage. We're seeing new technologies emerge across research, clinical, and operational functions.
That’s not necessarily a bad thing.
When implemented ethically and safely, AI can help scale research, detect patterns in patient populations, reduce clinician burden, and improve care coordination. It can even enhance administrative efficiency through automated note-taking and scheduling.
But it’s not all smooth sailing.
Federal AI-specific legislation (like Bill C-27) is still in progress.
Provincial regulators are updating privacy guidance in real-time.
And EMR vendors or AI startups may use data in ways clinicians and patients don’t expect—or consent to.
The Privacy & Security Reality for Clinics
Privacy and data security are among the top concerns in AI adoption across Canadian healthcare.
Only 21% of Canadian physicians feel confident that AI tools can protect patient privacy (CMA, 2023).
The sector has experienced over a dozen high-profile cyberattacks in recent years—including attacks on SickKids Hospital and Newfoundland’s provincial health system.
Canada’s privacy regulators are actively investigating how data from clinics is used by vendors, including for AI model training.
For small- to mid-sized clinics, this can feel overwhelming. But you don’t need to be a privacy lawyer to take meaningful action.
What Can Clinic Owners Do?
Here are five practical actions to consider:
1. Conduct a Privacy & Security Review
Read your EMR’s Data Processing Agreement (DPA) or Terms of Service. Confirm that they won't use any of the data without explicit consent from you and/or the patient.
Ask whether data is stored in Canada (this affects legal jurisdiction).
Ensure the vendor isn’t using patient data for AI model training or analytics—unless you’ve explicitly agreed to it (see the first bullet above).
2. Ensure Informed Consent
If you’re using AI tools (like note-taking assistants), be transparent with patients.
Explain how their data is used, whether it’s stored, and if it’s shared.
3. Appoint an AI/Privacy/Security Designate
This could be you as the clinic owner, or a clinic manager.
Someone who has clear expectations and is empowered to become informed in a quickly-changing landscape.
4. Track Evolving Regulations (see point #3)
Stay informed about provincial laws (like PHIPA, PIPA, Law 25 in Québec).
Monitor updates from the Office of the Privacy Commissioner, CMA, and your own professional college or association.
5. Create an AI Policy for Your Clinic
Even a one-pager outlining how you use AI and manage patient data builds accountability and trust.
Communicating With Patients
Trust is the foundation of all healthcare relationships.
Clearly explain how digital tools and AI enhance care—from reminders and record-keeping to personalized recommendations—while emphasizing your commitment to protecting personal health information.
A Final Note for Patients (really, all of us!)
If you’re a patient reading this, I encourage you: ask questions. You have a right to know:
What data is being collected
Where it’s stored
Whether it’s being used for anything beyond your care
Read the forms you sign when you start with a new clinician. Healthcare is becoming more digital and data-driven every day. The more informed we all are, the better choices we can make—for our care and our privacy.Don’t worry about sounding professional. Sound like you. There are over 1.5 billion websites out there, but your story is what’s going to separate this one from the rest. If you read the words back and don’t hear your own voice in your head, that’s a good sign you still have more work to do.
Be clear, be confident and don’t overthink it. The beauty of your story is that it’s going to continue to evolve and your site can evolve with it. Your goal should be to make it feel right for right now. Later will take care of itself. It always does.