I may be late to the party (fashionably so I hope). I’ve begun thinking about what policies and guidance are going to be needed and might work best for community-led charities using AI. So I asked ChatGPT!

Using slightly different language or perspectives each time, such as service clients, project participants, staff and board members, focusing on funders, ethical, and strategic, I asked the app to write guidance or policy around AI for community organisations.  Co-authored by  ChatGPT and me here are some things you might want to think about.

*ChatGPT and myself are both flawed so this won’t cover everything and, as ever, it’s best to make sure you are considering things from your own unique understanding and perspective.

Knowing what AI is

Talk about ‘AI’ is so commonplace that its easy to assume that everyone knows what it is. You and your colleagues might want to discuss and adopt a simple definition. You can use your definition as a preface to policies or guidance you develop to help ensure shared understanding. Something like this:

“At [our organisation] when we talk about AI, what we mean is [your definition].

ChatGPT said:AI, or Artificial Intelligence, is a type of technology that allows machines to learn from experience, adjust to new information, and perform tasks. This includes things like understanding language, recognizing patterns, solving problems, and making decisions. AI is accessible via online platforms such as ChatGPT.

Purpose

Why are we using AI? What do we want it to help us to do? Some AI uses might be great for your organisation, and others might not be. Its important to talk about this and make decisions. You might use it:

  • to assist in supporting clients;
  • to help with communications;
  • to help with management systems, policies and guidance;
  • to help with fundraising
  • to help with presenting information

Many community organisations run on trust and engagement between individuals.  Talk about how using AI could support or create barriers to your relationships with people and other organisations? When is using AI not right for you?  Using AI should be serving the overall mission of your organisation. If not, then why are you using it?

ChatGPT said: The return on investment for AI may be uncertain. The benefits gained might not justify the initial and ongoing costs, especially if the AI applications do not significantly enhance the charity’s operations.

Respect and Privacy

Nobody wants to be conned into thinking a machine is a real person, or to think a person did something when in fact it was a machine. How are you going to be open and transparent about the ways that you are using AI? It’s important to be open with all your stakeholders (members, clients, staff, board, funders etc) about when and how you are using AI.

How are you going to ensure that the privacy of anyone interacting with AI is protected? What are you doing to make certain you are knowledgeable and assured in your AI use? Can you create ‘opt-out’ provisions for clients or stakeholders, and what would it mean for them if they don’t want to engage with AI?

ChatGPT said:  If an AI tool influences a decision or recommendation, explain how and why. This helps everyone understand the role AI is playing.

Ethical and Responsible Use

AI is flawed. It is being built by people who lived in flawed societies.  AI has been proven to create images and materials which are stereotypical and discriminatory.  It’s very important to think about this when considering images or materials provided by AI.  How are you going to be sure that your AI use supports you to be fair and equitable, and to ensure your organisational values are being upheld?

AI is amazing! It’s tempting to be in awe of how it can make things easier, but remember, it is flawed.  It can spit out things that sound like facts but are not reality and sadly it stereotypes and perpetuates discimination. If we are going to use AI we need to think about risks and how we are going to manage them. How are we going to ‘bake-in’ human oversight, critical assessment, and quality control. How will we ensure the fairness and accuracy of materials or systems we haved used AI to create? AI is a tool and just as with any other tool, we need to evaluate its effectiveness, gather feedback and plan for continuous assessment and improvement.

ChatGPT said:  Ensure AI is used in a manner that respects all staff members and promotes inclusivity. Avoid using AI in ways that could lead to unfair treatment or bias.  Ensure AI is used in a manner that respects all staff members and promotes inclusivity. Avoid using AI in ways that could lead to unfair treatment or bias.

Support, Training and Resources

We need help! AI is such a fast developing technology.  Community organisations need to support their teams and stakeholders to understand its use and be effective.  Policies and guides can be a massive help but we need other types of support too.  Think about group discussions, training courses, webinars and reflective feedback as well as practical things like platform licenses and software.
ChatGPT said:  AI technology is always evolving. Stay informed and adjust your practices to be sure it serves you.

Posted in

Leave a comment