Need Help?

Skip to Content

CCA Portal

RESOURCES FOR
THE CCA COMMUNITY

Responsible Use of Generative AI at CCA

Last updated on Oct 24, 2025


New Generative AI tools and platforms are rolling out faster than any other technology before it. While the capabilities of these new tools and platforms are very compelling, there are significant and specific concerns around security, privacy, accuracy, and bias.

The following guidelines for the use of Generative AI at CCA is intended to promote best practices for using generative AI in a way that protects everyone's personal and institutional privacy and security.

Please note that while this document provides best practices and example uses, you should also refer to policies that address allowable and prohibited use of generative AI at CCA:

Data privacy and security

While Generative AI may be new, it is just one of many technologies in use at CCA. And, as with all technologies, the use of AI is subject to CCA’s Acceptable Use of Campus Technology (AUP) and Information Security Policy.

Additionally, the use of generative AI brings with it specific security and privacy concerns. Most Generative AI tools learn from the information shared with them, incorporating this content into their data models. This may lead to the information being reused and exposed in unpredictable ways.

For example, an instructor wants to quickly gather examples of artwork that students in their class have shared online. They consider uploading their class roster to ChatGPT and asking it to identify which students have posted art publicly. This would be a clear violation of FERPA and CCA’s Acceptable Use Policy, as student enrollment information is protected and may not be shared with public tools. Even if the AI tool doesn’t explicitly share this information with others, uploading such data to a third-party platform risks exposure, storage, or misuse beyond the college’s control.

Unless using AI tools specifically noted in our policy and below, you must not provide sensitive or confidential information when using AI platforms or tools. We entered into a contract with the AI tools listed below that specifically excludes data provided from being used to train the data model or shared externally.

Note that even though we are specifically allowing you to share CCA data while using the tools listed below, this allowance ONLY applies when you are using these tools while logged into your @cca.edu account. Usage of these tools under a personal account does not hold the same data security protections.

Accuracy

Even the best AI tools can generate inaccurate output. It is your responsibility to review, understand, and validate the output; particularly before sharing with others or reusing that content in other works.

Using a personal account

If you are using your personal account with a generative AI tool, such as your personal Gmail or Yahoo account, you are prohibited from providing CCA data while using an AI tool (or any online application). You may ONLY share CCA information that is publicly available with that tool, for example, academic program descriptions on cca.edu, or course descriptions posted publicly on https://portal.cca.edu. This is not only to protect the privacy and security of CCA data, but for your protection as well.

For example, you are a manager and need to compose a performance improvement letter for an employee. Because you already have an established ChatGPT profile under your @gmail.com account, you decide to provide the details to ChatGPT and ask it to draft a letter for you. Later, during a legal or HR-related review, it is discovered that you used your personal account in this way. As a result, your personal account (including all ChatGPT threads) could possibly become subject to an investigation or disclosure.

The same holds true when using your CCA account for personal work. Your CCA account is always subject to revocation, monitoring, and investigation. Do not get caught losing valuable personal work upon your departure from CCA, or risk having your personal information included in CCA litigation proceedings.

As a general best practice, use your CCA account for CCA work, and your personal account for non-CCA work. This is not only true for AI tools, but also your email, document management (e.g. Google Drive), and other online tools.

Accuracy

Even the best AI tools can generate inaccurate output. It is your responsibility to review, understand, and validate the output; particularly before sharing with others or reusing that content in other works. “Hallucinations” are a common phenomenon seen in AI-generated content, where false, nonsensical or ungrounded information is presented as fact.

Most generative AI tools provide direct links to sources that it used to compose a response. Check those sources, and verify that the information is still current and that the summary is accurate. For example, information included in AI-generated content could have been derived from a website that is provided as an example of what NOT to do, or from a non-trustworthy source, or from a particular product promoting their own services.

Bias

AI tools reflect the biases inherent in the models used to train them. Be particularly aware of overly simplified or biased AI-generated output that may omit important data or reinforce harmful stereotypes.

Generative AI tools are trained on large datasets that may reflect existing cultural, racial, and gender biases. This can lead to outputs that reinforce stereotypes or present a narrow perspective. For instance, image generation tools have been shown to default to Western beauty standards or to underrepresent people of color and non-Western settings. Similarly, text-based AI may unintentionally mirror biased language patterns in job descriptions, healthcare data, or crime statistics.

When working with generative AI, it’s important to review results critically and consider whose perspectives may be missing or misrepresented. Avoid assuming that AI output is neutral or comprehensive.

Clearly identify AI-generated output

When appropriate, clearly identify AI-generated output and cite the source, allowing others to trust and validate the output.

For example, at the bottom or footnote of a letter drafted in AI, state “This letter was initially drafted using Google Gemini AI, and subsequently reviewed and revised.”

Generative AI and Images

Generative AI tools can produce visual, audio, and written content. However, they raise ongoing concerns related to copyright, attribution, and the reuse of creative work without consent. The legal status of AI-generated images and media remains uncertain. In most cases, this type of content is not considered copyrightable.

At this time, we recommend that AI-generated images and creative content not be used in public-facing materials. This includes marketing, exhibitions, and external communications. Internal use is permitted, as long as the content is clearly labeled as AI-generated and includes attribution to the tool or source used.

Whenever possible, we encourage the use of work created by people, especially members of the CCA community. Always ensure that proper permission is obtained.

Recording content with generative AI

CCA currently allows only one AI-enabled tool for meeting recording and summarization: Zoom AI Companion. This is the only approved platform under CCA’s contractual agreement that meets our privacy and data protection standards. Other AI tools (such as, Otter.ai, Fireflies, or ChatGPT) may not be used to record or summarize meetings containing CCA data.

When using Zoom AI Companion:

Always log in with your @cca.edu Zoom account.

Verbally inform participants that the meeting is being recorded and summarize how the recording or summary will be used. Respect any participant's objections to being recorded or summarized by AI.

Meeting transcripts and summaries generated using Zoom AI Companion may still include sensitive or confidential content. This material must be handled according to CCA’s Acceptable Use Policy (AUP) and data handling requirements.

Recording Classes or Student Meetings:

Instructors may not record student classes or meetings using Zoom or any other application without prior consent and compliance with CCA’s policies. If recording an in-person class or event, you must inform participants and explain how the recording will be shared.

Please refer to the Audio Recording / Note-Taking Assistance.

Allowed Generative AI Tools at CCA

Many of CCA’s systems and systems available to you through your CCA account have introduced AI tools, including Workday, Mural, JSTOR AI research tools. These are allowed, but cannot be used to record meetings, or provide it with confidential information. Please see below for allowed generative AI tools that CAN be used to record meetings or process confidential information.

Zoom AI Companion

Zoom’s AI Companion is currently available to all staff and faculty at CCA. In order to use Zoom’s AI Companion under our contractual agreement, you must be logged into Zoom with your CCA account.

To ensure the security and privacy of all the participants, this is the only AI tool that is allowed to record CCA meetings. Zoom's implementation of Generative AI and our contractual agreement with Zoom provides a high level of security and privacy that is consistent with our AUP.

Zoom AI Companion is an effective tool for recording meetings, and can be used to automatically generate meeting transcripts and summaries, as well as a growing number of features including smart recordings and in-meeting questions. More information about using Zoom AI Companion features can be found here.

Google Gemini

When you need a general purpose Generative AI tool, we strongly recommend you use Google Gemini for Education, available to all CCA staff and faculty at https://gemini.google.com. Using Google Gemini for Education while logged into your CCA Google Workspace account ensures that your prompts and generated content are not used to train Google’s AI platform.

Additionally, Google provides tools to help you evaluate the accuracy of Gemini's responses. Look for Sources and related content and the Double Check button following your results.

As Google says, “Gemini may display inaccurate info, including about people, so double-check its responses.”

Notebook LM

NotebookLM is another Google AI product available to CCA staff and faculty at https://notebooklm.google.com/. NotebookLM is an AI-powered research and note-taking tool. Its core strength lies in its ability to act as a "virtual research assistant" that is grounded exclusively in the sources you provide. It doesn't pull information from the broader internet, which helps to prevent "hallucinations" and ensures that its outputs are verifiable against your own materials. At its heart, NotebookLM is designed to help you quickly understand, organize, and synthesize information from a variety of documents. The content summarization can also be used to generate student guides, audio overviews, flash cards and quizzes, or FAQs.