These guidelines and FAQs do not replace or supersede agreements made with the university and vendors such as Microsoft Corporation. For more details about security, privacy, and additional documentation, visit these links:
What AI Products Should I Use?
Microsoft Copilot Chat with Enterprise Data Protection (EDP) is Âé¶¹Çø's only approved Generative AI platform.
Ensure you are signed in to your Âé¶¹Çø account before searching or uploading fires to Copilot Chat. Once signed into your Calvin account, there should be a "green shield" that indicates you are using Copilot Chat with EDP.

Why Use Microsoft Copilot Chat?
As of October 2025, using Microsoft Copilot Chat with EDP ensures that Âé¶¹Çø's data remains protected, private, and compliant with institutional and regulatory standards if used as intended. Unlike many publicly available AI platforms that may use your prompts, uploaded files, and interactions to train their models — potentially exposing sensitive data — Copilot Chat with EDP keeps all data within the Microsoft 365 ecosystem. This means your queries and files are not used to train large language models, not shared with third parties, and remain governed by Calvin's enterprise agreement with Microsoft. This level of data residency and security is critical for protecting student records, research, and institutional information.
Despite the protections offered by EDP and Microsoft, it is against university policy to upload any sensitive or personal information into any system not approved for such use; this policy includes any AI systems used by Calvin employees. Refer to Calvin's Information Security Policy for more details.
Generally, faculty, staff, and students logged in with a Microsoft license provided to them by the university have Copilot Chat with EDP. Always ensure that your Copilot session is protected with EDP by confirming that the "green shield" appears at the top of your screen.
Microsoft 365 Copilot with EDP provided by Âé¶¹Çø is fundamentally different from the commercial, publicly available versions of Copilot and other generative AI tools. The enterprise version is integrated into Âé¶¹Çø's Microsoft 365 environment, meaning:
- Your data stays within Calvin's tenant — it is not shared with Microsoft or used to train public models.
- Prompts, files, and interactions are protected by enterprise-grade security controls, including encryption, access logging, and compliance with regulations like FERPA and GLBA, when used as intended.
- Your use of Microsoft 365 Copilot with EDP prevents data leakage — unlike free AI tools, which may retain and use your inputs to improve their models, Microsoft 365 Copilot with EDP ensures your data is not used for training or exposed to external systems.
Using free versions of Copilot or other AI platforms without an enterprise agreement often means:
- Your prompts and uploaded files may be stored, analyzed, and used to train the provider's models.
- You may lose control over your data once it's submitted.
- There are few guarantees about privacy, data residency (where data is physically stored), or regulatory compliance.
- For Calvin faculty, staff, and students, using Microsoft 365 Copilot Chat with EDP ensures that AI tools are aligned with institutional policies and protect sensitive information.
Yes, with discretion in Copilot Chat with EDP. Files without sensitive data can be analyzed, but users must be aware the file will be copied to the user's individual OneDrive folder; therefore, discretion is advised. Microsoft states that no files are used to train language models.
No, do not register with other AI platforms using your Calvin email address.
AI and Privacy: How to Protect Information
Creators of AI systems such as ChatCPT and other Large Language Models (LLM) use data from across the internet to train their systems, including the data, prompts, and other information added by end-users.
Please review the Âé¶¹Çø Information Security policy before interacting with AI.
You can protect student information by handling all student data in accordance with university policy and applicable regulations such as FERPA.
- Avoid querying any AI with student information or adding student data to AI models, including ChatGPT and Microsoft Copilot Chat.
- Avoid adding student work, student data, or student information to non-Calvin approved systems or platforms, including AI detection tools.
- Use approved tools only. Students should generally not be required to sign up for third-part platforms outside of the applications approved or provided by CIT, including third-party AI platforms like ChatGPT. Students already have access to Microsoft Copilot Chat through the university's Microsoft 365 environment.
You can protect university data by keeping it within approved and secure systems. Avoid uploading or entering protected data — such as employee, payroll, financial, legal, contract or partner data — into AI tools, including Microsoft Copilot Chat.
You can protect your own information in many ways including:
- Before signing up for a service, carefully consider whether Microsoft Copilot Chat is the safer alternative.
- Be careful about uploading your own work product or copyrighted materials into tools other than M365 Copilot Chat - this includes academic, teaching, research, and similar materials.
What Else Should I Know About Generative AI and Copilot Chat?
No, as of October 2025, Âé¶¹Çø has made the decision to leverage tools already available through CIT. Instead of building its own LLM, the university relies on enterprise-wide agreements, for example, with Microsoft Copilot Chat with EDP.
Files uploaded to Microsoft Copilot Chat are saved in the OneDrive account of the user who uploaded them.
Oversharing occurs when data is shared with people who shouldn't have access to it. This often happens when file permissions are set more broadly than intended.
In Copilot Chat, only the person who uploads a file will have it stored in their OneDrive. However, oversharing can still occur if that file, or any query results containing its information, are shared with others.
Microsoft 365 is designed so that users only see what they already have permission to access, or what's shared with them in the future.
AI for Different Roles: How Might Writers, Learners, and Researchers Use AI?
For a full overview of Calvin's Rhetoric Across the Curriculum discussion of generative AI for writers, including syllabus templates, review the following:
Use Microsoft Copilot Chat if an AI platform is needed. Requiring that students use additional services not provided and approved by CIT (such as ChatGPT), even for assignments, is not recommended.
Please review the Âé¶¹Çø Information Security policy before interacting with AI.
Generative AI offers opportunities for innovative academic research. When conducting research, be sure to follow Âé¶¹Çø's IRB standards, which are outline by the .