OpenAI is nudging ChatGPT further into the workplace with a trio of new features designed for real team workflows: shared projects that remember group context, native connectors to favorites among productivity apps, and hands-on enterprise-grade security controls. The end product is a chatbot that operates more like an interactive work surface than a one-time prompt box.
Shared Projects (Friendly): Persistent Team Memory
The new shared projects mode enables teams to create a persistent workspace where ChatGPT is able to maintain context through time and between people.

Whether that’s documents, datasets, or brand guidelines, you can upload them to a project and the assistant will always base its responses on that material for all members of your team.
Access operates similarly to a cloud doc: Project owners invite by link or email, and they assign chat or edit permissions. That organization makes it possible to keep long-running threads around client deliverables, proposals, incident retrospectives, or product specs without losing history as teammates cycle in and out.
In early tests featured by OpenAI, teams have used the shared projects to maintain proposals and contracts in lockstep, ensure that outbound communications are consistent in tone or style, and coauthor work from a knowledge base that remains synchronized as new information arrives. In practice, this might mean that a marketing team would anchor each phase of drafting to an approved style guide, or a legal department might maintain an up-to-date clause library automatically referenced by the model.
Connectors Turn ChatGPT Into a Central Work Hub
ChatGPT is now integrated into Gmail, Google Calendar, Microsoft Outlook, Microsoft Teams, SharePoint, GitHub, Dropbox, and Box. Once permissions have been given, the assistant can retrieve contextually relevant information from these sources as needed and produce summaries, action lists, schedules, or cross-app updates.
OpenAI says the system is capable of automatically choosing the correct connector “which requires significant cross-part knowledge beyond just string matching and doesn’t require being micromanaged.” Ask about the current status of a client project, and ChatGPT can now scan through recent emails and calendar invites, reconcile those with files saved in the linked project’s folder, and spit back a short update with next steps — all without any copy-pasting between tabs.
This is part of a wider industry effort to make AI assistants the gateway to corporate data. Competitors are moving in the same direction: Software maker Anthropic recently bolstered Claude’s connectors for design and task tools, while Zoom’s AI Companion connects to an ever-growing list of third-party apps. Where the differentiators will emerge are in retrieval accuracy, speed, and how well assistants respect enterprise governance boundaries.

Security Certifications And Admin Controls
For security-focused customers, OpenAI has added certifications and controls to help ease the buying process. The company says ChatGPT is now ISO/IEC 27001, 27017, 27018, and 27701 certified and also meets SOC 2 requirements for Security, Confidentiality, Availability, and Privacy. This is the level that enterprise IT is expecting for sensitive data and regulated workloads.
For Enterprise and Edu plans, admins can decide in detail who is able to access connectors and more. Optional IP allowlisting can block requests even if a user’s credentials are valid. Together with single sign-on and least-privilege permissions, those controls make it easier to plug ChatGPT into identity, DLP, and audit workflows already in place.
Practically, that means a finance department can restrict email and storage connectors to a small handful of preapproved users; keep compliance reviews traversing a shared project space; and guarantee every query is launched from company networks. And it reduces the potential for data spills when you’re working with contractors or any kind of external partners.
Availability And What It Means For Teams
Shared projects are rolling out to Business, Enterprise, and Edu plans first, before reaching Go, Plus, Pro, and Free accounts. The new connectors land alongside and can be enabled or restricted based on role or departmental requirements.
The timing matters. Generative AI is being operationalized in enterprises at breakneck speed, and analyst firms predict that model APIs and embedded assistants will become almost universally adopted among organizations from a niche footing today. Vendors are competing to become the control plane of that revolution: both Microsoft and AWS are creating marketplaces for AI agents and apps, and OpenAI is cobbling together a stack that combines authoring, retrieval, and governance.
For the everyman and woman, the upgrades make life (read: busywork) easier. A project manager could request a weekly digest pulled from Gmail and Teams, a developer might have GitHub issues summarized into a sprint plan, and a sales lead could generate a tailored proposal by landing on pricing sheets and case studies within a shared project. The point is not to dazzle with a newer model or features, but instead to weave the assistant into parts of people’s data and tools they are already using, without ceding control.
