United States: Ban House staff from using Microsoft's AI Copilot
Broader U.S. Government AI regulatory efforts
Administrative Director of the House of Representatives Catherine Szpindor
"The Microsoft Copilot application has been deemed a risk to users by the Office of Cybersecurity due to the potential leak of White House data to cloud storage not approved by the White House."
The US House of Representatives has issued a strict ban on employees using the AI Copilot chatbot. This is the latest example of the federal government trying to navigate its internal use of AI while also trying to develop regulations for this growing technology.
Copilot is Microsoft's AI-based assistant, built on leading technology from OpenAI, creators of ChatGPT.
Microsoft has released free and paid versions of the software for consumers as well as many paid options for businesses.
Microsoft's Copilot works as a standalone chatbot for Websites and mobile devices, and paid versions can also work directly within Office applications like Word, Excel, Outlook, and PowerPoint.
Previously, in June last year, the US House of Representatives also limited the use of ChatGPT to the paid version for employees, absolutely prohibiting the use of the free version.
These actions underscore a broader effort by the U.S. federal government to balance the internal adoption of AI technologies with the development of appropriate regulations for the sector.
The ban also reflects broader concerns, similar to those in the corporate sector, where there is concern about the use of consumer-grade AI tools due to potential problems about data privacy.
Many businesses are eyeing or purchasing business versions that come with a guarantee that the data will not be used to train AI models in the future.
Microsoft's response to Copilot chatbot security concerns
Before announcing the blocking and disabling of the Copilot chatbot on all devices operated by the White House, Microsoft expressed its intention to introduce a set of AI tools designed specifically for government use. These will comply with strict federal data security and standards.
Additionally, Microsoft recently launched Copilot for Security, a solution designed to empower people and solve cybercrime challenges. With it, Microsoft is bringing the power of AI to the backyards of IT and security professionals.
With the rapid development of artificial intelligence, most governments are trying to legislate and find ways to "live peacefully" with AI. But this is also a process that takes a lot of time and testing. Therefore, it is understandable that Microsoft's AI is banned.
You should read it
- How is Copilot Pro different from Copilot? Should I upgrade?
- Instructions for canceling Copilot Pro subscription
- What to do when Copilot cannot be found on Windows 11?
- What is Microsoft Copilot? How to use Copilot in Windows
- Is GitHub Copilot or ChatGPT better for programming?
- Chatbot Microsoft Copilot claims to be the master, calling users slaves
- Instructions for using Copilot in Outlook
- How to add Copilot to Microsoft Office 365
- How to activate Windows Copilot with ViveTool
- How to edit images on Copilot AI
- How to turn off or remove Windows Copilot on Windows 11
- How to edit images on Copilot AI to create impressive photos
Maybe you are interested
What is Microsoft Azure Certification?
Cybercriminals are using Microsoft Teams calls to commit fraud
Microsoft officially supports sharing files from iPhone to Windows using Phone Link application
Microsoft 365 Android PDF Viewer shows ads, even with subscription
10 Useful Table Formatting Tips in Microsoft Word
Here's everything Microsoft knows about your PC!