By Thomas Claburn
Copyright theregister
Your job may not support BYOD, but how about BYOC? Microsoft has declared that people can bring their personal Microsoft 365 subscriptions to work to access various Copilot features at companies that fail to provide an AI fix.
Redmond has done so unilaterally, effectively endorsing “shadow IT” – the practice of bringing unapproved software and devices into the workplace.
Earlier this year, Microsoft said it had adopted a new approach to shadow IT. “While earlier eras of our IT history focused on trying to prevent shadow IT, we are now concentrating on managing it,” the biz said in a blog post. By “managing,” Microsoft also means “enabling.”
Samer Baroudi, senior product marketing manager at Microsoft, insists this is for your own good.
“This offers a safer alternative to other bring-your-own-AI scenarios, and empowers users with Copilot in their daily jobs while keeping IT firmly in control and all enterprise data protections intact,” Baroudi explained in a blog post.
Makers of competing AI products might disagree.
Microsoft says that employees can sign into Microsoft 365 apps using both personal and work accounts and now can use Copilot features from their personal plan (Personal, Family, or Premium) for business documents – even if their work account lacks a Copilot license.
Raspberry Pi prices hiked as AI gobbles all the memory
AI has had zero effect on jobs so far, says Yale study
Air Force admits SharePoint privacy issue as reports trickle out of possible breach
Hundreds of orgs urge Microsoft: don’t kill off free Windows 10 updates
IT admins miffed at having their authority usurped by a diktat from Redmond can console themselves with the knowledge that Copilot’s level of access “is strictly governed by the user’s work account permissions, ensuring enterprise data remains protected.” The user’s Entra (work) identity governs file permissions and access controls.
Also, “IT retains full control and oversight” – apart from the bit about allowing this to happen in the first place.
Admins have the ability to disallow personal Copilot usage on work documents using cloud policy controls. And they can audit personal Copilot interactions and can apply enterprise identity, permission, and compliance policies.
Government tenants (GCC/DoD) for some reason don’t support this capability, the one that Baroudi insists “does not create new data exposure risks.”
Meanwhile, employees who decide to fire up their personal Copilot accounts within the workplace should be mindful that their prompts and responses will be captured by their employer.
As to why Microsoft would bother, Baroudi provides a hint in the FAQs detailing the bring-your-own-Copilot-to-work initiative that accompanies his post.
Can use of Copilot from personal Microsoft 365 subscriptions help drive AI adoption?
Yes. It allows users to experience AI productivity benefits while IT retains control.
Of course, when Microsoft next cites enterprise adoption statistics for its AI products, it will be worth asking whether the company is counting personal usage of Copilot.