Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this:
Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.
It's funny that a plan called "Pro" cannot be used professionally.
Lawyers are playing Calvinball again. I have no idea why the law finds this kind of argumentation compelling. "I clearly intentionally deceived, but I stashed some bullshit legalese into a document no one will read so my deception is completely OK."
As far as I can tell, this is only for the free personal plan, not any of the business offerings (ie not Copilot for M365) and Github Copilot is under a separate set of terms.
“These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.”
Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.
I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
Go to https://www.copilot.com/ and ask a question. You'll see from the answers that it is indeed for entertainment only. It is ridiculously behind ChatGPT, and I don't know how that can happen since Microsoft has access to the same models.
FYI: This is only for the "Cortana replacement" Copilot, not the other Copilots. This language doesn't appear in GitHub Copilot's Consumer Agreement, for example.
we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.
You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot
How does this affect Copilot in VS 2022 / VS 2026? Because this is kind of insulting to a professional. I really wish Microsoft would learn to name things correctly. There's Copilot the ChatGPT-like service, then there's Copilot for Visual Studio which is not the same as far as I can tell.
210 comments
Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity.
It's funny that a plan called "Pro" cannot be used professionally.
https://www.anthropic.com/legal/consumer-terms
“These Terms don’t apply to Microsoft 365 Copilot apps or services unless that specific app or service says that these Terms apply.”
Think of Copilot being a suite of different products under the same overall banner and it starts to make (a bit) more sense.
> IMPORTANT DISCLOSURES & WARNINGS
Tells us:
> You may stop using Copilot at any time.
That's an odd thing to include in a ToS.
- ‚Are you for entertainment purposes only?‘
-‚Not at all — unless you want me to be. The short version: I’m not “for entertainment only.”‘
Edit: Ok I see it is legal framing to not be held liable, but can they just do that via ToS and let the tool itself promote something else?
we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.
You agree to indemnify us and hold us harmless (including our affiliates, employees and any other agents) from and against any claims, losses, and expenses (including attorneys' fees) arising from or relating to your use of Copilot
Says the bot based on scraped data