Microsoftbecause of AI Toolshas begun reminding users to be careful with its service after controversy over its accuracy. The company has updated its service agreement to make it clear that its AI tools should be viewed as an aid rather than a substitute for professional advice.
The new terms, which will go into effect at the end of next month, specifically highlight problems with its health chatbot, theIt was pointed out that over-reliance by users on the advice it provided could pose a risk.
Microsoft makes it clear that AI cannot replace professionals. Microsoft's revised terms specifically address the limitations of its assisted AI: "AI services are not designed, intended or used as a substitute for professional advice." The company also added that the health chatbot "is not designed or intended to be a substitute for professional medical advice or for the diagnosis, treatment, mitigation, prevention or management of disease or other conditions."
The Agreement also reiterates that those subject to Bing's Terms of Use Copilot AI Experiences should not be used to extract data by crawling or collecting, etc., unless expressly authorized by Microsoft.
Additionally IT House notes that the update also imposes tighter restrictions on reverse engineering of AI models and imposes other protections, "AI services may not be used to discover models, algorithms, and any underlying components of the system." Microsoft also prohibits the use of its AI data to create or train other AI services.
The changes to the AI terms show that Microsoft is responding to potential liability issues and managing user expectations more clearly. It's also a reminder that AI technology is unlikely to replace humans in the short term.