Microsoft Plans to Sell a Private Version of ChatGPT for Security-Conscious Customers

The company aims to address data security concerns by developing a private version of OpenAI's ChatGPT, aimed at industries prioritizing data protection.

Microsoft Plans to Sell a Private Version of ChatGPT for Security-Conscious Customers
Image Credit: Maginative

In response to growing concerns among businesses over the potential for sensitive data leaks, Microsoft is planning to sell a private version of OpenAI's popular ChatGPT, according to a recent report from The Information. This offering will cater to companies hesitant to adopt ChatGPT, particularly those in heavily regulated industries like banking and healthcare, who worry about inadvertently sharing proprietary information with the chatbot.

Microsoft's new solution will involve running a version of ChatGPT on dedicated cloud servers, ensuring that each customer's data remains separate from others. While this private version will provide an extra layer of security, it comes with a significant price increase—potentially up to 10 times more than the regular version of ChatGPT.

Likely priced at several cents per token, Microsoft's private ChatGPT comes at a significant markup compared to the regular version's fraction of a cent per token. However, this pricing reflects the premium businesses are prepared to pay for increased security and data protection. The ongoing testing of the product by several financial institutions highlights a strong demand for a more secure ChatGPT offering. Microsoft salespeople have reportedly received inquiries from financial institutions and healthcare providers seeking a private version of the service.

The Information's report comes as OpenAI has hinted at a similar private ChatGPT offering aimed at businesses. Morgan Stanley has already signed on as an early customer. This move illustrates the potential market for privacy-focused versions of ChatGPT in industries that prioritize data security.

Corporate concerns regarding data leakage stem from the fact that ChatGPT is trained on vast amounts of text from the internet and user interactions. Although OpenAI has recently changed its privacy policy to stop using customer data for training its software by default and introduced more privacy controls, this measure has not fully assuaged the fears of companies in regulated industries.

Microsoft has been researching privacy-preserving methods for training OpenAI's machine-learning software since their partnership began in 2019. These efforts have led to the development of dedicated, isolated servers for training AI models, which have been found to protect against data leaks more effectively than shared servers.

With the introduction of this private ChatGPT offering, Microsoft aims to differentiate its services from OpenAI's while still selling the startup's software. The existing relationships many large customers, including banks, have with Azure could provide Microsoft with an advantage in convincing them that their data will be handled securely and in compliance with local regulations.

As companies increasingly seek to leverage AI-powered tools for various tasks, the demand for privacy-focused solutions will continue to grow. Microsoft's private ChatGPT is a step towards addressing these concerns and ensuring data security for businesses across regulated industries.

Let’s stay in touch. Get the latest AI news from Maginative in your inbox.

Subscribe