Yes, your data is secure when you use the OpenAI API (which is the case within the botx platform). OpenAI places a high priority on data security and has implemented robust measures to protect user data. The models, like ChatGPT, do not retain information about individual queries. Every question you ask is processed without access to past interactions. This design ensures that the data you send to the API isn't stored or remembered by the model. More at https://openai.com/enterprise-privacy
If you are using an on-premise version of the model, the data remains fully under your control and doesn't leave your infrastructure. This ensures an added layer of data protection, as you have full oversight and governance of data storage, processing, and transmission.
Yes, you will currently need your own OpenAI API key to use OpenAI models such as GPT 4 or GPT 3.5 turbo. We are working on solutions to allow you use our services without your own API key and that feature is likely coming in later this year.
We support chat OpenAI models and open source model Llama-2 70,13,7b
We are working on integrating with Bard and Mistral and allowing you to use other open-source models available through Replicate.
Besides that, we are able to launch most other open-source models for you for on-premise secure environments.
Our pricing allows you to pick a monthly subscription plan. In case you run over your monthly limits, you will be billed on pay-as-you go basis.
Get in touch, we are happy to help.