SPACES
AI Addon
OpenAI LLM
5 min
same as with archbee llm, the openai integration will have access to both / /ai addon md and / /ai addon md features security and privacy if you want to disable the llm (archbee llm or openai) you can go to your https //app archbee com/settings/team integrations and select llm disabled under the large language models tab if you want to keep one or more spaces private you can go to space settings/large language model tab and disable the llm integration this will keep this space out of the llm indexing and all information inside will remain private once activated, our openai integration indexes your content in the background to optimize the cost of creating and updating your embeddings, they will update once every 30 minutes, not as you type then the generative search function is enabled on top, openai has clarified its policy for embeddings, as described below https //openai com/policies/api data usage policies extract from openai data usage policies(as of 10 05 2023) "starting on march 1, 2023, we are making two changes to our data usage and retention policies openai will not use data submitted by customers via our api to train or improve our models, unless you explicitly decide to share your data with us for this purpose you can any data sent through the api will be retained for abuse and misuse monitoring purposes for a maximum of 30 days, after which it will be deleted (unless otherwise required by law) " how to configure the openai integration go to your https //app archbee com/settings/team integrations and select the llm model openai insert your openai api key in the box and hit save enjoy the / /ai addon md and / /ai addon md features how does openai integration work? it's a feature available for internal and customer facing docs, opt in at the space level when the openai integration is activated, your team or customer can use the search function to ask questions like "how do i get started with your public api? can you give me an example in java or haskell?" then get back an answer that feels human and embedded with real context about your team's work or product so what are some benefits? \ automated onboarding of employees and collaborators and quicker, with questions being answered on the spot from your internal knowledge base and wikis; \ fewer support tickets due to users being able to ask the docs directly this means your cs people focus more on building & maintaining an accurate knowledge base that we can use to feed into llms and less on answering questions individually; what is the cost? we don't charge for the extra functionality of integrating llms like openai gpt since we ask you for the api keys, you will be charged directly by openai take a look at their https //openai com/pricing we mostly use the https //platform openai com/docs/guides/embeddings/what are embeddingshttps //platform openai com/docs/guides/embeddings/what are embeddings