Of course, head over to self-hosting guide to get started.
Whether you use the cloud or community version, the IDE integration lets you control the data you share (even completely cut it off). By default, we collect your GitHub user ID and events (you editing guidelines) to improve the product. Additional details available here.
The cloud version doesn’t require you to orchestrate any additional service, install the VSCode extension you’re good to go. By default, it always ships with our best performing models. The community edition sees support for experimental features a bit later (making sure you can run the service locally without damaging your hardware or slowing it down).
For now, we support VSCode, and we’ll gradually add support to other development environments.
The cloud version. For the community version, the choice is up to you! But here are the best options: using OpenAI API (GPT 3.5 Turbo and more recent), or using locally-run open-source models. We plan to add support to additional LLM APIs, but for open-source models, here are the ones we recommend: Starling LM 7B, Mistral 7B v0.2 instruct.
You could see Quack as GitHub Copilot with a consistent & dynamic context of the software development culture of your team. We’ll help you establish clear & comprehensive expectations and embed them in every keystroke seamlessly. You will find little difference in benefits if you are coding on your own though.