Support for Groq chat completions is now baked into AIProxySwift. Groq completions are fast and free, but are rate limited on Groq's end. If you are at low volumes, you may find that Groq works well for your use case. The performance will impress your users.
Chat completions and streaming chat completions are both available, and are close to parity with OpenAI's request/response structure. We have given them different structs in AIProxySwift, because there is some divergence and likely more to occur with the pace that OpenAI releases advancements. Take a look at `GroqChatCompletionRequestBody.swift` for docstrings on how to control your chat completions.
Groq support is in AIProxySwift version 0.25.0 and later. You can find snippets to get started on our docs site or in the PR description here.
To update to this version of AIProxySwift, see the How to update the package section of the README.