Do you use Groq Chat?
What is Groq Chat?
This alpha demo lets you experience ultra-low latency performance using the foundational LLM, Llama 2 70B (created by Meta AI), running on the Groq LPU™ Inference Engine.
Goldcast Content Lab
AdAI-Powered Campaign Creation: Repurpose B2B Videos in Clicks
Maker shoutouts
Makers get only 3 shoutouts per launch
Storipress workflow uses Groq to power our 'Grammarly for brand voice and compliance' feature, where we detect text snippets that breach your style guide and suggests changes
Recent launches
Groq®
An LPU Inference Engine, with LPU standing for Language Processing Unit™, is a new type of end-to-end processing unit system that provides the fastest inference at ~500 tokens/second.
Groq Chat
This alpha demo lets you experience ultra-low latency performance using the foundational LLM, Llama 2 70B (created by Meta AI), running on the Groq LPU™ Inference Engine.