Grok-os

Grok-os

Open source release of xAI's LLM

216 followers

This is the base model weights and network architecture of Grok-1, xAI's large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.
Grok-1 gallery image
Grok-1 gallery image
Free
Launch Team
Vy - Cross platform AI agent
Vy - Cross platform AI agent
AI agent that uses your computer, cross platform, no APIs
Promoted

What do you think? …

Chris Messina
To @aaronoleary's question: "Can Elon Musk challenge Open AI?", apparently the answer is indeed yes.
Aaron O'Leary
@chrismessina appreciate you remembering my question!
Saif Khan
This is huge! Only if I had the computing power to give it a try :(
Abidariaz Abida
Aoa can I talk to you
Daniel
Great hunt! I like that they open sourced it but not the reason behind open sourcing it. Outside of that, awesome to see models of this size being thrown in the public. Interested to see what people will do with it!
Abhilash Chowdhary
Going to give this a try, team Grok-1. Looks interesting.
Bob WIlsey
Great find! I appreciate that they've open-sourced it, although the rationale behind doing so remains unclear. Nevertheless, it's fantastic to witness models of this magnitude being made available to the public. I'm curious to see what people will create with it!
Moneshkumar Natarajan
The open-source release of Grok-1, xAI's LLM is a boon for those interested in advanced prediction and decision-making AI models. Despite its hardware demands (like multiple H200 GPUs), this unlocks access for experimentation and the creation of groundbreaking applications.
Borey Washington
I like to see a smaller version of this LLM that use less memory; not everyone can afford this huge memory GPU machine to run it
12
Next
Last