You can now modify Grok-1
xAI, the company behind Grok-1, has made an open-release of the AI available on the Github repository. On March 17th, they announced it on the official website. They’ve also left you instructions on how to use it. To clarify, developers can build upon Grok-1’s foundation and improve its capabilities. For example, you may want to integrate it with an image recognition system. You can modify the AI to work in a way that lets users aim their phones at historical monuments and see information about them.
From all indications, the open-source version of Grok-1 is the base model from the AI’s pre-training phase that ended in October 2023. In short, it’s not limited to any particular use case — like conversations alone.
We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model.
xAI
AI models typically adjust parameters during training, which act like dials for processing data and generating output. More parameters mean greater complexity and detail capture. So, Grok-1’s massive 314 billion parameters suggest that it has highly advanced language processing capabilities.
Its MoE architecture combines multiple AI models into one and makes them work together effectively. Instead of using all the models for every task, MoE decides which ones to use based on the situation.
In a hypothetical situation, you can configure Grok-1 to improve personalization on X. The AI analyzes your past interactions on X (accounts you follow, hashtags you engage with, and content you retweet or like). Afterward, it matches content and user interest to make better recommendations.
Grok-1 gives you a clean slate
xAI started fresh with Grok-1 rather than building upon pre-existing language models — an uncommon approach. Pre-existing AI models were overlaid on GPT-3 or Jurassic-1 models as a foundation because they have already received training on massive datasets. Even Gemini wasn’t built from scratch. Instead, Google used its own Language Model for Dialogue Applications (LaMDA) to develop it.
With these foundational models, you only need to stack on them to create a new AI, saving time and resources. The obvious drawback Is that the architecture and training data of the underlying model limit you. Meanwhile, Grok-1 combines Jax, a high-performance numerical computation library, and Rust, a programming language, at its core.
Why did Musk make Grok open-source?
Earlier this month, Elon Musk sued OpenAI and CEO Sam Altman for allegedly breaching their agreement. The CEO had met Musk and proposed developing AI for humanity’s benefit. Musk’s move aims to hold them accountable for shifting focus to profit-making.
Musk claimed that OpenAI breached the agreement by releasing GPT -4 as a Microsoft product. Now, he wants to compel OpenAI to share its technology with the public. His move to make Grok open-source aligns with the original mission he shared with Altman.