GPT-4, Gro, and Gemini are all powerful language models that are not free in the truest sense of the word. They are not only censored and aligned with certain political ideologies, but they are also closed source, which means that developers cannot use their superpowers to fix these problems. However, a new open-source foundation model named Mixol 8X 7B, combined with the brain of a dolphin, can offer hope in creating uncensored and open AI models that can run on local machines.
In December 18th, 2023, the CEO of Open AAI, Sam Mman, spoke about the impossibility of competing with open AI in creating foundation models. However, a recent announcement by Google of the Gemini model and the release of Mixol with an Apache 2.0 license by the French company, Mistol, is changing the landscape. Mistol, which has recently gained a valuation of $2 billion, has developed Mixol based on a mixture of experts architecture. Although not yet at the level of GPT-4, Mixol outperforms GPT-3.5 and Llama 2 on most benchmarks. The most notable aspect of Mixol is its true open-source license, allowing for modification and commercial use without restrictive caveats.
While both Llama and Mixol offer powerful models, they are highly censored and aligned out of the box. However, there are ways to unalign these AI models, as highlighted in a blog post by Eric Hartford, the creator of the Mixol Dolphin model. This model not only improved its coding abilities but also uncensored it by filtering the dataset to remove alignment and bias.
Running models such as the Mixol Dolphin locally is made possible through tools like Olama, an open-source tool written in Go that makes downloading and running open-source models a straightforward process. Additionally, fine-tuning a model with Auto Train from Hugging Face is easier than one might think, as it provides a UI to choose a base model and upload training data, ultimately creating a custom and highly obedient model.
The availability of cloud services from providers such as Hugging Face, AWS, and Google Vertex AI makes it feasible to rent hardware for training customized models. With the Mixol Dolphin model taking about 3 days to train on four A1 100s, the cost comes out to approximately $1,200, providing an idea of the financial aspect of creating custom models.
In conclusion, the development of open-source models such as Mixol and the techniques for uncensoring and fine-tuning them are crucial steps in creating AI that is free in both the financial sense and the freedom to operate without alignment and bias constraints. These efforts hold the promise of pushing AI into an era where its mere existence becomes an act of rebellion against the control of information and ideas. As we continue to explore and develop open-source models, we step into a future where the power of AI is truly in the hands of the people, free from the grip of powerful entities.