xAI releases Grok-1 blog: x.ai/blog/grok-os code: github.com/xai-org/grok Base model trained on a large amount of text data, not fine-tuned for any particular task. 314B parameter Mixture-of-Experts model with 25% of the weights active on a given token. Trained from scratch by xAI using a custom training stack on top of JAX and Rust in October 2023.
@iScienceLuvr @AdrianDittmann this confirms that grok is indeed using a mixture of experts approach..
Perhaps I would like the AI if it actually was fine tuned for a particular task. That it is not is part of the problem. It is a specialist in…. nothing. In an over saturated market of AI tools - I am going to choose tools that serve a specialty niche for a specific use case. Market demand will go towards fine tuned tasks - not generalist AI.
@iScienceLuvr “Not fine-tuned for any particular task.”
@iScienceLuvr @ollama Will this run on my new Chromebook?
@iScienceLuvr @readwise save thread @SaveToNotion #thread @threadreaderapp unroll @memdotai mem it