AI computing startup Cerebras releases open source ChatGPT-like models
By Jane Lanhee Lee
OAKLAND, California (Reuters) - Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration.
Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model.
"There is a big movement to close what has been open sourced in AI...it's not surprising as there's now huge money in it," said Andrew Feldman, founder and CEO of Cerebras. "The excitement in the community, the progress we've made, has been in large part because it's been so open."
Models with more parameters are able to perform more complex generative functions.
OpenAI's chatbot ChatGPT launched late last year, for example, has 175 billion parameters and can produce poetry and research, which has helped draw large interest and funding to AI more broadly.
Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models.
However, Karl Freund, a chip consultant at Cambrian AI, said bigger is not always better.
"There's been some interesting papers published that show that (a smaller model) can be accurate if you train it more," said Freund. "So there's a trade off between bigger and better trained."
Feldman said his biggest model took a little over a week to train, work that can typically take several months, thanks to the architecture of the Cerebras system, which includes a chip the size of a dinner plate built for AI training.
Most of the AI models today are trained on Nvidia Corp's chips, but more and more startups like Cerebras are trying to take share in that market.
The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.
(Reporting By Jane Lanhee Lee; Editing by Sam Holmes)