August 13th, 2024

Show HN: Demo App for PLaMo-100B – A New Japan's 100B Parameter Language Model

PLaMo-100B, a large language model with 100 billion parameters, outperforms GPT-4 on Japanese benchmarks. Development spanned from February to August 2024, with a demo and trial API available online.

Read original articleLink Icon
Show HN: Demo App for PLaMo-100B – A New Japan's 100B Parameter Language Model

PLaMo-100B is an unofficial Python implementation of a large language model developed by Preferred Elements, a subsidiary of Preferred Networks. This model features 100 billion parameters and has been under development since February 2024, with its post-training process concluding on August 7, 2024. Notably, PLaMo-100B-Instruct has demonstrated superior performance compared to the GPT-4 model on specific benchmarks tailored for Japanese language models, including Jaster and Rakuda. The post-training phase involves additional training with extensive datasets to improve the model's capabilities on targeted tasks. A demo of PLaMo-100B is accessible online, although the trial API is available for a limited duration. Users are encouraged to support the project by starring the GitHub repository if they find it beneficial.

- PLaMo-100B has 100 billion parameters.

- Development began in February 2024 and concluded on August 7, 2024.

- The model outperforms GPT-4 on certain Japanese language benchmarks.

- A demo and limited trial API are available online.

- Users can star the GitHub repository to show support.

Link Icon 0 comments