Mozilla Builders' LocalScore: An Interesting Local AI LLM Benchmark
([Mozilla] 3 Hours Ago
LocalScore)
- Reference: 0001539277
- News link: https://www.phoronix.com/news/LocalScore-AI-LLM-Benchmark
- Source link:
Via Mozilla's Mozilla Builders initiative for fostering open-source AI projects is LocalScore, an interesting local AI large language model (LLM) benchmark for Windows and Linux systems. LocalScore has a lot of potential and also builds off the Mozilla Ocho [1]Llamafile project as [2]an easy-to-distribute LLM framework . LocalScore is still in its early stages but is already working well and will also be used in future hardware reviews on Phoronix.
[3]Llamafile 0.9.2 was released this past week and brings with it LocalScore into the codebase, a benchmarking utility. LocalScore helps in facilitating large language model (LLM) benchmarks on both CPUs and GPus. It's simple and a portable way to evaluate LLM system performance.
[4]
LocalScore can be triggered from Llamafile packages or there is also an independent LocalScore binary for Windows and Linux to facilitate easy AI benchmarking.
[5]
As part of this addition to Llamafile, there is now [6]LocalScore.ai as an opt-in repository for the CPU/GPU results from LocalScore with the official models based on tiny / small / medium Meta Llama 3.1 models.
Via [7]LocalScore.ai are the simple steps for running the CPU and/or GPU AI LLM benchmarks with the official models. The benchmarking can be easily done on Windows and Linux systems.
LocalScore is open-source and meets all of my standards/needs for benchmarking so it will be incorporated into future Phoronix benchmarks for the Linux hardware reviews, etc. (Well there's just a few tweaks needed that should get folded into the next release of LocalScore / Llamafile, but in any event look for LocalScore usage in the coming weeks.)
LocalScore is a nice addition to the [8]Mozilla Builders initiative and I'm all for seeing more open-source AI/LLM benchmarks that are easy-to-use / quick-to-deploy and cross-platform. Hopefully others check out LocalScore as well.
[1] https://www.phoronix.com/search/Llamafile
[2] https://www.phoronix.com/news/Llamafile-0.8.17-Released
[3] https://github.com/Mozilla-Ocho/llamafile/releases/tag/0.9.2
[4] https://www.phoronix.com/image-viewer.php?id=2025&image=localscore_1_lrg
[5] https://www.phoronix.com/image-viewer.php?id=2025&image=localscore_2_lrg
[6] https://www.localscore.ai/
[7] https://www.localscore.ai/download
[8] https://builders.mozilla.org/projects/
[3]Llamafile 0.9.2 was released this past week and brings with it LocalScore into the codebase, a benchmarking utility. LocalScore helps in facilitating large language model (LLM) benchmarks on both CPUs and GPus. It's simple and a portable way to evaluate LLM system performance.
[4]
LocalScore can be triggered from Llamafile packages or there is also an independent LocalScore binary for Windows and Linux to facilitate easy AI benchmarking.
[5]
As part of this addition to Llamafile, there is now [6]LocalScore.ai as an opt-in repository for the CPU/GPU results from LocalScore with the official models based on tiny / small / medium Meta Llama 3.1 models.
Via [7]LocalScore.ai are the simple steps for running the CPU and/or GPU AI LLM benchmarks with the official models. The benchmarking can be easily done on Windows and Linux systems.
LocalScore is open-source and meets all of my standards/needs for benchmarking so it will be incorporated into future Phoronix benchmarks for the Linux hardware reviews, etc. (Well there's just a few tweaks needed that should get folded into the next release of LocalScore / Llamafile, but in any event look for LocalScore usage in the coming weeks.)
LocalScore is a nice addition to the [8]Mozilla Builders initiative and I'm all for seeing more open-source AI/LLM benchmarks that are easy-to-use / quick-to-deploy and cross-platform. Hopefully others check out LocalScore as well.
[1] https://www.phoronix.com/search/Llamafile
[2] https://www.phoronix.com/news/Llamafile-0.8.17-Released
[3] https://github.com/Mozilla-Ocho/llamafile/releases/tag/0.9.2
[4] https://www.phoronix.com/image-viewer.php?id=2025&image=localscore_1_lrg
[5] https://www.phoronix.com/image-viewer.php?id=2025&image=localscore_2_lrg
[6] https://www.localscore.ai/
[7] https://www.localscore.ai/download
[8] https://builders.mozilla.org/projects/
Danny3