Tooling to work with language models. The Altiplano is the place where the llamas live
An inference server in Go to run local language models: Goinfer
Version | Name | Description |
---|---|---|
@altiplano/types | The shared data structures | |
@altiplano/usellama | A composable to use Llama.cpp | |
@altiplano/inferserver | An inference server library | |
@altiplano/taskserver | A tasks server library |
Note: these packages are built on top of Llama-node. As it does not seem to be maintained anymore the Typescript packages development is currently paused and might be reoriented