Project Overview - Board Game Finds
After Llama 2 was open-sourced, I wanted to see what type of novel content I could create and how that would compare against OpenAI's offering. Of the 2500+ pages, half of the content on boardgamefinds.com is generated by OpenAI's GPT-3.5 Turbo, and the other half is generated by Llama 2. While getting to "quality" outputs with Llama took more trial and error, the content is nearly indistinguishable from that of OpenAI. Initially, I intended this to be a small project. However, if I were to invest more time, I'd probably start with a VectorDB and LangChain, as well as trying to do in-browser recommendations using ONNX Web Runtime
- NextJS 13 + Typescript
- TailwindCSS
- MongoDB Atlas
- Running on Vercel/Edge
Generating everything took place over three steps. First, I compiled a dataset of board games and their relationships using sites like Board Game Atlas, Board Game Geek, and others.
Next, I converted these relationships into JSON format and prepared a template. Finally, I fed them into Guidance or Llama.Cpp to generate the markdown. This markdown is served from MongoDB Atlas and converted to HTML on-the-fly using react-markdown.
The result is a site with 2500+ pages of content, all generated by either GPT-3.5 Turbo or Llama 2.
{ "Hand Management": ["Terraforming Mars", "Brass: Birmingham", "Ark Nova"] }