Coder Social home page Coder Social logo

philippzagar / spezillm Goto Github PK

View Code? Open in Web Editor NEW

This project forked from stanfordspezi/spezillm

0.0 0.0 0.0 1.01 MB

A collection of machine learning-related modules usable in the Spezi ecosystem

Home Page: https://swiftpackageindex.com/StanfordSpezi/SpeziML/documentation/speziopenai

License: MIT License

Swift 98.05% C++ 1.95%

spezillm's Introduction

Spezi LLM

Build and Test codecov DOI

Overview

The Spezi LLM Swift Package includes modules that are helpful to integrate LLM-related functionality in your application. The package provides all necessary tools for local LLM execution as well as the usage of remote OpenAI-based LLMs.

Screenshot displaying the Chat View utilizing the OpenAI API from SpeziLLMOpenAI. Screenshot displaying the Local LLM Download View from SpeziLLMLocalDownload. Screenshot displaying the Chat View utilizing a locally executed LLM via SpeziLLMLocal.
OpenAI LLM Chat View Language Model Download Local LLM Chat View

Setup

1. Add Spezi LLM as a Dependency

You need to add the SpeziLLM Swift package to your app in Xcode or Swift package.

Important

If your application is not yet configured to use Spezi, follow the Spezi setup article to set up the core Spezi infrastructure.

2. Follow the setup steps of the individual targets

As Spezi LLM contains a variety of different targets for specific LLM functionalities, please follow the additional setup guide in the respective target section of this README.

Targets

Spezi LLM provides a number of targets to help developers integrate LLMs in their Spezi-based applications:

The section below highlights the setup and basic use of the SpeziLLMLocal and SpeziLLMOpenAI targets in order to integrate Language Models in a Spezi-based application.

Note

To learn more about the usage of the individual targets, please refer to the [DocC documentation of the package] (https://swiftpackageindex.com/stanfordspezi/spezillm/documentation).

Spezi LLM Local

The target enables developers to easily execute medium-size Language Models (LLMs) locally on-device via the llama.cpp framework. The module allows you to interact with the locally run LLM via purely Swift-based APIs, no interaction with low-level C or C++ code is necessary.

Setup

You can configure the Spezi Local LLM execution within the typical SpeziAppDelegate. In the example below, the LLMRunner from the SpeziLLM target which is responsible for providing LLM functionality within the Spezi ecosystem is configured with the LLMLocalRunnerSetupTask from the SpeziLLMLocal target. This prepares the LLMRunner to locally execute Language Models.

import Spezi
import SpeziLLM
import SpeziLLMLocal
import SpeziLLMOpenAI

class TestAppDelegate: SpeziAppDelegate {
    override var configuration: Configuration {
        Configuration {
            LLMRunner {
                LLMLocalRunnerSetupTask()
            }
        }
    }
}

Spezi will then automatically inject the LLMRunner in the SwiftUI environment to make it accessible throughout your application. The example below also showcases how to use the LLMRunner to execute a SpeziLLM-based LLM.

class ExampleView: View {
    @Environment(LLMRunner.self) var runner
    @State var model: LLM = LLMLlama(
        modelPath: URL(string: "...") // The locally stored Language Model File in the ".gguf" format
    )

    var body: some View {
        EmptyView()
            .task {
                // Returns an `AsyncThrowingStream` which yields the produced output of the LLM.
                let stream = try await runner(with: model).generate(prompt: "Some example prompt")
                
                // ...
            }
    }
}

Note

To learn more about the usage of SpeziLLMLocal, please refer to the [DocC documentation]: (https://swiftpackageindex.com/stanfordspezi/spezillm/documentation/spezillmlocal).

Spezi LLM Open AI

A module that allows you to interact with GPT-based large language models (LLMs) from OpenAI within your Spezi application.

Setup

You can configure the OpenAIModule in the SpeziAppDelegate as follows. In the example, we configure the OpenAIModule to use the GPT-4 model with a default API key.

import Spezi
import SpeziLLMOpenAI

class ExampleDelegate: SpeziAppDelegate {
    override var configuration: Configuration {
        Configuration {
            OpenAIModule(apiToken: "API_KEY", openAIModel: .gpt4)
        }
    }
}

The OpenAIModule injects an OpenAIModel in the SwiftUI environment to make it accessible throughout your application. The model is queried via an instance of Chat from the SpeziChat package.

class ExampleView: View {
    @Environment(OpenAIModel.self) var model
    let chat: Chat = [
        .init(role: .user, content: "Example prompt!"),
    ]

    var body: some View {
        EmptyView()
            .task {
                // Returns an `AsyncThrowingStream` which yields the produced output of the LLM.
                let stream = try model.queryAPI(withChat: chat)
                
                // ...
            }
    }
}

Note

To learn more about the usage of SpeziLLMOpenAI, please refer to the [DocC documentation] (https://swiftpackageindex.com/stanfordspezi/spezillm/documentation/spezillmopenai).

Contributing

Contributions to this project are welcome. Please make sure to read the contribution guidelines and the contributor covenant code of conduct first.

License

This project is licensed under the MIT License. See Licenses for more information.

Spezi Footer Spezi Footer

spezillm's People

Contributors

adritrao avatar philippzagar avatar pschmiedmayer avatar vishnuravi avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.