Coder Social home page Coder Social logo

guinmoon / llmfarm_core.swift Goto Github PK

View Code? Open in Web Editor NEW
156.0 4.0 18.0 3.39 MB

Swift library to work with llama and other large language models.

License: MIT License

Swift 1.43% Objective-C 3.91% C++ 45.01% C 45.77% Metal 3.87% Objective-C++ 0.01%
ai gpt-2 gptneox llama rwkv starcoder swift falcon llama2

llmfarm_core.swift's People

Contributors

guinmoon avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

llmfarm_core.swift's Issues

Any chance that you would bring Metal support to llama.cpp?

Hey, I saw your repo, and I think that you're working on really cool stuff!

I was trying to hack out figuring out how to enable Metal on llama.cpp for Swift apps, and it looks like you were able to hack something together. Do you have any plans to bring your changes to llama.cpp, and if not, would you please share any insights into how you overcame them? I'm new to Swift, so I can't really make heads or tails of it but was able to run your repo w/ Metals successfully.

Crashes when loading llama-2-7b.q4_K_M.gguf

Hello!

The current implementation crashed when trying to load llama-2-7b.q4_K_M.gguf.

                .onAppear {
                    Task {
                        // Load AI model
                        let url = Bundle.main.url(forResource: "llama-2-7b.Q4_K_M", withExtension: "gguf")!
                        let scribe = AI(_modelPath: url.path(),_chatName: "chat")
                        var params: ModelContextParams = .default
                        params.use_metal = true
                        
                        do {
                            try scribe.loadModel(ModelInference.LLama_gguf, contextParams: params)
                        } catch {
                            // Handle the error
                            print("An error occurred: \(error)")
                        }
                    }
                }

at line 484 in ggml-metal: ctx->buffers[ctx->n_buffers].metal = [ctx->device newBufferWithBytesNoCopy:(void *) ((uint8_t *) data + i) length:size_step_aligned options:MTLResourceStorageModeShared deallocator:nil];

Feature Request: Apple Silicone Neural Engine - Core ML model package format support

Description

Please consider adding Core ML model package format support to utilize Apple Silicone Nural Engine + GPU.

Success Criteria
Utilize both ANE & GPU, not just GPU on Apple Silicon

Additional Context

List of Core ML package format models
https://github.com/likedan/Awesome-CoreML-Models

Work in progress on CoreML implementation for [whisper.cpp]. They see x3 performance improvements for some models. (ggerganov/whisper.cpp#548) you might be interested in.

You might also be interested in another implementation Swift Transformers. Example of CoreML application
https://github.com/huggingface/swift-chat

Reopen issue 12

#12
Although using Commit for SPF , the library is considered an unstable-version package after relying on commit.

Thanks.
BR
buhe

Removing unsafe flags when debugging

Hi there,

ReadMe mentions removing unsafe flags, but I am unable to do so if I add it as a Package Dependency.

Are you suggesting I have to manually build the source without the unsafe flags to use in my project?

Apologies if this sounds too simple of a question, I am figuring out how to build a Mac OS chat desktop application that interacts with a mistral GGUF model.

Thanks,
Chaks

Crashed when set metal true

I tried running it on iphone and mac. The iphone emulator worked fine, but the mac and iphone real device(12) crashed.

截屏2024-01-24 下午12 34 03

The package product cannot be used as a dependency of this target because it uses unsafe build flags

I am getting this error in Xcode when trying to build the newly created project with this package imported:

The package product 'llmfarm_core' cannot be used as a dependency of this target because it uses unsafe build flags.

It doesn't work even when I'm importing the package locally and commenting out all the .unsafeFlags lines in the Package.swift

I also tried the approach suggested here with importing a specific commit, but it didn't help

How do you add this package to the Xcode project?

Sorry, I'm not proficient with packages 🙂

Cannot find type 'ModelContextParams' in scope

Hey! Trying to import the package but still face some issues

What I do:

  1. Create a new SwiftUI app
  2. File → Add Package Dependencies...
  3. Add llmfarm_core.swift package, specify the latest commit hash
  4. Inside ContentView.swift:
  • import llmfarm_core
  • Add code:
.onAppear() {
            let maxOutputLength = 256
            var total_output = 0

            var input_text = "State the meaning of life."

            let ai = AI(_modelPath: "llama-2-7b-chat.Q4_K_M.gguf",_chatName: "chat")
            var params:ModelContextParams = .default
            params.use_metal = true

            try? ai.loadModel(ModelInference.LLama_gguf,contextParams: params)
            ai.model.promptFormat = .LLaMa

          
        }
  1. The errors I get:
  • Cannot find type 'ModelContextParams' in scope
  • Value of type 'LLMBase' has no member 'promptFormat'

How do I resolve this?

lora adapter support

Not a real issue, but more a question.
current llama.cpp code supports loading also in lora adapters. Is this also supported by llmfarm_core.swift as I do not see that back in the LLMFarm code/application.
In case it is supported, is there an example on how to use it?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.