Coder Social home page Coder Social logo

cookbook's Introduction

AudioKit

Build Status License Platform Reviewed by Hound Twitter Follow

AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS (including Catalyst), and tvOS.

Installation

Using Xcode, you can add AudioKit and any of the other AudioKit libraries using Collections

  1. Select File -> Add Packages...
  2. Click the + icon on the bottom left of the Collections sidebar on the left.
  3. Choose Add Swift Package Collection from the pop-up menu.
  4. In the Add Package Collection dialog box, enter https://swiftpackageindex.com/AudioKit/collection.json as the URL and click the "Load" button.
  5. It will warn you that the collection is not signed, but it is fine, click "Add Unsigned Collection".
  6. Now you can add any of the AudioKit Swift Packages you need and read about what they do, right from within Xcode.

Documentation

Docs appear on the AudioKit.io Web Site. You can also generate the documentation in Xcode by pulling down the Product menu and choosing "Build Documentation".

Examples

The AudioKit Cookbook contains many recipes for simple uses for AudioKit components.

Getting help

  1. Post your problem to StackOverflow with the #AudioKit hashtag.

  2. Once you are sure the problem is not in your implementation, but in AudioKit itself, you can open a Github Issue.

  3. If you, your team or your company is using AudioKit, please consider sponsoring Aure on Github Sponsors.

cookbook's People

Contributors

aure avatar benedictst avatar eljeff avatar emurray2 avatar fernandolguevara avatar forevertangent avatar mahal avatar marcolabreu avatar markjeschke avatar matt54 avatar misteu avatar mjeschke-millermedia avatar nickculbertson avatar rex4539 avatar ronyeh avatar ryanfrancesconi avatar sigmonkycnbc avatar skyefreeman avatar timdubbins avatar tomduncalf avatar wtholliday avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cookbook's Issues

Missing package product 'CookbookCommon'

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 13.4.1

Description

I wanted to try the Cookbook project on my machine but it won't compile.

../Cookbook.xcodeproj Missing package product 'CookbookCommon'

Showing Recent Messages
Couldn’t get revision ‘833aa2d007a3d82df232d788f4e0b006bdfa91f7^{commit}’:

Crash Logs, Screenshots or Other Attachments (if applicable)

Capture d’écran 2022-09-04 à 10 05 50

Cookbook no longer runs xcode 14

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 14

Description

Error is:
/Cookbook/Cookbook/Cookbook.xcodeproj Missing package product 'CookbookCommon'

It expects that cookbookcommon is a package that should be added rather than integrated with the rest of Audiokit

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

Changing Drums sound samples

Hello,

When adding a wav file then changing the "file" string of a DrumSample in Drums.swift (line 38) the sound doesn't play properly. When played the note is tuned way down. The file is also played anytime another note is pressed.

Steps to reproduce:

  • add a wav file to the project
  • change string name of the file to the wav's path
  • play instrument

Do the DrumSample file string names need to follow a particular naming convention? i.e. "Samples/open_hi_hat_A#1.wav"

Let me know if you need any additional details.

Thanks,
Nick

Settings.AudioFormat change causes NodeParameter to fail

I am attempting to modify the default AudioKit sample rate via Settings and am encountering an issue where the Delay Node Parameter property wrappers no longer update the AUParameters of the underlying AVAudioUnit. I have reproduced this issue in the Cookbook project Delay recipe with only the following modifications to the AppDelegate:

`func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
// Override point for customization after application launch.

    #if os(iOS)
    do {
        /// The only modification to the Cookbook project
        Settings.audioFormat = AVAudioFormat(standardFormatWithSampleRate: 48_000, channels: 2)!
        
        Settings.bufferLength = .short
        try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
        try AVAudioSession.sharedInstance().setCategory(.playAndRecord,
                                                        options: [.defaultToSpeaker, .mixWithOthers])
        try AVAudioSession.sharedInstance().setActive(true)
    } catch let err {
        print(err)
    }
    #endif

    return true
}`

After some debugging I have found that the Delay.time and Delay.feedback do not update but the Delay.dryWetMix does update. I've debugged the Property wrapper and NodeParameters and it appears their reference no longer matches the AUParameter reference returned in the NodeParameter.avAudioUnit.auAudioUnit.parameterTree!.allParameters.

Is my approach to modifying the default sample rate appropriate or am I missing something?

KeyboardWidget help

I'm working on building an app that has a keyboard in it. I want to start with Recipes/OscillatorView.swift, add it to my app, and then change much of it. That being said, OscillatorView.swift calls a "KeyboardWidget" and no matter what I include, I can't find where KeyboardWidget is defined or declared. Am I supposed to use KeyboardView instead?

Feature Request: Playlist

Would it be possible to have a basic "playlist" recipe?
Here's how it would work:

- Given a [user-selected] folder on the filesystem, I would like to see list of all the audio files in that folder, along with a play button for each one.

Currently the closest thing is the Drum Pad view, but in that recipe the audio file locations are hard-coded in the recipe's conductor rather than being dynamic and there is also the added complication of a grid with columns and rows (so not as beginner friendly for what should almost be the "hello world" of sample playback.)

Tuner Plots

Should plot the recent frequencies (and maybe amplitudes)

Cookbook Recorder Demo fails with "No file URL/settings to record to"

To reproduce:
open Recorder, tap record
Output console shows:
"NodeRecorder.swift:record():155:🛑 Error: No file URL/settings to record to (NodeRecorder.swift:record():155)"

Cookbook version is Latest version as of May 20, 2022 using AudioKit 5.4.2.

Sorry I don't know this project that well but it looks like the temp audio file created in createAudioFile() does not get updated to the variable recordedFileURL

Pause/unpause audio recording

HI Aure,
How can I Pause and Un-pause a recording ? I've noticed that once I call recorder?.stop(), it completely stops the recording and I cant append to that file anymore even though I can toggle record and stop a few times. Is this expected behavior
I'd like to have a pause functionality. Any suggestions?

thank you very much,
-Vittal

Beef up the Keyboard View

Add Polyphony button, increase/decrease number of octaves, up/down octave buttons to keyboard view

Build issues with latest OS and XCode versions

I am trying to build the app for iOS 14 on macOS 11 with XCode 12.

Class 'WorkGroup' cannot conform to @objc protocol 'Repeatable' because the class is only visible via the Objective-C runtime

and

Failed to build module 'os' from its module interface; the compiler that produced it, 'Apple Swift version 5.3.1 (swiftlang-1200.2.40 clang-1200.0.32.7)', may have used features that aren't supported by this compiler, 'Apple Swift version 5.3 (swiftlang-1200.0.16.109 clang-1200.0.22.20)'

Is there a particular setting I need to use to successfully build the app?

Which versions do you recommend for building the app?

AVAudioNodeCompletionHandler example

I'm being pretty dense and am trying to figure out how to use the completion handler.

Would love an example. Thanks for any help or response!

Example of stream PCM buttering player

Now all the examples seems to play buffer loaded from files. Please provide a example of using 'scheduleBuffer' func to show how to load streaming buffer from other datasource (net, other buffer...).

Also I found PCM of Int16 not work for player, but float32 does. It seems it's a limitation of AVAudioPlayerNode. Please show the best practise to convert the format.

tb303filter peaks in wrong direction in example. How to solve that?

Been looking at that filter emulation. Would like to use it, but it has "issues". In the example in the cookbook, it peaks in the wrong direction. Basically on the actual instrument, the filter "screams" when the filter is high, not when it is low. But on the example of the emulation, it does the opposite. Maybe that is why it sounds so horribly "wrong".

Was looking at the code in this project and also the filter emulation code. Wasn't obvious how to solve that. Are there some more parameters in that filter that can be added to the init to make it more usable also?

Thanks in advance.

PS: Was looking at it now. Maybe there is a link between something in the "distortion" or resonance and the co frequency that is wrongly backwards in the filter emulation.

Update: Initial guess after tracing the issue through the source code is that the issue is located somewhere between line 87 and 109 in tbvcf.c in /Sources/soundpipe/modules/tbvcf.c located at: https://github.com/AudioKit/AudioKit/blob/b7b68ce544f5009e12ef439d56130187dedaec25/Sources/soundpipe/modules/tbvcf.c

That or the original it was converted from wasn't right either...or it's simply not compatible with soundpipe as both are currently.

This also seems to possibly be a good reference. Has some info showing that the original emulation (http://www.csounds.com/manual/html/tbvcf.html) had some issues and maybe shows useful info or workarounds.

Here is a VCS3 (also a diode ladder) filter emulation to also reference. Maybe it is better/easier also:
https://github.com/VoggLyster/DiodeLadderFilter/blob/master/Source/PluginProcessor.cpp

Cannot build latest version from master

Before I was using cocoapods so maybe there is something I don't understand re Swift Packages. I get the following errors after trying to compile the project:

Cookbook project Group
Validate Project Settings Group
/Users/patryk/Developer/workspace/ios/playground/Cookbook/Cookbook/Cookbook.xcodeproj Update to recommended settings
Cookbook Project Group
Package Loading Group
terminated(128): XPC_FLAGS=0x0 LOGNAME=patryk PATH=/Applications/Xcode.app/Contents/Developer/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin MallocNanoZone=0 HOME=/Users/patryk XPC_SERVICE_NAME=com.apple.xpc.launchd.oneshot.0x10000003.Xcode LD_LIBRARY_PATH=/Applications/Xcode.app/Contents/Developer/../SharedFrameworks/ CA_DEBUG_TRANSACTIONS=1 SHELL=/bin/zsh SECURITYSESSIONID=186a6 LaunchInstanceID=DEC86BD3-5BF1-4D93-9C92-45C669F6C961 CA_ASSERT_MAIN_THREAD_TRANSACTIONS=1 TMPDIR=/var/folders/p9/qr_6zy4x0pz4_q2kdt15tl5c0000gn/T/ COMMAND_MODE=unix2003 __CF_USER_TEXT_ENCODING=0x1F5:0x0:0x0 USER=patryk SSH_AUTH_SOCK=/private/tmp/com.apple.launchd.UWKGEX5Jhw/Listeners /Applications/Xcode.app/Contents/Developer/usr/bin/git -C /Users/patryk/Library/Developer/Xcode/DerivedData/Cookbook-cqoqdubjrdtiisghouzsouljfkfr/SourcePackages/repositories/AudioKit-ba28193d rev-parse --verify '9e9f1ab8312c71c02288f73034b0393faa5e0bd3^{commit}' output:
fatal: Needed a single revision
Cookbook Group
Package Loading Group
/Users/patryk/Developer/workspace/ios/playground/Cookbook/Cookbook/Cookbook.xcodeproj Missing package product 'AudioKit'
x-xcode-log://E7637D31-3A0F-4101-9E9F-C78FF1D105F1 Package resolution errors must be fixed before building
/Users/patryk/Developer/workspace/ios/playground/Cookbook/Cookbook/Cookbook.xcodeproj Missing package product 'Sliders'
x-xcode-log://E7637D31-3A0F-4101-9E9F-C78FF1D105F1 Package resolution errors must be fixed before building

after openning the project Swift Package Manager seems to correctly fetch both packages but still not able to build project.

using Macos Catalina 10.15.6 and XCode Version 12.1 (12A7403)

On latest commit of 'develop' branch the Cookbook/Cookbook/Recipes directory seems to be missing

I watched @NickCulbertson's awesome YouTube video on making an instrument app with AudioKit.

After I cloned the Cookbook repo and switched to the 'develop' branch it seemed like the recipes folder went missing?

With commit 8e1a027 the file InstrumentEXS.swift is in the Recipes directory.

However, with commit d6a74c9 the Recipes directory seems to have gone missing from Cookbook/Cookbook.

My git skills are not strong. I apologize if this is just user error on my end.

Recorder error -50 on Mac OS 12.3.1

Hello, I'm trying to run the Recorder demo, building it on XCode 13.3.1 on Mac OS 12.3.1, Apple Silicon

pressing the record button in the demo doesn't seem to work, and i'm getting these errors spammed in the console indefinitely

[avae]            AVAEInternal.h:109   [AVAudioFile.mm:480:-[AVAudioFile writeFromBuffer:error:]: (ExtAudioFileWrite(_imp->_extAudioFile, buffer.frameLength, buffer.audioBufferList)): error -50
[general] NodeRecorder.swift:process(buffer:time:):177:Write failed: error -> The operation couldn’t be completed. (com.apple.coreaudio.avfaudio error -50.) (NodeRecorder.swift:process(buffer:time:):177)

Not sure if i'm doing anything wrong, I also tried creating a new project, adding AudioKit from SPM, setting it up with audio input in the App Sandbox settings, and using the Recorder.swift file, but i'm getting the same errors

Other demos seem to work (like BaseTap for example)

I also tried it on an intel mac running xcode 13.0 / big sur, but i get into the same issues

Fix up the recorder example

  • Add recording plots and recorded file waveforms.
    *. Add ability to export recorded file (save to documents)
  • Allow recording to have some effects applied to it (manually rendered).

Playing a clip does not start if one already playing

I have submitted this to Stack Overflow #Audiokit a couple weeks ago but no replies:

It would seem a simple question, but perhaps I am missing something, I am simply trying to have a second track play on a button press, that stops the current and plays!

The first time code is executed, first track plays fine, status is "playing"

Second time code executed, first track stops, status is "playing", but no sound, then shortly after status is "stopped"

Third time executed, desired track plays fine.

clipPlayer.stop()
do {
    try clipPlayer.load(file: file)
    clipPlayer.play()
    print("status:\(clipPlayer.status)")
} catch {
    Log(error.localizedDescription, type: .error)
}

I see this same odd behaviour in the Audiokit5 Cookbook / Audioplayer Playlist too - having to click the desired track twice, surely so basic problem would've been picked up? Or am I missing something simple?

Oh, additional info - if the current track has already stopped by reaching its end point. Clicking the new track starts as it should just fine.

Cookbook does not contain any examples of the Dunne `Sampler()`

The only use of the sampler in all of the cookbook examples are from AppleSampler() -- considering how excellent the native Sampler() is we should make an example.

I will assign myself to this task as I have a good working example with a .sfz and compressed sample files.

I will just wait for the current issue with the envelope generator prior to getting this out. Thanks!

Add more audio player options

In addition to the stock beats, perhaps add guitar, vocals, or even the ability process any song in iTunes (if that's even really possible anymore) or any audio file on the users iCloud.

how do I change the Black background for the plots in Recorder

Hi,
I am playing with the cookbook. It is extremely useful. Thank you.
I would like to change the black background for the plots in Recorder, Tuner etc. I tried a few things on the cookbook files, but I am unsuccessful. Please let me know how to change. I would like to parameterize the background color to any color, including clear.

Thank you,
-Vittal

Midi and midi monitor example not working on macOS (Big Sur)

Hi,

I have been trying Audiokit but I am struggling with making it work with my midi devices on macOS. I have seen this issue both on Cookbook as well following examples and applying them onto my "playground app" where I've been testing the features.

I can see this error being logged on any of the "test" apps I create on macOS and specifically only when instantiating the MIDI class:

2020-12-16 09:56:08.457326+0000 Cookbook[72432:2113908] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x6000032cc560> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-12-16 09:56:08.505460+0000 Cookbook[72432:2113908]  HALC_ShellDriverPlugIn::Open: opening the plug-in failed, Error: 2003329396 (what)

Please let me know if I can add more detail to clarify the issue here. I have looked before on the main Audiokit repo and it seems related to this issue - AudioKit/AudioKit#2261

Recipe request: Read notes from a local MIDI file, loop over them and play the song on external output

Basic Demo on how to open a .mid file and then looping over all notes for the given tracks displaying key number, velocity, timestamp, duration, track and channel. Basically having direct access to the notes themselves for further processing. Then play the notes (file) on an output device (preferably external, e.g. e-piano).
Right now, it seems very hard to find the relevant information to achieve this probably very simple task. Thanks!

Example of how to combine two WAV files into one?

I was wondering what the easiest way would be to load two (or more) WAV files and combine them into a single output WAV file. E.g. taking a vocal stem and combining that with an instrumental stem. I don't see a specific recipe for this, and to be honest it's a bit intimidating how many features and options there are. Could I get a couple of pointers here? If I manage to get it working, I could try to contribute the example back to the Cookbook!

Metronome example

Hey there,

I'm pretty new to AudioKit and try to get my head around the AppleSequencer. In the examples, there is a "Drum Sequencer" that, if I could change the Audio files, would do the job. What I did is just drag and dropping the new Audio file into the Samples folder and reference that to eg the Hi Hat track. But when i play the file i hear only a dull bass drum sound. Also just swapping the existing files, eg a bass drum with a hi hat audio reference, doesn't work. I think I'm missing some basic knowledge about the functionalities. Here is what i did inside DrumSequencer.swift:

do {

            let metroURL = Bundle.main.resourceURL?.appendingPathComponent("Samples/metronome.wav")
            let metroFile = try AVAudioFile(forReading: metroURL!)

            try drums.loadAudioFiles([metroFile])

        } catch {
            Log("Files Didn't Load")
        }
        sequencer.clearRange(start: Duration(beats: 0), duration: Duration(beats: 100))
        sequencer.debug()
        sequencer.setGlobalMIDIOutput(drums.midiIn)
        sequencer.enableLooping(Duration(beats: 4))
        sequencer.setTempo(120)


        for i in 0 ... 7 {
            sequencer.tracks[0].add(
                noteNumber: 30,
                velocity: 127,
                position: Duration(beats: Double(i)),
                duration: Duration(beats: 1))
        }

Basically removed everything to just use the new sound. Why isn't that working?

Thanks a lot!

example of AudioPlayer fade in and fade out?

I'm trying to use the automation events to automate fader gain, but I have a player with a seek bar and if I seek, it messes up the fades, as they are on AVAudioTime based on current time.
Is there something I am missing?
Perhaps adding an example to the cookbook may help noobs like myself.
Thanks!

Tuner not producing output values

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 14

Description

The App i'm working on is based on the mini app - Tuner.
(The official tuner does not work properly for me there either -- the three text fields show 0.0 0.0 and -/- )

With Audiokit v5.3.0 the re are three text fields showing amp, pitch and notename

eg:
Text("\(conductor.data.pitch, specifier: "%0.1f")")

With Audiokit 5.5.4 the fields do not work.

These still work if I go revert to Audiokit 5.3.0 and add back in the start and stop functions which have been moved into basetap in 5.5.4

The conductor runs and the other tuner graphics (nodeOutputView, etc... ) pick up the data from it. The text fields do not.

Do you have the same fault if running cookbook.tuner? Maybe it is my specific set up.

Crash Logs, Screenshots or Other Attachments (if applicable)

Screenshot 2022-09-28 at 20 53 19

Can't redirect output to bluetooth earphones

After installing this cookbook application on iphone x i am able to use the app however i am unable to hear the output, for ex. drums output from first mini app, using bluetooth earphones. I can hear sound from phone speakers. I had same issue for AudioKit playgrounds and bluetooth earphones but i could resolve it by just changing the microphone to system microphone on mac. However, i didn't find any way to change the input source of ios application. Is there any way for it to make it work?

ParameterSlider not very responsive on macOS Big Sur

Testing Cookbook App built from Xcode 12.4 in macOS Big Sur 11.2, iMac (Retina 5K, 27-inch, 2019).

Not sure if this is just a SwiftUI bug or not...?

Most of the sliders (which use ParameterSlider) in the Cookbook app seem quite unresponsive in the latest macOS. The click hit zone seems small and it doesn't seem possible to make the slider jump to a position in the slider by clicking.

e.g. Open Drum Sequencer and try dragging the slider with the mouse/trackpad - it's hit-and-miss whether the slider moves!

To fix this in ParameterSlider.swift, I ended up replacing:

            ValueSlider(value: self.$parameter, in: self.range)
                .valueSliderStyle(HorizontalValueSliderStyle(
                    thumbSize: CGSize(width: 20, height: 20),
                    thumbInteractiveSize: CGSize(width: 44, height: 44)))

with the default SwiftUI slider:

Slider(value: self.$parameter, in: self.range)

dragging.mov

PeakLimiter does not have an audible effect

Adjusting pre-gain has no effect on the sound.

In my tests, you can set pre-gain before the app runs and it will have an effect, but you cannot adjust this parameter in realtime.

Other apple effects (PeakLimiter is an apple effect) work fine.

Feature Request: Waveform Display

I was trying to do a waveform display af an audio file, and tried the example in the Cookbook (Table Recipe).
While it does indeed display a waveform of an audio file, the method seems very slow to me.

While I was looking for a better and faster way, I found the Waveform and WavetableView in AudioKitUI, but no recipes on those. Now, wouldn't that be great to have such?

Distorted Sound

Within the last week or so, the sound to the Vocal Tract, Vocal Tract Operation, and recorder has gotten distorted. Tuner shows an intermittent incoming signal from the mic too.
File

SwiftUI lifecycle

Any chance for instructions on how to set up the cookbook using the SwiftUI app lifecycle? Specifically, where should this bit from the AppDelegate go....?

#if os(iOS)
        do {
            Settings.bufferLength = .short
            try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
            try AVAudioSession.sharedInstance().setCategory(.playAndRecord,
                                                            options: [.defaultToSpeaker, .mixWithOthers])
            try AVAudioSession.sharedInstance().setActive(true)
        } catch let err {
            print(err)
        }
        #endif

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.