Coder Social home page Coder Social logo

pixelkit's Introduction

Check out AsyncGraphics.
Inspired by PixelKit & RenderKit.

PixelKit

Live Graphics for iOS, macOS and tvOS
Runs on RenderKit, powered by Metal

PixelKit combines custom shaders, metal performance shaders, core image filters and vision to create tools for real-time rendering.

Examples: Camera Effects - Green Screen
Info: Coordinate Space - Blend Operators - Effect Convenience Funcs - High Bit Mode

CameraPIX DepthCameraPIX ImagePIX VideoPIX ScreenCapturePIX StreamInPIX SlopePIX
ColorPIX CirclePIX RectanglePIX PolygonPIX ArcPIX LinePIX GradientPIX StackPIX
NoisePIX TextPIX MetalPIX TwirlPIX FeedbackPIX DelayPIX SharpenPIX StreamOutPIX
LevelsPIX BlurPIX EdgePIX ThresholdPIX QuantizePIX TransformPIX KaleidoscopePIX
ChannelMixPIX ChromaKeyPIX CornerPinPIX ColorShiftPIX FlipFlopPIX RangePIX StarPIX
SepiaPIX ConvertPIX ReducePIX ClampPIX FreezePIX FlarePIX AirPlayPIX RecordPIX
BlendPIX CrossPIX LookupPIX DisplacePIX RemapPIX ReorderPIX ResolutionPIX CropPIX
BlendsPIX LumaLevelsPIX LumaBlurPIX LumaTransformPIX TimeMachinePIX ArrayPIX

Install

Swift Package

.package(url: "https://github.com/heestand-xyz/PixelKit", from: "3.0.0")

Setup

SwiftUI

import SwiftUI
import PixelKit

struct ContentView: View {
    
    @StateObject var circlePix = CirclePIX()
    @StateObject var blurPix = BlurPIX()
    
    var body: some View {
        PixelView(pix: blurPix)
            .onAppear {
                blurPix.input = circlePix
                blurPix.radius = 0.25
            }
    }
}

UIKit

import UIKit
import PixelKit

class ViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()

        let circlePix = CirclePIX()

        let blurPix = BlurPIX()
        blurPix.input = circlePix
        blurPix.radius = 0.25

        let finalPix: PIX = blurPix
        finalPix.view.frame = view.bounds
        view.addSubview(finalPix.view)
    }
}

Resolution

In PixelKit all PIXs have a resolution. Some PIXs have defined resolutions (default to .auto) and some PIXs have derived resolutions.

The .auto resolution will fill up the view and get the correct resolution based on the view size. If a view is 100x100 points, the resolution will be 200x200 pixels on macOS and 300x300 pixels on iPhone.

Import the resolution package to work with resolutions:

import Resolution

You can multiply and divide resolutions with a CGFloat or Int.

There are predefined resolutions like ._1080p & ._4K.

Rendered Image

.renderedImage // UIImage or NSImage
.renderedTexture // MTLTexture

Example: Camera Effects

import SwiftUI
import PixelKit

class ViewModel: ObservableObject {
    
    let camera: CameraPIX
    let levels: LevelsPIX
    let colorShift: ColorShiftPIX
    let blur: BlurPIX
    let circle: CirclePIX
    
    let finalPix: PIX
    
    init() {
        
        camera = CameraPIX()
        camera.cameraResolution = ._1080p

        levels = LevelsPIX()
        levels.input = camera
        levels.brightness = 1.5
        levels.gamma = 0.5

        colorShift = ColorShiftPIX()
        colorShift.input = levels
        colorShift.saturation = 0.5

        blur = BlurPIX()
        blur.input = colorShift
        blur.radius = 0.25

        circle = CirclePIX(at: .square(1080))
        circle.radius = 0.45
        circle.backgroundColor = .clear

        finalPix = blur & (camera * circle)
    }
}

struct ContentView: View {
    
    @StateObject var viewModel = ViewModel()
    
    var body: some View {
        PixelView(pix: viewModel.finalPix)
    }
}

This can also be done with Effect Convenience Funcs:

let pix = CameraPIX().pixBrightness(1.5).pixGamma(0.5).pixSaturation(0.5).pixBlur(0.25)

Remeber to add NSCameraUsageDescription to your Info.plist

Example: Green Screen

import RenderKit import PixelKit

let cityImage = ImagePIX()
cityImage.image = UIImage(named: "city")

let supermanVideo = VideoPIX()
supermanVideo.load(fileNamed: "superman", withExtension: "mov")

let supermanKeyed = ChromaKeyPIX()
supermanKeyed.input = supermanVideo
supermanKeyed.keyColor = .green

let blendPix = BlendPIX()
blendPix.blendingMode = .over
blendPix.inputA = cityImage
blendPix.inputB = supermanKeyed

let finalPix: PIX = blendPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

This can also be done with Blend Operators and Effect Convenience Funcs:

let pix = cityImage & supermanVideo.pixChromaKey(.green)

Example: Depth Camera

import RenderKit import PixelKit

let cameraPix = CameraPIX()
cameraPix.camera = .front

let depthCameraPix = DepthCameraPIX.setup(with: cameraPix)

let levelsPix = LevelsPIX()
levelsPix.input = depthCameraPix
levelsPix.inverted = true

let lumaBlurPix = cameraPix.pixLumaBlur(pix: levelsPix, radius: 0.1)

let finalPix: PIX = lumaBlurPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

The DepthCameraPIX was added in PixelKit v0.8.4 and requires an iPhone X or newer.

Note to use the setup(with:filter:) method of DepthCameraPIX.
It will take care of orientation, color and enable depth on the CameraPIX.

To gain access to depth values ouside of the 0.0 and 1.0 bounds,
enable 16 bit mode like this: PixelKit.main.render.bits = ._16

Example: Multi Camera

let cameraPix = CameraPIX()
cameraPix.camera = .back

let multiCameraPix = MultiCameraPIX.setup(with: cameraPix, camera: .front)

let movedMultiCameraPix = multiCameraPix.pixScale(by: 0.25).pixTranslate(x: 0.375 * (9 / 16), y: 0.375)

let finalPix: PIX = camearPix & movedMultiCameraPix
finalPix.view.frame = view.bounds
view.addSubview(finalPix.view)

Note MultiCameraPIX requires iOS 13.

Coordinate Space

The PixelKit coordinate space is normailzed to the vertical axis (1.0 in height) with the origin (0.0, 0.0) in the center.
Note that compared to native UIKit and SwiftUI views the vertical axis is flipped and origin is moved, this is more convinent when working with graphics in PixelKit. A full rotation is defined by 1.0

Center: CGPoint(x: 0, y: 0)
Bottom Left: CGPoint(x: -0.5 * aspectRatio, y: -0.5)
Top Right: CGPoint(x: 0.5 * aspectRatio, y: 0.5)

Tip: Resolution has an .aspect property:
let aspectRatio: CGFloat = Resolution._1080p.aspect

Blend Operators

A quick and convenient way to blend PIXs
These are the supported BlendingMode operators:

& !& + - * ** !** % ~ °
.over .under .add .subtract .multiply .power .gamma .difference .average cosine
<> >< ++ -- <-> >-< +-+
.minimum .maximum .addWithAlpha .subtractWithAlpha inside outside exclusiveOr
let blendPix = (CameraPIX() !** NoisePIX(at: .fullHD(.portrait))) * CirclePIX(at: .fullHD(.portrait))

The default global blend operator fill mode is .fit, change it like this:
PIX.blendOperators.globalPlacement = .fill

Effect Convenience Funcs

  • .pixScaleResolution(to: ._1080p * 0.5) -> ResolutionPIX
  • .pixScaleResolution(by: 0.5) -> ResolutionPIX
  • .pixBrightness(0.5) -> LevelsPIX
  • .pixDarkness(0.5) -> LevelsPIX
  • .pixContrast(0.5) -> LevelsPIX
  • .pixGamma(0.5) -> LevelsPIX
  • .pixInvert() -> LevelsPIX
  • .pixOpacity(0.5) -> LevelsPIX
  • .pixBlur(0.5) -> BlurPIX
  • .pixEdge() -> EdgePIX
  • .pixThreshold(at: 0.5) -> ThresholdPIX
  • .pixQuantize(by: 0.5) -> QuantizePIX
  • .pixPosition(at: CGPoint(x: 0.5, y: 0.5)) -> TransformPIX
  • .pixRotatate(by: 0.5) -> TransformPIX
  • .pixRotatate(byRadians: .pi) -> TransformPIX
  • .pixRotatate(byDegrees: 180) -> TransformPIX
  • .pixScale(by: 0.5) -> TransformPIX
  • .pixKaleidoscope() -> KaleidoscopePIX
  • .pixTwirl(0.5) -> TwirlPIX
  • .pixSwap(.red, .blue) -> ChannelMixPIX
  • .pixChromaKey(.green) -> ChromaKeyPIX
  • .pixHue(0.5) -> ColorShiftPIX
  • .pixSaturation(0.5) -> ColorShiftPIX
  • .pixCrop(CGRect(x: 0.25, y 0.25, width: 0.5, height: 0.5)) -> CropPIX
  • .pixFlipX() -> FlipFlopPIX
  • .pixFlipY() -> FlipFlopPIX
  • .pixFlopLeft() -> FlipFlopPIX
  • .pixFlopRight() -> FlipFlopPIX
  • .pixRange(inLow: 0.0, inHigh: 0.5, outLow: 0.5, outHigh: 1.0) -> RangePIX
  • .pixRange(inLow: .clear, inHigh: .gray, outLow: .gray, outHigh: .white) -> RangePIX
  • .pixSharpen() -> SharpenPIX
  • .pixSlope() - > SlopePIX
  • .pixVignetting(radius: 0.5, inset: 0.25, gamma: 0.5) -> LumaLevelsPIX
  • .pixLookup(pix: pixB, axis: .x) -> LookupPIX
  • .pixLumaBlur(pix: pixB, radius: 0.5) -> LumaBlurPIX
  • .pixLumaLevels(pix: pixB, brightness: 2.0) -> LumaLevelsPIX
  • .pixDisplace(pix: pixB, distance: 0.5) -> DisplacePIX
  • .pixRemap(pix: pixB) -> RemapPIX

Keep in mind that these funcs will create new PIXs.
Be careful of overloading GPU memory, some funcs create several PIXs.

High Bit Mode

Some effects like DisplacePIX and SlopePIX can benefit from a higher bit depth.
The default is 8 bits. Change it like this: PixelKit.main.render.bits = ._16

Enable high bit mode before you create any PIXs.

Note resources do not support higher bits yet.
There is currently there is some gamma offset with resources.

MetalPIXs

let metalPix = MetalPIX(at: ._1080p, code:
    """
    pix = float4(u, v, 0.0, 1.0);
    """
)
let metalEffectPix = MetalEffectPIX(code:
    """
    float gamma = 0.25;
    pix = pow(input, 1.0 / gamma);
    """
)
metalEffectPix.input = CameraPIX()
let metalMergerEffectPix = MetalMergerEffectPIX(code:
    """
    pix = pow(inputA, 1.0 / inputB);
    """
)
metalMergerEffectPix.inputA = CameraPIX()
metalMergerEffectPix.inputB = ImagePIX("img_name")
let metalMultiEffectPix = MetalMultiEffectPIX(code:
    """
    float4 inPixA = inTexs.sample(s, uv, 0);
    float4 inPixB = inTexs.sample(s, uv, 1);
    float4 inPixC = inTexs.sample(s, uv, 2);
    pix = inPixA + inPixB + inPixC;
    """
)
metalMultiEffectPix.inputs = [ImagePIX("img_a"), ImagePIX("img_b"), ImagePIX("img_c")]

Uniforms:

var lumUniform = MetalUniform(name: "lum")
let metalPix = MetalPIX(at: ._1080p, code:
    """
    pix = float4(in.lum, in.lum, in.lum, 1.0);
    """,
    uniforms: [lumUniform]
)
lumUniform.value = 0.5

Notes:

  • To gain camera access, on macOS, check Camera in the App Sandbox in your Xcode project settings under Capabilities.

inspired by TouchDesigner created by Anton Heestand XYZ

pixelkit's People

Contributors

heestand-xyz avatar janakmshah avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pixelkit's Issues

Video audio not included after export

I'm sorry about this issue and sorry about those things,

but when I export the video using it export only the video without the audio channel,

Note if I did that flag to true,
RecordPIX.recordAudio = true

it will record the microphone audio along with video audio,
is it possible to export the video ad audio without enabling the microphone?

Question: Support for CoreML / Vision?

I was hacking together a demo where I had to do portrait segmentation (Zoom like background removal) without chroma key.
I decided to use Deeplabv3 and found this project: https://github.com/jsharp83/MetalCamera which has all the parts properly setup. With a few minor changes I was able to macOS compatible as well.

This would make a great addition to PixelKit.

My use case is: remove the background from realtime video and merge it with a custom image/video.

Relevant File:
https://github.com/jsharp83/MetalCamera/blob/master/MetalCamera/Classes/CoreML/CoreMLClassifierHandler.swift

Unresolved identifier 'ImagePIXUI'

I've added PixelKit to my Podfile and installed it. Xcode does pick it up and I can import it and use the UIKit related part without problems. However, I can't use SwiftUI related types as those don't get resolved by Xcode.

Invalid Metal draw Size

on iOS 16 I get

CAMetalLayer ignoring invalid setDrawableSize width=0.000000 height=0.000000

or


2023-07-10 15:15:22.745864-0400 app[41287:14957907] [Graphics] Invalid size provided to UIGraphicsBeginImageContext(): size={353, 0}, scale=3.000000
2023-07-10 15:15:22.746013-0400 app[41287:14957907] [Unknown process name] CGContextGetType: invalid context 0x0. If you want to see the backtrace, please set CG_CONTEXT_SHOW_BACKTRACE environmental variable.
-----> (1059.0, 0.0) (128.0, 128.0) (353.0, 0.0)

code:

struct Modified<V: View>: View {
	let child: V

	var body: some View {
		let view = ViewPIX { child }
		let pix = view.pixEdge()
		return PixelView(pix: pix)
	}
}

.gitmodules probably wrong URL

.gitmodules contains:

https://github.com/heestand-xyz/MetlShaders.git

but it looks like it should be:

https://github.com/heestand-xyz/MetalShaders.git

('Metal' instead of 'Metl')

Memory Leak (iPhone XS Max crash)

Screen Shot 2020-04-22 at 4 50 18 AM

I'm using the following code to create an initial alpha layer called threshold and then multiplying it with border color. The code below is only run once.

 let fullscreenBackground = ColorPIX(at: .fullscreen)
    fullscreenBackground.color = .clear
    
    let blended: BlendPIX = fullscreenBackground & imagePix
    
    let mask: ReorderPIX = ReorderPIX()
    mask.inputA = blended
    mask.inputB = blended
    mask.redChannel = .alpha
    mask.greenChannel = .alpha
    mask.blueChannel = .alpha
    mask.alphaChannel = .one
    
    threshold = blended._blur(0.1)._threshold(0.1)._lumaToAlpha()
    
    let borderColor = LiveColor(color)
    let colorEdge = threshold! * borderColor
    
    let final: PIX = colorEdge & imagePix
    final.delegate = self

Later by changing the color I'm calling this following method on a debounce of 2 seconds:

guard let threshold = threshold else { return }
    let borderColor = LiveColor(color)
    let colorEdge = threshold * borderColor
    
    let final: PIX = colorEdge & imagePix
    final.delegate = self

Metal View res not set

Hi,

trying to setup this on scenekit , I think I am missing one or two steps for the init

Crash on the below in ScenePIX.swift , line 91
renderer.render(atTime: 0, viewport: viewport, commandBuffer: commandBuffer, passDescriptor: renderPassDescriptor)
<--- renderer seems nil <--- Thread 1: Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value

Log:
RenderKit ready to render.
PixelKit #00 WARNING [1] ScenePIX Render >>> Metal View res not set
PixelKit #1 INFO [1] ScenePIX Res >>> Applied: custom(w: 1500, h: 1000) [1500x1000]
Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value: file PixelKit/ScenePIX.swift, line 91

Code
DispatchQueue.global(qos: .background).async {
let scenePix = ScenePIX(at: .custom(w: 1500, h: 1000))
scenePix.setup(cameraNode: self.sceneView.currentPointOfView()!, scene: self.sceneView.currentScene()!, sceneView: self.sceneView)
let res: Resolution = .custom(w: 1500, h: 1000)
let circle = CirclePIX(at: res)
circle.radius = 0.45
circle.bgColor = .clear
let finalPix: PIX = scenePix * circle
finalPix.view.frame = self.view.bounds
self.view.addSubview(finalPix.view)
}

May I have an hint were to look please as there are no readme or examples around on scenePIX?

Thank you very much.

Low frame rate

Hi all,

We are trying to use the CameraPIX together with a MOV that has an alpha channel.

` let camera = CameraPIX()
camera.camera = .front
let overlay = VideoPIX()
overlay.load(fileNamed: "FreeSmokeYoutube", withExtension: "mov") { done in
let blend = BlendPIX()
blend.blendMode = .addWithAlpha
blend.placement = .aspectFill
blend.extend = .mirror
blend.inputA = camera
blend.inputB = overlay

                self.record = RecordPIX()
                self.record?.input = blend

                try? self.record?.startRec(name: "tempvid")
            }`

The outcome though, it's really low frame rate. This is basically using the default configuration so, what could it be?

Thanks

How to install?

Hi,

Sorry I was out of the swift/spm loop for two years, as I was still on Mojave, and a lot has changed, querying google gives various options none seems to work.

First of I'm under Catalina and using XCode 12.4 (AFAIK the last version compatible with Catalina).
My project is a basic SwiftUI macOS app.

In XCode I went to File > Swift Packages > Add Package Dependency... and search for PixelKit.

In the version selection panel I tried a few things:

  • Exact == 2.0.4
  • Up to Next Major from 2.0.4
  • branch "main"

Both returned the following:

image

Very interesting project.

Thanks

Text generation?

Hi, is it planned to add the function of generating text in the form of a texture using  Metal API?. What methods do you know for generating text using metal? I would be very happy if you gave an example. My goal is to overlay the effect of fire on the text. thanks for the answer

ScenePIX setup

Hi,

first off, congrats for the great work done.

I want to apply flarePIX into Scenekit via ScenePIX , but ScenePIX is undocumented.

I see

public func setup(cameraNode: SCNNode, scene: SCNScene, sceneView: SCNView)

what is the correct way to apply scenePIX to Scenekit?

Thanks.

I want to save the processed photos to the album

This is a very good framework, I want to save the processed photos to the album, but can't find the corresponding method, I see the code is directly loaded View:

           let image = ImagePIX()
            image.image = self.myImage
            let edge = EdgePIX()
            edge.input = image
            edge.strength = 3
            edge.view.frame = myImageView.bounds
            self.view.addSubview(edge.view)

Can't get image information of edge.view

install PixelKit error

I'd love PixelKit,but,I find that ,Unable to find a pod with name, author, summary, or description matching PixelKit

Compatibility with Catalina ?

Hi heestand,

Thanks for the fix regarding the swift version, it still won't build, here are the errors, I have linked to the source files:

Screenshot of the errors with permalinks to source files

image

public convenience init(blendMode: BlendMode = .add,

image

image

image

func extract(_ name: String, in text: String) -> Double? {

And this that still tries to compile even with the safeguards (no idea how it's possible, I will thoroughly check my xcode project)

image

Sorry it's very ugly 😅, do you think it will be possible to remediate those issue to be compatible with an older SDK than Big Sur?

Thanks

SPM error

Hi Hexagons,

Trying to install the package and getting the following error. Can you please advise.

because PixelKit >=1.0.13 contains incompatible tools version and root depends on PixelKit 1.1.2..<1.2.0, version solving failed.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.