Coder Social home page Coder Social logo

fastttcamera's Introduction

Open Source at IFTTT

FastttCamera Logo

CocoaPods Version Build Status Coverage Status

FastttCamera is a wrapper around AVFoundation that allows you to build your own powerful custom camera app without all the headaches of using AVFoundation directly.

FastttCamera now supports awesome photo filters!

FastttCamera

FastttCamera powers the camera in the new Do Camera app for iOS from IFTTT.

App Store

Major headaches that FastttCamera automatically handles for you:

AVFoundation Headaches
  • Configuring and managing an AVCaptureSession.
  • Displaying the AVCaptureVideoPreviewLayer in a sane way relative to your camera's view.
  • Configuring the state of the AVCaptureDevice and safely changing its properties as needed, such as setting the flash mode and switching between the front and back cameras.
  • Adjusting the camera's focus and exposure in response to tap gestures.
  • Zooming the camera in response to pinch gestures.
  • Capturing a full-resolution photo from the AVCaptureStillImageOutput.
Device Orientation Headaches
  • Changing the AVCaptureConnection's orientation appropriately when the device is rotated.
  • Detecting the actual orientation of the device when a photo is taken even if orientation lock is on by using the accelerometer, so that landscape photos are always rotated correctly.
  • (Optional) Returning a preview version of the image rotated to match the orientation of what was displayed by the camera preview, even if the user has orientation lock on.
  • (Optional) Asynchronously returning an orientation-normalized version of the captured image rotated so that the image orientation is always UIImageOrientationUp, useful for reliably displaying images correctly on web services that might not respect EXIF image orientation tags.
Image Processing Headaches
  • (Optional) Cropping the captured image to the visible bounds of your camera's view.
  • (Optional) Returning a scaled-down version of the captured image.
  • Processing high-resolution images quickly and efficiently without overloading the device's memory or creating app-terminating memory leaks.

FastttCamera does many operations faster than UIImagePickerController's camera, such as switching between the front and back camera, and provides you the captured photos in the format you need, returning a cropped full-resolution image as quickly as UIImagePickerController returns the raw captured image on most devices. It allows all of the flexibility of AVFoundation without the need to reinvent the wheel, so you can focus on making a beautiful custom UI and doing awesome things with photos.

While both UIImagePickerController's camera and AVFoundation give you raw images that may not even be cropped the same as the live camera preview your users see, FastttCamera gives you a full-resolution image cropped to the same aspect ratio as your live preview's viewport as well as a preview image scaled to the pixel dimensions of that viewport, whether you want a square camera, a camera sized to the full screen, or something else.

FastttCamera also is smart at handling image orientation, a notoriously tricky part of images from both AVFoundation and UIImagePickerController. The orientation of the camera is magically detected correctly even if the user is taking landscape photos with orientation lock turned on, because FastttCamera checks the accelerometer to determine the real device orientation.

Installation

FastttCamera is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod "FastttCamera"

Example Project

To run the example project, clone the repo, and run pod install from the Example directory.

Usage

Add an instance of FastttCamera as a child of your view controller. Adjust the size and layout of FastttCamera's view however you'd like, and FastttCamera will automatically adjust the camera's preview window and crop captured images to match what is visible within its bounds.

#import "ExampleViewController.h"
#import <FastttCamera.h>

@interface ExampleViewController () <FastttCameraDelegate>
@property (nonatomic, strong) FastttCamera *fastCamera;
@end

@implementation ExampleViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
    _fastCamera = [FastttCamera new];
    self.fastCamera.delegate = self;
    
    [self fastttAddChildViewController:self.fastCamera];
    self.fastCamera.view.frame = self.view.frame;
}

Switch between the front and back cameras.

if ([FastttCamera isCameraDeviceAvailable:cameraDevice]) {
	[self.fastCamera setCameraDevice:cameraDevice];
}

Set the camera's flash mode.

if ([FastttCamera isFlashAvailableForCameraDevice:self.fastCamera.cameraDevice]) {
	[self.fastCamera setCameraFlashMode:flashMode];
}

Set the camera's torch mode.

if ([FastttCamera isTorchAvailableForCameraDevice:self.fastCamera.cameraDevice]) {
	[self.fastCamera setCameraTorchMode:torchMode];
}

Tell FastttCamera to take a photo.

[self.fastCamera takePicture];

Use FastttCamera's delegate methods to retrieve the captured image object after taking a photo.

#pragma mark - IFTTTFastttCameraDelegate

- (void)cameraController:(FastttCamera *)cameraController
 didFinishCapturingImage:(FastttCapturedImage *)capturedImage
{
	/**
 	*  Here, capturedImage.fullImage contains the full-resolution captured
 	*  image, while capturedImage.rotatedPreviewImage contains the full-resolution
 	*  image with its rotation adjusted to match the orientation in which the
 	*  image was captured.
 	*/
}

- (void)cameraController:(FastttCamera *)cameraController
 didFinishScalingCapturedImage:(FastttCapturedImage *)capturedImage
{
	/**
 	*  Here, capturedImage.scaledImage contains the scaled-down version
 	*  of the image.
 	*/
}

- (void)cameraController:(FastttCamera *)cameraController
 didFinishNormalizingCapturedImage:(FastttCapturedImage *)capturedImage
{
	/**
 	*  Here, capturedImage.fullImage and capturedImage.scaledImage have
 	*  been rotated so that they have image orientations equal to
 	*  UIImageOrientationUp. These images are ready for saving and uploading,
 	*  as they should be rendered more consistently across different web
 	*  services than images with non-standard orientations.
 	*/
}

Filters!

FastttCamera now supports awesome photo filters!

Filter Camera

FastttFilterCamera is a wrapper around the super-speedy GPUImage project that supports fast and easy creation of a camera app using filters based on GPUImage's Lookup Filters.

GPUImage is a powerful framework that processes images by using the GPU so quickly that it's possible to filter the live video preview straight from the camera. With all of its powerful features, it can be a bit tricky to get it configured correctly to make this work, and then you still need to solve the same image orientation and cropping challenges that you would using AVFoundation. FastttFilterCamera uses the same interface as FastttCamera, but allows you to apply a simple filter to the camera's preview or to your images.

Lookup Filters

Lookup Filters are a clever feature of GPUImage that makes creating beautiful filters as easy as editing a photo. Using your favorite photo editing application, you create whatever effect you want using actions or layers, and test it out on photos until you like the look. If you use Photoshop, check out this example lookup filter image creation file.

When you're ready, you take a special png image that has one pixel for each possible color value, and apply your desired effects to it. The Lookup Filter then filters each pixel of a photo by looking at the pixel location in your lookup image that corresponds to the color in the photo, and replacing it with the color it finds in the lookup image.

It's super fast because it doesn't need to do any on-the-fly calculations such as adjusting contrast or brightness, it just looks up the pre-calculated replacement color, so you can do more complex effects without slowing down your app. Pretty cool stuff!

The only limitation of Lookup Filters is that the effects you apply must not be affected by the location of the pixel. Blur, vignette, noise/grain, texture, edge detection, and other similar effects that are either dependent on neighboring pixels or aren't uniform over the entire image won't work, but Contrast, Levels, Color Overlay, Hue, Brightness, and Saturation would all be perfect effects to use here. Remember to save it as an uncompressed 512 x 512 png image when you're done.

Filters Installation

FastttCamera uses a CocoaPods subspec if you need support for photo filtering. To install FastttCamera including filters support, simply add the following line to your Podfile:

pod "FastttCamera/Filters"

This will also include all of the standard FastttCamera classes, so the FastttCamera/Filters subspec is the only FastttCamera CocoaPod you need to include in your Podfile.

Usage

The filters camera uses the same interface as the regular FastttCamera. To create a camera that live-filters the camera's preview, just include the Filters CocoaPods subspec, and create a FastttFilterCamera instead of a FastttCamera.

#import "ExampleViewController.h"
#import <FastttFilterCamera.h>

@interface ExampleViewController () <FastttCameraDelegate>
@property (nonatomic, strong) FastttFilterCamera *fastCamera;
@end

@implementation ExampleViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
    UIImage *lookupFilterImage = [UIImage imageNamed:@"YourLookupImage"];
    _fastCamera = [FastttFilterCamera cameraWithFilterImage:lookupFilterImage];
    self.fastCamera.delegate = self;
    
    [self fastttAddChildViewController:self.fastCamera];
    self.fastCamera.view.frame = self.view.frame;
}

To change to a different filter, just change the filterImage property of the FastttFilterCamera to the lookup image of the new filter.

- (void)switchFilter
{    
    UIImage *newLookupFilterImage = [UIImage imageNamed:@"NewLookupImage"];
    self.fastCamera.filterImage = newLookupFilterImage;
}

Filtering Captured Photos

If you don't want the live camera preview to have a set filter, you can use a regular FastttCamera for photo taking, then apply your filters to the captured photos afterwards.

Include the FastttCamera/Filters CocoaPods subspec, then create a regular FastttCamera for photo capturing. After your user takes the photo, you can present filter options they can apply to the static image on your photo edit/confirm screen using the UIImage fastttFilteredWithImage: filter method found in UIImage+FastttFilters.h.

#import <UIImage+FastttFilters.h>

- (void)applyFilter
{
    UIImage *preview = [UIImage imageWithContentsOfFile:self.imageFileName];
    UIImage *lookupFilterImage = [UIImage imageNamed:@"LookupImage"];
    preview = [preview fastttFilteredImageWithFilter:lookupFilterImage];
    [self.imageView setImage:preview];
}

Remember to apply each filter to the original UIImage if you let users switch between many filters, instead of applying new filters on top of the same image you already filtered, unless you intend for the filters' effects to be combined.

Filters Example Project

To run the filters example project, clone the repo, and run pod install from the FiltersExample directory.

You can see a few examples of different lookup images used for filtering in the Images.xcassets directory. Remember, to make your own custom lookup filters, start with this unfiltered lookup image, and remember to save it as an uncompressed 512 x 512 png image in your project when you're done applying any desired effects.

Contributors

License

FastttCamera is available under the MIT license. See the LICENSE file for more info.

Copyright 2015 IFTTT Inc.

fastttcamera's People

Contributors

barrettj avatar davelyon avatar jhersh avatar lauraskelton avatar mdelmaestro avatar ssathy2 avatar tony-yan-yu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fastttcamera's Issues

Can't Lock Orientation

I love this library, thanks for all the hard work!

I'm having a problem though. I'm unable to lock the camera in Lansdcape. My entire project supports LandscapeLeft/Right only. On my fastttCamera instance, I've set interfaceRotatesWithOrientation = NO.

If I start the camera holding the device in Landscape mode, everything works great. But if I hold the device in portrait mode, and start the camera, the viewport is rotated 90 degrees. I can now rotate the phone in any direction, and the viewport will stay offset 90 degrees.

Here's a dandy image taken, after starting in Landscape mode & staying in landscape mode.

photo 1

Here's a not so dandy one, taken after starting holding the phone in portrait, and then rotating back to landscape.

photo 2

Note that these are screenshots, not images saved using the delegate callbacks.

Here's a gist of my CameraViewController.m >> https://gist.github.com/jonstoked/43cb9674183f984c1006.

Any ideas?

Thanks,
Jon

Race condition on older phones

Sometimes _checkDeviceAuthorizationWithCompletion takes too long before invoking the completion handler, resulting in _session being nil at the point startRunning is called.

Orientating and or scaling image in landscape in portrait only app

So as the title suggests, my App is portrait only, however there are instances where i would like to tell the library it is about to take landscape pictures, for example for business cards. In this case i make the camera fullscreen and encourage them to take a landscape picture. Then i would like the picture that is returned to be rotated to be portrait. Reading some of the variables names it looks as if this is possible but i cant seem to get it to work

Any help is appreciated.

Swift implementation

Hi!

I'm trying to implement the camera in Swift. So far I've been able to instantiate it, take a picture, and manipulate the resulting pic.

Where I'm running into trouble is with the functions in AVCaptureDevice+FastttCamera.m. I can't seem to get the syntax right to call these functions in Swift.

var myCam = FastttCamera()
if FastttCamera.isFlashAvailableForCameraDevice(myCam.cameraDevice){
    myCam.setCameraFlashMode(FastttCameraFlashMode.Off)
}

'FastttCamera' does not have a member named 'setCameraFlashMode'

I know I'm probably approaching this incorrectly but figured I'd ask for help since I'm stuck. If someone could take a quick look I'd really appreciate it.

First startup of the camera , it is not possible to zoom

when I was returning from the background you can use it .

Do you have a problem with the following ?
_setupCaptureSession method of FastttCamera.m
The following has not been read at the time of the first start-up .

            if (self.isViewLoaded && self.view.window) {
                [self startRunning];
                [self _insertPreviewLayer];
                [self _setPreviewVideoOrientation];
                [self _resetZoom];
            }

In Xcode7 Console
(lldb) po [(UIView *)self.view window]
nil

"use of @import when modules are disabled error"

I've just updated pods for my project and now i'm having "use of @import when modules are disabled error" because of FastttCamera.h:

@import UIKit;

BUT, modules are switched ON for my project and pods:

bildschirmfoto 2015-03-29 um 11 30 44

bildschirmfoto 2015-03-29 um 11 32 18

Any idea how to fix this?

Crash on [AVCaptureSession addInput:]

There are no checks for errors when AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil] so if an error occurs, deviceInput will be nil and the following call to [_session addInput:deviceInput] will crash.

We cannot reproduce it on development, but we see several crashes on our production app on crashlytics:

Fatal Exception: NSInvalidArgumentException
0  CoreFoundation                 0x184aa51b8 __exceptionPreprocess
1  libobjc.A.dylib                0x1834dc55c objc_exception_throw
2  AVFoundation                   0x18c287c70 -[AVCaptureSession addInput:]
3  FastttCamera                   0x100f42170 __36-[FastttCamera _setupCaptureSession]_block_invoke.150 (FastttCamera.m:436)
4  libdispatch.dylib              0x18392e1fc _dispatch_call_block_and_release
5  libdispatch.dylib              0x18392e1bc _dispatch_client_callout
6  libdispatch.dylib              0x183932d68 _dispatch_main_queue_callback_4CF
7  CoreFoundation                 0x184a52810 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
8  CoreFoundation                 0x184a503fc __CFRunLoopRun
9  CoreFoundation                 0x18497e2b8 CFRunLoopRunSpecific
10 GraphicsServices               0x186432198 GSEventRunModal
11 UIKit                          0x18a9c57fc -[UIApplication _run]
12 UIKit                          0x18a9c0534 UIApplicationMain

Is there any reason why there are no such checks?

Thanks in advance

`AVCaptureSession` `startRunning` and `stopRunning` performed on Main queue

The Apple docs state:

(startRunning)The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam for iOS for an implementation example.

(stopRunning)This method is synchronous and blocks until the receiver has completely stopped running.

I think this project is great and can be improved even more by moving some of the session setup into a dedicated NSOperationQueue for better performance. Keep it up! :)

Return both the original and the filtered image when capturing

Hi there! First of all, amazing job with this camera. Really straightforward and fastttt! I wanted to see if there was a way to call takePhoto on the FastttFilterCamera and have it return both the original, unfiltered version of the image, as well as the filtered image. My best guess would be to capture the image without a filter, and then process the image with the self.fastttfilter.filter attribute to generate a second image. Not sure how much that would slow down the image capture process though.

I was going to fork and implement this on my own, but I figured I'd ask to see if there was a simpler way to do this before I embarked on that journey.

Thanks

EDIT: Just to clarify, my goal is to be able to show the user a live preview of the filters before capturing the photo, but to also allow the user to change the filter after the photo was captured without having to take a new photo.

CGBitmapContextCreate return NULL

CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef) & kCGBitmapAlphaInfoMask;

//Since iOS8 it's not allowed anymore to create contexts with unmultiplied Alpha info
if (bitmapInfo == kCGImageAlphaLast) {
    bitmapInfo = kCGImageAlphaPremultipliedLast;
}
if (bitmapInfo == kCGImageAlphaFirst) {
    bitmapInfo = kCGImageAlphaPremultipliedFirst;
}

Cocoapods don't work

pod install
Analyzing dependencies
Fetching podspec for FastttCamera from ../FastttCamera.podspec
[!] Unable to find a specification for Masonry (= 0.6.1)

Swift not recognizing enum FastttCameraFlashMode

Hi - I'm trying to check the cameraFlashMode while using Swift 2.0. For some reason, I get the error
Enum case 'On' is not a member of type 'FastttCameraFlashMode?' with the following:

@IBAction func switchFlash(sender: AnyObject) {
        var flashMode: FastttCameraFlashMode
        var flashTitle: String
        switch self.camera?.cameraFlashMode {
        case FastttCameraFlashMode.On:
            flashMode = FastttCameraFlashMode.Off
            flashTitle = "Flash Off"
        default:
            flashMode = FastttCameraFlashMode.On
            flashTitle = "Flash On"
        }

        if let camera = camera {
            if camera.isFlashAvailableForCurrentDevice() {
                camera.cameraFlashMode = flashMode
                flashButton.setTitle(flashTitle, forState: UIControlState.Normal)
            }
        }
    }

The odd thing is that I can set the flashMode perfectly fine, but the switch is not working. The same happens for Off and Auto. I'm not sure if this is an iOS 9 issue or a Fasttt issue. If it's an Apple issue, then I can file a Radar.

Thanks,
Ahan

Captured image has wrong size / aspect ratio

Hi! Great library!
I have a problem with the output image. I am trying to get a square photo but the output is slightly off.
These are the logs from the various delegate methods:

fastCamera.view.frame: (0.0, 80.0, 320.0, 320.0)
RAW JPEG Data size: Optional((2448.0, 3264.0))
capturedImage size: (2448.0, 2452.0)
Normalized full capturedImage size: (2448.0, 2452.0)
Normalized scaled capturedImage size: (320.0, 321.0)

Any idea why the image is slightly wider? Thanks!
I am testing on an iPhone 5S running iOS 9.1

Pinch To Zoom

Hi,

Is there any way to add pinch to zoom support to FastttCamera?

Love this btw :)

Square preview image

How do I automatically crop the camera to a square UIView as it says in the README?

Preferably in swift.

add a framework header file

I use in it in swift, I have to import all the files one by one into bridge header
why not add a header such as FasttCameraKit.h which imports all public header files

Support for asynchronous processing of images

Working on an app that is requested to have rapid-fire photo taking abilities. This is not capable with the current pod given that the image needs to be processed before isCapturingImage is set to NO.

Why not have that function asynchronous and just have a queue to handle the photos that come in from the camera? I am probably going to try and implement this myself with your library, but I'm just wondering why you all decided against it. Or maybe it is possible and I'm just missing something.

lag on preview

I noticed a lag on the preview, It's sow, not fluid and a little bit laggy. On my iPhone 6. On a homemade framework like this one, i didn't have bug like that.

Filter image example does not work

The sample code does not give a preview as expected:

- (void)viewDidLoad {
    [super viewDidLoad];

    _filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
    self.filterCamera.delegate = self;
    [self fastttAddChildViewController:self.filterCamera];
    self.filterCamera.view.frame = self.view.frame;
}

Instead, setting the filterImage once more works:

- (void)viewDidLoad {
    [super viewDidLoad];

    _filterCamera = [FastttFilterCamera cameraWithFilterImage:[UIImage imageNamed:@"MonochomeHighContrast"]];
    self.filterCamera.delegate = self;
    [self fastttAddChildViewController:self.filterCamera];
    self.filterCamera.view.frame = self.view.frame;
    self.filterCamera.filterImage = [UIImage imageNamed:@"MonochomeHighContrast"];
}

Why?

Tested on iPhone 6 Plus, installed via Pod.

photo and it crashed on iPhone6 plus.

the delegate method return a capturedImage,and it's size 2448 * 2448 on iPhone6 plus device,cause memory waring then crash.Colud you help me or something advice?

  • (void)cameraController:(id)cameraController didFinishCapturingImage:(FastttCapturedImage *)capturedImage

Main Thread Checker: UI API called on a background thread

It seems like FastttCamera is using background thread to access UI API. Enabled Main Thread Checker using Xcode 9 Apple Doc.

Happens in FastttCamera.m line number 610

Here's the backtrace after pausing execution.

=================================================================
Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4   FastttCamera                        0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5   libdispatch.dylib                   0x0000000109cc12cc _dispatch_call_block_and_release + 24
6   libdispatch.dylib                   0x0000000109cc128c _dispatch_client_callout + 16
7   libdispatch.dylib                   0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8   libdispatch.dylib                   0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9   libdispatch.dylib                   0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10  libsystem_pthread.dylib             0x0000000185beb06c _pthread_wqthread + 1268
11  libsystem_pthread.dylib             0x0000000185beab6c start_wqthread + 4
2017-12-05 20:12:47.092293+0530 App-Dev[2754:1011164] [reports] Main Thread Checker: UI API called on a background thread: -[UIView bounds]
PID: 2754, TID: 1011164, Thread name: (none), Queue name: com.apple.root.default-qos, QoS: 21
Backtrace:
4   FastttCamera                        0x00000001026d7d98 __107-[FastttCamera _processImage:withCropRect:maxDimension:fromCamera:needsPreviewRotation:previewOrientation:]_block_invoke + 708
5   libdispatch.dylib                   0x0000000109cc12cc _dispatch_call_block_and_release + 24
6   libdispatch.dylib                   0x0000000109cc128c _dispatch_client_callout + 16
7   libdispatch.dylib                   0x0000000109ccd3dc _dispatch_queue_override_invoke + 984
8   libdispatch.dylib                   0x0000000109cd29d0 _dispatch_root_queue_drain + 624
9   libdispatch.dylib                   0x0000000109cd26f4 _dispatch_worker_thread3 + 136
10  libsystem_pthread.dylib             0x0000000185beb06c _pthread_wqthread + 1268
11  libsystem_pthread.dylib             0x0000000185beab6c start_wqthread + 4

Video feed & photo taken are wrong orientation when iPad is faceUp

Laura thanks for your awesome work! This is a great & useful piece of code you have written.

Fixed by pull request #63

iPads with deviceOrientation of faceUp are often in landscape orientation. The code currently assumes portrait orientation. This pull request solves the issue by using statusBarOrientation for faceUp and faceDown deviceOrientation (only).

'Expected a Type' error since Xcode 7

After opening project on Xcode 7 and updating pods - I received an 'Expected a Type' error on this line:

- (CGRect)fastttCropRectFromPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;

I thought Cocoapods might be to blame so I've reverted pod to the version I was using before (0.2.9) to no effect. I've removed all pods from the project, cleaned and then re-added - but no joy. I'm at a bit of a loss!

Image below is what I see - and I can provide more detail if required. Any thoughts would be much appreciated. Thanks.

image

Support for non-default capture session presets

Hello

_session.sessionPreset = AVCaptureSessionPresetPhoto;

AVCaptureSessionPresetPhoto
AVCaptureSessionPresetHigh
AVCaptureSessionPresetMedium
AVCaptureSessionPresetLow
AVCaptureSessionPreset352x288
AVCaptureSessionPreset640x480
...
AVCaptureSessionPresetInputPriority

Opportunity to change this parameter is very necessary. Take out it in a separate method, please.

Add support for Torch instead of Flash?

This is actually not a huge change and can of course just be bolted on by a user but I figure it might be worth it to include an option to turn on the Torch before taking a picture rather than just flashing it (I do it this way in my apps because I find it lets me compose my shot better on the first try, since the picture has time to refocus nicely and you can adjust your angle to minimize glare). Any thoughts?

Camera not fully initializing in Swift project

Strange one here, hoping you can shed some light...

This block of code in FastttCamera.m doesn't execute in my Swift project:

if (self.isViewLoaded && self.view.window) {
    [self startRunning];
    [self _insertPreviewLayer];
    [self _setPreviewVideoOrientation];
    [self _resetZoom];
}

I've verified that self.isViewLoaded is true. However, when I po self.view.window, the response is property 'window' not found on object of type 'UIView *'

OK, fine, that would explain why I the block doesn't run. If I remove the self.view.window check everything seems to function normally again. However within the example app provided, self.view.window is also not found on view when I po it in lldb (same error as in my app), BUT the block still executes.

To test this, set a breakpoint here, run the example app, and then po self.view.window.

The example app does work, and my app mostly works without this although zooming is broken because FastttZoom.maxScale is not initialized. Any ideas what would cause this or how to fix?

Pinch to zoom doesn't appear to be working :/

I just built this awesome library into my project and everything seems to be working great except that zooming does not seem to be working. When I use a pinch gesture nothing happens at all. Tap to focus works and everything but literally nothing occurs when i pinch. I have not messed with any of the setup but I checked to ensure that zooming was enabled and it is. Any thoughts on why this feature isn't working for me? iPhone 5S running iOS 9

filter image makes high memory

When I use FastttCamera's filter image, I found that It makes my app's memory very high,

and this increased memory can't be released.
I tested the demo , It causes the same question. please help me to solve the problem ,tks so much

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.