Come learn about what’s new in iOS 11 with AVFoundation, as David Okun walks us through new ways to capture still images, live photos, and depth maps via his open source framework, Lumina.
init var flashMode: AVCaptureDevice.FlashMode var isAutoStillImageStabilizationEnabled: Bool var isHighResolutionPhotoEnabled: Bool var isAutoDualCameraFusionEnabled: Bool // iOS 10.2+ @dokun24
var livePhotoMovieDimensions: CMVideoDimensions var isFlashEnabled: Bool var isStillImageStabilizationEnabled: Bool var isDualCameraFusionEnabled: Bool // iOS 10.2+ @dokun24
CVPixelBuffer? var metadata: [String : Any] var resolvedSettings: AVCaptureResolvedPhotoSettings var photoCount: Int var depthData: AVDepthData? @dokun24
previewPixelBuffer: CVPixelBuffer? var metadata: [String : Any] var resolvedSettings: AVCaptureResolvedPhotoSettings var photoCount: Int var depthData: AVDepthData?
previewPixelBuffer: CVPixelBuffer? var metadata: [String : Any] var resolvedSettings: AVCaptureResolvedPhotoSettings var photoCount: Int var depthData: AVDepthData?
self.photoOutput.isDepthDataDeliverySupported { self.photoOutput.isDepthDataDeliveryEnabled = true } // in your preparation for capturing a still image with AVCapturePhotoSettings if self.photoOutput.isDepthDataDeliverySupported { settings.isDepthDataDeliveryEnabled = true } @dokun24
AVCaptureMetadataOutput -> QR codes, barcodes, faces • AVCapturePhotoOutput -> still photos • AVCaptureDepthDataOutput -> depth data maps • You see the problem here, right? @dokun24