MetalCamera
Motivation
MetalCamera is an open source project for performing GPU-accelerated image and video processing on Mac and iOS.
There are many ways to use the GPU, including CIFilter, but it's not open or difficult to expand feature and contribute.
The main goal of this repository is to provide an interface and test performance to develop and apply it to actual services more easily when you have an idea about image processing and machine learning in the iOS environment.
At this stage, I'm developing to provide the following functions simply.
- SwiftUI support
- Camera input/output Handling
- Save image frame to video
- Basic image processing and filter
- Download and processing CoreML model
- Visualize result of CoreML model
- Benchmark algorithm.
There are still a lot of bugs and many things to implement, but I created a repository because I wanted to develop camera and vision feature in iOS with many people.
Feel free to use, make some issue and PR when you have a idea.
Thanks.
Example
To run the example project, clone the repo, and open Example.xcodeproj from the Example directory first.
Camera
- SwiftUI case
import SwiftUI
import MetalCamera
struct CameraSampleView: View {
let camera = try! MetalCamera(videoOrientation: .portrait, isVideoMirrored: true)
var body: some View {
VideoPreview(operation: camera)
.onAppear {
camera.startCapture()
}
.onDisappear {
camera.stopCapture()
}
}
}
- UIKit case
import MetalCamera
@IBOutlet weak var preview: MetalVideoView!
var camera: MetalCamera!
override func viewDidLoad() {
super.viewDidLoad()
guard let camera = try? MetalCamera(useMic: useMic) else { return }
camera-->preview
self.camera = camera
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
camera?.startCapture()
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
camera?.stopCapture()
}
Download and load CoreML from web url
import MetalCamera
let url = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!
do {
coreMLLoader = try CoreMLLoader(url: url, isForcedDownload: true)
coreMLLoader?.load({ (progress) in
debugPrint("Model downloading.... \(progress)")
}, { (loadedModel, error) in
if let loadedModel = loadedModel {
debugPrint(loadedModel)
} else if let error = error {
debugPrint(error)
}
})
} catch {
debugPrint(error)
}
Segmentation Test(DeepLabV3Int8LUT model, iPhone XS, avg 63ms)
func loadCoreML() {
do {
let modelURL = URL(string: "https://ml-assets.apple.com/coreml/models/Image/ImageSegmentation/DeepLabV3/DeepLabV3Int8LUT.mlmodel")!
let loader = try CoreMLLoader(url: modelURL)
loader.load { [weak self](model, error) in
if let model = model {
self?.setupModelHandler(model)
} else if let error = error {
debugPrint(error)
}
}
} catch {
debugPrint(error)
}
}
func setupModelHandler(_ model: MLModel) {
do {
let modelHandler = try CoreMLClassifierHandler(model)
camera.removeTarget(preview)
camera-->modelHandler-->preview
} catch{
debugPrint(error)
}
}
Composite images or video and Rotation
let rotation90 = RotationOperation(.degree90_flip)
let imageCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
guard let testImage = UIImage(named: "sampleImage") else {
fatalError("Check image resource")
}
let gray = Gray()
let compositeFrame = CGRect(x: 50, y: 100, width: 250, height: 250)
imageCompositor.addCompositeImage(testImage)
imageCompositor.sourceFrame = compositeFrame
videoCompositor = ImageCompositor(baseTextureKey: camera.textureKey)
videoCompositor.sourceFrame = CGRect(x: 320, y: 100, width: 450, height: 250)
camera-->rotation90-->gray-->imageCompositor-->videoCompositor-->preview
Filter
- Lookup Filter
Recording video and audio
do {
if FileManager.default.fileExists(atPath: recordingURL.path) {
try FileManager.default.removeItem(at: recordingURL)
}
recorder = try MetalVideoWriter(url: recordingURL, videoSize: CGSize(width: 720, height: 1280), recordAudio: useMic)
if let recorder = recorder {
preview-->recorder
if useMic {
camera==>recorder
}
recorder.startRecording()
}
} catch {
debugPrint(error)
}
Requirements
- Swift 5
- Xcode 12.5.1 or higher on Mac
- iOS: 14.0 or higher
Installation
The Swift Package Manager is a tool for automating the distribution of Swift code and is integrated into the swift
compiler. It is in early development, but Alamofire does support its use on supported platforms.
Once you have your Swift package set up, adding Alamofire as a dependency is as easy as adding it to the dependencies
value of your Package.swift
.
dependencies: [
.package(url: "https://github.com/jsharp83/MetalCamera.git", .upToNextMinor(from: "0.2.0"))
]
References
When creating this repository, I referenced the following repositories a lot. First of all, thanks to those who have worked and opened many parts in advance, and let me know if there are any problems.
Author
jsharp83, [email protected]
License
MetalCamera is available under the MIT license. See the LICENSE file for more info.