我只需要每秒60帧中的20帧来处理(CVPixelBuffer
)。
如何在ARKit会话中捕获每第三个ARFrame?我需要大约20 fps的捕获速率(我理解可能会有丢帧的情况)。
这里是一个代码片段:
func updateCoreML() { let pixelBuffer: CVPixelBuffer? = (sceneView.session.currentFrame?.capturedImage) if pixelBuffer == nil { return } let ciImage = CIImage(cvPixelBuffer: pixelBuffer!) let imageRequestHandler = VNImageRequestHandler(ciImage: ciImage, options: [:]) do { try imageRequestHandler.perform(self.visionRequests) } catch { print(error) }}
回答:
我所知道的最简单的方法是使用RxARKit
并对session.rx.didUpdateFrame
应用.throttle()
操作符。我认为不应该跳过太多事件,因为帧率并不能保证始终是60fps,因此最好使用.throttle()
来确保每隔一定毫秒获取一帧,无论实际帧率如何。你可以将结果连接到RxVision
,它将确保该帧被CoreML
使用。
import RxSwiftimport RxARKitimport RxVisionlet disposeBag = DisposeBag()let mlRequest: RxVNCoreMLRequest<CVPixelBuffer> = VNCoreMLRequest.rx.request(model: model, imageCropAndScaleOption: .scaleFit)mlRequest .observable .subscribe { [unowned self] (event) in switch event { case .next(let completion): let cvPixelBuffer = completion.value if let result = completion.request.results?[0] as? VNClassificationObservation { os_log("results: %@", type: .debug, result.identifier) } default: break } } .disposed(by: disposeBag)session .rx .didUpdateFrame .throttle(1/20) .subscribe { event in switch event { case .next(let didUpdateFrame): let frame: ARFrame = didUpdateFrame.frame let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: frame.capturedImage, orientation: .up, options: requestOptions) do { try imageRequestHandler.rx.perform([mlRequest], with: frame.capturedImage) } catch { print(error) } break default: break } } .disposed(by: disposeBag)