Technologies
Technologies
- Swift 5+
- ARKit + RealityKit
- Face anchor tracking (
ARFaceAnchor). - Eye transform extraction (
leftEyeTransform,rightEyeTransform).
- Face anchor tracking (
- SceneKit — used for 3D visualization of eyes & PD cylinder.
- TensorFlow Lite (TFLite) via MediaPipe — image classifier (EfficientNet Lite) to detect sunglasses.
- UIKit — for overlays, labels, user-facing guidance.
- MVVM pattern —
MetricsViewModelseparates business logic from AR/UI rendering.
Example Usage
import ARShadesPDKit
class MyViewController: UIViewController {
var metricsVC: FacialMetricsViewController?
override func viewDidLoad() {
super.viewDidLoad()
let vm = MetricsViewModel()
metricsVC = FacialMetricsViewController(vm: vm)
if let metricsVC = metricsVC {
self.addChild(metricsVC)
self.view.addSubview(metricsVC.view)
metricsVC.didMove(toParent: self)
}
}
}
Workflow
- Initialize
FacialMetricsViewController. - Run
ARFaceTrackingConfiguration. - Classify frames → detect if user wears glasses.
- Guide user → adjust distance, orientation, or remove glasses.
- Capture 30 valid frames.
- Compute PD → millimeter output available via
MetricsViewModel.
Asset Management
- TFLite models
efficientnet_lite0.tflite,efficientnet_lite2.tflite(shipped inside SDK).
- Constants — path definitions for bundled assets.
- 3D rendering — temporary nodes (
SCNSphere+SCNCylinder) only for measurement visualization.
Integration Notes
- Requires ARKit-capable device with TrueDepth camera (iPhone X or newer, iPad Pro with FaceID).
- PD results are device-calibrated, but may vary slightly across devices.
- Always prompt the user to:
- Remove glasses.
- Keep head straight.
- Stay within 40 cm distance.
- SDK is designed to integrate into:
- Eyewear retail apps.
- Tele-optometry flows.
- PD pre-measurement tools.