Skip to main content
Version: 1.0.0

Technologies

Technologies


  • Swift 5+
  • ARKit + RealityKit
    • Face anchor tracking (ARFaceAnchor).
    • Eye transform extraction (leftEyeTransform, rightEyeTransform).
  • SceneKit — used for 3D visualization of eyes & PD cylinder.
  • TensorFlow Lite (TFLite) via MediaPipe — image classifier (EfficientNet Lite) to detect sunglasses.
  • UIKit — for overlays, labels, user-facing guidance.
  • MVVM patternMetricsViewModel separates business logic from AR/UI rendering.

Example Usage

import ARShadesPDKit

class MyViewController: UIViewController {
var metricsVC: FacialMetricsViewController?

override func viewDidLoad() {
super.viewDidLoad()
let vm = MetricsViewModel()
metricsVC = FacialMetricsViewController(vm: vm)

if let metricsVC = metricsVC {
self.addChild(metricsVC)
self.view.addSubview(metricsVC.view)
metricsVC.didMove(toParent: self)
}
}
}

Workflow

  1. Initialize FacialMetricsViewController.
  2. Run ARFaceTrackingConfiguration.
  3. Classify frames → detect if user wears glasses.
  4. Guide user → adjust distance, orientation, or remove glasses.
  5. Capture 30 valid frames.
  6. Compute PD → millimeter output available via MetricsViewModel.

Asset Management

  • TFLite models
    • efficientnet_lite0.tflite, efficientnet_lite2.tflite (shipped inside SDK).
  • Constants — path definitions for bundled assets.
  • 3D rendering — temporary nodes (SCNSphere + SCNCylinder) only for measurement visualization.

Integration Notes

  • Requires ARKit-capable device with TrueDepth camera (iPhone X or newer, iPad Pro with FaceID).
  • PD results are device-calibrated, but may vary slightly across devices.
  • Always prompt the user to:
    • Remove glasses.
    • Keep head straight.
    • Stay within 40 cm distance.
  • SDK is designed to integrate into:
    • Eyewear retail apps.
    • Tele-optometry flows.
    • PD pre-measurement tools.