Camera FOV and Distortion

Key Takeaways

  • Field of View (FOV) defines the observable scene area and is primarily determined by sensor size and lens focal length.
  • Lens distortion alters geometric accuracy, especially at wide FOVs, requiring calibration and correction.
  • Proper FOV and distortion management are critical for measurement accuracy, perception reliability, and system integration.

What is it?

Field of View (FOV) refers to the angular extent of the observable scene captured by a camera. It is typically expressed in degrees (e.g., 60°, 90°, 120°) and can be defined horizontally, vertically, or diagonally. FOV is determined by the combination of lens focal length and image sensor size.
Lens distortion describes the deviation from an ideal pinhole camera model, where straight lines in the real world may appear curved in the image. The most common types are:
  • Barrel distortion (wide-angle lenses): lines curve outward
  • Pincushion distortion (telephoto lenses): lines curve inward
  • Complex distortion (mixed models): higher-order effects in real systems
FOV and distortion are tightly coupled. Increasing FOV, especially beyond 90°, typically introduces stronger distortion.

How does it work?

FOV is governed by geometric optics. For a given sensor size, shorter focal lengths produce wider FOVs, while longer focal lengths result in narrower FOVs. This relationship is fundamental in lens design and camera selection.
Lens distortion arises from non-ideal optical behavior. Real lenses consist of multiple elements, and light rays passing through different regions of the lens are refracted unevenly. This leads to spatial warping of the image.
In practical systems, distortion is modeled mathematically using intrinsic camera parameters, including radial and tangential distortion coefficients. During camera calibration:
  • A known pattern (e.g., checkerboard) is captured
  • Distortion parameters are estimated
  • Image correction (undistortion) is applied in software or ISP
For vision systems, this calibration step is essential to restore geometric consistency.

Why does it matter?

FOV directly affects scene coverage. A wider FOV enables a single camera to observe more area, reducing the number of cameras required. However, it also reduces spatial resolution per unit area and increases distortion.
Distortion impacts measurement accuracy and algorithm performance. In applications such as 3D reconstruction, SLAM, or object detection:
  • Uncorrected distortion leads to spatial errors
  • Feature matching becomes less reliable
  • Depth estimation accuracy degrades
There is a trade-off between coverage and accuracy. Engineering design must balance:
  • FOV (coverage)
  • Resolution (detail)
  • Distortion (geometric fidelity)
  • Processing cost (correction and calibration)

Applications

FOV and distortion considerations vary significantly by application:
  • Robotics: Wide FOV for navigation and obstacle avoidance, with calibrated distortion for accurate mapping
  • Industrial inspection: Narrow or moderate FOV with low distortion for precise measurement
  • Smart retail and security: Wide FOV for area coverage, distortion correction applied for analytics
  • AR/VR and smart glasses: Ultra-wide FOV with aggressive distortion correction and real-time processing
  • Barcode and logistics scanning: Optimized FOV for working distance and target size
Each use case requires a different balance between optical design and algorithmic compensation.

SGI Solution

SGI approaches FOV and distortion as part of a complete vision system design rather than isolated camera selection.
At the hardware level, SGI supports lens selection across a wide FOV range, from narrow-angle precision optics to ultra-wide lenses (>120°), matched with appropriate sensor formats. This ensures alignment between optical coverage and pixel density.
At the system level, SGI integrates:
  • Camera calibration pipelines (intrinsic and extrinsic)
  • Distortion correction algorithms (real-time or offline)
  • ISP tuning for image quality consistency
  • Interface optimization (USB, MIPI) for bandwidth and latency control
For multi-camera or 3D systems, SGI also considers cross-camera alignment and distortion consistency, which are critical for depth fusion and spatial perception.
The result is a calibrated, application-specific vision system where FOV and distortion are engineered parameters, not afterthoughts.

Related Topics