Technology

Our core technology approach lies in the application of pulsed time-of-flight techniques for industrial vision systems, enabling innovative products to be created.

 

animation

Pulsed Time-of-Flight

Exploiting time-of-flight technology was the main driver behind the formation of Odos Imaging, here is a simple explanation of how time-of-flight 3D imaging works:

1. The system triggers the illumination units to emit an intense pulse of laser light (light is in the near-IR spectrum, with wavelength 905nm).

2. The pulse floods the scene, and is reflected from objects within the scene.

3. Pulses reflected from objects that are closer return to the time-of-flight sensor before objects that are further away.

In between capture of frames used to extract range from the scene, the sensor is able to capture a conventional machine vision intensity image.

Considerations for Applications

Range information can only be extracted if a return pulse is detected. This means that objects that are very “dark” (meaning they do not reflect light well), or objects that are very reflective (and reflect all emitted light away from the sensor rather than back to it) may be “invisible” in the range image. Of course, (assuming sufficient ambient light) such objects will be visible in the intensity image.

Range information is extracted using proprietary algorithms implemented in hardware in the sensor. Internally, the sensor utilises a sequence of frames to do this, running at a very high internal frame rate .

An ambient light intensity image can be captured as part of the sequence – machine vision with depth. (It is also possible to utilise the laser illumination as an intensity flash to capture the intensity image with IR illumination – so called active IR.)

The use of short but intense pulses of light ensure that the system is able to operate in even high levels of ambient light (outdoor applications are possible). In very bright situations, it may be useful to incorporate a bandpass filter centred around 905nm.

Very busy, or cluttered scenes, especially with reflective objects can result in multipath returns (where the emitted pulse is reflected multiple times from multiple objects before detection at the sensor). Multipath is undesirable in a time-of-flight measurement, adding uncertainty to the measured range.

The illumination profile should be well matched to the field of view of the objective lens to ensure that light emitted into the scene is not “wasted”.

High Resolution Sensors

Each and every pixel on the image sensor is used to capture both range (3D) and intensity information. However, with only a finite area behind the objective lens, pixels on high resolution sensors are small (for example, each of the pixels on the 1.3MP StarForm camera are just 14µm across). The better the quality of the the reflected pulse, then the better then quality of the range measurement, as the level of signal generated by the pixel is higher for a “brighter” returned pulse (lower quality returns, result in “noisier” pixels in the range measurement. This is generally not a concern in the intensity image, as exposure time is much longer and the intensity of (ambient) light incident on the pixel is much higher.

Frame accumulation in the sensor hardware is a user controlled feature that allows multiple samples of the range image to be captured, and accumulated together to improve the quality of the range measurement. The number of frames used is user programmable, with the effect that a higher number of accumulations results in an improved range measurement, with a small impact on output frame rate.