The distance measuring sensor proposed has a lens system based on an astronomical telescope type with the fixed distance between lens L
1 and L
2 to the sum of focal lengths,
f1+
f2. Through the optical system, an object point P
s (
rs, θ
s,
Zs1) in the object space which is measured from the front focal point F
11 of L
1 is converted linearly into a real inverted image point P
r(
rr, θ
r,
Zr2) measured from the rear focal point F
22 of L
2 in the image space ; that is {(
f2/
f1)
rs, θ
s + π, - (
f2/
f1)
2Zs1}. In order to know
Zs1 of the laser beam spot on an object
Zr2 is to be detected, then the sensor has 1-D CCD placed horizontally so that it makes the pixel line in alignment with the optical axis, as is the case of the laser beam axis. The problem is what aspects of image formation occurs on such an unusual horizontal plane, when a laser beam spot on the object has a finite size of diameter under other conditions such as distance
Zs1, object surface orientation, misalignment of CCD and so on. The paper discusses this problem, examining through spot diagrams and illumination intensity distributions of the image formed by optical computing, by means of ray-tracing method and by applying a multi-point-source model to the beam spot. As conclusions, the peak location of computed illumination intensity for the image corresponds to the theoretical whatever height and form of the peak is affected by
Zs1 and object surface orientation, provided that each alignment in the optical system is ideally perfect.
抄録全体を表示