A Novel Method of Light Source Calibration under Near-field Lighting Condition

The paper presents a novel method of photometric stereo calibration that works under near-field condition, and eight LED lights are installed on the system. In order to estimate the pose of the light source, light source position estimation based on iPad imaging is proposed. However, it is difficult to estimate principal optical axis direction of the light source which would influence the reconstruction result. Considering the problems of above, the new convolution kernel is used to find the brightest point on the chessboard paper, and the principal optical axis direction of light source can be precisely calculated according to geometric transformation. The experiments show the high-quality of 3-D reconstruction can be obtained by using the novel calibration method.


Introduction
Comparing with the traditional 3D scanning techniques, such as laser scanning and structured light scanning [1,2], photometric stereo is famous for its high ability to recover details [3]. Considering the principle of photometric stereo technique is to estimate surface normal of measured object and the accuracy of reconstruction is highly determined by surface normal estimation [4], the important role of light sources calibration becomes obvious.
For most of the photometric stereo systems that have been developed, the light source is usually processed as an ideal point or parallel light source if the distance between light source and measured object is too long [5], and the method above becomes invalid under near-field lighting condition. In the paper, a novel calibration method of light source is proposed and the traditional photometric stereo model is modified to fit the near-field lighting condition. To achieve high accurate estimation of surface normal, the lighting condition on each image should be modeled precisely.
There are many methods to calibrate light source information. Powell et al. present a methodology for calibrating multiple light source locations and three standard spheres with known relative positions are used in calibrating process [6]. A matte sphere is used to calculating the imaging system parameters, and the other two reflective spheres are used to estimating locations of light sources by calculating intersection of two incident lights. Result shows that the error of vector from a point in the scene to a light source can be controlled within 2.7⁰. Xie and Chung propose to employ a shiny sphere , where albedo variance of the lambertian plate is used as the basis of the object function. Through photometric stereo reconstruction and image rendering, the accuracy of their estimation framework is proved. Song et al. present a computational framework for photometric stereo system [8], and a two-step calibration procedure is proposed in their work. A multiple-sphere-based approach is used to estimate the light source position. Then, a reference-plane-based approach is applied to estimate the principal optical axis direction of each light source.
To improve the accuracy and robustness of light source calibration under near-field lighting condition, a chessboard image is generated in the iPad for light source pose estimation, and a light spot image which is used to estimate incident light direction can be obtained through turning on the light source in a low exposure. In order to reduce errors in principal optical axis direction, a white paper printed chessboard is used in the experiment. A new convolution kernel is applied to find the brightest point on the chessboard paper, and the grayscale centroid method is used to achieve subpixel accuracy [9]. After obtaining the brightest point on the paper, the principal optical axis can be solved in [10].
The paper is organized as follows. In Section 2, calibration of light source information is introduced, which is aimed to estimate light source position and principal optical axis direction. Section 3 verified good robustness of the method and its excellent performance on the photometric stereo reconstruction. Finally, conclusions are given in Section 4.

Calibration of light source information
In this section, the method of light source calibration is explained in detail. First, light source position estimation based on iPad imaging is introduced, and each light source position can be solved by calculating intersection of the incident lights. Then, a new convolution kernel is applied in finding the brightest point preliminary on the chessboard paper and the grayscale centroid method is used in the paper for obtaining subpixel accuracy, which is benefit for principal optical axis estimation.

Light source position estimation based on iPad imaging
As shown in figure 1(a), an image of chessboard with known size is shown in the iPad, and the pose of camera coordinate system relative to world coordinate system which presents in figure 1(b) can be obtained by using EPNP [11]. The world coordinate system is established, and W Z is set as 0 for all points on the iPad imaging plane. A light spot is formed when the LED is on, and the exposure of camera must be set in a suitable value to obtain a approximate ellipse spot on the imaging plane. In figure 1(b), R and t represent rotation and translation of optical center relative to iPad imaging plane. Also, R is also shown in formula 1. From formula 2, the world coordinate of spot center can be solved because there are only , C Z W X and W Y unkown, and formula 3 is converted from formula 2. M represents the intrinsic parameter matrix which is calculated accurately in camera calibration process, and the spot center in o uv − is obtained through using moment of outline.
The spot center in the camera coordinate system is presented in formula 4. In figure 1(b), the vector of incident light, reflected light and normal are shown in formula 5, and n is the unit vector which is equal to 3. r − The projection of the reflected light onto the normal vector can be expressed as / 2, n and projection relation is presented in formula 6. According to formula 5 and formula 6, the incident light vector can be obtained in formula 7, and the equation of incident light in camera coordinate system can be obtained because the camera coordinate of spot center is known. To estimate light source position, the iPad imaging plane is converted in different pose, and the position is obtain by solving intersection of the incident lights in least square method.     In order to solve the 0 , l the formula 8-10 are used and g is an inherent feature which related to illuminance attenuation of the light source [10]. For an LED, the g can be solved in formula 9 and half  is the angle between the ray which has illuminance 0 /2 E and 0 . l When the  is solved, the 0 l can be obtained in formula 10 and Rodrigues formula is used. All parameters of formula 10 are shown in figure 2.

Light source position estimation and calculation of the principal optical axis
In light source position estimation process, exposure time of the camera is set to 40ms when the chessboard image is shown in the iPad. Then, turn on one LED and set exposure time 1ms under the same pose. Two images are taken respectively according to two steps above. Change the pose of the iPad and repeat the procedure before, the light source position can be solved by using least square method. In calculation of the principal optical axis direction, exposure time of the camera is set to 100ms and all LEDs are turned on as external light sources. Then, the LED which needs to calculate principal optical axis direction is turned on and other LEDs are turned off, and the camera is set to 60ms. Two images are taken respectively shown in figure 4. According to the section 2.2, the principal optical axis direction of the LED is obtained. In [10], the maximum illuminance 0 E emitted along the principal optical axis must be solved before conducting reconstruction and formula 11 is shown in [10].
As shown in figure 6 and formula 11, for any ray l , I is the intensity of point P taken by the camera. After calibrating, g is 1.2 in the experiment. According to formula 11, for all surface point taken by camera, the sum intensity can be shown in formula 12. Therefore, the maximum illuminance of each light source can be calculated in formula 13 and a white paper with known pose is used in the experiment. Figure 7 shows the process of calculating 0 E , and    To verify good robustness of the method proposed in the paper, the LED labeled L1 in the table 1 is calibrated 6 times and the method in [8] is for comparison. table 2 and table 3 show position and principal optical axis direction of our method comparing with the method in [8] respectively. From experimental results, the standard deviation of our method is lower, which proves good robustness of the method.

Evaluation of 3D reconstruction
In figure 6, the geometric relationship between h and l can be expressed in formula 14. For point  From the experimental results, it is observed that our method has the advantage of high accuracy and good robustness under the near-field lighting condition. However, the reconstruction result is only In future work, the objects with high reflection will be studied and the reconstruction algorithm will be simplified for improving reconstruction speed.

Conclusion
In the process of light source position estimation, the method uses a chessboard image which is generated in the iPad for pose estimation, and a light spot image taken in a low exposure is used to estimate incident light direction. Through changing the pose of the iPad, the position of the light source in the camera coordinate is calculated by solving intersection of the incident lights in least square method. To estimate principal optical axis of light source more accurate, a new convolution kernel is applied to find the brightest point on the chessboard paper. The grayscale centroid method is used in finding brightest point to achieve subpixel accuracy and the principal optical axis can be solved according to Rodrigues formula. The results in table 2 and table 3 prove that our method has better robustness, and the reconstruction in figure 9 present our method has a better accuracy, which can be applied in deformation inspection.