286
Views
0
CrossRef citations to date
0
Altmetric
Research Article

On-orbit geometric calibration and preliminary accuracy verification of GaoFen-14 (GF-14) optical two linear-array stereo camera

, , , , , , , & show all
Article: 2289013 | Received 03 Dec 2022, Accepted 23 Nov 2023, Published online: 03 Dec 2023

ABSTRACT

The GaoFen-14 (GF-14) satellite is China’s most recent high-resolution earth observation satellite system. It is equipped with a two linear-array stereo camera and is intend for topographic mapping at 1:10,000-scale without ground control points (GCPs). The technical parameters of payloads will change once the satellite enters orbit. As a result, strict on-orbit geometric calibration is necessary. This study performs the on-orbit geometric calibration of the GF-14 stereo camera using ground calibration data. The exterior orientation errors are corrected by a comprehensive bias matrix, internal orientation errors are described by a fifth order polynomial. The results of Yinchuan calibration field show that the planar accuracy is better than 286 m (RMS) before calibration, and improved to 2.11 m (front camera) and 1.51 m (back camera) by external orientation, and further improved to 1.46 m (front camera) and 0.99 m (back camera) after internal calibration. The front intersection accuracy reaches 0.67 m in plane (RMS) and 1.10 m in elevation (RMS), respectively. Checking by multiple global check points (CKPs), the planar and elevation RMS reaches 2.38 m (5.09 m in CE90) and 2.08 m (3.43 m in LE90), respectively.

Introduction

Optical satellite photogrammetry is an effective means to obtain global geospatial data. Through advances in satellite remote sensing technology, optical mapping satellite are widely used in various fields such as topographic mapping (Baltsavias et al., Citation2009; Zhang & Gruen, Citation2006), city three-dimensional modelling (Gong et al., Citation2021; Poli et al., Citation2015), disaster monitoring (Lim & Seock Lee, Citation2018), bathymetric mapping (Cao et al., Citation2019, Citation2023; Chen et al., Citation2019), etc.

According to the way of acquiring stereo images, optical photogrammetry satellite are divided into agile imaging class and thematic mapping class. Agile imaging satellites (such as Pleiades, GeoEye-1, WorldView) take images of the same target multiple times on same track or different track by swinging their mechanical structure, which is flexible and manoeuvrable, but requires high mechanical structure and control capabilities. Thematic mapping satellites are typically not motorized or swept, but instead use three linear-array or two linear-array cameras to obtain stereo images at one pass. For example, the MOMS-02 (Büyüksalih & Jacobsen, Citation2000; Kornus et al., Citation2000), ALOS-1 (Kocaman & Gruen, Citation2010), TH-1 (Wang et al., Citation2017) and ZY-3 (Yang et al., Citation2017; Li & Wang, Citation2012) satellites are equipped with three linear-array cameras, while the Cartosat-1 (Willneff et al., Citation2008) and GF-7 (Tang et al., Citation2021) are equipped with two linear-array cameras.

Launched on 6 December 2020, GF-14 is by far the most technically difficult optical mapping satellite in China. The main engineering goal of GF-14 is to map and update the global 1:10,000-scale topographic map product without ground control points (GCPs), which puts forward high requirements for satellite absolute geolocation accuracy and accuracy retention ability. Although the sensor-related parameters are strictly calibrated in the laboratory before launched, the technical parameters of the payload will change due to the changes of space thermal, mechanical and other environmental factors during the launch and operation. To achieve high-precision goal, on-orbit geometric calibration must be carried out (Crespi et al., Citation2010; Tadono et al., Citation2009).

At present, although there are geometric calibration methods with sparse control points or no calibration field, these methods are aimed at agile satellites and require fast multi-angle imaging, and are not suitable for stereo push-broom satellites (Lussy et al., Citation2012; Wang et al., Citation2017). A more rigorous and general method is as follows: based on the high-precision reference data of the ground calibration field, using the calibration field images obtained by optical satellites in orbit, and using automatic matching to obtain dense control points as constraints to accurately determine various calibration parameters. Common high-resolution earth observation systems (such as SPOT-5, IKONOS, ALOS-1, WorldView, GeoEye-1, ZY-3, etc.) use the ground calibration field to regularly or irregularly carry out on-orbit geometric calibration. After calibration and block adjustment, the accuracy of Pleiades series and SPOT-6/7 satellites without GCPs reaches 10 m (CE90), the accuracy of ALOS-1 without GCPs reaches 8 m in plane and 10 m in elevation (RMS) (Gruen et al., Citation2007), WorldView series satellites reaches 3.5 m (CE90) (DigitalGlobe, Citation2010), GeoEye-1 up to 2.5 m (CE90) (Crespi et al., Citation2010), ZY-3’s planar and elevation RMS are both better than 5 m (Gong et al., Citation2017), and TH-1 03 satellite reaches 3.7 m in plane, 2.4 m in elevation (RMS) (Wang et al., Citation2017, Citation2019). Through the overall solution of the stereo camera calibration parameters, GF-14 reaches 2.3 m in plane (RMS) and 1.9 m in elevation (Lu et al., Citation2023). The above satellites have accumulated a lot of experience in geometric calibration. For example, when SPOT-5 is calibrated, the calibration parameters are classified into static parameters (external calibration, internal calibration) and dynamic parameters (orbit, attitude). ALOS-1 uses 30 additional parameters to compensate for the system error (Gruen et al., Citation2007). TH-1 has carried out an overall calibration of the camera’s main distance, main point position, intersection angle, and angle between the star and the ground camera (Wang & Wang, Citation2012).

The essence of on-orbit geometric calibration is to compensate the systematic errors in geometric positioning using satellite imagery. Due to the complexity and correlation of internal and external geometric parameters, it is difficult to model one by one and make absolute distinctions. The most commonly used method is to use a rotation matrix as the external calibration model to compensate the external error, and use a polynomial as the internal calibration model to fit the viewing angle of each detector on the CCD of the line scan camera. This kind of method has been successfully applied in camera parameters calibration such as SPOT-5, IKONOS (Wang et al., Citation2014) and ZY-3 (Cao et al., Citation2014). This paper aims to carry out high-precision on-orbit calibration of GF-14 stereo camera and evaluate its geopositioning accuracy. In the calibration method, we use the traditional “first external then internal” processing method, carry out calibration processing based on the ground calibration field data, and use several global GCPs to evaluate the stereo geopositioning accuracy, so as to provide processing methods and accuracy reference for the surveying and mapping application of GF-14 satellite.

GF-14 Satellite payload overvire

The main mission of the GF-14 satellite is to carry out global 1:10,000 scale topographic mapping without relying on GCPs, and the main payloads include a two linear-array stereo camera, a hyperspectral camera, a laser ranging system, an optical axis position measuring instrument, attitude measurement equipment (star camera, star sensor + gyro), GNSS position measurement equipment (Beidou + GPS), etc (Wang et al., Citation2023).

The two linear-array stereo camera has a rendezvous angle of 31°, with the front view camera at +26° and the back camera at −5°, and the panchromatic camera has a ground resolution of 0.6 m. A 4-band line array sensor is placed on the focal plane of the panchromatic camera to acquire multi-band images with a ground resolution of 2.4 m. Both the panchromatic and multi-spectral images have a ground width of 40 km. The hyperspectral camera can acquire 100 bands, of which visible near infrared resolution of 5 m and shortwave infrared resolution of 10 m, both with a width of 9.9 km.

To improve elevation accuracy, the satellite also carries a 3-beam laser ranging system with laser repetition frequency of 2 Hz and laser footprint of approximately 60 m. All three laser beams adopt linear detection regime, recording the echo waveform. The laser ranging system is equipped with three footprint cameras at the same time, with a ground resolution of 7.2 m. Matching the footprint camera images with the two linear-array stereo camera images, the laser point ground position information can be accurately determined.

The satellite platform also carries two high-precision star cameras and a set of optical axis position measurement devices. The combined attitude fixing in orbit of the two star camera data enables the accurate calculation of the attitude angle at the moment of photography. The optical axis position measurement device is used to obtain data on the change of position of the front and back view cameras and the star camera with respect to each other (three axes), as well as the change of focal length of the front and back view cameras and the star camera, thus realising the measurement and monitoring of the change of camera angle and focal length in real time during in-orbit photography.

Rigorous imaging geometry model

The rigorous imaging geometric model of optical satellite imagery is constructed based on the collinear equation. The collinear equation is essentially the collinearity of three points: the camera projection center, the image point, and the associated object point. It serves as the fundamental model of satellite image geometric processing. It can also be understood that the image vector Vimage with the imaging projection center as the starting point and the image point as the end point in the camera coordinate system is collinear with the object vector Vobject with the projection center as the starting point and the object point as the end point in the object coordinate system (Gong et al., Citation2017; Poli, Citation2014), as shown in .

Figure 1. Vector collinearity diagram 76 × 66 mm (600 × 600 DPI).

Figure 1. Vector collinearity diagram 76 × 66 mm (600 × 600 DPI).

The GF-14 satellite adopts linear array push-broom imaging. The satellite only acquires one line of images at each imaging moment, and forms continuous strip images with the movement of the satellite. Each row of images satisfies the projection imaging collinear equation, and its strict model is shown in formula (1).

(1) XYZ=XGPSYGPSZGPS+λRJ2000WGS84RbodyJ2000DxDyDz+dxdydz+Rcambodyxyf(1)

where [XYZ]WGS84T is the geodetic coordinates of the target point under the WGS84 ellipsoid; [XGPSYGPSZGPS]Tis the position of the GPS (GNSS) antenna phase center in the WGS-84 Cartesian coordinate system; λ is the imaging scale; [xyf]Tis the imaging point coordinate in the image coordinate system and f represents the principal distance; [DxDyDz]T[dxdydz]Tare the installation offset of GPS antenna phase center and camera projection center, respectively; Rcambody, RbodyJ2000 and RJ2000WGS84 are all 3×3 square matrices, representing the transformation matrix from the camera coordinate system to the satellite body coordinate system, the satellite body coordinate system to the J2000 coordinate system, and the J2000 coordinate system to the WGS84 coordinate system, respectively.

Under the condition of no GCPs, the rigorous imaging geometric model intersects with the earth ellipsoid model. With the support of high-precision DEM (digital elevation model), the target geometric positioning is carried out. The ellipsoid model is shown in equation (2).

(2) X2+Y2A2+Z2B2=1(2)

where, A=ae+h; B=be+h; ae=6378137.0m and be=6356752.3m are the major and minor half axes of WGS-84 earth ellipsoid, respectively; h is the ellipsoid height of the observation target, which can be obtained by geographic coordinate interpolation with the support of DEM data. If there are GCPs involved in the calculation, h is assigned as the ellipsoid height of the control point. This single imagery geolocation algorithm is denoted as look direction method (Riazanofi, Citation2002) in the following.

External parameters calibration model

If the various parameters in the rigorous geometric model (equation (1)) are accurate enough, the ideal three-dimensional coordinates of ground points can be obtained directly by stereo positioning method. In fact, after the satellite goes into orbit, the relative position and attitude between the linear array sensor, star sensor/star camera and GPS will change. There are also systematic errors in the on-board attitude and position measurement equipment. These errors are usually called external errors. Among these errors, the placement error between loads and attitude measurement error are both expressed as attitude angle observation error, which can be combined into attitude error; Both orbit measurement error and GPS eccentricity error are shown as the position observation error of the sensor, which can also be combined [32]. Since the influence of line element and angle element on geopositioning accuracy is similar, a bias matrix Roff is often introduced to uniformly compensate the external error. The bias matrix unifies the external parameters such as GPS eccentric vector, position and attitude error and load placement angle error into one matrix. After introducing the bias matrix, the strict geometric imaging model of satellite sensor becomes the following form:

(3) XYZWGS84=XGPSYGPSZGPS+λRcamWGS84Roffxyf(3)

where RcamWGS84 is the rotation matrix from the camera coordinate system to the WGS84 coordinate system. Roff is also a 3×3 square matrix, which can be composed of any one of the three right corner systems. Taking Y-X-Z system as an example, φ, ω and κ are three rotation angles respectively, then the expression of Roff is:

(4) Roff=cosφ0sinφ010sinφ0cosφ1000cosωsinω0sinωcosω\breakcosκsinκ0sinκcosκ0001(4)

In the calculation, the error equation can be constructed after linearization, and the three corner elements of the offset matrix can be solved through several ground control points. Due to the correlation between internal and external orientation elements, some internal parameter errors (such as offset and rotation) will also be compensated by the bias matrix Roff.

Internal parameters calibration model

The internal errors of the sensor include the optical distortion of the camera, the change of the principal distance, the offset of the principal point, the translation, rotation and scaling of the linear array, etc. There are two commonly used internal parameter models, one is physical model and the other is viewing angle model. The physical model is directly constructed according to the physical meaning of internal parameters. Although the physical meaning is clear, the expression is complex and the parameters are correlated. In this paper, the viewing angle model is used to describe the internal geometric distortion of GF-14 two-linear array camera. This method directly models various internal deformations of the camera without considering the types and physical significance of deformations. Pixel viewing angle refers to the light pointing of CCD detector in the sensor coordinate system. shows the viewing angle diagram in the camera coordinate system.

Figure 2. The diagram of viewing angle in the camera coordinate system 76 × 49 mm (600 × 600 DPI).

Figure 2. The diagram of viewing angle in the camera coordinate system 76 × 49 mm (600 × 600 DPI).

The calculation formula of viewing angle can be obtained from :

(5) tanφx=xftanφy=yf(5)

Substituting equation (5) into equation (3), we get:

(6) XYZWGS84=XGPSYGPSZGPS+λRcamWGS84Rofftanφxtanφy1(6)

In the specific calculation, the 3-5th degree polynomial model is used to represent the viewing angle of the detector. Taking the 5th degree polynomial as an example, the formula is as follows,

(7) tanφx=a0+a1s+a2s2+a3s3+a4s4+a5s5tanφy=b0+b1s+b2s2+b3s3+b4s4+b5s5(7)

Where a0a5 and b0b5 are parameters to be calculated in the viewing angle model, s represents the pixel number (value 1–69728 from leftmost to rightmost in ). Since the front and back camera are separately calibrated, for the convenience of calculation, we set the origin point Oc of image coordinate system at the position of camera main point. Therefore, the initial value is set as: b0=y0μ, b1=μ. Where y0 represents the pixel number of the main point of the camera (front or back) in the image coordinate system; μ represents the pixel size. The pixel number of main point has been calibrated in the laboratory; y0 of front and back camera are 34,864.5 and 35,614.5, respectively.

The front and back cameras were calibrated respectively to obtain the external and internal calibration parameters. At this time, the directions of the corresponding image rays can be accurately recovered, and the forward intersection method can be used to calculate the three-dimensional coordinates of the target point. It should be pointed out that since the real coordinates of the target point are unknown in the geodetic rectangular coordinate system XYZ, if the scale coefficient is calculated by projecting to a fixed reference plane (such as XOY), a large calculation error may be caused by the projection angle. Therefore, the projection coefficient method commonly used in aerial photogrammetry should be avoided during forward intersection; instead, the rigorous solution algorithm based on collinear equation should be adopted (Zhang et al., Citation2008).

On-orbit geometric calibration test and accuracy verification

Experimental data and study area

GF-14 satellite is a comprehensive optical mapping satellite. Its high-precision stereo positioning capability is mainly obtained by the combined processing of two linear stereo camera, laser altimeter and optical axis position recorder. This paper only focuses on the on-orbit calibration and accuracy evaluation of the two linear stereo cameras. The main parameters of stereo camera are shown in . The satellite orbit altitude is about 500 km. Tilt angles of the front camera and the back camera are +26° and −5°, respectively (shown as ). The ground resolutions are both 0.6 m, the radiometric resolution is 12 bit and the ground coverage is greater than 40 km. The front camera and the back camera are respectively composed of 9 pieces and 6 pieces of TDI CCD through the full-return and full-transparent optical splicing method, and 69,728 effective pixels in the cross-track direction are obtained.

Figure 3. Schematic diagram of the geometric relationships of GF-14 front camera and back camera 81 × 69 mm (600 × 600 DPI).

Figure 3. Schematic diagram of the geometric relationships of GF-14 front camera and back camera 81 × 69 mm (600 × 600 DPI).

Table 1. Main technical parameters of the two linear-array cameras.

The on-orbit calibration test was carried out using the calibration field data in Yinchuan City, Ningxia Province, China. The geographical location of remote sensing images and the distribution of GCPs are shown in . Experimental stereo images were acquired on 22 March 2021. The standard image of one scene is 69728×69728 pixels, with a length and width of about 40 km. The basic control data of Yinchuan is a high-precision point record file. There are 119 GCPs within the image coverage and they are basically located in the plain area of Yinchuan City, with an elevation of 1072 m −1214 m. Those GCPs were measured in the field by GPS equipment with an accuracy of 30 cm in planar and 5 cm in elevation.

Figure 4. Geographical location of Yinchuan calibration field and distribution of GCPs 210 × 183 mm (300 × 300 DPI).

Figure 4. Geographical location of Yinchuan calibration field and distribution of GCPs 210 × 183 mm (300 × 300 DPI).

In order to verify the stereo geopositioning accuracy of the two linear-array images after on-orbit calibration, 9 sets of check data (as shown in ) were selected and named as Data 1-Data 9 in time order. Among them, Data 1 is the Jiangsu region of China, Data 6 is Yinchuan field at another time, and the remaining 7 sets of data are outside China. Data 2, 5, 9 are located in the United States, Data 3 is in Japan, Data 4 is in Singapore, Data 7 is in Egypt, and Data 8 is in Saudi Arabia. These CKPs are basically road corner points in the city, and the accuracy is comparable to that of the Yinchuan area, meeting the accuracy verification requirements.

Table 2. Geometric geopositioning accuracy of Yinchuan area before and after on-orbit calibration (unit: m, RMS).

Table 3. Results of external calibration parameters/(units: degree).

Table 4. Results of internal calibration parameters.

Table 5. Single image planar positioning accuracy of all CKPs before and after calibration (unit: m, RMS).

On-orbit calibration results

On-orbit calibration was carried out for the front and back test images of the Yinchuan area on 22 March 2021. The geometric model was constructed for the two cameras separately, and then external orientation and internal orientation based on GCPs were respectively conducted for the front and back images. Finally, after obtaining the precise on-orbit calibration parameters, stereo-pair forward intersection method was used to evaluate the satellite geopositioning accuracy. In the calibration stage, in order to obtain better calibration accuracy, we take all 119 points in Yinchuan area as control points to solve the calibration parameters, and then count the positioning residuals of all points to observe the accuracy (i.e. the internal compliance accuracy of calibration). In different field accuracy verification stages, the CKPs are used to evaluate the forward intersection accuracy of stereo images, reflecting the absolute positioning ability. shows the geopositioning accuracy of each step before and after calibration, is the initial values and calculation results of the external orientation parameters of the stereo cameras, shows the initial values and calculation results of the internal orientation parameters, shows the viewing angle and residual of each detector of CCD array, and corresponds to , reflecting the planar error distribution of each step in the calibration.

Figure 5. Calibrated true viewing angle and difference with initial value of each detector of CCD array 177 × 156 mm (300 × 300 DPI).

Figure 5. Calibrated true viewing angle and difference with initial value of each detector of CCD array 177 × 156 mm (300 × 300 DPI).

Figure 6. Error distribution before and after on-orbit calibration 177 × 262 mm (300 × 300 DPI).

Figure 6. Error distribution before and after on-orbit calibration 177 × 262 mm (300 × 300 DPI).

It should be noted that, considering the customary use of accuracy indicators within China, we mainly count RMS in the following analysis, the formula is as follows,

(8) RMS=i=1nΔi2/n(8)

where Δ represents the difference between a calculated value and the real value, and n represents the total number of points. Examples of RMS calculations can be found in . The more common international CE90 and LE90 can be converted by the following formula (Wang et al., Citation2019),

(9) CE902.14×RMSplanarLE901.65×RMSvertical(9)

Overall, in , the planar accuracy of the front and back camera before calibration are 285.26 m and 224.27 m, respectively, and after geometric calibration, it reaches 1.45 m and 0.99 m (single image positioning method). The stereo geopositioning accuracy of two linear-array camera reaches 0.67 m in planar and 1.10 m in elevation; the accuracy improvement is obvious and reaches a very high accuracy level.

In terms of rigorous model construction, this paper introduced the post-processing high-precision attitude data provided by the image supplier, combined with the laboratorial calibrated transformation matrix from the front/back camera to the satellite body coordinate system, to obtain high precision external transformation matrix RcamWGS84 in formula (6). Single-point positioning method provided in formula (2) was used to solve the ground point coordinates corresponding to the image point. It can be seen from that the planar accuracy (RMS) of the stereo cameras before calibration are better than about 286 m, which shows a relatively good direct positioning ability, indicating that the laboratory calibration parameters and attitude and orbit data are both of high accuracy. In comparison, the direct geopositioning accuracy of China’s previous generation of optical mapping satellite is at the kilometer level, such as the planar accuracy of ZY-3 01 satellite is about ±1 km (Cao et al., Citation2014; Li, Citation2012). At the same time, it can be seen from that the planar error vector of the GCPs before calibration are obviously biased in one direction, and there is a systematic error, which is mainly caused by the error in the attitude data and the equipment placement angle.

In the aspect of external orientation, the initial values of the external parameters Roff(φ,ω,κ) were all set to 0 as shown in . After linearization, the calculations are all convergent, and the optimal solution can be obtained after no more than three iterations. After geometric calibration, the three angle elements corresponding to the front and back cameras were obtained (the last column of ). Those angle elements form the rotation matrix Roff(φ,ω,κ) by Eq.4, which is used to compensate for the systematic error caused by external attitude data. It is clear to see from that after the external orientation, the GCPs’ planar arrows no longer show an obvious system error, but are distributed in all directions. shows that the planar geopositioning accuracy of the front and back camera are reduced from 285.26 m and 224.27 m to 2.11 m and 1.51 m, respectively, showing a significant accuracy improvement.

show the calibrated true viewing angle of each CCD detector in along and cross CCD array, and show the numerical difference between the true viewing angle and the ideal state (converted to pixels). In , the true viewing angle is linearly related to the CCD detector number, which is almost a straight line. The residual is smallest towards the middle pixel, as seen in . The residual is smallest towards the middle pixel, as seen in . The error of the back camera became larger in the sides, while the front camera has a local small value around 15,000 (about −2 pixels), which reflects the significant difference in linear deformation of the front and back cameras in the CCD array direction. If the internal orientation calibration was not carried out, the maximum along CDD error of about 1.2 m (2×0.6m) can be caused. are identical in line trend, which is due to the ideal along-track position should be 0. The leftmost end of the CCD in exhibits the largest along-track residual, which indicates that there may be local rotational deformation at the leftmost end of the front and back cameras, and the linearity of the whole CCD is degraded.

In terms of internal orientation, only the initial values of b0 and b1 were not 0 among the 12 parameters of each camera, and the rest were all set to 0. Combined with and EquationEquation 7, in the ideal design state, the CCD line array should coincide with the y-axis, that is, φx=0, so initial values of a0a5 were set to 0. Meanwhile, the CCD center pixel should coincide with the origin of the coordinate system, so b0=34864.5×0.000007=0.2440515 (front camera). In , all internal parameters are no long 0 after geometric calibration. The difference between the final solution value of b0 and b1 is very small from the initial value, and the order of magnitude of the remaining parameter values varies from 10−28 to 10−5 times. The above results show that CCD array does have geometric variation such as rotation, scaling or translation on the image coordinate system, which are difficult to individually correct, but can be compensated by a polynomial model synthetically.

Although the arrows in do not clearly indicate a tendency of deviating in one direction, they nonetheless demonstrate local systematicity. As shown in , the control point error vector whose x-coordinate is greater than 6.05×105 are basically biased to the right, and in , they are biased to the left. This error, which is obviously related to the CCD column number, is typically caused by the error of the internal orientation element and can be eliminated by the internal viewing angle model. From , it can be seen that after the internal calibration by viewing angle model, the error vector shows obvious randomness in the whole image, which indicates that the camera system error has been well eliminated.

In , the planar geopositioning accuracy of the front and back cameras has been improved from 2.11 m and 1.51 m to 1.45 m and 0.99 m, respectively. The improvement ratio is considerable. It should be noted that since the GF-14 has 69,728 pixels, high sub-deformation on both sides of the CCD is possible, so it is more reasonable to choose a higher number of 5 times model for the internal orientation. This point can also be illustrated from the comparison effect of different times of internal orientation calibration accuracy in . Although the accuracy improvement of the 5th term is not obvious compared with the 3th polynomial, it still has the smallest error and the highest accuracy.

Accuracy assessment outside the calibration field

Acquiring global large-scale high-precision CKPs is a very challenging task, and this paper employs a small number of CKPs in several regions to verify the true accuracy of GF-14 after geometric calibration. The validation sites are labelled Area 1-Area 9 based on the time of data acquisition, with Area 1 and Area 6 located within China and the rest located outside of China. The approximate geographical location of each group of data is shown in . We use the look direction method to calculate the planar accuracy of the front or back camera, and the forward intersection method to calculate the three-dimensional accuracy of the stereo cameras.

shows the comparison of single camera positioning accuracy before and after calibration. The coordinates were projected into local UTM coordinate system. The planar accuracies of the front and the back camera before calibration are 316.33 m and 256.33 m, respectively, and the accuracies improved to 3.48 m and 2.09 m. This result is as expected and slightly worse than the calibrated field planar accuracy in .

represents Stereo positioning accuracy of Area 1 before and after on-orbit calibration. Here, ΔX and ΔY represent the difference between the calculated ground point coordinates and the check point coordinates in the X axis (east) and Y axis (north) of the UTM coordinate system, respectively. ΔXY=ΔX2+ΔY2 represents planar accuracy. Δh is the geodetic height difference between the calculated ground point coordinates and the check point coordinates. As can be seen from , ΔXY=285.52m and Δh=223.33m before calibration, and the two values drop to 2.72 m and 2.58 m, respectively.

Table 6. Stereo positioning accuracy of Area 1 before and after on-orbit calibration (unit: m, RMS).

shows the error distribution of some CKPs before and after calibration. For example, show the error plots before and after calibration corresponding to Area 1, respectively. Since there is only one checkpoint in Area 5, it is not shown here in . It can be seen that both the planar error (red arrow) and elevation error (blue arrow) before geometric calibration have obvious directionality, reflecting systematic positioning errors. The planar RMS (RMSXY) range from 273 m to 292 m and the elevation RMS (RMSh) range from 219 m to 227 m. After geometric calibration, the red arrows have a random direction and the blue arrows also face either up or down, indicating that the systematic errors have been eliminated.

Figure 7. Error distribution of some check points 440 × 367 mm (300 × 300 DPI).

Figure 7. Error distribution of some check points 440 × 367 mm (300 × 300 DPI).

Stereo positioning accuracy of all CKPs after calibration is shown in . Here, RMSXY and RMSXYh represent planar and 3D accuracy, respectively. RMSXY of stereo positioning ranges from 1.01 m to 3.80 m, RMSh ranges from 0.91 m to 3.09 m. The best accuracy occurs in Area 6 with 1.01 m in planar and 1.19 m in elevation. When counting all CKPs, the mean RMSXY is 2.38 m, and the mean RMSh is 2.08 m. According to EquationEquation (9), the final CE90 = 5.09 m and LE90 = 3.43 m.

Table 7. Stereo positioning accuracy of all CKPs after calibration (unit: m, RMS).

Discussion and conclusion

Before on-orbit calibration, the planar accuracy of direct ground target positioning of GF-14 front and back camera reached 285.26 m (RMS) and 224.27 (RMS) m respectively, which is an order of magnitude improvement in accuracy compared with the previous generation of Chinese optical mapping satellite. Taking ZY-3 as an example, the direct planar accuracy of its three linear-array image before calibration is only about 1 km (Cao et al., Citation2019). The GF-14 satellite has a star camera and a star sensor with a gyro that have been meticulously calibrated in a lab. It can output two sets of independent attitude data for mutual backup and mutual verification, and the geopositioning accuracy is comparable in actual processing. The above results show that China’s domestic attitude measurement equipment has reached a high level, and also reflect that the placement parameters between the loads during the on-orbit operation do not differ much from the laboratory calibration values.

After using the comprehensive bias matrix to calibrate the external orientation error, the planar positioning accuracies of those two cameras were significantly improved. For the front and back camera, it reduced from 285.26 m and 224.27 m to 2.11 m and 1.51 m respectively, which indicates that most of the positioning errors in direct positioning are caused by the external orientation angle or load placement relations. From , it can be seen that the maximum internal orientation error of the front and back cameras is about 2–2.5 pixels. It is about 1.5 m when converted to planar error. Therefore, once the external orientation error is compensated, good planar accuracy can be obtained.

After calibrating the viewing angle of each CCD probe by the 5th degree polynomial, the planar accuracy of front and back cameras reached 1.46 m (RMS) and 0.99 m (RMS) respectively. The accuracy of the front camera is slightly poor, and the reason may be that the large inclination angle of the earth observation leads to a larger error in the selection of GCPs, which in turn affects the calibration accuracy. After viewing angle calibration, the stereo geopositioning accuracy reaches 0.67 m (RMS) in planar and 1.10 m (RMS) in elevation. The planar accuracy is close to 1 pixel, reflecting the correctness of the calibration method and results.

The inspection results of a limited number of regional CKPs show that the ground target positioning accuracy of GF-14 stereo camera accuracy is 2.38 m in plane (RMS) and 2.08 m in elevation (RMS), the corresponding CE90 and LE90 are 5.09 m and 3.43 m, respectively. The accuracy of parameters’ overall solution method provide by Lu et al. (Citation2023) are 2.34 m in plane and 1.97 m in elevation, which is very closer to the result of the pointing angle method used in this paper. This result meets the requirements of the camera design accuracy index, and can lay a solid foundation for the satellite to realize global 1:10000 mapping without GCPs.

The time span of the test data is only more than 2 months, test data in this paper alone are not enough to reflect the variation law of accuracy with time. This paper also does not introduce the optical axis position recorder data for the time being. In the follow-up, it is necessary to collect a longer time series of images and GCPs to carry out relevant research.

It should be noted that the calibration accuracy in this paper does not represent the final accuracy of GF-14 stereo positioning. In order to meet the requirements of measuring 1:10,000 topographic maps without GCPs, especially 1.6 meters in elevation, laser ranging data and optical axis data should be comprehensively applied in post data processing. In fact, after the combined processing of stereo imagery, laser altimetry data and optical axis data, the single-route geopositioning accuracy without GCPs reaches 1.8 m (RMS) in plane (CE90 = 3.85 m) and 0.8 m (RMS) in elevation (LE90 = 1.32 m) (Wang et al., Citation2023). GF-14 can achieve high geopositioning accuracy without GCPs and multiple strips’ block adjustment, and the accuracy is the highest level reported so far. The technical details will be shown in a separate paper.

Acknowledgments

The authors are grateful for the editors and reviewers of this manuscript. Also, we would like to thank Ding Lei, Li Guoyuan and Shi Yulin, engineers of Aerospace Stellar Technology Co., Ltd. (503) for their data and technical support.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work is supported by the equipment development project of the independent project of State Key Laboratory of Geographic information Engineering (SKLGIE2022-ZZ-01), the Youth Science innovation Fund (2013-01).

References

  • Baltsavias, E., Zhang, L., & Eisenbeiss, H. (2009). DSM Generation and interior orientation determination of IKONOS ImAges using a testfield in Switzerland. Photogrammetrie - Fernerkundung - Geoinformation, 2006(1). https://www.isprs.org/proceedings/2005/hannover05/paper/112-baltsavias.pdf
  • Büyüksalih, G., & Jacobsen, K. (2000). Geometric aspects of MOMS-2P three-line imagery for mapping applications. Proceedings of Annual Meeting of the Remote Sensing Society. https://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=922431F57609CEE7F615F473843E5B7D?doi=10.1.1.512.1955&rep=rep1&type=pdf
  • Cao, B., Fang, Y., Jiang, Z., Gao, L., & Hu, H. (2019). Shallow water bathymetry from WorldView-2 stereo imagery using two-media photogrammetry. European Journal of Remote Sensing, 52(1), 506–14. https://doi.org/10.1080/22797254.2019.1658542
  • Cao, B., Wang, J., Hu, Y., Lv, Y., Yang, X., Gong, H., Li, G., & Lu, X. (2023). ICESAT-2 shallow bathymetric mapping based on a size and direction adaptive filtering algorithm. IEEE Journal of Selected Topics in Applied Earth Observations & Remote Sensing, 16, 6279–6295. https://doi.org/10.1109/JSTARS.2023.3290672
  • Cao, J., Yuan, X., Gong, J., & Duan, M. (2014). The look-angle calibration method for on-orbit geometric calibration of ZY-3 satellite imaging sensors. Acta Geodaetica et Cartographica Sinica, 43(10), 1039–1045. https://doi.org/10.13485/j.cnki.11-2089.2014.0147
  • Chen, B., Yang, Y., Dewei, X., & Huang, E. (2019). A dual band algorithm for shallow water depth Retrieval from high spatial resolution imagery with no ground truth. ISPRS Journal of Photogrammetry and Remote Sensing, 151, 1–13. https://doi.org/10.1016/j.isprsjprs.2019.02.012
  • Crespi, M., Colosimo, G., Vendictis, L. D., Fratarcangeli, F., & Pieralice, F. 2010. GeoEye-1: Analysis of Radiometric and geometric capability. Paper presented at the Personal Satellite Services-second International Icst Confernce, February Revised Selected Papers.
  • DigitalGlobe. 2010. WorldView-2 overview. Retrieved November 28, 2015. http://global.digitalglobe.com/sites/default/files/DG-WorldView2-DS-PROD.pdf
  • Gong, D., Han, Y., & XU, X. (2021). Global refinement of building boundary with line feature constraints for stereo dense image matching. Acta Geodaetica et Cartographica Sinica, 50(6), 833–846.
  • Gong, J., Wang, M., & Yang, B. (2017). High-precision geometric processing theory and method of high-resolution optical remote sensing satellite imagery without GCP. Acta Geodaetica et Cartographica Sinica, 46(10), 1255–1261. https://doi.org/10.11947/j.AGCS.2017.20170307
  • Gruen, A., Kocaman, S., & Wol Ff, K. (2007). Calibration and validation of early ALOS/PRISM images. Journal of the Japan Society of Photogrammetry & Remote Sensing, 46(1), 24–38. https://www.research-collection.ethz.ch/handle/20.500.11850/6907
  • Kocaman, S., & Gruen, A. (2010). Orientation and self-calibration of ALOS PRISM imagery. Photogrammetric Record, 23(123), 323–340.
  • Kornus, W., Lehner, M., & Schroeder, M. (2000). Geometric in-flight calibration of the stereoscopic line-CCD scanner MOMS-2P. Isprs Journal of Photogrammetry & Remote Sensing, 55(1), 59–71.
  • Li, D. (2012). China’s first civilian three-line-array stereo mapping satellite: ZY-3. Acta Geodaetica et Cartographica Sinica, 41(3), 317–322. http://xb.chinasmp.com/CN/article/downloadArticleFile.do?attachType=PDF&id=6042
  • Lim, J., & Seock Lee, K. (2018). Flood mapping using multi-source remotely sensed data and logistic regression in the heterogeneous mountainous regions in North Korea. Remote Sensing, 10, 1–17. https://doi.org/10.3390/rs10071036
  • Li, D., & Wang, M. (2012). On-orbit geometric calibration and accuracy assessment of ZY-3. Spacecraft Recovery & Remote Sensing, 33(3), 1–6. https://doi.org/10.3969/j.issn.1009-8518.2012.03.001
  • Lussy, F. D., Greslou, D., Dechoz, C., Amberg, V., & Fourest, S. (2012). Pleiades Hr in flight geometrical calibration : Location and mapping of the focal plane. ISPRS - International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, XXXIX-B1(4), 519–523.
  • Lu, X., Wang, J., Yang, X., Lv, Y., Hu, Y., Wei, Y., & Cao, B. (2023). High-precision on-orbit geometric calibration of the GF-14 satellite dual-line-array cameras. Acta Geodaetica et Cartographica Sinica, 52(1), 15–21.
  • Poli, D. (2014). A rigorous model for spaceborne linear array sensors. Photogrammetric Engineering & Remote Sensing, 73(2), 187–196. https://doi.org/10.14358/PERS.73.2.187
  • Poli, D., Remondino, F., Angiuli, E., & Agugiaro, G. (2015). Radiometric and geometric evaluation of GeoEye-1, WorldView-2 and pléiades-1A stereo images for 3D information extraction. Isprs Journal of Photogrammetry & Remote Sensing, 100(feb), 35–47.
  • Riazanofi, S. (2002). SPOT satellite geometry handbook. https://www.intelligence-airbusds.com/files/pmedia/public/r439_9_spot_geometry_handbook.pdf
  • Tadono, T., Shimada, M., Murakami, H., & Takaku, J. (2009). Calibration of PRISM and AVNIR-2 onboard ALOS “Daichi. IEEE Transactions on Geoscience & Remote Sensing, 47(12), 4042–4050. https://doi.org/10.1109/TGRS.2009.2025270
  • Tang, X., XIE, J., & Mo, F. (2021). GF-7 dual-beam laser altimeter on-orbit geometric calibration and test verification. Acta Geodaetica et Cartographica Sinica, 50(3), 384–395.
  • Wang, M., Bo, Y., Fen, H., & Xi, Z. (2014). On-orbit geometric calibration model and its applications for high-resolution optical satellite imagery. Remote Sensing, 6(5), 4391–4408. https://doi.org/10.3390/rs6054391
  • Wang, J., & Wang, R. (2012). EFP multi-functional bundle adjustment of mapping satellite-1 without ground control points. Journal of Remote Sensing, 16(additional), 112–115. CNKISUNYGXB.https://doi.org/10.2012/S1/025
  • Wang, J., Wang, R., Hu, X., & Su, Z. (2017). The on-orbit calibration of geometric parameters of the tian-hui 1 (TH-1) satellite. Isprs Journal of Photogrammetry & Remote Sensing, 124, 144–151. https://doi.org/10.1016/j.isprsjprs.2017.01.003
  • Wang, R., Wang, J., & Li, J. (2019). Improvement strategy for location accuracy without ground control points of 3rd satellite of TH-1. Acta Geodetic et Cartographica Sinica, 48(6), 671–675. https://doi.org/10.11947/j.AGCS.2019.20190058
  • Wang, J., Yang, Y., Hu, Y., LV, Y., Yang, X., Lu, X., & Cao, B. (2023). Preliminary location accuracy assessments of GF-14 stereo mapping satellite without ground control points. Acta Geodaetica et Cartographica Sinica, 52(1), 8–14.
  • Willneff, J., Weser, T., Rottensteiner, F., & Fraser, C. S. (2008). Precise georeferencing of cartosat imagery via different orientation models. ISPRS Beijing Congress.
  • Yang, B., Pi, Y., Li, X., & Yang, Y. (2020). Integrated geometric self-calibration of stereo cameras onboard the ZiYuan-3 satellite. Isprs Journal of Photogrammetry & Remote Sensing, 162, 173–183. https://doi.org/10.1016/j.isprsjprs.2020.02.015
  • Yang, B., Wang, M., Xu, W., Li, D., Gong, J., & Pi, Y. (2017). Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images. Isprs Journal of Photogrammetry & Remote Sensing, 134, 1–14. https://doi.org/10.1016/j.isprsjprs.2017.10.013
  • Zhang, B., Gong, Z., & Guo, H. (2008). Photogrammetry Beijing (pp. 84–89). Surveying and Mapping Press.
  • Zhang, L., & Gruen, A. (2006). Multi-image matching for DSM generation from IKONOS imagery. Isprs Journal of Photogrammetry & Remote Sensing, 60(3), 195–211.