GrandLeez : Différence entre versions

De MicMac
Aller à : navigation, rechercher
(Absolute orientation and correction of the GNSS delay)
(fix typo)
Ligne 142 : Ligne 142 :
  
 
===With New pipeline===
 
===With New pipeline===
If you want to use the new pipeline, yo have to use [[Pims]] and [[Pims2MNT]] :
+
If you want to use the new pipeline, yo have to use [[PIMs]] and [[PIMs2MNT]] :
 
<pre>mm3d Pims MicMac "R.*JPG" All-BL72 DefCor=0 ZoomF=1</pre>
 
<pre>mm3d Pims MicMac "R.*JPG" All-BL72 DefCor=0 ZoomF=1</pre>
 
<pre>mm3d Pims2MNT MicMac DoOrtho=1</pre>
 
<pre>mm3d Pims2MNT MicMac DoOrtho=1</pre>

Version du 2 mai 2017 à 16:00

Picto-liste.png Tutorials index

Description

This dataset will allow you to process a UAV mission a more generally a aerial mission. You will go trough direct georeferencing, GNSS delay correction, coordinate system change. We apply all this concepts to canopy model generation.

Download

You can find this dataset at http://micmac.ensg.eu/data/uas_grand_leez_dataset.zip
Once you have downloaded it, you have to unzip the ".zip" archive.

Presentation

This dataset is kindly provided by "l’Unité Gestion des Ressources Forestières et des Milieux Naturels (GRFMN), Université de Liège". Contact: jo.lisein@ulg.ac.be File present in the directory are :

  • 200 images : (800x600px) captured by a RICOH GR DIGITAL 3

Carroussel grandleez.png

  • 2 files to change coordinate system : SysCoBL72_EPSG31370.xml et SysCoRTL.xml
  • Image Neighboors : FileImagesNeighbour.xml
  • Image geolocation : GPS_WPK_Grand-Leez.csv
  • 2 commands scripts : UASGrandLeez.sh (Linux) et UASGrandLeez.bat (Windows)
  • 1 file with detailed commands : cmd_UAS_Grand-Leez.txt

This dataset is a UAV mission realised in GrandLeez, Belgium for forest canopy survey.

Tutorial

Conversion of image coordinates

OriConvert is used for 5 purposes:

  1. Conversion of the embedded GPS data into the micmac format : OriTxtInFile
  2. Generate the image pairs file
  3. Change the coordinate system (from WGS84 to a locally tangent system) with the argument : ChSys=DegreeWGS84@SysCoRTL.xml
  4. Compute relative speed of the camera (for GPS delay determination) : MTD1=1 CalcV=1
  5. Select a sample of the image block (PATC) for camera calibration : NameCple=FileImagesNeighbour.xml ImC=R0040536.JPG NbImC=25
mm3d OriConvert OriTxtInFile GPS_WPK_Grand-Leez.csv Nav-Brut-RTL ChSys=DegreeWGS84@SysCoRTL.xml MTD1=1 NameCple=FileImagesNeighbour.xml CalcV=1 ImC=R0040536.JPG NbImC=25

See OriConvert for more details on arguments and file format.

Tie Point Generation with Tapioca

The file FileImagesNeighbour.xml contain for each images, his differents neighboors. If you open the file, you can see :

     <Cple>R0040439.JPG R0040519.JPG</Cple>
     <Cple>R0040439.JPG R0040514.JPG</Cple>
     <Cple>R0040439.JPG R0040444.JPG</Cple>
     <Cple>R0040439.JPG R0040517.JPG</Cple>
     <Cple>R0040439.JPG R0040438.JPG</Cple>
     <Cple>R0040439.JPG R0040440.JPG</Cple>
     <Cple>R0040439.JPG R0040441.JPG</Cple>
     <Cple>R0040439.JPG R0040516.JPG</Cple>
     <Cple>R0040439.JPG R0040442.JPG</Cple>
     <Cple>R0040439.JPG R0040515.JPG</Cple>
     <Cple>R0040439.JPG R0040443.JPG</Cple>

It means, image R0040439.JPG is connected with all the images detailed in <Cple> tag. So you can run the tie point generation with Tapioca using this file :

mm3d Tapioca File FileImagesNeighbour.xml -1

The processing time is shorter, because micmac knows which pictures to match.

Camera Calibration

To run a Camera calibration with Tapas, you can take an other dataset (here using a block of 25 images), with exactly the same camera settings, or you can use a part of the principal dataset. Here we use the same images as in OriConvert to determine Internal Orientation Parameters (IOP) :

mm3d Tapas RadialBasic "R0040536.JPG|R0040537.JPG|R0040535.JPG|R0040578.JPG|R0040498.JPG|R0040499.JPG|R0040579.JPG|R0040538.JPG|R0040577.JPG|R0040534.JPG|R0040497.JPG|R0040500.JPG|R0040580.JPG|R0040456.JPG|R0040616.JPG|R0040576.JPG|R0040496.JPG|R0040617.JPG|R004045.JPG|R0040457.JPG|R0040615.JPG|R0040539.JPG|R0040501.JPG|R0040581.JPG|R0040533.JPG" Out=Sample4Calib-Rel

This is the result of the last iteration :

| |  Residual = 0.474718 ;; Evol, Moy=5.50743e-015 ,Max=3.70866e-014
| |  Worst, Res 0.618139 for R0040576.JPG,  Perc 99.446 for R0040496.JPG
| |  Cond , Aver 6.46061 Max 42.4603 Prop>100 0

Orientation of the complete block in a relative system

You can directly integrate the IOP determination in the relative orientation processing, by using Tapas and the argument InCal :

mm3d Tapas RadialBasic "R.*.JPG" Out=All-Rel InCal=Sample4Calib-Rel

This is the results of the last iteration :

| |  Residual = 0.420786 ;; Evol, Moy=3.64623e-014 ,Max=4.34387e-013
| |  Worst, Res 0.662578 for R0040576.JPG,  Perc 98.5075 for R0040472.JPG
| |  Cond , Aver 5.84769 Max 47.5369 Prop>100 0
Go further
When you are using a pre-calibration in Tapas, you give a initial solution to the least squares algorithm, you improve the convergence speed/chances.
mm3d Tapas RadialBasic "R.*.JPG" Out=All-Rel-b

The results of the last iteration is :

| |  Residual = 0.420786 ;; Evol, Moy=4.39195e-013 ,Max=1.7138e-012
| |  Worst, Res 0.662578 for R0040576.JPG,  Perc 98.5075 for R0040472.JPG
| |  Cond , Aver 5.40335 Max 52.3331 Prop>100 0

The processing time is longer, but the residuals are the same, which prove the algortihm efficiency.

We will know compute a sparse cloud with image relative position and orientation, to check if the block is correctly computed :

mm3d AperiCloud "R.*.JPG" All-Rel

05 ori rel.png
Optionnaly, if meshlab is installed, you can vizualise the sparse cloud:

meshlab All-Rel.ply

Absolute orientation and correction of the GNSS delay

The position of the UAV are computed for the phase center of the GNSS which not correspond to the camera center. Moreover, the UAV is moving when you take the picture so you have to compute relative speed of each camera in order to determine and correct GNSS systematic error(delay).

First, we have to use the embedded GNSS data to set ("bascule") our system in the geographical coordinate system of choice (here a local radial tangential system - RTL) :

mm3d CenterBascule "R.*.JPG" All-Rel Nav-Brut-RTL tmp CalcV=1

Note : the target system NEEDS to be at least pseudo-euclidean (axes orthonormales). Lat/long/height is not such a system.

This is the result of the bascule

...
BEGIN Compensation
BEGIN AMD
END AMD
APPLI APERO, NbUnknown = 1208
delay init :::    -0.0787348
...

Here we compute a pre-absolute orientation from the relative one and we compare it to the image geolocation. The delay is estimated from the residuals and the UAV speed (here -0.0787348s). OriConvert is (again) used for taking the delay into account and generate a new (accurate) orientation :

mm3d OriConvert OriTxtInFile GPS_WPK_Grand-Leez.csv Nav-adjusted-RTL ChSys=DegreeWGS84@SysCoRTL.xml MTD1=1 Delay=-0.0787348

Now we can use the precise georeferencing to compute the absolute orientation from the aerotriangulated model with CenterBascule

mm3d CenterBascule "R.*.JPG" All-Rel Nav-adjusted-RTL All-RTL

Change the coordinate system

Here we want to use the canopy model in other tools such OTB or QGIS for image segmentation/classification. So we have to backward transform our orientation coordinate system from local euclidian system to a geographic system. The tool to perform this transform is ChgSysCo. Here we transform to Belgian Lambert 72 (EPSG31370). More information about EPSG codes here : http://spatialreference.org/ref/epsg/

mm3d ChgSysCo  "R.*JPG" All-RTL SysCoRTL.xml@SysCoBL72_EPSG31370.xml All-BL72

Compute an orientation vizualisation :

mm3d AperiCloud "R.*.JPG" All-BL72 Out=All-BL72-cam.ply WithPoints=0

The argument WithPoints, allow you to export only image position and orientation. Optionnaly, if meshlab is installed, you can vizualise the orientation cloud :

meshlab All-BL72-cam.ply

Canopy Surface Model

With Old pipeline

Dense-matching with Malt.

mm3d Malt Ortho "R.*JPG" All-BL72 DirMEC=MEC DefCor=0 AffineLast=1 Regul=0.005 HrOr=0 LrOr=0 ZoomF=1

This command line generate depth map by iteration on sub-sampling models. So we have to use the highest resolution, this is the file MEC/Z_Num8_DeZoom1_STD-MALT.tif Here, we aren't interested in the generation of orthophoto, but we want to compute a Digital Elevation Model (DEM) for canopy survey, so we convert the depth map in 8bits

mm3d to8Bits MEC/Z_Num8_DeZoom1_STD-MALT.tif Out=Canopy_dem.tif

Canopy DEM.png
We can also export the dense point cloud and color it with Nuage2Ply :

mm3d Nuage2Ply "MEC/NuageImProf_STD-MALT_Etape_8.xml" Scale=8 Attr="Canopy_dem.tif" Out=CanopySurfaceModel.ply

One can compute the orthoimage with:

mm3d Tawny Ortho-MEC

The orthoimage will be the file Ortho-MEC/Orthophotomoisaic.tif .


Optionnaly, if meshlab is installed

meshlab CanopySurfaceModel.ply

With New pipeline

If you want to use the new pipeline, yo have to use PIMs and PIMs2MNT :

mm3d Pims MicMac "R.*JPG" All-BL72 DefCor=0 ZoomF=1
mm3d Pims2MNT MicMac DoOrtho=1
mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Out=Canopy_dem.tif
mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr="Canopy_dem.tif" Out=CanopySurfaceModel.ply

Optionnaly, if meshlab is installed

meshlab CanopySurfaceModel.ply