<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="fr">
		<id>http://micmac.ensg.eu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Mdaakir</id>
		<title>MicMac - Contributions de l’utilisateur [fr]</title>
		<link rel="self" type="application/atom+xml" href="http://micmac.ensg.eu/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Mdaakir"/>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php/Sp%C3%A9cial:Contributions/Mdaakir"/>
		<updated>2026-04-15T15:44:41Z</updated>
		<subtitle>Contributions de l’utilisateur</subtitle>
		<generator>MediaWiki 1.26.2</generator>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Tutorials&amp;diff=3168</id>
		<title>Tutorials</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Tutorials&amp;diff=3168"/>
				<updated>2022-07-14T18:20:58Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;To start with MicMac, here are some tutorials, illustrating the basic process. &lt;br /&gt;
*[[Gravillons_tutorial|01 Gravillons (Base tutorial with 4 images)]]&lt;br /&gt;
*[[Fontaine tutorial|02 Fontaine]]&lt;br /&gt;
*[[Pierrerue tutorial|03 Pierrerue]]&lt;br /&gt;
*[[Zhenjue_tutorial|04 Zhenjue]]&lt;br /&gt;
*[[GrandLeez|05 GrandLeez (Fixed wing drone with embedded GNSS)]]&lt;br /&gt;
*[[Historical_Orthoimage|06 Historical Orthoimage (Scanned photogrammetric films)]]&lt;br /&gt;
*[[Processing data from video files|07 Processing data from video files]]&lt;br /&gt;
*[[MMASTER|08 How to compute a MMASTER DEM from ASTER L1A data]]&lt;br /&gt;
*[[Unwrap Vault|09 How to unwrap a texture]]&lt;br /&gt;
*[[Camera Calibration|10 How to calibrate your camera (will be added soon)]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
You can find some commented workflows in bash form (.sh, for UNIX systems) here : [https://github.com/luc-girod/MicMacWorkflowsByLucGirod MicMacWorkflowsByLucGirod]&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=GrandLeez&amp;diff=3167</id>
		<title>GrandLeez</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=GrandLeez&amp;diff=3167"/>
				<updated>2022-07-14T18:19:37Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Download */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
=Description=&lt;br /&gt;
This dataset will allow you to process a UAV mission a more generally a aerial mission. You will go trough direct georeferencing, GNSS delay correction, coordinate system change. We apply all this concepts to canopy model generation.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;https://micmac.ensg.eu/data/uas_grand_leez_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
=Presentation=&lt;br /&gt;
This dataset is kindly provided by  &amp;quot;l’Unité Gestion des Ressources Forestières et des Milieux Naturels (GRFMN), Université de Liège&amp;quot;. Contact: jo.lisein@ulg.ac.be&lt;br /&gt;
File present in the directory are :&lt;br /&gt;
*200 images : (800x600px) captured by a RICOH GR DIGITAL 3&lt;br /&gt;
[[Image:Carroussel_grandleez.png|x200px]]&lt;br /&gt;
*2 files to change coordinate system : &amp;lt;i&amp;gt;SysCoBL72_EPSG31370.xml&amp;lt;/i&amp;gt; et &amp;lt;i&amp;gt;SysCoRTL.xml&amp;lt;/i&amp;gt;&lt;br /&gt;
*Image Neighboors : &amp;lt;i&amp;gt;FileImagesNeighbour.xml&amp;lt;/i&amp;gt;&lt;br /&gt;
*Image geolocation : &amp;lt;i&amp;gt;GPS_WPK_Grand-Leez.csv&amp;lt;/i&amp;gt;&lt;br /&gt;
*2 commands scripts : &amp;lt;i&amp;gt;UASGrandLeez.sh&amp;lt;/i&amp;gt; (Linux) et &amp;lt;i&amp;gt;UASGrandLeez.bat&amp;lt;/i&amp;gt; (Windows)&lt;br /&gt;
*1 file with detailed commands : &amp;lt;i&amp;gt;cmd_UAS_Grand-Leez.txt&amp;lt;/i&amp;gt;&lt;br /&gt;
This dataset is a UAV mission realised in GrandLeez, Belgium for forest canopy survey.&lt;br /&gt;
&lt;br /&gt;
=Tutorial=&lt;br /&gt;
==Conversion of image coordinates==&lt;br /&gt;
OriConvert is used for 5 purposes:&lt;br /&gt;
#Conversion of the embedded GPS data into the micmac format : OriTxtInFile&lt;br /&gt;
#Generate the image pairs file&lt;br /&gt;
#Change the coordinate system (from WGS84 to a locally tangent system) with the argument : ChSys=DegreeWGS84@SysCoRTL.xml&lt;br /&gt;
#Compute relative speed of the camera (for GPS delay determination) : MTD1=1 CalcV=1&lt;br /&gt;
#Select a sample of the image block (PATC) for camera calibration : NameCple=FileImagesNeighbour.xml ImC=R0040536.JPG NbImC=25&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d OriConvert OriTxtInFile GPS_WPK_Grand-Leez.csv Nav-Brut-RTL ChSys=DegreeWGS84@SysCoRTL.xml MTD1=1 NameCple=FileImagesNeighbour.xml CalcV=1 ImC=R0040536.JPG NbImC=25&amp;lt;/pre&amp;gt;&lt;br /&gt;
See [[OriConvert]] for more details on arguments and file format.&lt;br /&gt;
&lt;br /&gt;
==Tie Point Generation with Tapioca==&lt;br /&gt;
The file &amp;lt;i&amp;gt;FileImagesNeighbour.xml&amp;lt;/i&amp;gt; contain for each images, his differents neighboors. If you open the file, you can see :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040519.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040514.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040444.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040517.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040438.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040440.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040441.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040516.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040442.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040515.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
     &amp;lt;Cple&amp;gt;R0040439.JPG R0040443.JPG&amp;lt;/Cple&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
It means, image &amp;lt;i&amp;gt;R0040439.JPG&amp;lt;/i&amp;gt; is connected with all the images detailed in &amp;lt;Cple&amp;gt; tag. So you can run the tie point generation with [[Tapioca]] using this file :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapioca File FileImagesNeighbour.xml -1&amp;lt;/pre&amp;gt;&lt;br /&gt;
The processing time is shorter, because micmac knows which pictures to match.&lt;br /&gt;
&lt;br /&gt;
==Camera Calibration==&lt;br /&gt;
To run a Camera calibration with [[Tapas]], you can take an other dataset (here using a block of 25 images), with exactly the same camera settings, or you can use a part of the principal dataset. Here we use the same images as in [[OriConvert]] to determine Internal Orientation Parameters (IOP) :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialBasic &amp;quot;R0040536.JPG|R0040537.JPG|R0040535.JPG|R0040578.JPG|R0040498.JPG|R0040499.JPG|R0040579.JPG|R0040538.JPG|R0040577.JPG|R0040534.JPG|R0040497.JPG|R0040500.JPG|R0040580.JPG|R0040456.JPG|R0040616.JPG|R0040576.JPG|R0040496.JPG|R0040617.JPG|R004045.JPG|R0040457.JPG|R0040615.JPG|R0040539.JPG|R0040501.JPG|R0040581.JPG|R0040533.JPG&amp;quot; Out=Sample4Calib-Rel&amp;lt;/pre&amp;gt;&lt;br /&gt;
This is the result of the last iteration :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
| |  Residual = 0.474718 ;; Evol, Moy=5.50743e-015 ,Max=3.70866e-014&lt;br /&gt;
| |  Worst, Res 0.618139 for R0040576.JPG,  Perc 99.446 for R0040496.JPG&lt;br /&gt;
| |  Cond , Aver 6.46061 Max 42.4603 Prop&amp;gt;100 0&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Orientation of the complete block in a relative system==&lt;br /&gt;
You can directly integrate the IOP determination in the relative orientation processing, by using [[Tapas]] and the argument InCal :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialBasic &amp;quot;R.*.JPG&amp;quot; Out=All-Rel InCal=Sample4Calib-Rel&amp;lt;/pre&amp;gt;&lt;br /&gt;
This is the results of the last iteration :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
| |  Residual = 0.420786 ;; Evol, Moy=3.64623e-014 ,Max=4.34387e-013&lt;br /&gt;
| |  Worst, Res 0.662578 for R0040576.JPG,  Perc 98.5075 for R0040472.JPG&lt;br /&gt;
| |  Cond , Aver 5.84769 Max 47.5369 Prop&amp;gt;100 0&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;When you are using a pre-calibration in Tapas, you give a initial solution to the least squares algorithm, you improve the convergence speed/chances.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialBasic &amp;quot;R.*.JPG&amp;quot; Out=All-Rel-b&amp;lt;/pre&amp;gt;&lt;br /&gt;
The results of the last iteration is :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
| |  Residual = 0.420786 ;; Evol, Moy=4.39195e-013 ,Max=1.7138e-012&lt;br /&gt;
| |  Worst, Res 0.662578 for R0040576.JPG,  Perc 98.5075 for R0040472.JPG&lt;br /&gt;
| |  Cond , Aver 5.40335 Max 52.3331 Prop&amp;gt;100 0&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
The processing time is longer, but the residuals are the same, which prove the algortihm efficiency.&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We will know compute a sparse cloud with image relative position and orientation, to check if the block is correctly computed :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;R.*.JPG&amp;quot; All-Rel&amp;lt;/pre&amp;gt;&lt;br /&gt;
[[Image:05 ori rel.png|x300px]]&amp;lt;br&amp;gt;&lt;br /&gt;
Optionnaly, if meshlab is installed, you can vizualise the sparse cloud:&lt;br /&gt;
&amp;lt;pre&amp;gt;meshlab All-Rel.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Absolute orientation and correction of the GNSS delay==&lt;br /&gt;
The position of the UAV are computed for the phase center of the GNSS which not correspond to the camera center. Moreover, the UAV is moving when you take the picture so you have to compute relative speed of each camera in order to determine and correct GNSS systematic error(delay).&lt;br /&gt;
&lt;br /&gt;
First, we have to use the embedded GNSS data to set (&amp;quot;bascule&amp;quot;) our system in the geographical coordinate system of choice (here a local radial tangential system - RTL) :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d CenterBascule &amp;quot;R.*.JPG&amp;quot; All-Rel Nav-Brut-RTL tmp CalcV=1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note : the target system NEEDS to be at least pseudo-euclidean (axes orthonormales). Lat/long/height is not such a system.&lt;br /&gt;
&lt;br /&gt;
This is the result of the bascule&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
BEGIN Compensation&lt;br /&gt;
BEGIN AMD&lt;br /&gt;
END AMD&lt;br /&gt;
APPLI APERO, NbUnknown = 1208&lt;br /&gt;
delay init :::    -0.0787348&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Here we compute a pre-absolute orientation from the relative one and we compare it to the image geolocation. The delay is estimated from the residuals and the UAV speed (here -0.0787348s).&lt;br /&gt;
OriConvert is (again) used for taking the delay into account and generate a new (accurate) orientation :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d OriConvert OriTxtInFile GPS_WPK_Grand-Leez.csv Nav-adjusted-RTL ChSys=DegreeWGS84@SysCoRTL.xml MTD1=1 Delay=-0.0787348&amp;lt;/pre&amp;gt;&lt;br /&gt;
Now we can use the precise georeferencing to compute the absolute orientation from the aerotriangulated model with CenterBascule&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d CenterBascule &amp;quot;R.*.JPG&amp;quot; All-Rel Nav-adjusted-RTL All-RTL&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Change the coordinate system==&lt;br /&gt;
Here we want to use the canopy model in other tools such OTB or QGIS for image segmentation/classification. So we have to backward transform our orientation coordinate system from local euclidian system to a geographic system. The tool to perform this transform is [[ChgSysCo]]. Here we transform to Belgian Lambert 72 (EPSG31370). More information about EPSG codes here : &amp;lt;code&amp;gt;http://spatialreference.org/ref/epsg/&amp;lt;/code&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d ChgSysCo  &amp;quot;R.*JPG&amp;quot; All-RTL SysCoRTL.xml@SysCoBL72_EPSG31370.xml All-BL72&amp;lt;/pre&amp;gt;&lt;br /&gt;
Compute an orientation vizualisation :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;R.*.JPG&amp;quot; All-BL72 Out=All-BL72-cam.ply WithPoints=0&amp;lt;/pre&amp;gt;&lt;br /&gt;
The argument WithPoints, allow you to export only image position and orientation.&lt;br /&gt;
Optionnaly, if meshlab is installed, you can vizualise the orientation cloud :&lt;br /&gt;
&amp;lt;pre&amp;gt;meshlab All-BL72-cam.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Canopy Surface Model==&lt;br /&gt;
===With Old pipeline===&lt;br /&gt;
Dense-matching with [[Malt]].&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Malt Ortho &amp;quot;R.*JPG&amp;quot; All-BL72 DirMEC=MEC DefCor=0 AffineLast=1 Regul=0.005 HrOr=0 LrOr=0 ZoomF=1&amp;lt;/pre&amp;gt;&lt;br /&gt;
This command line generate depth map by iteration on sub-sampling models. So we have to use the highest resolution, this is the file &amp;lt;i&amp;gt;MEC/Z_Num8_DeZoom1_STD-MALT.tif&amp;lt;/i&amp;gt;&lt;br /&gt;
Here, we aren't interested in the generation of orthophoto, but we want to compute a Digital Elevation Model (DEM) for canopy survey, so we convert the depth map in 8bits&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits MEC/Z_Num8_DeZoom1_STD-MALT.tif Out=Canopy_dem.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
[[Image:Canopy DEM.png|x200px]]&lt;br /&gt;
&amp;lt;br&amp;gt;We can also export the dense point cloud and color it with [[Nuage2Ply]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply &amp;quot;MEC/NuageImProf_STD-MALT_Etape_8.xml&amp;quot; Scale=8 Attr=&amp;quot;Canopy_dem.tif&amp;quot; Out=CanopySurfaceModel.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One can compute the orthoimage with:&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tawny Ortho-MEC&amp;lt;/pre&amp;gt;&lt;br /&gt;
The orthoimage will be the file Ortho-MEC/Orthophotomoisaic.tif .&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Optionnaly, if meshlab is installed&lt;br /&gt;
&amp;lt;pre&amp;gt;meshlab CanopySurfaceModel.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===With New pipeline===&lt;br /&gt;
If you want to use the new pipeline, yo have to use [[PIMs]] and [[PIMs2MNT]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims MicMac &amp;quot;R.*JPG&amp;quot; All-BL72 DefCor=0 ZoomF=1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims2MNT MicMac DoOrtho=1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Out=Canopy_dem.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr=&amp;quot;Canopy_dem.tif&amp;quot; Out=CanopySurfaceModel.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Optionnaly, if meshlab is installed&lt;br /&gt;
&amp;lt;pre&amp;gt;meshlab CanopySurfaceModel.ply&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Zhenjue_tutorial&amp;diff=3166</id>
		<title>Zhenjue tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Zhenjue_tutorial&amp;diff=3166"/>
				<updated>2022-07-14T18:19:07Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Download */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
&lt;br /&gt;
==Description==&lt;br /&gt;
This dataset contain images with different focal length (24 and 100 mm).&lt;br /&gt;
The purpose of this tutorial is to reconstruct each statue independently (warrior and musician).&lt;br /&gt;
We will show two methods to reconstruct object in 3D by image geometry (Malt and PIMs).&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;https://micmac.ensg.eu/data/zhenjue_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===1. Relative orientation===&lt;br /&gt;
As all MicMac process, the pipeline begin by calling the tool [[Tapioca]] to detect tie points :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapioca All  &amp;quot;.*JPG&amp;quot; 1500&amp;lt;/pre&amp;gt;&lt;br /&gt;
For this dataset, image have different focal length, so we have to compute first a orientation for the 24mm focal length images.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; Focs=[20,30] Out=F24&amp;lt;/pre&amp;gt;&lt;br /&gt;
Check residual and number of points used per images.&amp;lt;br&amp;gt;&lt;br /&gt;
We use the 24mm orientation as an entry for our command in order to indicate to MicMac there is different focal length :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; InOri=F24 Out=All&amp;lt;/pre&amp;gt;&lt;br /&gt;
We will now generate a sparse cloud to visualize the relative orientation.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; All&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2. Dense correlation in image geometry (old method)===&lt;br /&gt;
For this part and the rest of this tutorial, we will focus only on the warrior.&lt;br /&gt;
Define a mask for dense correlation can be done with the command [[SaisieMasqQT]]. Here we define a image mask :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieMasqQT &amp;quot;DSC_3128.JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The previous tool for dense correlation was [[Malt]]. Here we are working in image geometry.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Malt GeomImage &amp;quot;DSC_313[2-9].JPG&amp;quot; All Master=DSC_3135.JPG ZoomF=4 AffineLast=0&amp;lt;/pre&amp;gt;&lt;br /&gt;
We can compute a dense cloud with the command [[Nuage2Ply]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply NuageImProf_STD-MALT_Etape_6.xml Attr=../DSC_3135.JPG RatioAttrCarte=4 Out=../Warrior.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===3. Dense correlation in image geometry (new method)===&lt;br /&gt;
Define a mask for dense correlation can be done with the command [[SaisieMasqQT]]. Here we define a 3D mask :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieMasqQT &amp;quot;DSC_3128.JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The new tool [[C3DC]] doesn't need a image master for 3D reconstruction :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d C3DC BigMac &amp;quot;DSC_313[2-9].JPG&amp;quot; All ZoomF=4&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===4. Comparison===&lt;br /&gt;
[[Image:ply_warrior.png|thumb|180px|Sparse cloud]]&lt;br /&gt;
So let's compare the files &amp;quot;Warrior_Malt.ply&amp;quot; and &amp;quot;C3DC-BigMac.ply&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
===5. Compute a depth map===&lt;br /&gt;
[[Image:GrShade_warrior.png|thumb|180px|GrShade]]&lt;br /&gt;
To visualize the depth map, we have to use the tool [[GrShade]] to the most elevate resolution map (Z_Num6... here) :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd MM-Malt-Img-DSC_3135&lt;br /&gt;
mm3d GrShade Z_Num6_DeZoom4_STD-MALT.tif ModeOmbre=IgnE Mask=AutoMask_STD-MALT_Num_5.tif FZ=2 Out=../ShadeWarrior.tif&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;If you want to perform the musician then use :&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mm3d SaisieMasqQT DSC_3128.JPG&lt;br /&gt;
mm3d Malt GeomImage &amp;quot;DSC_31((2[5-9])|(3[0-1])).JPG&amp;quot; Toutes Master=DSC_3128.JPG ZoomF=4&lt;br /&gt;
cd MM-Malt-Img-DSC_3128&lt;br /&gt;
mm3d GrShade Z_Num6_DeZoom4_STD-MALT.tif ModeOmbre=IgnE Mask=AutoMask_STD-MALT_Num_5.tif FZ=2 Out=../ShadeMusician.tif&lt;br /&gt;
mm3d Nuage2Ply NuageImProf_STD-MALT_Etape_6.xml Attr=../DSC_3128.JPG RatioAttrCarte=4 Out=../Musician.ply&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Conclusion==&lt;br /&gt;
Unfortunately, the tools [[C3DC]] and PIMs don't have a radiometric egalisation module. So for instance, if you want to use it in orthophotos or dense cloud, you can still use the old pipeline ([[Malt]],[[Nuage2Ply]],[[Tawny]] etc...)&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=3165</id>
		<title>Pierrerue tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=3165"/>
				<updated>2022-07-14T18:18:36Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Download */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
This dataset allow you to process a georeferenced orthophoto from a front. This dataset was acquired by student of ENSG during their summer internship in Forcalquier. They have use targets surveyed with total station.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;https://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
*31 JPG images&lt;br /&gt;
*1 file containing the support points (Pierrerue.xml)&lt;br /&gt;
*We will use the folder 001_Elements-de-georeferencement&lt;br /&gt;
&lt;br /&gt;
==About the data==&lt;br /&gt;
The shooting contain 31 images, taken with the Sony alpha850 with a 244mm lens.&lt;br /&gt;
Check that the folder contains images recovering :&lt;br /&gt;
*facade n°1&lt;br /&gt;
*facade n°2&lt;br /&gt;
* the corner between these 2 façade (images link).&lt;br /&gt;
Support points are available and allows to georeference the readings (see the folder 001_Elements-de-georeferencement).&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===Set up the images===&lt;br /&gt;
====Tie-Points search====&lt;br /&gt;
All the images should be set up simultaneously, in order that the georeference process will be expressed into one only cordinate system. First, we can run the tie-points search : &amp;lt;pre&amp;gt;mm3d Tapioca MulScale &amp;quot;.*JPG&amp;quot; 600 2000&amp;lt;/pre&amp;gt;&lt;br /&gt;
====Internal orientation and relative orientation====&lt;br /&gt;
Then, we have to define the settings of the camera used from the images covering the corner between the 2 facades (more suitable than the others, because of the depth). &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;angle.*JPG&amp;quot; Out=Calib24mm&amp;lt;/pre&amp;gt; Into the command prompt, we can check if the residual from the images is admissible (around the half pixel). We can check also the number of connection points, aswell as the percentage of points keeped (&amp;quot;99.8258 of 28466&amp;quot; : 99.8% of the connection points kept from 38466 points calculated). We can process yet the setting up of all the images. starting from the camera we calculated before. &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; InCal=Calib24mm Out=MEP&amp;lt;/pre&amp;gt; Into the command prompt, we can control residual during the process. At the last step, we can see that the image residual are, for all the images, less than a half-pixel. We control also the number of support points, aswell that the percentage of points keeped (&amp;quot;99.8258 of 38466&amp;quot; : 99.8% of the support points keeped from the 38466 points calculated).&lt;br /&gt;
&lt;br /&gt;
====Visualization of relative orientation====&lt;br /&gt;
The [[AperiCloud]] command allows to generate 3D clouds, containing all the support points obtained with [[Tapioca]], and the position of the cameras obtained from the [[Tapas]] output.&lt;br /&gt;
[[Image:Pierrerue2.png|thumb|180px||alt=Pierrerue|Meshlab visualization]] &lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP&amp;lt;/pre&amp;gt;The result of this command can be seen for example with the meshlab software.&lt;br /&gt;
&lt;br /&gt;
===Set up the images into the coordinates system of the support points===&lt;br /&gt;
Now we have to measure the support points available to georeference the images, and so the incoming products, into the reference coordinate system.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Explaining the process====&lt;br /&gt;
*1. We measure 3 support points well distributed ([[SaisieAppuisInit]] command) ;&lt;br /&gt;
*2. We can now create a 3D similarity (transformation from 7 settings containing one scale coefficient, one translation into space and one space rotation) between the arbitary system calculated while the setting up and the coordinate system chosen ([[GCPBascule]] command) ; &lt;br /&gt;
*3. We have now to measure the remaining points : the absolute orientation calculated from the previous step, allows to suggest an approximated position for each points ([[SaisieAppuisPredic]] command) ; &lt;br /&gt;
*4. We affine the absolute orientation ([[GCPBascule]] command) ; &lt;br /&gt;
*5. We start the final computation for the setting up (this offset allows to find the best position/orientation of the cameras while using the points measurements on the linking points and on the support points) ; ([[Campari]] command).&lt;br /&gt;
&lt;br /&gt;
====Measurement process====&lt;br /&gt;
*1. Measurements of almost 3 support points on the facade n°1 (At least one point should be measured to be valid on at least two images). To set the points, you need to get to know the position of the 3 points thanks to the folder 001_Elements-de-georeferencement. &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[8|9].JPG&amp;quot; MEP 1001 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
To validate two other support points (on two images each) : &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0588[3|5].JPG&amp;quot; MEP 1002 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[3|6].JPG&amp;quot; MEP 1121 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*2. Computation of the 3D similarity (absolute orientation) &amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc Pierrerue.xml MesureFacade-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*3. Measurement of all available points : &amp;lt;pre&amp;gt;mm3d SaisieAppuisPredicQT &amp;quot;facade.*JPG&amp;quot; MEP-Basc Pierrerue.xml MesureFacade-Final.xml&amp;lt;/pre&amp;gt; We must now validate the points left. &lt;br /&gt;
*4. Update computation of the absolute orientation.&lt;br /&gt;
This time, we need all the support points to calclulate the 3D similarity of the absolute orientation.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc2 Pierrerue.xml MesureFacade-Final-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*5. Final adjustement&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Campari &amp;quot;.*JPG&amp;quot; MEP-Basc2 MEP-Terrain GCP=[Pierrerue.xml,0.02,MesureFacade-Final-S2D.xml,0.5]&amp;lt;/pre&amp;gt;&lt;br /&gt;
This computation adjustement use to find the best position/orientation of the cameras when the shooting was made assuming that the support points have a 0.02m accuracy, and the linking points have a 0.5 pixel accuracy. This values allows to ponderate the measurements.  It suits, at the end of the process, to control the residues on the support points and on the images residuals.&lt;br /&gt;
The residual images look like this :&lt;br /&gt;
&amp;lt;pre&amp;gt;| |  RESIDU LIAISON MOYENS = 0.547721 pour Id_Pastis_Hom Evol, Moy=2.38308e-07 ,Max=0.00295916&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===3D Reconstruction===&lt;br /&gt;
As we did for the Fountain exercise, we have to do the 3D reconstruction with the image geometry with the [[C3DC]] tools. First, we have to limit the reconstruction area. To process it, we have to create a mask on the cloud points ([[AperiCloud]]), that we have to recalculate in the new set up. &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP-Terrain&amp;lt;/pre&amp;gt;&lt;br /&gt;
To limit the computation area, we will create a 3D mask :&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AperiCloud_MEP-Terrain.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
Once the mask is created, we can launch the 3D reconstruction : &amp;lt;pre&amp;gt;mm3d C3DC MicMac &amp;quot;facade.*JPG&amp;quot; MEP-Terrain Masq3D=AperiCloud_MEP-Terrain.ply Out=C3DC_MicMac_Pierrerue.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The C3DC_MicMac_Pierrerue.ply file can be open with Meshlab.&lt;br /&gt;
&lt;br /&gt;
===Orthorectification===&lt;br /&gt;
To realize orthorectifications of the Pierrerue facade, we have to define a temporary landmark for each facade, where the Z axis will be perpendicular to the facade. In the first time, we will work on facade n°1. We processing it in 2 steps : &lt;br /&gt;
&lt;br /&gt;
1. Mask creation into the facade : &amp;lt;pre&amp;gt;mm3d SaisieMasqQT facade1DSC05893.JPG Attr=Facade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Computation of a local landmark where the Z axis is perpendicular and going through the support points included into the masks : &amp;lt;pre&amp;gt;mm3d RepLocBascule &amp;quot;facade1.*JPG&amp;quot; Ori-MEP-Terrain HOR Repere-Facade1.xml PostPlan=_MasqFacade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The HOR setting show that for the Ox axis of our orthoimage, we use the horizontal of our worksite. It can be done here, because the set up MEP-Terrain was obtained from support points. The orthoimage will be calculated from new landmark. It is necessary here to reproject the 3D reconstruction process into this new landmark. Normally, a depthmap will be calculated into the orthorectification map : it's about an image applied on the object, where the pixels show the distance from map. We describe it as 2.5D (3D information isn't available only for a finite number of positions). &lt;br /&gt;
&lt;br /&gt;
The [[PIMs2MNT]] command allows to create a depth map into the map of the facade n°1 (the one calculated before : ''Repere-Facade1.xml'') : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims2MNT MicMac DoOrtho=1 Repere=Repere-Facade1.xml Pat=&amp;quot;facade1.*JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The correlation map resulting, calculated into the facade n°1, can be found into the folder ''PIMs-TmpBasc'' with the name ''PIMs-Merged_Correl.tif''. This file contains the correlation results : white is corresponding to a very good correlation scores ; more the grey is dark, less the matching process went well. &lt;br /&gt;
&lt;br /&gt;
Once the depth map processed, MicMac compute into the PIMs-ORTHO repertory the orthoimages for each image, aswell as incidence pictures, with the angle between the facade and the perspective ray (images Incid_facade1DSC###.tif) and images with hidden parts, that show in white the hidden parts into the image (imgaes PC_facade1DSC###.tif).&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
After the calculation, each orthoimages should be mosaiced. The choice of the image to use for each pixel is done with these specificity : no hidden parts, best angle of attack, continuity in the choice of images :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tawny PIMs-ORTHO/&amp;lt;/pre&amp;gt;&lt;br /&gt;
The result is an image created into ''PIMs-ORTHO'' folder, called Orthophotomosaic.tif. Metadatas associated are available into the file Orthophotomosaic.tfw. We can see the resolution chosen for the orthoimage computation here (1.1mm).&lt;br /&gt;
&lt;br /&gt;
[[Image:Pierrerue5.png|thumb|180px||alt=Pierrerue|Meshlab visualization]]&lt;br /&gt;
We can now create a faded image relief, this image allows to evaluate the quality of the reconstruction, especially in detecting the noise existing on the reconstructed surface.&lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d GrShade PIMs-TmpBasc/PIMs-Merged_Prof.tif ModeOmbre=IgnE Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_Shade.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Another product can be created, it's an colorized image into the facade depth (each colour is corresponding to a  scale of depth) :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Coul=1 Circ=1 Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_8Bits.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Files ''Facade1_Shade.tif'' and ''Facade1_8bits'' can be seen with any image viewer software. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, it can be helpful to regenerate a 3D cloud points from the depth map, by colorizing it with the orthoimage. The advantage is to use radiometric equalization calculated on the orthoimage (during the [[Tawny]]) to have an equalized 3D cloud points. The disadvantage is that the 3D area is only 2,5D and that the perpendicular object from the facade map aren't showed on the cloud. &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr=PIMs-ORTHO/Orthophotomosaic.tif Out=Facade1.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The file ''Facade1.ply'' can be seen with Meshlab, and can be compared to the file ''C3DC_MicMac_Pierrerue.ply''. &lt;br /&gt;
&lt;br /&gt;
Everything executed on the facade n°1 can be done now on facade n°2.&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Gravillons_tutorial&amp;diff=3164</id>
		<title>Gravillons tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Gravillons_tutorial&amp;diff=3164"/>
				<updated>2022-07-14T18:18:04Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Download */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
In this tutorial, we will approach general concepts, basics tools, and how to process an image dataset with overlaps with MicMac. This dataset is light by design (4 images), in order to focus on the MicMac tools.&lt;br /&gt;
This tutorial is designed especially for MicMac beginners with a light photogrammetry background.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;https://micmac.ensg.eu/data/gravillons_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to extract the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
This dataset was created by L.Girod at the University of Oslo, Norway. This dataset was acquired to model a volcano model created by O.Galland.&lt;br /&gt;
Files present in the directory are:&lt;br /&gt;
*4 images : 1.JPG, 2.JPG, 3.JPG, 4.JPG.&lt;br /&gt;
[[Image:01_gravillons_caroussel.png]]&lt;br /&gt;
*GCP coordinates: Dico-Appuis.xml.&lt;br /&gt;
*Measures of GCPs in images: Mesure-Appuis.xml.&lt;br /&gt;
*1 Mask: 1_Masq.tif/1_Masq.xml&lt;br /&gt;
*2 command scripts: gravillons.sh (Linux) and gravillons.bat (Windows)&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
&lt;br /&gt;
===1 Tie-Points search===&lt;br /&gt;
The first step of each MicMac pipline is to look for tie points (points that are seen in more than one image), this step is call image matching and performed by the command [[Tapioca]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapioca All &amp;quot;.*.JPG&amp;quot; 1500&amp;lt;/pre&amp;gt;&lt;br /&gt;
The All option is used here because we know that all the images are going to have tie points with each other (they all depict the same area).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;To process tie points at full resolution, use &amp;quot;-1&amp;quot; (instead of 1500 here):&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapioca All &amp;quot;.*.JPG&amp;quot; -1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2 Internal Orientation+Relative Orientation===&lt;br /&gt;
Photogrammetry is composed of three steps :&lt;br /&gt;
*Internal Orientation : to determine camera parameters (focal length, PPA, PPS, distortion center, or distortion parameters).&lt;br /&gt;
*Relative Orientation : to determine the relative position of each camera in an arbitrary coordinate system.&lt;br /&gt;
*Absolute Orientation : to map the relative orientations to a scaled and oriented coordinate system (typically WGS84)&lt;br /&gt;
In digital photogrammetry, the two first steps are generally processed at the same time. In MicMac, the tools which perform internal and relative orientation is call [[Tapas]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas FraserBasic &amp;quot;.*.JPG&amp;quot; Out=Arbitrary&amp;lt;/pre&amp;gt;&lt;br /&gt;
This tool uses a compensation by least squares to determine camera parameters and relative orientations. The option &amp;quot;FraserBasic&amp;quot;, correspond to a model of distortion for our camera. The &amp;quot;option&amp;quot; Out specify the name of the orientation directory (here it will be Ori-Arbitrary).&lt;br /&gt;
&lt;br /&gt;
===3 Visualize Relative Orientation===&lt;br /&gt;
[[Image:01_Gravillonn_RO.jpg|thumb|250px||alt=Relative Orientation|Meshlab visualization]]&lt;br /&gt;
MicMac include a tools which create a sparse point clouds (TPs) for visualization. This tool is [[AperiCloud]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*.JPG&amp;quot; Arbitrary&amp;lt;/pre&amp;gt;&lt;br /&gt;
After this step, a &amp;quot;.ply&amp;quot; file will appear in your working directory, open it with Meshlab (Screenshot 1 : see [[Install|Useful softwares for MicMac]])&lt;br /&gt;
&lt;br /&gt;
===4 Absolute Orientation===&lt;br /&gt;
[[Image:01_Gravillonn_AO.jpg|thumb|250px||alt=Absolute Orientation|Absolute Orientation]]&lt;br /&gt;
For this datasets, Ground Control Points, are already measured in images (file &amp;quot;Mesure-Appuis.xml&amp;quot;). With 3 points (X,Y,Z) we can determine the 3D transformation between the arbitrary system (Relative Orientation) and the georeferenced system, this operation is call &amp;quot;Bascule&amp;quot; and can be performed by the command [[GCPBascule]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*.JPG&amp;quot; Arbitrary Ground_Init Dico-Appuis.xml Mesure-Appuis.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
This tool process a first Bascule only with the GCPs (Directory Ori-Ground_Init), we will now calculate a second Bascule with GCPs and TPs. To do that, we use the command [[Campari]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Campari &amp;quot;.*.JPG&amp;quot; Ground_Init Ground&amp;lt;/pre&amp;gt;&lt;br /&gt;
The new orientation is stocked in the directory &amp;quot;Ori-Ground&amp;quot;. We can visualize it with [[AperiCloud]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*.JPG&amp;quot; Ground&amp;lt;/pre&amp;gt;&lt;br /&gt;
You can visualize the points cloud created in Meshlab.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender;width: 1400px&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;The file &amp;quot;Mesure-Appuis.xml&amp;quot; is already provided in the dataset, it contain the measurements of each GCPs in image coordinates (px). If you want to mesure the GCPs by yourself, you can use the tool [[SaisieAppuisInitQT]] before GCPBascule etc...&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;.*.JPG&amp;quot; Arbitrary Dico-Appuis.xml Mesure-Appuis.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
It launch a GUI to click on GCPs, when it's finish don't forget to save before leaving. It will create to files :&lt;br /&gt;
*Mesure-Appuis-S2D.xml : Measurements of GCPs in images coordinates.&lt;br /&gt;
*Mesure-Appuis-S3D.xml : Measurements of GCPs in Relative Orientation (here &amp;quot;Arbitrary&amp;quot;). Warning no unit.&lt;br /&gt;
So for the following command don't forget to use &amp;quot;Mesure-Appuis-S2D.xml&amp;quot; instead of &amp;quot;Mesure-Appuis.xml&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
This command is best explained in the [[Pierrerue_tutorial|Pierrerue tutorial]].&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===5 Create a depth map===&lt;br /&gt;
[[Image:01_Gravillonn_3DC.jpg|thumb|250px|3D Points Cloud]]&lt;br /&gt;
With any orientation directory, you can compute a depth map. The method consisting on using all the images to create a 3D model is call dense correlation or densification. In MicMac, it's performed by the command [[Malt]] :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Malt GeomImage &amp;quot;.*.JPG&amp;quot; Ground Master=&amp;quot;1.JPG&amp;quot; ZoomF=2&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===6 Create a Dense Points Cloud===&lt;br /&gt;
This last tool doesn't create directly a 3D point cloud. To generate it, you have to run an other tools, Nuage2Ply :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply &amp;quot;MM-Malt-Img-1/NuageImProf_STD-MALT_Etape_7.xml&amp;quot; Attr=&amp;quot;1.JPG&amp;quot; Out=1.ply RatioAttrCarte=2&amp;lt;/pre&amp;gt;&lt;br /&gt;
Then visualize the 3D model &amp;quot;1.ply&amp;quot; in Meshlab.&lt;br /&gt;
&lt;br /&gt;
===Conclusion===&lt;br /&gt;
With this tutorial, you went through a complete photogrammetric process with MicMac. This first tutorial was willingly easy and on a minimal dataset to help you kickstart your MicMac skills. To go further, try the next tutorials.&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Fontaine_tutorial&amp;diff=3163</id>
		<title>Fontaine tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Fontaine_tutorial&amp;diff=3163"/>
				<updated>2022-07-14T18:17:05Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Download */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
In this tutorial we go further in command details, process, results, products etc...&lt;br /&gt;
Here you will learn how to process a dataset of a circular point of view around an object.&lt;br /&gt;
It's also a good way to compare the old [[Malt]] and the new pipeline [[Pims]], specially for geometry based on images.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;https://micmac.ensg.eu/data/fontaine_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
The folder contain 30 JPG files. In this dataset there isn't any data about the camera which was used. The shooting set contains 30 images, all taken with the Canon 70D camera with an 18mm lens. The camera saves metadata for all the pictures (exif data). If you are looking in the property data of each pictures, the 18mm lens is mentioned and the various chosen settings while the production of this dataset was realized (opening, break, ...). &amp;lt;br&amp;gt;&lt;br /&gt;
There are 5 parts in this dataset : &lt;br /&gt;
*4 parts includes 4 differents points of view of the fountain. Each part was created with a cross points of view. This means there is one master image at the center and 4 images around it (top, bottom, left and right from the master image).&lt;br /&gt;
*1 part contain the others images (IMG.*JPG). These images are images link, this means that they will allow to link the 4 other parts together. There won't be used to generate the dense clouds points. &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We will work this dataset with the image geometry pattern, that means that we will choose one master image, and from this image, it will compute a depth map. This master image can't cover the whole object at 360degree. To modelize this 3D object, we need to calculate a few more depth maps with the image geometry pattern, in order to get the whole 3D fountain. In this exercise, we will calculate 4 depth maps : one for each part.&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===1. Tie-points search===&lt;br /&gt;
The tie points search for all the images should be computed simultaneously so the 4 parts are linked with each others. The orientation will be in an arbitrary system, but the same system will be kept for all the images. The 4 depth maps generated (as well as the 4 clouds) will therefore be in the same coordinate system.&lt;br /&gt;
&lt;br /&gt;
First, we need to run the tie-points search :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapioca MulScale &amp;quot;.*JPG&amp;quot; 500 2500&amp;lt;/pre&amp;gt;&lt;br /&gt;
The MulScale pattern allows to make a search for similar points firstly on sub-sampled images. On this dataset, it will be on 500pixels on the biggest side instead of 5472 on the originals images. This allows to know which images have some tie points between them and to only run the tie-points search at a bigger resolution (here 2500) on the optimal sets of images.&lt;br /&gt;
&lt;br /&gt;
===2. Internal Orientation+Relative Orientation===&lt;br /&gt;
We are looking now to know the position of the camera, in relative to each others, but also to know the calibration of the camera used :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; Out=Fontaine&amp;lt;/pre&amp;gt;&lt;br /&gt;
For that in [[Tapas]], there is the calibration type that we need to choose. Here, RadialStd is the pattern generally used for classic cameras. The Out pattern is affiliated to the exportation name (here Fontaine).&lt;br /&gt;
The calibration will be determined directly with the images which will be used for the 3D reconstruction. In some cases, it can be interesting to choose another site with more depth textures on which some pictures will be used to know the calibration of the camera which will be given as a Tapas input (with the InCal tool) run on images from the object.&lt;br /&gt;
In the command prompt, we can control the residues as the calculation goes. At the last step, we can see that the image residual are for all the images lower than a half-pixel. We also have to see the number of tie-points, as well as the percentage of keeping points (&amp;quot;99.8258 of 38466&amp;quot; : 99.8% of tie-points kept than 38466 calculated points).&lt;br /&gt;
&lt;br /&gt;
===3. Visualize Relative Orientation=== &lt;br /&gt;
[[Image:Exercice_Fontaine1.png|thumb|180px||alt=AperiCloud|AperiCloud visualization]]&lt;br /&gt;
The AperiCloud command allows to generate a 3D clouds points, containing all the tie-points obtained with [[Tapioca]], and the position of the camera used as a [[Tapas]] input.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; Fontaine &amp;lt;/pre&amp;gt;&lt;br /&gt;
The result of this command can be seen for example, with the Meshlab software. So we can see that the 4 parts containing 5 images around the fountain are connected between them thanks to the images link.&lt;br /&gt;
&lt;br /&gt;
===4. 3D Reconstruction of one part===&lt;br /&gt;
[[Image:SaisieMasqQT_Fontaine.png|thumb|180px|SaisieMasqQT]]&lt;br /&gt;
Now we will work on one part each. The images link won't be used now : they were used to set up all the part in the same system. Now we can start with the first part. The master image from this part is the the image AIMG_2470.JPG on which we need to define a mask to limit the correlation zone. &amp;lt;br&amp;gt; &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AIMG_2470.JPG&amp;lt;/pre&amp;gt;&lt;br /&gt;
it can be useful to see if everything is saved by checking if the file AIMG_2470_Masq.xml was created. The contents of AIMG_2470_Masq.tif (binary image) can also be verified.&lt;br /&gt;
&lt;br /&gt;
We can now calculate the dense correlation with the image geometry pattern :&lt;br /&gt;
&amp;lt;pre&amp;gt; mm3d Malt GeomImage &amp;quot;A.*JPG&amp;quot; Fontaine Master=AIMG_2470.JPG ZoomF=2&amp;lt;/pre&amp;gt;&lt;br /&gt;
We choose the image pattern that contains our master image aswell as the secondary images and we define the master image thanks to the Master setting. The ZoomF setting allows to define the last stage of the image pyramid which will be used. For this computation, we don't do it on full resolution images. The [[Malt]] pattern will begin to compute first on sub-sampled images and then increase the image sizes until sub-sampled images from a level 2.&lt;br /&gt;
&lt;br /&gt;
Within the process, you can verify if the correlation is working by analyzing the files MM-Malt-Img-AIMG_2470/Correl_STD-MALT_Num_#.tif each files matches to a stage of the images pyramid. These files contains correlation scores : white means there is a very good matching score. More the gray is dark, less the matching process went well.&lt;br /&gt;
&lt;br /&gt;
===5. Visualize 3D products results===&lt;br /&gt;
[[Image:GrShade_Fontaine.png|thumb|140px|GrShade]]&lt;br /&gt;
[[Image:8Bits_Fontaine.png|thumb|140px|8bits]]&lt;br /&gt;
*From the depth map previously computed, we can generate more 3D products, after moving into the folder from the Malt output : &lt;br /&gt;
&amp;lt;pre&amp;gt; cd MM-Malt-Img-AIMG_2470/&amp;lt;/pre&amp;gt;&lt;br /&gt;
*Create a faded relief image : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GrShade Z_Num7_DeZoom2_STD-MALT.tif ModeOmbre=IgnE Mask=AutoMask_STD-MALT_Num_6.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
*Create an hypsometric colors image : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits Z_Num7_DeZoom2_STD-MALT.tif Circ=1 Coul=1 Mask=AutoMask_STD-MALT_Num_6.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
The file Z_Num7_DeZoom2_STD-MALT_8Bits.tif can be seen in a image viewer software.&lt;br /&gt;
&lt;br /&gt;
*Create a 3D points cloud :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply NuageImProf_STD-MALT_Etape_7.xml Attr=../AIMG_2470.JPG RatioAttrCarte=2&amp;lt;/pre&amp;gt;&lt;br /&gt;
The [[Nuage2Ply]] command allows to take the depth map as an input, which will be converted into a points cloud. This cloud will be colorized with the image you will give as an input into the Attr pattern. The setting RatioAttrCarte=2 allows to remember that our cloud points is appearing with a half-resolution than the origin image, because of the ZoomF=2 pattern. The .ply file generated into the folder MM-Malt-Img-AIMG_2470 can be seen into Meshlab for example.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further : cleaning clouds&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;After applying a mask to your image, the clouds can be more or less raw on the edge. The first solution is to clean up the clouds directly with Meshlab or CloudCompare and their editing tools. Another solution is to use the faded relief images, on which there is some raw zone and to apply a new mask :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieMasqQT MM-Malt-Img-AIMG_2470/Z_Num7_DeZoom2_STD-MALTShade.tif Out=MasqCorrelA.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Then, use this mask when converting the depth map into points cloud :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply MM-Malt-Img-AIMG_2470/NuageImProf_STD-MALT_Etape_7.xml Attr=AIMG_2470.JPG RatioAttrCarte=2 Mask=MasqCorrelA.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===6. Automatic method===&lt;br /&gt;
[[Image:SaisieMasqQT_ply_Fontaine.png|thumb|180px|SaisieMasqQT]]&lt;br /&gt;
We will work here with the same dataset of 30 images, obtained with the Canon 70D on a 18mm lens. We don't make any difference between each points of view for this exercise. The setting up computation is the same that the previous exercise : we can get as an output, the results from the Ori-Fontaine/, from the command [[Tapas]]. We can obtain aswell the AperiCloud_Fontaine.ply file, as a result of the [[AperiCloud]] command we did before.&lt;br /&gt;
the calculation mode that we will use, based on the geometry pattern. MicMac will choose itself master images and associated images, from a 3D mask that we will draw on points cloud : AperiCloud_Fontaine.ply.&lt;br /&gt;
&lt;br /&gt;
We will start to draw a 3D mask, from the AperiCloud_Fontaine.ply file.&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AperiCloud_Fontaine.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
Then, we can run the 3D reconstruction with the C3DC command :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d C3DC BigMac &amp;quot;(A|B|C|D).*JPG&amp;quot; Fontaine Masq3D=AperiCloud_Fontaine_selectionInfo.xml&amp;lt;/pre&amp;gt; &lt;br /&gt;
[[Image:C3DC_Fontaine.png|thumb|180px|C3DC]]&lt;br /&gt;
This command run the depth maps computation, the 3D cloud points will be converted into a ply file and the merge of all the ply files. The result : C3DC_BigMac.ply can be seen into Meshlab.&lt;br /&gt;
&lt;br /&gt;
===7. 3D Reconstruction===&lt;br /&gt;
NB : The tools we will use now aren't completely finished yet, so please remind developers if errors occurred.&amp;lt;br&amp;gt;&lt;br /&gt;
When you get a 3D point cloud, you can compute a 3D textured model. First, you have to create a mesh. The tool to create mesh is [[TiPunch]] :&lt;br /&gt;
&amp;lt;div class=&amp;quot;toccolours mw-collapsible mw-collapsed&amp;quot; style=&amp;quot;background-color: Lavender&amp;quot;&amp;gt;&lt;br /&gt;
&amp;lt;h6 style=&amp;quot;font-family: Helvetica:font-size: 40px&amp;quot;&amp;gt;Go further : triangulation&amp;lt;/h6&amp;gt;&lt;br /&gt;
&amp;lt;div class=&amp;quot;mw-collapsible-content&amp;quot;&amp;gt;&lt;br /&gt;
To compute a mesh, MicMac process a Delaunay triangulation and then a Poisson 3D surface reconstruction.&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2509</id>
		<title>Pierrerue tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2509"/>
				<updated>2017-06-06T17:28:07Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Measurement process */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
This dataset allow you to process a georeferenced orthophoto from a front. This dataset was acquired by student of ENSG during their summer internship in Forcalquier. They have use targets surveyed with total station.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
*31 JPG images&lt;br /&gt;
*1 file containing the support points (Pierrerue.xml)&lt;br /&gt;
*We will use the folder 001_Elements-de-georeferencement&lt;br /&gt;
&lt;br /&gt;
==About the data==&lt;br /&gt;
The shooting contain 31 images, taken with the Sony alpha850 with a 244mm lens.&lt;br /&gt;
Check that the folder contains images recovering :&lt;br /&gt;
*facade n°1&lt;br /&gt;
*facade n°2&lt;br /&gt;
* the corner between these 2 façade (images link).&lt;br /&gt;
Support points are available and allows to georeference the readings (see the folder 001_Elements-de-georeferencement).&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===Set up the images===&lt;br /&gt;
====Tie-Points search====&lt;br /&gt;
All the images should be set up simultaneously, in order that the georeference process will be expressed into one only cordinate system. First, we can run the tie-points search : &amp;lt;pre&amp;gt;mm3d Tapioca MulScale &amp;quot;.*JPG&amp;quot; 600 2000&amp;lt;/pre&amp;gt;&lt;br /&gt;
====Internal orientation and relative orientation====&lt;br /&gt;
Then, we have to define the settings of the camera used from the images covering the corner between the 2 facades (more suitable than the others, because of the depth). &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;angle.*JPG&amp;quot; Out=Calib24mm&amp;lt;/pre&amp;gt; Into the command prompt, we can check if the residual from the images is admissible (around the half pixel). We can check also the number of connection points, aswell as the percentage of points keeped (&amp;quot;99.8258 of 28466&amp;quot; : 99.8% of the connection points kept from 38466 points calculated). We can process yet the setting up of all the images. starting from the camera we calculated before. &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; InCal=Calib24mm Out=MEP&amp;lt;/pre&amp;gt; Into the command prompt, we can control residual during the process. At the last step, we can see that the image residual are, for all the images, less than a half-pixel. We control also the number of support points, aswell that the percentage of points keeped (&amp;quot;99.8258 of 38466&amp;quot; : 99.8% of the support points keeped from the 38466 points calculated).&lt;br /&gt;
&lt;br /&gt;
====Visualization of relative orientation====&lt;br /&gt;
The [[AperiCloud]] command allows to generate 3D clouds, containing all the support points obtained with [[Tapioca]], and the position of the cameras obtained from the [[Tapas]] output.&lt;br /&gt;
[[Image:Pierrerue2.png|thumb|180px||alt=Pierrerue|Meshlab visualization]] &lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP&amp;lt;/pre&amp;gt;The result of this command can be seen for example with the meshlab software.&lt;br /&gt;
&lt;br /&gt;
===Set up the images into the coordinates system of the support points===&lt;br /&gt;
Now we have to measure the support points available to georeference the images, and so the incoming products, into the reference coordinate system.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Explaining the process====&lt;br /&gt;
*1. We measure 3 support points well distributed ([[SaisieAppuisInit]] command) ;&lt;br /&gt;
*2. We can now create a 3D similarity (transformation from 7 settings containing one scale coefficient, one translation into space and one space rotation) between the arbitary system calculated while the setting up and the coordinate system chosen ([[GCPBascule]] command) ; &lt;br /&gt;
*3. We have now to measure the remaining points : the absolute orientation calculated from the previous step, allows to suggest an approximated position for each points ([[SaisieAppuisPredic]] command) ; &lt;br /&gt;
*4. We affine the absolute orientation ([[GCPBascule]] command) ; &lt;br /&gt;
*5. We start the final computation for the setting up (this offset allows to find the best position/orientation of the cameras while using the points measurements on the linking points and on the support points) ; ([[Campari]] command).&lt;br /&gt;
&lt;br /&gt;
====Measurement process====&lt;br /&gt;
*1. Measurements of almost 3 support points on the facade n°1 (At least one point should be measured to be valid on at least two images). To set the points, you need to get to know the the position of the 3 points thanks to the folder 001_Elements-de-georeferencement. &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[8|9].JPG&amp;quot; MEP 1001 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
To validate two other support points (on two images each) : &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0588[3|5].JPG&amp;quot; MEP 1002 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[3|6].JPG&amp;quot; MEP 1121 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*2. Computation of the 3D similarity (absolute orientation) &amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc Pierrerue.xml MesureFacade-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*3. Measurement of all available points : &amp;lt;pre&amp;gt;mm3d SaisieAppuisPredicQT &amp;quot;facade.*JPG&amp;quot; MEP-Basc Pierrerue.xml MesureFacade-Final.xml&amp;lt;/pre&amp;gt; We must now validate the points left. &lt;br /&gt;
*4. Update computation of the absolute orientation.&lt;br /&gt;
This time, we need all the support points to calclulate the 3D similarity of the absolute orientation.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc2 Pierrerue.xml MesureFacade-Final-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*5. Final adjustement&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Campari &amp;quot;.*JPG&amp;quot; MEP-Basc2 MEP-Terrain GCP=[Pierrerue.xml,0.02,MesureFacade-Final-S2D.xml,0.5]&amp;lt;/pre&amp;gt;&lt;br /&gt;
This computation adjustement use to find the best position/orientation of the cameras when the shooting was made assuming that the support points have a 0.02m accuracy, and the linking points have a 0.5 pixel accuracy. This values allows to ponderate the measurements.  It suits, at the end of the process, to control the residues on the support points and on the images residuals.&lt;br /&gt;
The residual images look like this :&lt;br /&gt;
&amp;lt;pre&amp;gt;| |  RESIDU LIAISON MOYENS = 0.547721 pour Id_Pastis_Hom Evol, Moy=2.38308e-07 ,Max=0.00295916&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===3D Reconstruction===&lt;br /&gt;
As we did for the Fountain exercise, we have to do the 3D reconstruction with the image geometry with the [[C3DC]] tools. First, we have to limit the reconstruction area. To process it, we have to create a mask on the cloud points ([[AperiCloud]]), that we have to recalculate in the new set up. &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP-Terrain&amp;lt;/pre&amp;gt;&lt;br /&gt;
To limit the computation area, we will create a 3D mask :&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AperiCloud_MEP-Terrain.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
Once the mask is created, we can launch the 3D reconstruction : &amp;lt;pre&amp;gt;mm3d C3DC MicMac &amp;quot;facade.*JPG&amp;quot; MEP-Terrain Masq3D=AperiCloud_MEP-Terrain.ply Out=C3DC_MicMac_Pierrerue.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The C3DC_MicMac_Pierrerue.ply file can be open with Meshlab.&lt;br /&gt;
&lt;br /&gt;
===Orthorectification===&lt;br /&gt;
To realize orthorectifications of the Pierrerue facade, we have to define a temporary landmark for each facade, where the Z axis will be perpendicular to the facade. In the first time, we will work on facade n°1. We processing it in 2 steps : &lt;br /&gt;
&lt;br /&gt;
1. Mask creation into the facade : &amp;lt;pre&amp;gt;mm3d SaisieMasqQT facade1DSC05893.JPG Attr=Facade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Computation of a local landmark where the Z axis is perpendicular and going through the support points included into the masks : &amp;lt;pre&amp;gt;mm3d RepLocBascule &amp;quot;facade1.*JPG&amp;quot; Ori-MEP-Terrain HOR Repere-Facade1.xml PostPlan=_MasqFacade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The HOR setting show that for the Ox axis of our orthoimage, we use the horizontal of our worksite. It can be done here, because the set up MEP-Terrain was obtained from support points. The orthoimage will be calculated from new landmark. It is necessary here to reproject the 3D reconstruction process into this new landmark. Normally, a depthmap will be calculated into the orthorectification map : it's about an image applied on the object, where the pixels show the distance from map. We describe it as 2.5D (3D information isn't available only for a finite number of positions). &lt;br /&gt;
&lt;br /&gt;
The [[PIMs2MNT]] command allows to create a depth map into the map of the facade n°1 (the one calculated before : ''Repere-Facade1.xml'') : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims2MNT MicMac DoOrtho=1 Repere=Repere-Facade1.xml Pat=&amp;quot;facade1.*JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The correlation map resulting, calculated into the facade n°1, can be found into the folder ''PIMs-TmpBasc'' with the name ''PIMs-Merged_Correl.tif''. This file contains the correlation results : white is corresponding to a very good correlation scores ; more the grey is dark, less the matching process went well. &lt;br /&gt;
&lt;br /&gt;
Once the depth map processed, MicMac compute into the PIMs-ORTHO repertory the orthoimages for each image, aswell as incidence pictures, with the angle between the facade and the perspective ray (images Incid_facade1DSC###.tif) and images with hidden parts, that show in white the hidden parts into the image (imgaes PC_facade1DSC###.tif).&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
After the calculation, each orthoimages should be mosaiced. The choice of the image to use for each pixel is done with these specificity : no hidden parts, best angle of attack, continuity in the choice of images :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tawny PIMs-ORTHO/&amp;lt;/pre&amp;gt;&lt;br /&gt;
The result is an image created into ''PIMs-ORTHO'' folder, called Orthophotomosaic.tif. Metadatas associated are available into the file Orthophotomosaic.tfw. We can see the resolution chosen for the orthoimage computation here (1.1mm).&lt;br /&gt;
&lt;br /&gt;
[[Image:Pierrerue5.png|thumb|180px||alt=Pierrerue|Meshlab visualization]]&lt;br /&gt;
We can now create a faded image relief, this image allows to evaluate the quality of the reconstruction, especially in detecting the noise existing on the reconstructed surface.&lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d GrShade PIMs-TmpBasc/PIMs-Merged_Prof.tif ModeOmbre=IgnE Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_Shade.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Another product can be created, it's an colorized image into the facade depth (each colour is corresponding to a  scale of depth) :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Coul=1 Circ=1 Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_8Bits.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Files ''Facade1_Shade.tif'' and ''Facade1_8bits'' can be seen with any image viewer software. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, it can be helpful to regenerate a 3D cloud points from the depth map, by colorizing it with the orthoimage. The advantage is to use radiometric equalization calculated on the orthoimage (during the [[Tawny]]) to have an equalized 3D cloud points. The disadvantage is that the 3D area is only 2,5D and that the perpendicular object from the facade map aren't showed on the cloud. &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr=PIMs-ORTHO/Orthophotomosaic.tif Out=Facade1.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The file ''Facade1.ply'' can be seen with Meshlab, and can be compared to the file ''C3DC_MicMac_Pierrerue.ply''. &lt;br /&gt;
&lt;br /&gt;
Everything executed on the facade n°1 can be done now on facade n°2.&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2508</id>
		<title>Pierrerue tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2508"/>
				<updated>2017-06-06T17:26:13Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Visualization of relative orientation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
This dataset allow you to process a georeferenced orthophoto from a front. This dataset was acquired by student of ENSG during their summer internship in Forcalquier. They have use targets surveyed with total station.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
*31 JPG images&lt;br /&gt;
*1 file containing the support points (Pierrerue.xml)&lt;br /&gt;
*We will use the folder 001_Elements-de-georeferencement&lt;br /&gt;
&lt;br /&gt;
==About the data==&lt;br /&gt;
The shooting contain 31 images, taken with the Sony alpha850 with a 244mm lens.&lt;br /&gt;
Check that the folder contains images recovering :&lt;br /&gt;
*facade n°1&lt;br /&gt;
*facade n°2&lt;br /&gt;
* the corner between these 2 façade (images link).&lt;br /&gt;
Support points are available and allows to georeference the readings (see the folder 001_Elements-de-georeferencement).&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===Set up the images===&lt;br /&gt;
====Tie-Points search====&lt;br /&gt;
All the images should be set up simultaneously, in order that the georeference process will be expressed into one only cordinate system. First, we can run the tie-points search : &amp;lt;pre&amp;gt;mm3d Tapioca MulScale &amp;quot;.*JPG&amp;quot; 600 2000&amp;lt;/pre&amp;gt;&lt;br /&gt;
====Internal orientation and relative orientation====&lt;br /&gt;
Then, we have to define the settings of the camera used from the images covering the corner between the 2 facades (more suitable than the others, because of the depth). &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;angle.*JPG&amp;quot; Out=Calib24mm&amp;lt;/pre&amp;gt; Into the command prompt, we can check if the residual from the images is admissible (around the half pixel). We can check also the number of connection points, aswell as the percentage of points keeped (&amp;quot;99.8258 of 28466&amp;quot; : 99.8% of the connection points kept from 38466 points calculated). We can process yet the setting up of all the images. starting from the camera we calculated before. &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; InCal=Calib24mm Out=MEP&amp;lt;/pre&amp;gt; Into the command prompt, we can control residual during the process. At the last step, we can see that the image residual are, for all the images, less than a half-pixel. We control also the number of support points, aswell that the percentage of points keeped (&amp;quot;99.8258 of 38466&amp;quot; : 99.8% of the support points keeped from the 38466 points calculated).&lt;br /&gt;
&lt;br /&gt;
====Visualization of relative orientation====&lt;br /&gt;
The [[AperiCloud]] command allows to generate 3D clouds, containing all the support points obtained with [[Tapioca]], and the position of the cameras obtained from the [[Tapas]] output.&lt;br /&gt;
[[Image:Pierrerue2.png|thumb|180px||alt=Pierrerue|Meshlab visualization]] &lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP&amp;lt;/pre&amp;gt;The result of this command can be seen for example with the meshlab software.&lt;br /&gt;
&lt;br /&gt;
===Set up the images into the coordinates system of the support points===&lt;br /&gt;
Now we have to measure the support points available to georeference the images, and so the incoming products, into the reference coordinate system.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Explaining the process====&lt;br /&gt;
*1. We measure 3 support points well distributed ([[SaisieAppuisInit]] command) ;&lt;br /&gt;
*2. We can now create a 3D similarity (transformation from 7 settings containing one scale coefficient, one translation into space and one space rotation) between the arbitary system calculated while the setting up and the coordinate system chosen ([[GCPBascule]] command) ; &lt;br /&gt;
*3. We have now to measure the remaining points : the absolute orientation calculated from the previous step, allows to suggest an approximated position for each points ([[SaisieAppuisPredic]] command) ; &lt;br /&gt;
*4. We affine the absolute orientation ([[GCPBascule]] command) ; &lt;br /&gt;
*5. We start the final computation for the setting up (this offset allows to find the best position/orientation of the cameras while using the points measurements on the linking points and on the support points) ; ([[Campari]] command).&lt;br /&gt;
&lt;br /&gt;
====Measurement process====&lt;br /&gt;
*1. Measurements of almost 3 support points on the facade n°1 (At least one point should be measured to be valid on at least two images). To set the points, you need to get to know the the position of the 3 points thanks to the folder 001_Elements-de-georeferencement. &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[8|9].JPG&amp;quot; Pierrerue 1001 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
To validate two other support points (on two images each) : &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0588[3|5].JPG&amp;quot; Pierrerue 1002 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[3|6].JPG&amp;quot; Pierrerue 1121 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*2. Computation of the 3D similarity (absolute orientation) &amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc Pierrerue.xml MesureFacade-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*3. Measurement of all available points : &amp;lt;pre&amp;gt;mm3d SaisieAppuisPredicQT &amp;quot;facade.*JPG&amp;quot; MEP-Basc Pierrerue.xml MesureFacade-Final.xml&amp;lt;/pre&amp;gt; We must now validate the points left. &lt;br /&gt;
*4. Update computation of the absolute orientation.&lt;br /&gt;
This time, we need all the support points to calclulate the 3D similarity of the absolute orientation.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP-Basc MEP-Basc2 Pierrerue.xml MesureFacade-Final-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*5. Final adjustement&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Campari &amp;quot;.*JPG&amp;quot; MEP-Basc2 MEP-Terrain GCP=[Pierrerue.xml,0.02,MesureFacade-Final-S2D.xml,0.5]&amp;lt;/pre&amp;gt;&lt;br /&gt;
This computation adjustement use to find the best position/orientation of the cameras when the shooting was made assuming that the support points have a 0.02m accuracy, and the linking points have a 0.5 pixel accuracy. This values allows to ponderate the measurements.  It suits, at the end of the process, to control the residues on the support points and on the images residuals.&lt;br /&gt;
The residual images look like this :&lt;br /&gt;
&amp;lt;pre&amp;gt;| |  RESIDU LIAISON MOYENS = 0.547721 pour Id_Pastis_Hom Evol, Moy=2.38308e-07 ,Max=0.00295916&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===3D Reconstruction===&lt;br /&gt;
As we did for the Fountain exercise, we have to do the 3D reconstruction with the image geometry with the [[C3DC]] tools. First, we have to limit the reconstruction area. To process it, we have to create a mask on the cloud points ([[AperiCloud]]), that we have to recalculate in the new set up. &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP-Terrain&amp;lt;/pre&amp;gt;&lt;br /&gt;
To limit the computation area, we will create a 3D mask :&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AperiCloud_MEP-Terrain.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
Once the mask is created, we can launch the 3D reconstruction : &amp;lt;pre&amp;gt;mm3d C3DC MicMac &amp;quot;facade.*JPG&amp;quot; MEP-Terrain Masq3D=AperiCloud_MEP-Terrain.ply Out=C3DC_MicMac_Pierrerue.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The C3DC_MicMac_Pierrerue.ply file can be open with Meshlab.&lt;br /&gt;
&lt;br /&gt;
===Orthorectification===&lt;br /&gt;
To realize orthorectifications of the Pierrerue facade, we have to define a temporary landmark for each facade, where the Z axis will be perpendicular to the facade. In the first time, we will work on facade n°1. We processing it in 2 steps : &lt;br /&gt;
&lt;br /&gt;
1. Mask creation into the facade : &amp;lt;pre&amp;gt;mm3d SaisieMasqQT facade1DSC05893.JPG Attr=Facade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Computation of a local landmark where the Z axis is perpendicular and going through the support points included into the masks : &amp;lt;pre&amp;gt;mm3d RepLocBascule &amp;quot;facade1.*JPG&amp;quot; Ori-MEP-Terrain HOR Repere-Facade1.xml PostPlan=_MasqFacade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The HOR setting show that for the Ox axis of our orthoimage, we use the horizontal of our worksite. It can be done here, because the set up MEP-Terrain was obtained from support points. The orthoimage will be calculated from new landmark. It is necessary here to reproject the 3D reconstruction process into this new landmark. Normally, a depthmap will be calculated into the orthorectification map : it's about an image applied on the object, where the pixels show the distance from map. We describe it as 2.5D (3D information isn't available only for a finite number of positions). &lt;br /&gt;
&lt;br /&gt;
The [[PIMs2MNT]] command allows to create a depth map into the map of the facade n°1 (the one calculated before : ''Repere-Facade1.xml'') : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims2MNT MicMac DoOrtho=1 Repere=Repere-Facade1.xml Pat=&amp;quot;facade1.*JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The correlation map resulting, calculated into the facade n°1, can be found into the folder ''PIMs-TmpBasc'' with the name ''PIMs-Merged_Correl.tif''. This file contains the correlation results : white is corresponding to a very good correlation scores ; more the grey is dark, less the matching process went well. &lt;br /&gt;
&lt;br /&gt;
Once the depth map processed, MicMac compute into the PIMs-ORTHO repertory the orthoimages for each image, aswell as incidence pictures, with the angle between the facade and the perspective ray (images Incid_facade1DSC###.tif) and images with hidden parts, that show in white the hidden parts into the image (imgaes PC_facade1DSC###.tif).&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
After the calculation, each orthoimages should be mosaiced. The choice of the image to use for each pixel is done with these specificity : no hidden parts, best angle of attack, continuity in the choice of images :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tawny PIMs-ORTHO/&amp;lt;/pre&amp;gt;&lt;br /&gt;
The result is an image created into ''PIMs-ORTHO'' folder, called Orthophotomosaic.tif. Metadatas associated are available into the file Orthophotomosaic.tfw. We can see the resolution chosen for the orthoimage computation here (1.1mm).&lt;br /&gt;
&lt;br /&gt;
[[Image:Pierrerue5.png|thumb|180px||alt=Pierrerue|Meshlab visualization]]&lt;br /&gt;
We can now create a faded image relief, this image allows to evaluate the quality of the reconstruction, especially in detecting the noise existing on the reconstructed surface.&lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d GrShade PIMs-TmpBasc/PIMs-Merged_Prof.tif ModeOmbre=IgnE Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_Shade.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Another product can be created, it's an colorized image into the facade depth (each colour is corresponding to a  scale of depth) :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Coul=1 Circ=1 Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_8Bits.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Files ''Facade1_Shade.tif'' and ''Facade1_8bits'' can be seen with any image viewer software. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, it can be helpful to regenerate a 3D cloud points from the depth map, by colorizing it with the orthoimage. The advantage is to use radiometric equalization calculated on the orthoimage (during the [[Tawny]]) to have an equalized 3D cloud points. The disadvantage is that the 3D area is only 2,5D and that the perpendicular object from the facade map aren't showed on the cloud. &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr=PIMs-ORTHO/Orthophotomosaic.tif Out=Facade1.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The file ''Facade1.ply'' can be seen with Meshlab, and can be compared to the file ''C3DC_MicMac_Pierrerue.ply''. &lt;br /&gt;
&lt;br /&gt;
Everything executed on the facade n°1 can be done now on facade n°2.&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2507</id>
		<title>Pierrerue tutorial</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Pierrerue_tutorial&amp;diff=2507"/>
				<updated>2017-06-06T17:25:40Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Internal orientation and relative orientation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:picto-liste.png|25px|link=Tutorials]] [[Tutorials|Tutorials index]]&lt;br /&gt;
==Description==&lt;br /&gt;
This dataset allow you to process a georeferenced orthophoto from a front. This dataset was acquired by student of ENSG during their summer internship in Forcalquier. They have use targets surveyed with total station.&lt;br /&gt;
&lt;br /&gt;
==Download==&lt;br /&gt;
You can find this dataset at &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt; &amp;lt;br&amp;gt;&lt;br /&gt;
Once you have downloaded it, you have to unzip the &amp;quot;.zip&amp;quot; archive.&lt;br /&gt;
&lt;br /&gt;
==Presentation==&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
*31 JPG images&lt;br /&gt;
*1 file containing the support points (Pierrerue.xml)&lt;br /&gt;
*We will use the folder 001_Elements-de-georeferencement&lt;br /&gt;
&lt;br /&gt;
==About the data==&lt;br /&gt;
The shooting contain 31 images, taken with the Sony alpha850 with a 244mm lens.&lt;br /&gt;
Check that the folder contains images recovering :&lt;br /&gt;
*facade n°1&lt;br /&gt;
*facade n°2&lt;br /&gt;
* the corner between these 2 façade (images link).&lt;br /&gt;
Support points are available and allows to georeference the readings (see the folder 001_Elements-de-georeferencement).&lt;br /&gt;
&lt;br /&gt;
==Tutorial==&lt;br /&gt;
===Set up the images===&lt;br /&gt;
====Tie-Points search====&lt;br /&gt;
All the images should be set up simultaneously, in order that the georeference process will be expressed into one only cordinate system. First, we can run the tie-points search : &amp;lt;pre&amp;gt;mm3d Tapioca MulScale &amp;quot;.*JPG&amp;quot; 600 2000&amp;lt;/pre&amp;gt;&lt;br /&gt;
====Internal orientation and relative orientation====&lt;br /&gt;
Then, we have to define the settings of the camera used from the images covering the corner between the 2 facades (more suitable than the others, because of the depth). &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;angle.*JPG&amp;quot; Out=Calib24mm&amp;lt;/pre&amp;gt; Into the command prompt, we can check if the residual from the images is admissible (around the half pixel). We can check also the number of connection points, aswell as the percentage of points keeped (&amp;quot;99.8258 of 28466&amp;quot; : 99.8% of the connection points kept from 38466 points calculated). We can process yet the setting up of all the images. starting from the camera we calculated before. &amp;lt;pre&amp;gt;mm3d Tapas RadialStd &amp;quot;.*JPG&amp;quot; InCal=Calib24mm Out=MEP&amp;lt;/pre&amp;gt; Into the command prompt, we can control residual during the process. At the last step, we can see that the image residual are, for all the images, less than a half-pixel. We control also the number of support points, aswell that the percentage of points keeped (&amp;quot;99.8258 of 38466&amp;quot; : 99.8% of the support points keeped from the 38466 points calculated).&lt;br /&gt;
&lt;br /&gt;
====Visualization of relative orientation====&lt;br /&gt;
The [[AperiCloud]] command allows to generate 3D clouds, containing all the support points obtained with [[Tapioca]], and the position of the cameras obtained from the [[Tapas]] output.&lt;br /&gt;
[[Image:Pierrerue2.png|thumb|180px||alt=Pierrerue|Meshlab visualization]] &lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; Pierrerue&amp;lt;/pre&amp;gt;The result of this command can be seen for example with the meshlab software.&lt;br /&gt;
&lt;br /&gt;
===Set up the images into the coordinates system of the support points===&lt;br /&gt;
Now we have to measure the support points available to georeference the images, and so the incoming products, into the reference coordinate system.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
====Explaining the process====&lt;br /&gt;
*1. We measure 3 support points well distributed ([[SaisieAppuisInit]] command) ;&lt;br /&gt;
*2. We can now create a 3D similarity (transformation from 7 settings containing one scale coefficient, one translation into space and one space rotation) between the arbitary system calculated while the setting up and the coordinate system chosen ([[GCPBascule]] command) ; &lt;br /&gt;
*3. We have now to measure the remaining points : the absolute orientation calculated from the previous step, allows to suggest an approximated position for each points ([[SaisieAppuisPredic]] command) ; &lt;br /&gt;
*4. We affine the absolute orientation ([[GCPBascule]] command) ; &lt;br /&gt;
*5. We start the final computation for the setting up (this offset allows to find the best position/orientation of the cameras while using the points measurements on the linking points and on the support points) ; ([[Campari]] command).&lt;br /&gt;
&lt;br /&gt;
====Measurement process====&lt;br /&gt;
*1. Measurements of almost 3 support points on the facade n°1 (At least one point should be measured to be valid on at least two images). To set the points, you need to get to know the the position of the 3 points thanks to the folder 001_Elements-de-georeferencement. &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[8|9].JPG&amp;quot; Pierrerue 1001 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
To validate two other support points (on two images each) : &amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0588[3|5].JPG&amp;quot; Pierrerue 1002 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d SaisieAppuisInitQT &amp;quot;facade1DSC0589[3|6].JPG&amp;quot; Pierrerue 1121 MesureFacade.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*2. Computation of the 3D similarity (absolute orientation) &amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP MEP-Basc Pierrerue.xml MesureFacade-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*3. Measurement of all available points : &amp;lt;pre&amp;gt;mm3d SaisieAppuisPredicQT &amp;quot;facade.*JPG&amp;quot; MEP-Basc Pierrerue.xml MesureFacade-Final.xml&amp;lt;/pre&amp;gt; We must now validate the points left. &lt;br /&gt;
*4. Update computation of the absolute orientation.&lt;br /&gt;
This time, we need all the support points to calclulate the 3D similarity of the absolute orientation.&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d GCPBascule &amp;quot;.*JPG&amp;quot; MEP-Basc MEP-Basc2 Pierrerue.xml MesureFacade-Final-S2D.xml&amp;lt;/pre&amp;gt;&lt;br /&gt;
*5. Final adjustement&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Campari &amp;quot;.*JPG&amp;quot; MEP-Basc2 MEP-Terrain GCP=[Pierrerue.xml,0.02,MesureFacade-Final-S2D.xml,0.5]&amp;lt;/pre&amp;gt;&lt;br /&gt;
This computation adjustement use to find the best position/orientation of the cameras when the shooting was made assuming that the support points have a 0.02m accuracy, and the linking points have a 0.5 pixel accuracy. This values allows to ponderate the measurements.  It suits, at the end of the process, to control the residues on the support points and on the images residuals.&lt;br /&gt;
The residual images look like this :&lt;br /&gt;
&amp;lt;pre&amp;gt;| |  RESIDU LIAISON MOYENS = 0.547721 pour Id_Pastis_Hom Evol, Moy=2.38308e-07 ,Max=0.00295916&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===3D Reconstruction===&lt;br /&gt;
As we did for the Fountain exercise, we have to do the 3D reconstruction with the image geometry with the [[C3DC]] tools. First, we have to limit the reconstruction area. To process it, we have to create a mask on the cloud points ([[AperiCloud]]), that we have to recalculate in the new set up. &amp;lt;pre&amp;gt;mm3d AperiCloud &amp;quot;.*JPG&amp;quot; MEP-Terrain&amp;lt;/pre&amp;gt;&lt;br /&gt;
To limit the computation area, we will create a 3D mask :&amp;lt;pre&amp;gt;mm3d SaisieMasqQT AperiCloud_MEP-Terrain.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
Once the mask is created, we can launch the 3D reconstruction : &amp;lt;pre&amp;gt;mm3d C3DC MicMac &amp;quot;facade.*JPG&amp;quot; MEP-Terrain Masq3D=AperiCloud_MEP-Terrain.ply Out=C3DC_MicMac_Pierrerue.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The C3DC_MicMac_Pierrerue.ply file can be open with Meshlab.&lt;br /&gt;
&lt;br /&gt;
===Orthorectification===&lt;br /&gt;
To realize orthorectifications of the Pierrerue facade, we have to define a temporary landmark for each facade, where the Z axis will be perpendicular to the facade. In the first time, we will work on facade n°1. We processing it in 2 steps : &lt;br /&gt;
&lt;br /&gt;
1. Mask creation into the facade : &amp;lt;pre&amp;gt;mm3d SaisieMasqQT facade1DSC05893.JPG Attr=Facade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Computation of a local landmark where the Z axis is perpendicular and going through the support points included into the masks : &amp;lt;pre&amp;gt;mm3d RepLocBascule &amp;quot;facade1.*JPG&amp;quot; Ori-MEP-Terrain HOR Repere-Facade1.xml PostPlan=_MasqFacade1&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The HOR setting show that for the Ox axis of our orthoimage, we use the horizontal of our worksite. It can be done here, because the set up MEP-Terrain was obtained from support points. The orthoimage will be calculated from new landmark. It is necessary here to reproject the 3D reconstruction process into this new landmark. Normally, a depthmap will be calculated into the orthorectification map : it's about an image applied on the object, where the pixels show the distance from map. We describe it as 2.5D (3D information isn't available only for a finite number of positions). &lt;br /&gt;
&lt;br /&gt;
The [[PIMs2MNT]] command allows to create a depth map into the map of the facade n°1 (the one calculated before : ''Repere-Facade1.xml'') : &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Pims2MNT MicMac DoOrtho=1 Repere=Repere-Facade1.xml Pat=&amp;quot;facade1.*JPG&amp;quot;&amp;lt;/pre&amp;gt;&lt;br /&gt;
The correlation map resulting, calculated into the facade n°1, can be found into the folder ''PIMs-TmpBasc'' with the name ''PIMs-Merged_Correl.tif''. This file contains the correlation results : white is corresponding to a very good correlation scores ; more the grey is dark, less the matching process went well. &lt;br /&gt;
&lt;br /&gt;
Once the depth map processed, MicMac compute into the PIMs-ORTHO repertory the orthoimages for each image, aswell as incidence pictures, with the angle between the facade and the perspective ray (images Incid_facade1DSC###.tif) and images with hidden parts, that show in white the hidden parts into the image (imgaes PC_facade1DSC###.tif).&lt;br /&gt;
[[Image:Pierrerue1.png|thumb|180px||alt=Pierrerue|Pierrerue Chapel]]&lt;br /&gt;
After the calculation, each orthoimages should be mosaiced. The choice of the image to use for each pixel is done with these specificity : no hidden parts, best angle of attack, continuity in the choice of images :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Tawny PIMs-ORTHO/&amp;lt;/pre&amp;gt;&lt;br /&gt;
The result is an image created into ''PIMs-ORTHO'' folder, called Orthophotomosaic.tif. Metadatas associated are available into the file Orthophotomosaic.tfw. We can see the resolution chosen for the orthoimage computation here (1.1mm).&lt;br /&gt;
&lt;br /&gt;
[[Image:Pierrerue5.png|thumb|180px||alt=Pierrerue|Meshlab visualization]]&lt;br /&gt;
We can now create a faded image relief, this image allows to evaluate the quality of the reconstruction, especially in detecting the noise existing on the reconstructed surface.&lt;br /&gt;
 &amp;lt;pre&amp;gt;mm3d GrShade PIMs-TmpBasc/PIMs-Merged_Prof.tif ModeOmbre=IgnE Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_Shade.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Another product can be created, it's an colorized image into the facade depth (each colour is corresponding to a  scale of depth) :&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d to8Bits PIMs-TmpBasc/PIMs-Merged_Prof.tif Coul=1 Circ=1 Mask=PIMs-TmpBasc/PIMs-Merged_Masq.tif Out=Facade1_8Bits.tif&amp;lt;/pre&amp;gt;&lt;br /&gt;
Files ''Facade1_Shade.tif'' and ''Facade1_8bits'' can be seen with any image viewer software. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, it can be helpful to regenerate a 3D cloud points from the depth map, by colorizing it with the orthoimage. The advantage is to use radiometric equalization calculated on the orthoimage (during the [[Tawny]]) to have an equalized 3D cloud points. The disadvantage is that the 3D area is only 2,5D and that the perpendicular object from the facade map aren't showed on the cloud. &lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d Nuage2Ply PIMs-TmpBasc/PIMs-Merged.xml Attr=PIMs-ORTHO/Orthophotomosaic.tif Out=Facade1.ply&amp;lt;/pre&amp;gt;&lt;br /&gt;
The file ''Facade1.ply'' can be seen with Meshlab, and can be compared to the file ''C3DC_MicMac_Pierrerue.ply''. &lt;br /&gt;
&lt;br /&gt;
Everything executed on the facade n°1 can be done now on facade n°2.&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Presentation&amp;diff=2506</id>
		<title>Presentation</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Presentation&amp;diff=2506"/>
				<updated>2017-06-06T13:14:32Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Introduction ==&lt;br /&gt;
&lt;br /&gt;
MicMac is a free open-source ([https://en.wikipedia.org/wiki/CeCILL Cecill-B licence]) photogrammetric suite that can be used in a variety of 3D reconstruction scenarios. In aims mainly at professionnal or academic users but constant efforts are made to make it more accessible to the general public. &lt;br /&gt;
&lt;br /&gt;
One of MicMac strengths is its high degree of versatility. It can indeed be used in various fields : cartography, environment, industry, forestry, heritage, archaeology,...&lt;br /&gt;
&lt;br /&gt;
MicMac allows both the creation of 3D models and of ortho-imagery when appropriate. &lt;br /&gt;
&lt;br /&gt;
The software is suitable to every type of objects of any scale : from small object or statues with acquisition from the ground, to church, castle through drone acquisitions, to buildings, cities or natural areas through aerial or satellite acquisitions. The tools also allow for the georeferencing of the end products in local/global/absolute coordinates system. Some complementary tools opens the fields of metrology and site surveying.&lt;br /&gt;
&lt;br /&gt;
To discover MicMac, its environment, its philosophy and the main tools it offers: [http://download.springer.com/static/pdf/241/art%253A10.1186%252Fs40965-017-0027-2.pdf?originUrl=http%3A%2F%2Fopengeospatialdata.springeropen.com%2Farticle%2F10.1186%2Fs40965-017-0027-2&amp;amp;token2=exp=1496754285~acl=%2Fstatic%2Fpdf%2F241%2Fart%25253A10.1186%25252Fs40965-017-0027-2.pdf*~hmac=06e4aa9ab6253649cb5b66085279293fe869176cc5baacf8bc4b0b89a023db67]&lt;br /&gt;
&lt;br /&gt;
To access the MicMac reference documentation : [https://github.com/micmacIGN/Documentation/blob/master/DocMicMac.pdf]&lt;br /&gt;
&lt;br /&gt;
== Examples ==&lt;br /&gt;
These are some examples of surveys were MicMac was used for the photogrammetric processing.&lt;br /&gt;
&lt;br /&gt;
=== Digitizing the castle of Chambord ===&lt;br /&gt;
Two survey campaigns of two weeks were organised in Chambord castle during the autumns of 2014 and 2015 in order to digitize this national heritage monument. These surveys were conducted by students of the [http://www.ensg.eu/ ENSG] PPMD master as educational fieldwork. The following methods were used there : laser scanning, photogrammetry, topography, geodesy, remote sensing, etc... The student have for example surveyed the fronts of the castle :&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
[[Image:ApericloudChambord.jpg|x200px]][[Image:Front_ortho.png|x200px]][[Image:Front_depthmap.png|x200px]][[Image:Front_depthmapshade.png|x200px]]&amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== 3D modelization of Mourres site ===&lt;br /&gt;
Here are some results from a project made by [http://www.ensg.eu/ ENSG] students, at Forcalquier. Ground acquisition and from UAV.&lt;br /&gt;
Photogrametric workflow was made on Micmac, and the layout on QGIS.&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
[[Image:Mourres1.jpg|x200px]] [[Image:Mourres2.jpg|x200px]] [[Image:MourresDroneOblique.jpg|x200px]][[Image:MourresOrtho.jpg|x250px]] [[Image:MourresMNE.jpg|x250px]]&amp;lt;br&amp;gt;&lt;br /&gt;
{{#ev:youtube|https://youtu.be/es3RByteOu0|500x300px}} &amp;lt;br&amp;gt;&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Canopy Survey ===&lt;br /&gt;
A UAV-survey was organised in Grand-Leez in Belgium by l’Unité Gestion des Ressources Forestières et des Milieux Naturels (GRFMN), Université de Liège. The aim of this survey was to generate a DCM (Digital Canopy Model) for forestery applications.&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
[[Image:Canopy DEM.png|x200px]]&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=2505</id>
		<title>Bibliography</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=2505"/>
				<updated>2017-06-06T13:05:39Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Articles */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Articles=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Author&lt;br /&gt;
!Title&lt;br /&gt;
!URL&lt;br /&gt;
!Use of MicMac&lt;br /&gt;
|-&lt;br /&gt;
|A. Stumpf, E. Augereau, C. Delacourt, J. Bonnier&lt;br /&gt;
|&amp;lt;i&amp;gt;Photogrammetric discharge monitoring of small tropical mountain rivers: A case study at Rivière des Pluies, Réunion Island&amp;lt;/i&amp;gt;&lt;br /&gt;
|https://www.researchgate.net/publication/303093292_Photogrammetric_discharge_monitoring_of_small_tropical_mountain_rivers_A_case_study_at_Riviere_des_Pluies_Reunion_Island&lt;br /&gt;
|Reconstruction of mobile river beds at La Reunion&lt;br /&gt;
|-&lt;br /&gt;
|A. Stumpf, J.-P. Malet, C. Delacourt&lt;br /&gt;
|&amp;lt;i&amp;gt;Correlation of satellite image time-series for the detection and monitoring of slow-moving landslides&amp;lt;/i&amp;gt;&lt;br /&gt;
|https://www.researchgate.net/publication/310800399_Correlation_of_satellite_image_time-series_for_the_detection_and_monitoring_of_slow-moving_landslides&lt;br /&gt;
|Proposing methods for analyzing time-series of displacement fields for landslide detection and monitoring, MicMac is used for derriving the displacement fields&lt;br /&gt;
|-&lt;br /&gt;
|A. Stumpf,  J.-P. Malet, P. Allemand, G. Skupinski&lt;br /&gt;
|&amp;lt;i&amp;gt;Ground-based multi-view photogrammetry for the monitoring of landslide deformation and erosion&amp;lt;/i&amp;gt;&lt;br /&gt;
|https://www.researchgate.net/publication/269762133_Ground-based_multi-view_photogrammetry_for_the_monitoring_of_landslide_deformation_and_erosion&lt;br /&gt;
|Comparison of different MVS methods for monitoring landslides and erosion&lt;br /&gt;
|-&lt;br /&gt;
|Luc Girod&lt;br /&gt;
|&amp;lt;i&amp;gt;IMPROVEMENT OF DEM GENERATION FROM ASTER IMAGES USING SATELLITE JITTER ESTIMATION AND OPEN SOURCE IMPLEMENTATION&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-1-W5/249/2015/&lt;br /&gt;
|Describe a method to process ASTER images into DEMs&lt;br /&gt;
|-&lt;br /&gt;
|Olivier Galland&lt;br /&gt;
|&amp;lt;i&amp;gt;APPLICATION OF OPEN-SOURCE PHOTOGRAMMETRIC SOFTWARE MICMAC FOR MONITORING SURFACE DEFORMATION IN LABORATORY MODELS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://dx.doi.org/10.1002/2015JB012564&lt;br /&gt;
|MicMac for monitoring surface deformation (Gravillons dataset)&lt;br /&gt;
|-&lt;br /&gt;
|Mehdi Daakir&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV ONBOARD PHOTOGRAMMETRY AND GPS POSITIONNING FOR EARTHWORKS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/293/2015/isprsarchives-XL-3-W3-293-2015.pdf&lt;br /&gt;
|Compute lever arm for UAV onboard photogrammetry&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;A MULTIRESOLUTION AND OPTIMIZATION-BASED IMAGE MATCHING APPROACH: AN APPLICATION TO SURFACE RECONSTRUCTION FROM SPOT5-HRS STEREO IMAGERY&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs.org/proceedings/xxxvi/1-w41/makaleler/Pierrot_multiresolution_matching.pdf&lt;br /&gt;
|Describe multi-scale approach which MicMac use&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;APERO, AN OPEN SOURCE BUNDLE ADJUSMENT SOFTWARE FOR AUTOMATIC CALIBRATION AND ORIENTATION OF SET OF IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXVIII-5-W16/269/2011/isprsarchives-XXXVIII-5-W16-269-2011.pdf&lt;br /&gt;
|General presentation of APERO-MicMac, some algorithmic aspects and examples of realizations&lt;br /&gt;
|-&lt;br /&gt;
|Antoine Pinte&lt;br /&gt;
|&amp;lt;i&amp;gt;Orthoimages of the outer walls and towers of the château de Chambord&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5-W3/243/2015/isprsannals-II-5-W3-243-2015.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the château de Chambord&lt;br /&gt;
|-&lt;br /&gt;
|Raphaele Héno&lt;br /&gt;
|&amp;lt;i&amp;gt;COSTLESS PLATFORM FOR HIGH RESOLUTION STEREOSCOPIC IMAGES OF A HIGH GOTHIC FACADE&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B5/559/2012/isprsarchives-XXXIX-B5-559-2012.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the cathedral of Amiens&lt;br /&gt;
|-&lt;br /&gt;
|Ana-Maria Rosu&lt;br /&gt;
|&amp;lt;i&amp;gt;COASTAL DIGITAL SURFACE MODEL ON LOW CONTRAST IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/307/2015/isprsarchives-XL-3-W3-307-2015.pdf&lt;br /&gt;
|Advanced tie points extraction and using MicMac in coastal sandy environments &lt;br /&gt;
|-&lt;br /&gt;
|Vincent Tournadre&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV PHOTOGRAMMETRY TO MONITOR DYKES – CALIBRATION AND COMPARISON TO TERRESTRIAL LIDAR&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W1/143/2014/isprsarchives-XL-3-W1-143-2014.pdf&lt;br /&gt;
|Using MicMac to process UAV acquisitions in the context of dykes monitoring&lt;br /&gt;
|-&lt;br /&gt;
|Vincent Tournadre&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV LINEAR PHOTOGRAMMETRY&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/327/2015/isprsarchives-XL-3-W3-327-2015.pdf&lt;br /&gt;
|Using MicMac to process complex linear axis UAV acquisitions&lt;br /&gt;
|-&lt;br /&gt;
|Jonathan Lisein&lt;br /&gt;
|&amp;lt;i&amp;gt;A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.mdpi.com/1999-4907/4/4/922/htm&lt;br /&gt;
|Using MicMac in forest context to model the forest canopy surface&lt;br /&gt;
|-&lt;br /&gt;
|Athanasios Georgantas&lt;br /&gt;
|&amp;lt;i&amp;gt;AN ACCURACY ASSESSMENT OF AUTOMATED PHOTOGRAMMETRIC TECHNIQUES FOR 3D MODELING OF COMPLEX INTERIORS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B3/23/2012/isprsarchives-XXXIX-B3-23-2012.pdf&lt;br /&gt;
|Comparison of MicMac 3D cloud point to terrestrial laser scanning for modelling of complex interior spaces&lt;br /&gt;
|-&lt;br /&gt;
|Mariam Samaan&lt;br /&gt;
|&amp;lt;i&amp;gt;CLOSE-RANGE PHOTOGRAMMETRIC TOOLS FOR SMALL 3D ARCHEOLOGICAL OBJECTS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-5-W2/549/2013/isprsarchives-XL-5-W2-549-2013.pdf&lt;br /&gt;
|Using MicMac in the context of macrophotography for small archaeological objects&lt;br /&gt;
|-&lt;br /&gt;
|Anna Mouget&lt;br /&gt;
|&amp;lt;i&amp;gt;PHOTOGRAMMETRIC ARCHAEOLOGICAL SURVEY WITH UAV&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5/251/2014/isprsannals-II-5-251-2014.pdf&lt;br /&gt;
|Using MicMac to process archaeological site survey with a UAV&lt;br /&gt;
|-&lt;br /&gt;
|Ewelina Rupnik, Mehdi Daakir and Marc Pierrot Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;MicMac – a free, open-source solution for photogrammetry&amp;lt;/i&amp;gt;&lt;br /&gt;
|https://opengeospatialdata.springeropen.com/articles/10.1186/s40965-017-0027-2&lt;br /&gt;
|An article to discover MicMac and to familiarize with its environment, its philosophy and its main tools.&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2477</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2477"/>
				<updated>2017-04-03T16:13:37Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : ajout dataset ramses&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Here, you can find some datasets :&lt;br /&gt;
* Gravillons : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/gravillons_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Fontaine : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/fontaine_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Pierrerue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Zhenjue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/zhenjue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* GrandLeez : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/uas_grand_leez_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Chambord Tower : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Chambord_Tower_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Saint Michel De Cuxa : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Saint_Michel_De_Cuxa_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Mur Saint Martin : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Mur_Saint_Martin_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Street Saint Martin : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Street_Saint_Martin_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Tortue Hue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Tortue_Hue_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Vincennes : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Vincennes_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Viabon : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Viabon_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Mars : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Mars_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Gulya Earthquake : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Guyla_Earthquake_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Concrete : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Concrete_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* DemoScanned : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/DemoScanned_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Ramses : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Ramses_Dataset.zip&amp;lt;/code&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2476</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2476"/>
				<updated>2017-04-03T15:57:55Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : ajout de quelques datasets de la doc micmac&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Here, you can find some datasets :&lt;br /&gt;
* Gravillons : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/gravillons_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Fontaine : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/fontaine_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Pierrerue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Zhenjue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/zhenjue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* GrandLeez : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/uas_grand_leez_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Chambord Tower : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Chambord_Tower_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Saint Michel De Cuxa : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Saint_Michel_De_Cuxa_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Mur Saint Martin : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Mur_Saint_Martin_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Street Saint Martin : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Street_Saint_Martin_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Tortue Hue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Tortue_Hue_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Vincennes : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Vincennes_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Viabon : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Viabon_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Mars : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Mars_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Gulya Earthquake : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Guyla_Earthquake_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Concrete : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Concrete_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* DemoScanned : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/DemoScanned_Dataset.zip&amp;lt;/code&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2475</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2475"/>
				<updated>2017-04-03T14:15:58Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : ajout dataset saint michel de cuxa&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Here, you can find some datasets :&lt;br /&gt;
* Gravillons : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/gravillons_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Fontaine : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/fontaine_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Pierrerue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Zhenjue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/zhenjue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* GrandLeez : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/uas_grand_leez_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Chambord Tower : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Chambord_Tower_Dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Saint Michel De Cuxa : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Saint_Michel_De_Cuxa_Dataset.zip&amp;lt;/code&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2474</id>
		<title>Datasets</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Datasets&amp;diff=2474"/>
				<updated>2017-04-03T12:46:55Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : ajout dataset déroulé cylindrique&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Here, you can find some datasets :&lt;br /&gt;
* Gravillons : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/gravillons_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Fontaine : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/fontaine_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Pierrerue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/pierrerue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Zhenjue : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/zhenjue_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* GrandLeez : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/uas_grand_leez_dataset.zip&amp;lt;/code&amp;gt;&lt;br /&gt;
* Chambord Tower : &amp;lt;code&amp;gt;http://micmac.ensg.eu/data/Chambord_Tower_Dataset.zip&amp;lt;/code&amp;gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Forum&amp;diff=2216</id>
		<title>Forum</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Forum&amp;diff=2216"/>
				<updated>2016-11-10T16:52:24Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : ajout du sketchfab&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The MicMac forum is a user to user help forum. It can be found here : [http://forum-micmac.forumprod.com/ http://forum-micmac.forumprod.com/]&lt;br /&gt;
&lt;br /&gt;
The MicMac Sketchfab is available here : [https://sketchfab.com/micmac https://sketchfab.com/micmac] If you want to participate and add your 3D model generated with MicMac, please send it to mehdi.daakir(at)ensg.eu (size limit is 50MB)&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=MicMac_tools&amp;diff=2185</id>
		<title>MicMac tools</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=MicMac_tools&amp;diff=2185"/>
				<updated>2016-10-19T13:38:26Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Commands */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== About all commands ==&lt;br /&gt;
All commands come with an inline Help that can be accessed by typing :&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d CommandName -help&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Commands have Unnamed and Named arguments. The Unnamed are mandatory and must be given in order while the Named can be given in any order. For instance:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d CommandName UnnamedValue1 UnnamedValue2 NamedArg1=NamedValue1 NamedArg2=NamedValue2&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you have a version of MicMac that include the QT tools (binaries from the [http://logiciels.ign.fr/?Telechargement,20 IGN download page] or self compiled with the QT option activated), each command come with a GUI containing the options to fill and a file selection tool when appropriate. These GUI can be called using the command with a ''v'' prefix:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;mm3d vCommandName&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Commands ==&lt;br /&gt;
*[[AperiCloud]]&lt;br /&gt;
*[[Apero]]&lt;br /&gt;
*[[Bascule]]&lt;br /&gt;
*[[C3DC]]&lt;br /&gt;
*[[Campari]]&lt;br /&gt;
*[[CenterBascule]]&lt;br /&gt;
*[[CheckDependencies]],to do...&lt;br /&gt;
*[[ChgSysCo]],to do...&lt;br /&gt;
*[[GCPBascule]]&lt;br /&gt;
*[[GCPConvert]]&lt;br /&gt;
*[[GCPCtrl]],to do...&lt;br /&gt;
*[[GrShade]]&lt;br /&gt;
*[[Malt]]&lt;br /&gt;
*[[Nuage2Ply]]&lt;br /&gt;
*[[NuageBascule]]&lt;br /&gt;
*[[OriConvert]]&lt;br /&gt;
*[[PIMs]],to improve...&lt;br /&gt;
*[[PIMs2MNT]]&lt;br /&gt;
*[[PIMs2Ply]]&lt;br /&gt;
*[[RepLocBascule]]&lt;br /&gt;
*[[SaisieAppuisInit]]&lt;br /&gt;
*[[SaisieAppuisInitQT]]&lt;br /&gt;
*[[SaisieAppuisPredic]]&lt;br /&gt;
*[[SaisieAppuisPredicQT]]&lt;br /&gt;
*[[SaisieBasc]]&lt;br /&gt;
*[[SaisieBascQT]]&lt;br /&gt;
*[[SaisieMasq]]&lt;br /&gt;
*[[SaisieMasqQT]]&lt;br /&gt;
*[[SBGlobBascule]]&lt;br /&gt;
*[[ScaleIm]]&lt;br /&gt;
*[[ScaleNuage]]&lt;br /&gt;
*[[SEL]]&lt;br /&gt;
*[[Tapas]]&lt;br /&gt;
*[[Tapioca]]&lt;br /&gt;
*[[Tarama]]&lt;br /&gt;
*[[Tawny]]&lt;br /&gt;
*[[Tequila]]&lt;br /&gt;
*[[TestKey]]&lt;br /&gt;
*[[TiPunch]]&lt;br /&gt;
*[[To8Bits]]&lt;br /&gt;
*[[CmpCalib]]&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=CmpCalib&amp;diff=2184</id>
		<title>CmpCalib</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=CmpCalib&amp;diff=2184"/>
				<updated>2016-10-19T13:34:00Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : Page créée avec « comparaison de calibrations (à venir) »&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;comparaison de calibrations (à venir)&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=1825</id>
		<title>Bibliography</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=1825"/>
				<updated>2016-06-12T13:02:38Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Articles */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Articles=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Author&lt;br /&gt;
!Title&lt;br /&gt;
!URL&lt;br /&gt;
!Use of MicMac&lt;br /&gt;
|-&lt;br /&gt;
|Luc Girod&lt;br /&gt;
|&amp;lt;i&amp;gt;IMPROVEMENT OF DEM GENERATION FROM ASTER IMAGES USING SATELLITE JITTER ESTIMATION AND OPEN SOURCE IMPLEMENTATION&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-1-W5/249/2015/&lt;br /&gt;
|Describe a method to process ASTER images into DEMs&lt;br /&gt;
|-&lt;br /&gt;
|Olivier Galland&lt;br /&gt;
|&amp;lt;i&amp;gt;APPLICATION OF OPEN-SOURCE PHOTOGRAMMETRIC SOFTWARE MICMAC FOR MONITORING SURFACE DEFORMATION IN LABORATORY MODELS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://dx.doi.org/10.1002/2015JB012564&lt;br /&gt;
|MicMac for monitoring surface deformation (Gravillons dataset)&lt;br /&gt;
|-&lt;br /&gt;
|Mehdi Daakir&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV ONBOARD PHOTOGRAMMETRY AND GPS POSITIONNING FOR EARTHWORKS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/293/2015/isprsarchives-XL-3-W3-293-2015.pdf&lt;br /&gt;
|Compute lever arm for UAV onboard photogrammetry&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;A MULTIRESOLUTION AND OPTIMIZATION-BASED IMAGE MATCHING APPROACH: AN APPLICATION TO SURFACE RECONSTRUCTION FROM SPOT5-HRS STEREO IMAGERY&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs.org/proceedings/xxxvi/1-w41/makaleler/Pierrot_multiresolution_matching.pdf&lt;br /&gt;
|Describe multi-scale approach which MicMac use&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;APERO, AN OPEN SOURCE BUNDLE ADJUSMENT SOFTWARE FOR AUTOMATIC CALIBRATION AND ORIENTATION OF SET OF IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXVIII-5-W16/269/2011/isprsarchives-XXXVIII-5-W16-269-2011.pdf&lt;br /&gt;
|General presentation of APERO-MicMac, some algorithmic aspects and examples of realizations&lt;br /&gt;
|-&lt;br /&gt;
|Antoine Pinte&lt;br /&gt;
|&amp;lt;i&amp;gt;Orthoimages of the outer walls and towers of the château de Chambord&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5-W3/243/2015/isprsannals-II-5-W3-243-2015.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the château de Chambord&lt;br /&gt;
|-&lt;br /&gt;
|Raphaele Héno&lt;br /&gt;
|&amp;lt;i&amp;gt;COSTLESS PLATFORM FOR HIGH RESOLUTION STEREOSCOPIC IMAGES OF A HIGH GOTHIC FACADE&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B5/559/2012/isprsarchives-XXXIX-B5-559-2012.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the cathedral of Amiens&lt;br /&gt;
|-&lt;br /&gt;
|Ana-Maria Rosu&lt;br /&gt;
|&amp;lt;i&amp;gt;COASTAL DIGITAL SURFACE MODEL ON LOW CONTRAST IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/307/2015/isprsarchives-XL-3-W3-307-2015.pdf&lt;br /&gt;
|Advanced tie points extraction and using MicMac in coastal sandy environments &lt;br /&gt;
|-&lt;br /&gt;
|Vincent Tournadre&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV PHOTOGRAMMETRY TO MONITOR DYKES – CALIBRATION AND COMPARISON TO TERRESTRIAL LIDAR&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W1/143/2014/isprsarchives-XL-3-W1-143-2014.pdf&lt;br /&gt;
|Using MicMac to process UAV acquisitions in the context of dykes monitoring&lt;br /&gt;
|-&lt;br /&gt;
|Vincent Tournadre&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV LINEAR PHOTOGRAMMETRY&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/327/2015/isprsarchives-XL-3-W3-327-2015.pdf&lt;br /&gt;
|Using MicMac to process complex linear axis UAV acquisitions&lt;br /&gt;
|-&lt;br /&gt;
|Jonathan Lisein&lt;br /&gt;
|&amp;lt;i&amp;gt;A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.mdpi.com/1999-4907/4/4/922/htm&lt;br /&gt;
|Using MicMac in forest context to model the forest canopy surface&lt;br /&gt;
|-&lt;br /&gt;
|Athanasios Georgantas&lt;br /&gt;
|&amp;lt;i&amp;gt;AN ACCURACY ASSESSMENT OF AUTOMATED PHOTOGRAMMETRIC TECHNIQUES FOR 3D MODELING OF COMPLEX INTERIORS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B3/23/2012/isprsarchives-XXXIX-B3-23-2012.pdf&lt;br /&gt;
|Comparison of MicMac 3D cloud point to terrestrial laser scanning for modelling of complex interior spaces&lt;br /&gt;
|-&lt;br /&gt;
|Mariam Samaan&lt;br /&gt;
|&amp;lt;i&amp;gt;CLOSE-RANGE PHOTOGRAMMETRIC TOOLS FOR SMALL 3D ARCHEOLOGICAL OBJECTS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-5-W2/549/2013/isprsarchives-XL-5-W2-549-2013.pdf&lt;br /&gt;
|Using MicMac in the context of macrophotography for small archaeological objects&lt;br /&gt;
|-&lt;br /&gt;
|Anna Mouget&lt;br /&gt;
|&amp;lt;i&amp;gt;PHOTOGRAMMETRIC ARCHAEOLOGICAL SURVEY WITH UAV&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5/251/2014/isprsannals-II-5-251-2014.pdf&lt;br /&gt;
|Using MicMac to process archaeological site survey with a UAV&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=1824</id>
		<title>Bibliography</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Bibliography&amp;diff=1824"/>
				<updated>2016-06-12T11:14:05Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : /* Article */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Articles=&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
!Author&lt;br /&gt;
!Title&lt;br /&gt;
!URL&lt;br /&gt;
!Use of MicMac&lt;br /&gt;
|-&lt;br /&gt;
|Luc Girod&lt;br /&gt;
|&amp;lt;i&amp;gt;IMPROVEMENT OF DEM GENERATION FROM ASTER IMAGES USING SATELLITE JITTER ESTIMATION AND OPEN SOURCE IMPLEMENTATION&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-1-W5/249/2015/&lt;br /&gt;
|Describe a method to process ASTER images into DEMs&lt;br /&gt;
|-&lt;br /&gt;
|Olivier Galland&lt;br /&gt;
|&amp;lt;i&amp;gt;APPLICATION OF OPEN-SOURCE PHOTOGRAMMETRIC SOFTWARE MICMAC FOR MONITORING SURFACE DEFORMATION IN LABORATORY MODELS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://dx.doi.org/10.1002/2015JB012564&lt;br /&gt;
|MicMac for monitoring surface deformation (Gravillons dataset)&lt;br /&gt;
|-&lt;br /&gt;
|Mehdi Daakir&lt;br /&gt;
|&amp;lt;i&amp;gt;UAV ONBOARD PHOTOGRAMMETRY AND GPS POSITIONNING FOR EARTHWORKS&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/293/2015/isprsarchives-XL-3-W3-293-2015.pdf&lt;br /&gt;
|Compute lever arm for UAV onboard photogrammetry&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;A MULTIRESOLUTION AND OPTIMIZATION-BASED IMAGE MATCHING APPROACH: AN APPLICATION TO SURFACE RECONSTRUCTION FROM SPOT5-HRS STEREO IMAGERY&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs.org/proceedings/xxxvi/1-w41/makaleler/Pierrot_multiresolution_matching.pdf&lt;br /&gt;
|Describe multi-scale approach which MicMac use&lt;br /&gt;
|-&lt;br /&gt;
|Marc Pierrot-Deseilligny&lt;br /&gt;
|&amp;lt;i&amp;gt;APERO, AN OPEN SOURCE BUNDLE ADJUSMENT SOFTWARE FOR AUTOMATIC CALIBRATION AND ORIENTATION OF SET OF IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXVIII-5-W16/269/2011/isprsarchives-XXXVIII-5-W16-269-2011.pdf&lt;br /&gt;
|General presentation of APERO-MicMac, some algorithmic aspects and examples of realizations&lt;br /&gt;
|-&lt;br /&gt;
|Antoine Pinte&lt;br /&gt;
|&amp;lt;i&amp;gt;Orthoimages of the outer walls and towers of the château de Chambord&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5-W3/243/2015/isprsannals-II-5-W3-243-2015.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the château de Chambord&lt;br /&gt;
|-&lt;br /&gt;
|Raphaele Héno&lt;br /&gt;
|&amp;lt;i&amp;gt;COSTLESS PLATFORM FOR HIGH RESOLUTION STEREOSCOPIC IMAGES OF A HIGH GOTHIC FACADE&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XXXIX-B5/559/2012/isprsarchives-XXXIX-B5-559-2012.pdf&lt;br /&gt;
|On the use of MicMac to produce a photogrammetric documentation in the context of architectural/cultural heritage, example of the cathedral of Amiens&lt;br /&gt;
|-&lt;br /&gt;
|Ana-Maria Rosu&lt;br /&gt;
|&amp;lt;i&amp;gt;COASTAL DIGITAL SURFACE MODEL ON LOW CONTRAST IMAGES&amp;lt;/i&amp;gt;&lt;br /&gt;
|http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-3-W3/307/2015/isprsarchives-XL-3-W3-307-2015.pdf&lt;br /&gt;
|Advanced tie points extraction and using MicMac in coastal sandy environments &lt;br /&gt;
&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	<entry>
		<id>http://micmac.ensg.eu/index.php?title=Fichier:Vinci-construction-logo.png&amp;diff=1823</id>
		<title>Fichier:Vinci-construction-logo.png</title>
		<link rel="alternate" type="text/html" href="http://micmac.ensg.eu/index.php?title=Fichier:Vinci-construction-logo.png&amp;diff=1823"/>
				<updated>2016-06-08T12:27:16Z</updated>
		
		<summary type="html">&lt;p&gt;Mdaakir : Mdaakir a téléversé une nouvelle version de Fichier:Vinci-construction-logo.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Mdaakir</name></author>	</entry>

	</feed>