Advanced knowledge - Scientific papers

   

Classification of aerial photogrammetric 3D point clouds

Click here to download the pdf.

Photogrammetric accuracy and modeling of rolling shutter cameras

Click here to download the pdf.

Terrestrial 3D mapping using fisheye and perspective sensors

Click here to download the pdf.

UAV collision and crime scene investigation

Click here to download the pdf.

Simplified building models extraction from ultra-light UAV

Click here to download the pdf.

Automatic mapping from ultra-light UAV imagery

Click here to download the pdf.

Photogrammetric performance of an ultra-light weight swinglet UAV

Click here to download the pdf.

Wide-baseline stereo from multiple views: a probabilistic account

Click here to download the pdf.

A generative model for true orthorectification

Click here to download the pdf.

The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery

Click here to download the pdf.

Dense matching of multiple wide-baseline views

Click here to download the pdf.

The Chillon project: aerial / terrestrial and indoor integration

Click here to download the pdf.

Dynamic and scalable large scale image reconstruction

Click here to download the pdf.

How accurate are UAV surveying methods

Click here to download the pdf.

3D error estimation from tie points

Click here to download the pdf.

Assessment Of The Radiometric Accuracy In A Target Less Work Flow Using Pix4D Software

Click here to download the pdf.

A new method to determine multi-angular reflectance factor from lightweight multispectral cameras with sky sensor in a target-less workflow applicable to UAV

Abstract

A new physically based method to estimate hemispheric-directional reflectance factor (HDRF) from lightweight multispectral cameras that have a downwelling irradiance sensor is presented. It combines radiometry with photogrammetric computer vision to derive geometrically and radiometrically accurate data purely from the images, without requiring reflectance targets or any other additional information apart from the imagery. The sky sensor orientation is initially computed using photogrammetric computer vision and revised with a non-linear regression comprising radiometric and photogrammetry-derived information. It works for both clear sky and overcast conditions. A ground-based test acquisition of a Spectralon target observed from different viewing directions and with different sun positions using a typical multispectral sensor configuration for clear sky and overcast showed that both the overall value and the directionality of the reflectance factor as reported in the literature were well retrieved. An RMSE of 3% for clear sky and up to 5 for overcast sky was observed.

To access the full article visit the journal of Remote Sensing of Environment or read the preprint.

Seeing the Invisible - Multispectral Image Analysis in Arable Farming

Click here to download the pdf.
Was this article helpful?
11 out of 14 found this helpful

Article feedback (for troubleshooting, post here)

0 comments

Please sign in to leave a comment.