Boost PC performance: How more available memory can improve productivity
Low Cost Solutions for LiDAR Point Cloud Analysis
1. Low Cost Solutions for
LiDAR Point Cloud Analysis
-Utilizing Open Source and Low Cost software-
Lidar Technologies 2012
Cairns, 7th June, 2012
Rahman Schionning and George Corea
Atherton Tablelands Geographic Information Services
2. Overview
Where we were an year ago
The search for new software
SAGA Demo – LAS Capabilities
Python integration
Dealing with outputs and derivative products
The 32bit Windows shortfall
More tools we are exploring
A future option
3. So...
You've gone and gotten yourself a point cloud!
Now What?
4. What software have we already got?
• ArcGIS and MapInfo could handle derivative
products like the 1m ASCII DEM, 12.5cm Imagery
and the 25cm Contours (in small portions)
• Erdas Imagine could import .las for rasterization and
subsequent analysis
• Global Mapper worked well but we only had one
license at the time.
• Mars Freeview
9. Customized Tools
•SAGA / Python based extraction of “.las”
10. Dealing with outputs of Lidar
- ArcGIS centred approaches
Even 10km2 blocks of 25cm data
(usually) crashed in vector
processing
Solution is to “chunk” the
datasets and then merge after
analysis.
Issue is that for the area of TRC
this means…
Hundreds of files =
High operator time for running processes =
Increased risk to operator error (-:
13. Other complex “Point Clouds”
- ArcGIS couldn’t process datasets of ~ 1m points at 30m spacing (same was true
with Mapinfo and QGIS)
Utilized …
UNIX “SED” command to reformat txt dataset to
geocodable format
then python to “chunk” the datasets into segments of
500k points
finally models to rejoin all the processed data.
14. •Some hard lessons…
Dataset size:
- Spatially splitting data significantly reduces processing time even when
multiple parts have to be merged at the completion of the process. Models
including python components have helped significantly.
“[If] there were originally 65,000 * 1000^2 = 6.5 E10 cells in the DEM. To
represent each of these requires at least four ordered pairs of either 4-byte integer or 8-
byte floating coordinates, or 32-64 bytes. That's a 1.3 E12 - 2.6 E12 byte (1.3 - 2.5 TB)
requirements. We haven't even begun to account for file overhead (a feature is stored as
more than just its coordinates), indexes, or the attribute values, which themselves could
need 0.6 TB (if stored in double precision) or more (if stored as text), plus storage for
identifiers. Oh, yes--ArcGIS likes to keep two copies of each intersection around, thereby
doubling everything. You might need 7-8 TB just to store the output.
Even if you had the storage needed, (a) you might use twice this (or more) if ArcGIS is
caching intermediate files and (b) it's doubtful that the operation would complete in any
reasonable time, anyway.”
A response to my query -
http://gis.stackexchange.com/questions/16110/issues-with-large-datasets which succinctly
describes the issues with large datesets.
15. •Some hard lessons…
- For example
A dataset which took 7 hours to process (and crashed sometimes),
processed in about 100 mins when split into 6 parts and then took 10 mins to merge.
- Need to investigate multi-core, multi-threaded operations but the currently
available modules are to cumbersome to easily integrate into our models.
16. •Added “intelligence” to models
so that unnecessary processes
aren’t run “brute force” in ArcGIS
model builder
•Some steps took ~12h to run
and so this (~10min) quick step
saved ~10h per model that took
30+ hours to complete.
17. •Added “intelligence” to models
so that unnecessary processes
aren’t run “brute force” in ArcGIS
model builder
•Some steps took ~12h to run and
so this quick step saved ~10h per
model that took 30+ hours to
complete.
18. More Tools
•Global Mapper ~$350
•great for conversion and clipping.
•3D viewing exists but not as good as Mars free view. Flood and View shed
analysis
•Meshlab ~$0
•Only useful for visualization of small areas and objects in high resolution.
•Google Sketchup ~$0-$500
•similar to Meshlab in usefulness for GIS.
•Commercial > $2500
o Makai Voyager – resamples las files to proprietary format. Highly
functional for detailed analysis but commercial version is not yet
launched
o Point Tools – resamples las files to proprietary format. Highly
functional for detailed analysis
o LASTools -
o Mars Pro -
* * <should use present tense except when talking about completed project>
* *
* *
* This process was initiated almost an year ago and only recently have tools been launched to do this –ex LASTools. Only required a list of the key lidar tile names and used arc view licence and SAGA/opensource Reduced operator time to approx 5 mins per project and processed overnight Quicker in new LASTools as SAGA/Dos/Python required multiple conversions.
* Ok so we have created high density outputs from the Lidar such as contours, DEM etc Now we can’t do any large scale analysis due to system issues…