Webinar | Drone LiDAR survey. The Modern Drone Automation Guide to Achieve the Best Data Results
As LiDAR has become more accessible than ever, more businesses are able to benefit from this technology.
In this webinar, SPH Engineering and MODUS discussed how to build a successful drone LiDAR survey workflow and achieve the best LiDAR data collection results.
- Drone LiDAR optimization: the ultimate guide to choosing the right drone LiDAR sensor combination for your use case
- The critical elements your flight automation must have to capture high-accuracy LiDAR data
- Top 4 most beneficial LiDAR planning elements to maximize data capture for your use case
- UgCS LiDAR tools in action
- How do you start your drone LiDAR journey?
Dan Hubert, Owner at MODUS
MODUS is an Enterprise Geospatial Drone consultation Firm. Mapping Operations and Data Unmanned Solutions (a.k.a. MODUS) is logistic automation and geo-spatial intelligence innovation company, offering solutions to reduce the cost of transporting sensors and cargo while providing business growth opportunities with increased business intelligence. To achieve this, MODUS fuses its expertise in drones, geo-spatial sensors, business analysis, digital workflow design, geographic information visualization, and learning systems into a cohesive operational intelligence product.
Alexey Yankelevich, Co-Founder and Head of Software Development at SPH Engineering
SPH Engineering is a multiproduct drone software company and UAV integration services provider. Founded in 2013 in Latvia (EU) as a UAV mission planning and flight control start-up, the company has evolved from a developer of a single flagship product UgCS to a market leader of multiple drone solutions. To provide high-quality solutions for UAV professionals, SPH Engineering's team advances four key product lines: UgCS (mission planning and flight control software), UgCS Integrated Systems (airborne integrated systems with sensors from diverse manufacturers), Drone Show Software (only commercially available software to manage drone swarm flights) and ATLAS (AI platform to process and analyze geospatial data).
How do your account for the true grade elevation with the varying elevation of the drone?
The typical approach is to use GCPs
What are the units for the accuracy line? / So, what is the accuracy of 50, or 60 - is that mm?
When we talk about survey accuracies, in US survey standards, we talk in terms of contour, and if we have one foot contour as an example, your data is going have to be within six inches or half of foot, if you are going for a six-inch contour, you're going to want three inch accuracy, three inch is 76 mm.
Are these [LiDAR] units RTK compatible, or PPK only?
If you buy RTK system, you will need a radio to communicate directly with your drone or your payload package, whereas with PPK system all you need to do is download your data. The bottom line here is that the better your telemetry data is, the more likely that your LiDAR data will be better. This is why there is somewhat of a debate on RTK vs PPK, but in the end of the day if it gets you your data accuracy requirements, then you are good. That's the main thing you should be looking for.
How does LiDAR perform in the mining industry and can we integrate AI to it?
The preferred method is photogrammetry unless you're doing volume at scale.
Photogrammetry is better at collecting aggregates, because it's cheaper but LiDAR is faster. You can pull aggregates that are relatively correct within 10 minutes after landing and start delivering aggregates report. Therefore, if you need to do timely reporting of larger sites, then LiDAR is the way to go, because the return of the information is better.
Could you summarize the correct action sequence programming a DJI Zenmuse L1 LiDAR mission? In particular how to start lidar+RGBdata acquisition within UgCS, and how /when to do IMU calibration. How do DJI PILOT and UGCS interact?
Have we seen any hybrid processing uniting Photogrammetry and LiDAR in accuracy improvements?
If you use GCP you can tie photogrammetry cloud and LIDAR cloud together
Can UgCS be used with Zenmuse L1 Lidar sensor?
Do you have any side-by-side error data for high quality photogrammetry results vs. Entry/Mid/Pro LIDAR results?
Both LiDAR and photogrammetry have their place in the market and we recommend to use them both. If you do bare earth scans, then photogrammetry is the way to go, unless you're dealing with things that have very low contrast, or you need something more timely. LiDAR system is best suited for verticals. It's also good for structures, if you're doing towers and that kind of thing, you are going to get much more accurate data with LiDAR. And the last area is vegetation penetration.
What will be the strip to strip accuracy in terms of planimetry and height?
It depends on the sensor.
In the few flights I've done, my DJI Matrice seems to almost stop and pause when making turns. Is that most effective for the best accuracy?
The best is to use smooth turns. For sharp turns, we recommend turns with loops. Both are supported in UgCS.
Is there any preferable scanning pattern of a LiDAR (circular, linear, etc..) for a better vegetation penetration? (apart from "exposure" time of the area) / So if you want a point cloud over forest (for tree metrics), do you suggest also double grid flight for better results?
We do not recommend any specific pattern.
When it comes to forestry applications, what’s the best path to growth to follow, when funding is limited? What kind of reliable information can be obtained with the most basic possible solution?
Even with DJI L1 tree heights can be determined
What’s the map reference in UGCS for terrain follow?
By default, we use SRTM. However, pilots can upload GeoTIFF from photogrammetry processing in UgCS.
Are there any specific tips for selection of GСP for Drone LiDAR?
There are many tips on this but the main thing to remember is that you need to have ground control. However, the subject is very broad and MODUS teaches a special course on it.
Most LiDAR suppliers suggest having an initial eight pattern before entering the mission for a better IMU initialization. What is your thought about that?
Yes, this is standard practice. In UgCS, you can make it with automatic command, that will build 8-figure for you.
We have only done photogrammetry. Instead of making pit stops on a long flight to change batteries, we split one long mission, into multiple overlapping missions that can be performed on one battery. Is this an option for LiDAR missions?
Yes, this is also possible. With UgCS, you can resume long flights or split them into parts.
How the process of calibration of figure-eight would be done?
In UgCS, you can make it with automatic command that will build 8-figure for you.
Can you review the importance of flight line that crosses parallel flight lines? Is it better to fly at the offset angle before or after the parallel flight lines?
Are calibration patterns required for LiDAR systems that have two GNSS antennas to estimate the heading?
Yes, we recommend using calibration patterns.
What type of target do you recommend using?
LiDAR does not see targets the same way as with photogrammetry. Daniel will post a video comparing LiDAR target and photogrammetry target in MODUS Youtube channel.
What format of DEM can you consume with UgCS, and can you show us how to bring them in and use them?
By default, we use SRTM. However, pilots can upload GeoTIFF with elevation from photogrammetry processing or previous LiDAR mission in UgCS.