Bandwagon's Drones Thread

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
From what I've heard, most forest inventory depends on changing the light spectrum, so you can see diff colors of leaves and determine species. And this is mostly a manual process?

And even then they still have to have someone go out and manually plot the stand of timber to get an idea of DBH (diameter breast height), or in other words the diameter of the trees 4' off the ground.
DBH is and count of trees is all we're aiming for as a first step. Both require the trunks to have enough points to at least define a semi-circle, enough that its human interpretation or semi reliable via AI. We're probably not going to get the point density from airborne lidar to be able to do that reliably. I'm sure we'll pick some up, but I wouldn't scope a project and plan to do it that way.

The geoslam is absolutely capable of doing that, just like any terrestrial scanner. The advantage of the geoslam is that you can scan while you walk and cover a lot of ground. Its just impractical to do with an SX10. At least with the tree inventory projects we do. Last one was 6k trees. The issue with the geoslam is that it's harder to classify ground and, based on that classification, filter to breast heights in undulating terrain. So I've been thinking of either using both on a big project, or just using USGS 3DEP terrain if its good enough in the area.
That's about as close as I've looked at the geoslam dataset for that one though....filter to elevations at breast height and see if we have enough points to extract centroid and DBH. We did. I'll try with the M300 L1 too, but not holding my breath.

For the species classification, I'm guessing that comes down to some very expensive lasers, a spectral library and some AI training? My friend is on the cutting edge of this stuff and we were just talking last week about the DBH thing and he mentioned "supposedly they're able to extract species now too, with the new [whatever wavelenth] lasers". There's a newish "green" laser that's supposed to be pretty decent at collecting bathymetry, too. It's mainly limited by the turbidity of the water and is supposed to penetrate about 50% further than you can see with the human eye. No idea if they're one and the same
 
  • 1Like
Reactions: 1 user

Tmac

Adventurer
<Gold Donor>
9,436
16,054
For the species classification, I'm guessing that comes down to some very expensive lasers, a spectral library and some AI training? My friend is on the cutting edge of this stuff and we were just talking last week about the DBH thing and he mentioned "supposedly they're able to extract species now too, with the new [whatever wavelenth] lasers". There's a newish "green" laser that's supposed to be pretty decent at collecting bathymetry, too. It's mainly limited by the turbidity of the water and is supposed to penetrate about 50% further than you can see with the human eye. No idea if they're one and the same

Yeah, they use some sort of light-spectrum laser to identify different species. Honestly, whoever cracks the code on generating legit inventories on large tracts of timber (100,000+ acres), using drones, is going to make billions. They'd make hundreds of millions from federal and state agencies alone.
 
  • 1Like
Reactions: 1 user

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
Yeah, they use some sort of light-spectrum laser to identify different species. Honestly, whoever cracks the code on generating legit inventories on large tracts of timber (100,000+ acres), using drones, is going to make billions. They'd make hundreds of millions from federal and state agencies alone.
I don't think it'll make sense to do areas that size with drones for a good while. Manned aircraft can carry the same sensors, and much better. We set control for hundreds of miles of transmission line surveys done by helicopters and those guy fly lower than I do with the drones.

I usually start requesting quotes from manned aircraft providers once we get over 3k acres. Once you get over 6k acres, it's pretty much always cheaper to use manned. In my experience, at least. Also less targets to place. We did a 6k acre survey last year and my estimate was ~80 targets and 4 days to acquire. Manned aircraft needed 18 targets and one day to fly, including 400 miles to mobilize.

I actually just talked to those guys this morning. I was trying to hire their main compiler, but he's getting a stake in the company next month. We talked about the possibility of acquiring them though. And also, we might be cheaper for them to use for acquisition on ~45 of their annual projects than the airplane they've been contracting. We priced out 5 sites today and we would be 70% the cost of doing it with manned. I sent him 2x datasets with the new Drone to take a look and see what he thinks.

I was a busy boy this week. I'm kinda reinvigorated. The ball is rolling on a Geospatial group. Jury is out on if the new hire is going to get anywhere himself, but all of the groups are hiring staff with gis backgrounds as a priority now. Now that I'm not the only one pushing that shit along, I'm getting back to focusing on UAS stuff and being ready for staff that can fully utilize what I can make. ;) And my friend that did the training is pumped we're getting set up with the same equipment. Our workflows were already the same, but we're moving into the same software suites too. They have a much narrower focus, so fewer applications for drones than us....but a crazy capable data science and remote sensing team. We're going to have a meeting next week to see if we can include them as a teaming partner on a bunch of different stuff and see if we can help fund their R&D projects with paid project work. tree inventories would be one, methane sampling and thermal mapping on solid waste sites, hazmat sampling, both indoor and outdoor, etc. We're on opposite sides of the country and not direct competitors, so I'd appreciate being able to tap into their expertise for the office tech work. The people here that do that type of work can either use CAD or nothing at all, and it really limits what I can provide.

I'm more excited than I have been in a long time, actually.

Oh....and I think they're hiring my friend thats a remote sensing professor in Canada. I'm pretty sure I mentioned both of these guys in this thread like 5 years ago....but one of them taught me a lot about multispectral, gave me a really expensive camera and referred a shit ton of work to my company for a year long study. The other is a photogrammetrist, lidar tech, uas program lead and GIS developer that is better than me at literally everything, and very happy to teach. Both great guys, and I introduced them a couple months ago. One just flew to interview with the other last Friday and I think he's going to take the offer. I just played matchmaker with two of my mentors and now we're talking about formalizing a company partnership. I'm pumped.
 
Last edited:
  • 1Solidarity
Reactions: 1 user

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
Alrighty, I'm switching over to Metashape Pro.
I had a really big project that was kind of a shit show. Controlled airspace with a 0ft ceiling that we had to request permission to exceed. Denied 3 times, then I talked to ATC and they said they'd approve 200ft. Had to reschedule 4x because of rain, then ended up being forced to fly on a Saturday because it was the only day that looked decent before it was due. Not great conditions, mapping at 190ft, 2x different drones/pilots to get it done in one day, lots of shadows, etc etc. Everything came in pretty well EXCEPT the road surfaces, which is the main thing we wanted, to keep the crew out of the roads. Too much wet pavement, too many cars, too many shadows. So the surface I had to work with looked like absolute garbage and I told the engineer I don't want to give him anything from it, and he should let me go back and re fly it with the new drone.

We were going to make the client pay for it since they're the ones that signed off on trying for that Saturday flight, even though I advised against it because of the weather. But I told him to let me re-process a portion of it in Metashape and see if it does a better job than pix4d. Still not fantastic, but it's a big improvement:

Pix4d:
1656709304371.png

Metashape Pro
1656709284912.png




And I'm working on a different project right now that I flew yesterday with the P1. Ran this one off a local base (RoboDOT). Connected the RoboDOT to VRS to average the base position, then just ran off of it with the drone. I half-assed it too because we're not even doing anything with this beyond using it as a basemap. I didn't have any time to plan anything out or make sure I knew the new workflow....the surveyor just asked me to hit it on my way by when I was in town. Didn't even care about the vertical.

North target is a mon in a case, so that makes sense. South is 2x shots from the field crew, "X" scribed in concrete. Pretty damn good hit considering how I did it all. I mean...there's about 0.02 just between their 2x shots. ~0.05 from my stuff when I'm half-assing a brand new workflow is pretty damn good as a starting point. I have zero control other than the RTK, and that base position was set by VRS. We're not even using any common control.

And the damn road surface is CRISPY.

1656709531865.png
 

Vepil

Gamja
<Bronze Donator>
5,791
24,239
What speed and overlap are you flying? Also are you flying perpendicular to the first flight to cross hatch the site? I know with ours we fly at 8mph with 50% overlap at 200 feet. Another thing you can do is USGS to grab some LIDAR data that is usually ground and control it to your own to help fill in those thick areas.

On the GEOSlam front, rockropbotic figured out how to use our current lidar in a geoslam configuration.
 
Last edited:

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
I fly at max speed for the shutter speed, to keep pixel blur under 1/2 GSD. With the phantoms, thats about 11mph @ 200ft 1/800ss. With the m300/p1, I haven't seen a reason yet to slow that fucker down. As long as its not moving too fast to cause pixel blur, I want it flying fast.

Why are you flying at 8mph? How did you come up with that number?

I don't do cross hatch flights anymore, at all. I think mixing nadir and oblique makes the calibration worse, and I haven't ever seen much of an improvement to vertical. I did that for 2 or 3 years straight though, and had all my guys doing it too. Now, if I want really tight vertical, I stack nadir flights. I'll do one at 200 and another at 225, with perpendicular lines.


I do use that usgs terrain all the time, though. With global mapper, you can stream it right in. Don't need to go download it unless you want the Metadata.
 

Vepil

Gamja
<Bronze Donator>
5,791
24,239
Those numbers are flying with lidar, not photo. I was talking about cross hatching with the L1 might help ya with getting more on the ground in tree areas and tall fields.
 

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
Those numbers are flying with lidar, not photo. I was talking about cross hatching with the L1 might help ya with getting more on the ground in tree areas and tall fields.
Ohhhh, gotcha. I really don't know yet, since the lidar part is new to me and I've only done 2 flights with that, both with my friend. He actually recommends lower overlap and not duplicating flights over areas, to produce less noise. The idea being that we're going to do a vertical adjustment during post processing qaqc anyways, so a single pass produces a TIN with less spikes/more normalized surface and works better for the final vertical shift.

He'll be the first to admit he doesn't have a ton of time with this specific sensor, though. Maybe 50 projects with it. He's excited to get one in my hands because we do a lot more flying than they do, so hopefully I can fiddle with it more and start showing him what works better.

I guess that's mostly aimed at roads, though. What you're saying makes sense for canopy cover. Are you using the repetitive or non repetitive setting? Do you guys have wheat out there? Have you found anything that actually works in crops that dense, or you just save those for after the growing season?

Do you keep it NADIR when you're trying to get under canopy?
 

Vepil

Gamja
<Bronze Donator>
5,791
24,239
Odd we have zero spikes in our tins, its lidar, it bounces off and records that flight time doing those overlaps help with accuracy as does the flight speed being slower. We have a lot of what they call sage brush and it gets thick but we still have a pretty good 3rd return.

EDIT:

Removed the video as the client would go ballistic if they knew what else is posted here lol.
 
Last edited:

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
I'm hoping I can get out over the next 2 weeks and do a lot of test flights with that thing. I'll try his settings and yours on the same site and see how they look.

Avoiding excessive overlap was just the recommendation for the L1, though. The one you have has a better module in it.
 

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
Bandwagon Bandwagon does the DJI software auto align the point cloud to the gcps after you upload and pick them?
I think it does, but we're doing very little in DJI Terra right now. I'm following the provided workflow to a T, until I feel like I have enough experience and familiarity with it to start making changes.

But we're using DJI Terra to process the initial point cloud, align and filter, colorize and classify the points. Adjusting to control happens in Terrascan (I think) afterwards. I still haven't done any Lidar beyond the first 2 test flights we did though. I've done a few small photogrammetry flights since, but no lidar. And I've been out sick this week.
 
  • 1Like
Reactions: 1 user

Vepil

Gamja
<Bronze Donator>
5,791
24,239
Very cool camera I saw today. Five 24MP lenses taking various angles for up to 120MP image. The model they built from it is absolutely crazy. Looks to be around $14k. We don't really have a use for it but I would like to have one to play with.


 
  • 1Wow!
  • 1Like
Reactions: 1 users

Vepil

Gamja
<Bronze Donator>
5,791
24,239
Vepil Vepil what are you doing for a base on your m300?
We do not use a base with the m300 for it to receive corrections. The base we put out is to record 1 second for ppk solution, to create our trajectory for the IMU. Send off the rinex to OPUS for a lat/long/vert and input that with the rinex in the software and it solves the rest.

We use old trimble units that were replaced by newer units for everyday work. As long as the batteries last and they get 1 second updates we can use any gps out there.
 

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
We do not use a base with the m300 for it to receive corrections. The base we put out is to record 1 second for ppk solution, to create our trajectory for the IMU. Send off the rinex to OPUS for a lat/long/vert and input that with the rinex in the software and it solves the rest.

We use old trimble units that were replaced by newer units for everyday work. As long as the batteries last and they get 1 second updates we can use any gps out there.
PPK, but you're still not running RTK off that vase during flights?

The first day out, I used the Robodot to average a HERE point, then ran it rtk off that base position while doing 1sec logs for opus. I still PPKd that one, but was flying with rtk too.

Anyways, I was just out flying and added a note to my calender for 3pm today to order a low profile tripod for the robodot (i normally just have a rod and bipod with me when I go out flying). 10 minutes later, gust of wind blew my bipod over and broke the robodot. Lol, just talked to the guy at Robota and he's really cool. Going to fix it for me, but I was just wondering if I could just use our SP80s or R6s in the mean time.

I ended up switching it over to network RTK after I broke the robodot. The main flight was already done when that happened, so no biggie on this project. I was just playing with smart oblique and the L1 after I broke the Robodot
 

Vepil

Gamja
<Bronze Donator>
5,791
24,239
PPK, but you're still not running RTK off that vase during flights?

The first day out, I used the Robodot to average a HERE point, then ran it rtk off that base position while doing 1sec logs for opus. I still PPKd that one, but was flying with rtk too.

Anyways, I was just out flying and added a note to my calender for 3pm today to order a low profile tripod for the robodot (i normally just have a rod and bipod with me when I go out flying). 10 minutes later, gust of wind blew my bipod over and broke the robodot. Lol, just talked to the guy at Robota and he's really cool. Going to fix it for me, but I was just wondering if I could just use our SP80s or R6s in the mean time.

I ended up switching it over to network RTK after I broke the robodot. The main flight was already done when that happened, so no biggie on this project. I was just playing with smart oblique and the L1 after I broke the Robodot
Nahh we never fly the m300 using rtk or anything like that we don't need it to know where it is just fly the mission we plan. The base is only to run a solution on the lidar. I have heard of some in the mountains using RTK to fly the drone. Our lidar unit has a GPS antenna we attach and use a gopro bicycle mount to hook to an arm. The software calculates the offset to the antenna and resolves that to the IMU and corrects using the base ppk.

Gotta remember we don't even have cameras for the 3 m300's we have. We only use them for LiDAR and the pictures it collects we occasionally make an ortho from it.
 

Bandwagon

Kolohe
<Silver Donator>
22,900
60,067
Ahhhh, I forgot you don't have a p1.

My friend has been giving me shit for 2 years for not really caring about RTK on the drones. Since we're going to use control on everything anyways.

I gotta say.....I'm a believer now.
Just getting on the road. I'll probably post more about this one next week.
 
  • 1Solidarity
Reactions: 1 user