Taking the LiDAR out for a spin

with No Comments

We took the LiDAR out for a spin on Friday. We had a beautiful (read: high tech) rig that consisted of Neil driving, Kellan holding the LiDAR out the sunroof, and Nic spread out across that back seat with the drone (Kia), laptop, and router, collecting data from the LiDAR. We re-learned that Kia doesn’t like having a lot of metal around, so our plan to collect GPS data was challenged, more on why GPS data is important below.

This is us trying to get a good GPS signal on Kia, not us actually driving with Kia sitting on the roof…
Note: Kia is the drone not that we are in a KIA car

 

 

After a lot of indoors practice with the LiDAR, we determined that the best way to set up Kia when in Iceland is to have a balanced rig with weight evenly distributed to the sides. The elements that we have to consider are the LiDAR, a wifi transmitter as well as battery packs for both. Neil and Charlie are working on getting these built in a reliable prototype.

 

Our ultimate goal is to create point clouds from the LiDAR data based on GPS coordinates. During our test run on Friday we collected data but are still mapping this data based on time instead of GPS. Our plans for Monday are to pull the GPS data from our Friday test run from the Kia and align the GPS/telemetry data with the LiDAR data in a point cloud. This will allow us to correctly map the data in a point cloud. We can’t map the LiDAR data using time because the LiDAR spins faster than our time readings and time is linear and our flight most likely won’t be linear. If we were to create the point cloud using time, we would have a straight line where the data would be stretched out and repeated.

Driving, of a sort

with No Comments

This is the first of a couple of short clips from the GoPro dashboard cam that we used in 2016 (we plan to again this year). The wheat-to-chaff ratio is pretty low, but the kernels you do find tend to be gems. In this clip we are driving down to the red sand beach on the West coast of Iceland. It’s near the end of a long day of driving from Akureyri, Nic is at the wheel with “support” from Erin, Deeksha, and the Talking Heads.

Some set backs

with No Comments

This week has been a bit frustrating, but it all seems to be looking up now.

After taking some test images with the UAV last weekend I realised that there was image fragmenting going on with the camera. This meant that one of the cables was either lose, dirty, or damaged. So, after quite a few hours of hardware debugging (cleaning plus some cable re-seating, etc) Im reasonable certain the ribbon cable that transfers the actual image to the tablet receiver needs to be replaced. I still have no idea what caused the actual damage in the first place, but thats a little beside the point. Unfortunately that means until that comes in the camera will be out of order. (…Charlie ordered two just in case…)

BUT FORTUNATELY, that gave me an excuse to take the whole camera gimbal off which means its time to fly without the camera and stress-test how much of a payload it can carry! *cough* I mean, purely in the interest of knowing our payload total for the UAV+CAMERA+LIDAR.

In other news, after setting up a physical router I was able to solve the uplink problem. Essentially it turns out that having complete control over your ports when trying to control ~6 different java servers worth of information transferring all at the same time is really useful.

Super large backlog update

with No Comments

Okay, so a lot has happened over the last month!

Early this month I started to dive a bit into how we are going to stitch our images together. I approached this through OpenCV so that I have access to python libraries like numpy. At its core this is taking a directory of images as an input, detecting key points, matching descriptors and features as a vector, before processing them through homography and warping the images to fit. The results are PRETTY good right now, but could be better. Ive since spent quite a bit of time working with the code getting it to accept more images on the first run. skalanes_stitch_1-copy

The Earlham College poster fair was also earlier this month and I wrote up and designed the poster detailing some highlights in our research, a pdf of the poster can be found here –> uas-poster-2016.

I suggested that Vitali and Niraj use OpenCV for object detection of archaeological sites. Ive also started to look into the different ways in which things like animal activity and vegetation health might be done through computer vision as well. Interestingly enough, vegetation is not only easy for computer vision to find but to also get rough calculations of how nitrogen heavy the plants are.

Ive continued to work with UGCS’s open software packages. I recently got integration of custom map layers (i.e., GIS) to appear in the flight program. As the majority of the sdk comes undocumented it feels a lot like reverse engineering and takes more time than I would like.

This weekend I had the intentions to fly a programmed flight over our campus to make a high resolution map that Charlie had asked me to do. I’ve since run into networking problems though and cannot proceed further until I get them figured out. I cannot get an uplink to the UAV but can receive download and telemetry data. This is mostly a networking error I suspect but one thats continuing to confuse me.

Moving forward: First off my plan is to get the ground station working correctly again and this is my highest priority. Second off, while Niraj and Vitali handle the computer vision software I will start familiarising myself a bit more with QGIS.

I guess this is why they call it research

with No Comments

Starting off, this week has been spent reading and studying up on the current tech thats useful for us. The pun is fully intended in the title of this as this is maybe the 5th time I’ve had to look at all the new technology and advances made (damn you Moore’s law).
Looking into a more advanced LiDAR system had me sifting through nearly all viable models that fit our budget. Sample rate, beam power, light shielding, and mass were a few of the deciding factors that went into looking at different models.

  • A high sample rate allows for the LiDAR to be traveling faster and not suffer a loss of resolution.
  • The power output of the laser beam itself is important to get greater distant, our original system only had a usable distance of ~1.5m above the ground in broad daylight.
  • light shielding, though it can be added later, is best done built in as close to the laser output as possible. The shield itself allows through only the recognisable light of the laser receiver and blocks out as much UV light as possible.
  • Because the goal is to have the LiDAR mounted on the UAS mass needs to be taking into consideration.

After making up a list and handing it over to Charlie to look over he submitted it to the scientific equipment fund.

I also took the time this week to put together a new abstract for our research. Since first starting on the research our knowledge and capabilities have expanded drastically and I felt our abstract should reflect this. As of today we submitted to be a part of the Earlham College “Natural Sciences Division Undergraduate Research Poster Conference.”

Back into the swing of things

with No Comments

Now a few weeks into the new semester its time to stop procrastinating and get back to work!

Ive started off by essentially doing what seems like mass updating of hardware, software, firmware, etc. The DJI drone system and ground control software both had to be brought up to date. Initially there were a lot of errors in getting them talking to one another again but as of Wednesday this week they are fully happy and working properly. I’ve also taken the chance to update the plug-ins and various parts of the website. Ive also made accounts on the blog for Vitalii and Niraj who will be joining us by working on some of the back end coding.

In an effort to stream line our massive photo cache I’ve copied all photos onto our media server here on campus to allow us to situate things. After talking with Charlie and Erin we have decided to sort initially by day and then by capture device. This means though that we have to map all photos to their correct day which has proved to be more difficult then I thought… If the EXIF data on the images doesn’t contain a date – the date is just set to be the most recent “creation date” which means when I copied them. Thats left me and Erin with a whole lot of images that need to be sorted and sifted by hand still. Im also still finding places where images are stored, such as the google drive, etc, and have been working on getting all of those to the server as well. Its sort of like a  big game of cat and mouse and is tedious at best.

With Niraja and Vitalii joining us I have also been doing my best in catching them up to speed on all the projects and details there in. Much of this involves working with the UAV images and how to post process them. Much of this is trying it out on proprietary software and then figuring out to best do it in large batches and with open source programs. Ive most recently been trying to figure out if its possible to get a 3D topographic image without needing to apply a LIDAR point cloud layer. While not as accurate it would certainly be quick and would be used more as an approximation.

As time becomes available, Ive also started going through all the video footage that we have saved and started to throw together a very (VERY) rough storyboard for the documentary. Im hoping to have a solid chunk of time this weekend to work on this a bit more. Really I’m just focused on getting some generic shots worked together so that we can use them as filler for voice over and context.

As things move along I’ll keep posting! Cheers!

Blog 7/5/16: Drones, LiDAR, dirt

with No Comments

This is now our 5th day at Skalanes and were are all feeling rather settled in and have our workflows… er… flowing. We were awoken to a bit of surprise today, the tent was actually warm! This is the first day of being in Iceland where a jacket wasnt necessary without having to hike a volcano. With hardly a cloud in the sky the weather was perfect for taking arial images with the UAVs. Oli, Rannveig, and Bjarki took some of out (at different times) to explain the archeological dig sites and what areas would be helpful to have high resolution mappings of the area. I did an initial fly over survey of the known archaeological site. Later is post processing I started to look into tweaking the photos to highlight ground structures, cancel out water reflections, etc. Its all looking promising. Unfortunately, its so bright that the sun would mess with the LiDAR’s receiver and so I will have to wait for the sun to “set” before going out. This will probably be around midnight. Tara continues to work with the soil samples taken and devising ways to process the soil.

visualisation and flight

with 1 Comment

This past week has been a busy week. I spent a great deal of time working with Erin on the UAV Lundi where we were able to do test flights in fairly heavy wind, get used to the camera system, learn how to stitch images together. I got a chance to look into some of the customisable flight controls such as “point of interest”, “follow me”, and “waypoints.” The waypoints is particular interesting as It should allow us to define flight plans, save them, and then have Lundi fly them in the same way on its own. Most of the custom commands are going to come from a “ground control station” which is just an input form a laptop. I spent time looking into open source ground stations that have already been created but didn’t find anything exactly like I wanted. In order to remedy this Ive started coding up a few things using the DJI api to either create our own standalone ground station or add it into an open system. The goal is to have the ability to click waypoints on a map and have the drone fly the path with user defined specifications.

I’ll update shortly about LiDAR visualisation and information. Im working on creating a nice image and detailed instructions on whats going on!

 

Tests of light detection and ranging

with No Comments

Charlie and I had a conversation at the beginning of the week about if the LiDAR module we are using is the “right” one for our purposes. The bottom line is that we don’t exactly know and so I set out to answer that question.

IMG_4301-1IMG_4300

I built a controlled slider that gives us nearly 20cm of movement that we have control over when taking LiDAR measurements. My first test was to place a pencil on the ground and take 1minute of scans at each centimetre. The hope was to be able to pick out the pencil given “perfect” circumstances. After collecting the scan data, which consisted of only the distance in mm and the theta (degree angle at which the scan was take), I then artificially created a Y value corresponding to the 20cm of data taken.

After I got the data into a pleasing form I began to try and visualise it. This proved to be a tad bit difficult and I still don’t have it quite right yet but I was able to see something;

Screen Shot 2016-05-22 at 10.13.41 PM

I am fairly certain that the object towards the top of that screenshot is one of the legs on the tripod that the beam picked up. I am not entirely sure though and will continue to do tests until something is conclusive.

Lundi and LiDAR!

with 1 Comment

I’ll start with the UAV. Now known as Lundi* we have finally started flying. Our initial idea was to fly indoors however our Science Centre has some intense electromagnetic fields (who knew a mostly metal high tech structure would be filled with metal and electric??) so we ended up flying outside after a few initial tests indoors. The tests flights have been in areas we deemed low wind (plenty of building to fly into though) and minimal spectators as to not distract our pilots. Speaking of pilots, Erin and I will be the main controllers of Lundi (and possibly a sister UAV soon) and will begin practicing more and more. The rest of the team will also learn to fly them but only as time permits. The RAW digital negative images .DNG are around the size of 25MB and we can take 3 photos per second. Our next step is to explore the autopilot features of the UAV that will allow us to plot flight patterns.

Now onto LiDAR (we are now styling it as LiDAR). I built a prototype housing for the sensor that allows us to get roughly a 30º angle output. After many frustrating hours with ROS I decided to put it on the shelf for a bit and write my own code. I currently just take sample readings in an .xyz format but the ultimate goal is to pull information from Lundi to give us a full .las file which includes all sorts of useful meta data. Currently the sensor only knows how to take “slides” that are the size of the laser itself but I’m working on getting it to realise its scanning from the top down (part of what the x and y values do) and I can then import it into a point cloud viewer and we should be good to go! Currently in the .xyz format we are getting 600KB/s  which translated into 4.5MB/m. Ive also started to prototype a sort of “slider” for the LiDAR that would allow us to move smoothly across a set distance. This will then be mounted up at our 3m height and scan a patch of grass with a pencil hidden inside, the ultimate goal will be able to pick out the pencil from amongst the blades of grass.

Ill be looking into photogrammetry a bit more asap as well, its proving to be a VERY useful tool.

 

*we are under the impression that Lundi means puffin in Icelandic, if any of our Icelandic friends know differently please let us know… Id hate to be flying something called “your mothers pants”

1 2 3