Catching-up

with No Comments

In some sense the amount of activity going on is inversely proportional to the posts we’re making here to document it… Here’s a quick summary of what we’ve been working on since my last post:

o LiDAR – We have new unit which is much more capable. We built a test-rig in Hopper to simulate having it on Kia, we’re just starting to collect data. Nic and Kellan are working on the algorithms/workflow to analyze it. Charlie is working on mounting it on Kia.
o Image processing – Kellan and Nic are working on a workflow for extracting features from the images. Charlie built a map (headed to QGIS) with marked known POIs on the Skalanes peninsula to use as part of the training data.
o Glacier forefield sampling – Tara and Charlie are developing a sampling plan and 16S RRNA workflow for processing the soil samples.
o Benchmarks for registering multi-modal data collected by sensors on the RPA.
o Discovered Siggi’s yoghurt.
o Kristin and Charlie are working on Field Day, mostly on the BLE plumbing.
o Checklist App – More news to follow.
o Gail and Charlie have done lots of logistics and planning.
o Emi is working on the avian surveys and the MinION workflow for processing glacier forefield soil samples
o New ambiance platform (single chip) is designed, construction to follow.

More regular updates to follow, we’re all psyched.

Some set backs

with No Comments

This week has been a bit frustrating, but it all seems to be looking up now.

After taking some test images with the UAV last weekend I realised that there was image fragmenting going on with the camera. This meant that one of the cables was either lose, dirty, or damaged. So, after quite a few hours of hardware debugging (cleaning plus some cable re-seating, etc) Im reasonable certain the ribbon cable that transfers the actual image to the tablet receiver needs to be replaced. I still have no idea what caused the actual damage in the first place, but thats a little beside the point. Unfortunately that means until that comes in the camera will be out of order. (…Charlie ordered two just in case…)

BUT FORTUNATELY, that gave me an excuse to take the whole camera gimbal off which means its time to fly without the camera and stress-test how much of a payload it can carry! *cough* I mean, purely in the interest of knowing our payload total for the UAV+CAMERA+LIDAR.

In other news, after setting up a physical router I was able to solve the uplink problem. Essentially it turns out that having complete control over your ports when trying to control ~6 different java servers worth of information transferring all at the same time is really useful.

Super large backlog update

with No Comments

Okay, so a lot has happened over the last month!

Early this month I started to dive a bit into how we are going to stitch our images together. I approached this through OpenCV so that I have access to python libraries like numpy. At its core this is taking a directory of images as an input, detecting key points, matching descriptors and features as a vector, before processing them through homography and warping the images to fit. The results are PRETTY good right now, but could be better. Ive since spent quite a bit of time working with the code getting it to accept more images on the first run. skalanes_stitch_1-copy

The Earlham College poster fair was also earlier this month and I wrote up and designed the poster detailing some highlights in our research, a pdf of the poster can be found here –> uas-poster-2016.

I suggested that Vitali and Niraj use OpenCV for object detection of archaeological sites. Ive also started to look into the different ways in which things like animal activity and vegetation health might be done through computer vision as well. Interestingly enough, vegetation is not only easy for computer vision to find but to also get rough calculations of how nitrogen heavy the plants are.

Ive continued to work with UGCS’s open software packages. I recently got integration of custom map layers (i.e., GIS) to appear in the flight program. As the majority of the sdk comes undocumented it feels a lot like reverse engineering and takes more time than I would like.

This weekend I had the intentions to fly a programmed flight over our campus to make a high resolution map that Charlie had asked me to do. I’ve since run into networking problems though and cannot proceed further until I get them figured out. I cannot get an uplink to the UAV but can receive download and telemetry data. This is mostly a networking error I suspect but one thats continuing to confuse me.

Moving forward: First off my plan is to get the ground station working correctly again and this is my highest priority. Second off, while Niraj and Vitali handle the computer vision software I will start familiarising myself a bit more with QGIS.

Lundi and LiDAR!

with 1 Comment

I’ll start with the UAV. Now known as Lundi* we have finally started flying. Our initial idea was to fly indoors however our Science Centre has some intense electromagnetic fields (who knew a mostly metal high tech structure would be filled with metal and electric??) so we ended up flying outside after a few initial tests indoors. The tests flights have been in areas we deemed low wind (plenty of building to fly into though) and minimal spectators as to not distract our pilots. Speaking of pilots, Erin and I will be the main controllers of Lundi (and possibly a sister UAV soon) and will begin practicing more and more. The rest of the team will also learn to fly them but only as time permits. The RAW digital negative images .DNG are around the size of 25MB and we can take 3 photos per second. Our next step is to explore the autopilot features of the UAV that will allow us to plot flight patterns.

Now onto LiDAR (we are now styling it as LiDAR). I built a prototype housing for the sensor that allows us to get roughly a 30º angle output. After many frustrating hours with ROS I decided to put it on the shelf for a bit and write my own code. I currently just take sample readings in an .xyz format but the ultimate goal is to pull information from Lundi to give us a full .las file which includes all sorts of useful meta data. Currently the sensor only knows how to take “slides” that are the size of the laser itself but I’m working on getting it to realise its scanning from the top down (part of what the x and y values do) and I can then import it into a point cloud viewer and we should be good to go! Currently in the .xyz format we are getting 600KB/s  which translated into 4.5MB/m. Ive also started to prototype a sort of “slider” for the LiDAR that would allow us to move smoothly across a set distance. This will then be mounted up at our 3m height and scan a patch of grass with a pencil hidden inside, the ultimate goal will be able to pick out the pencil from amongst the blades of grass.

Ill be looking into photogrammetry a bit more asap as well, its proving to be a VERY useful tool.

 

*we are under the impression that Lundi means puffin in Icelandic, if any of our Icelandic friends know differently please let us know… Id hate to be flying something called “your mothers pants”

UAV Status and info!

with 1 Comment

We recently purchased and received the UAV drone, a Phantom 3 Advanced. In choosing the drone we looked at a variety of specifications but the important ones was price, flight time, range, camera, and hack-ablility. The P3 Advanced hit right in the sweet spot.

Its priced cheap enough to not break the bank (Charlie’s eyes nearly popped out of his head at the cost of some systems… Charlie might be the bank in this analogy) while providing adequate if not superb performance in other areas. Its got a flight time of ~23 minutes and capable of speeds of roughly 16m/s (35mph) and ascent/descent speeds of 5m/s and 3m/s respectively. When hovering it has a vertical accuracy of +/- 0.1m and horizontal accuracy of +/- 1.5m (more on this with LIDAR onboard). Though no built in wind resistance (fancy speak for the ability to rapidly increasing speed to offset sudden gusts of wind) a pilot monitoring the system will be able to adapt for such things. According to data we have from the Icelandic met office, though windy they have rarely been stronger than 9m/s during the months we will be there.

In terms of range the advanced has one of the best combo systems. On board its got both GPS and GLONAS(read: Russian GPS) capabilities and under perfect conditions will be able to travel up to 5000meters (yeah, thats not an extra 0) away from the ground station. Its ceiling is set at 120m from ground station but capable of working anywhere to 6000meters above sea level. This means that we will be able to set up virtually any flight path for the drone to take within our 23 minute flight before a battery switch is needed. Side note/idea: This will probably be shot down but, because of our need for solar panels with the energy survey if the panels work well we might be able to have a remote charging station for discharged batteries.

The biggest obstacle weather/flight wise will be working rain. I am looking into possible “water resistance” techniques for the drone that are similar to what videographers have done when filming around waterfalls and during mist or light rains. The most common is coating the electronics in some sort of $ever_dry_product, but before we go spraying our drone with strange substances Id like to be absolutely sure of its success. (Big note that this is only a weather resistance in the same way that if you go swimming in a rain coat you will still get wet)

The Advanced’s camera is also a pretty neat piece of technology. First off, its TINY and lightweight but capably of 2.7K(30fps) resolution video and 12MP stills (translates to roughly 4000 x 3000p). The Advanced has the capability to store the stills as DNG RAW format so they retain as much digital info as possible which we can then harvest IR and other spectrums from in post processing. With images and video this high of quality we will be able to apply photogrammetry to measure distances between objects for mapping purposes natively. With a LIDAR sensor added in we should be able to apply these two together and through machine learning get something incredibly comprehensive.

Hack-ability is pretty important for our needs, given that we as a group could probably be called a group of science hackers. There are two flavours of this hack-ableness for the drone; software and hardware. Software is pretty straight forward – DJI is kind enough to provide a developers SDK that gives us the ability to customise and use the drones capabilities. Whats going to be important is finding how we can get this information into the FieldDay app and how thats going to look. Hardware is another thing entirely though. The Phantom series is slightly like Apple in that its a beautifully designed chassis thats not really meant to be played around with (in fact, google only returns hardware changes that deal with aesthetics). So, of course our plan is to stick a modular LIDAR system on the back of it which may require quite a few body modifications!

Looking forward I’ll be planning out how our ground station will work with the drone. This is most likely going to mean a laptop computer + tablet + controller + batteries in the field (My laptop is a 13″ macbook pro with 16GB RAM and a 256GB SSD). The waypoint guide system or “drone” part of the UAV will probably be easiest to do with a laptop and we can get some fancy features from having an actual computer near at hand. The controller itself will be near by for manual piloting, this uses a tablet as view finder/some controls. At highest quality video (2.7K resolution, 30fps) the advanced will record about 3GB per 10 minutes. Given the flight time will be ~23 minutes it’s probably going to be ~8 to 10GB per flight captured with video + lidar. Luckily on board the Advanced can hold a 64GB extreme speed SD Card to help with storage. The flight data will most likely be stored on laptop in the field and then transferred to backups once at basecamp. As part of the ground station the laptop is probably going to be running some form of mapping program for real time mappings (this will be looked at further down the road).

Lastly and most important: The drone needs a name. I’ll be taking suggestions. (Kristin says Bob is not an option fyi)

I’ve mentioned a LIDAR system several times, so please look at this post for further reading!