Catching-up

with No Comments

In some sense the amount of activity going on is inversely proportional to the posts we’re making here to document it… Here’s a quick summary of what we’ve been working on since my last post:

o LiDAR – We have new unit which is much more capable. We built a test-rig in Hopper to simulate having it on Kia, we’re just starting to collect data. Nic and Kellan are working on the algorithms/workflow to analyze it. Charlie is working on mounting it on Kia.
o Image processing – Kellan and Nic are working on a workflow for extracting features from the images. Charlie built a map (headed to QGIS) with marked known POIs on the Skalanes peninsula to use as part of the training data.
o Glacier forefield sampling – Tara and Charlie are developing a sampling plan and 16S RRNA workflow for processing the soil samples.
o Benchmarks for registering multi-modal data collected by sensors on the RPA.
o Discovered Siggi’s yoghurt.
o Kristin and Charlie are working on Field Day, mostly on the BLE plumbing.
o Checklist App – More news to follow.
o Gail and Charlie have done lots of logistics and planning.
o Emi is working on the avian surveys and the MinION workflow for processing glacier forefield soil samples
o New ambiance platform (single chip) is designed, construction to follow.

More regular updates to follow, we’re all psyched.

Where are we?

with No Comments

I learned this week the breadth of coordinate systems that are used to geolocate a “point” on the surface of the earth. A very nice archeological survey was done of the Skalanes peninsula about 10 years ago (as near as I can tell, it is in Icelandic). The locations of the features they catalog are given in eastings and northings from a point that I have not found documented yet. It appears to be UTM but when you use 28 W, the UTM grid zone for eastern Iceland you get kind-of-sort-of close, say within a couple of Km, but nowhere no exact. The survey team used a differential GPS, if we can obtain a copy of their database we should be able to figure-out how to convert from their coordinate system.

In other news Gail and I have finished making the hostel, ferry, and car reservations. Next up is exploring bulk airfare.

What do our computer eyes see?

with 1 Comment

Ive begun stepping into the world of machine learning algorithms, Kellan has been helping with some of the mathematical aspects which has made debugging a breeze. This post is going to be a bit image heavy so take it easy and look through some of the results!

This first two images are SIFT based implementations. With the Lupin plant targeted in the

Original:

The next is based on SURF and is targeting the rocks in the image. Parts of the bridge is ‘found’ but mostly due to the similarities in pixel colours. This can be worked out eventually.

We’re back…

with No Comments

After a bit of a break from blogging we’re back. Nic and I have been puddling along with image analysis, LiDAR and the ambiance platform over the past couple of months. Recently Kellan joined us working on machine learning/feature extraction (she is also a part of the group that will be traveling to Iceland with us). Gail and I have been busy on the logistics side, we have van & car reservations, ferry tickets to Heimaey, and a few of our hostel reservations done. We have also been working with Oli on the details related to Skalanes and Gummi (Colder Climates) about the glacier work and a hike to the Laki craters.

just a quick update for the blog

with No Comments

This is more of just a short blip for the blog as there is a lot to go over for the first research meeting in about two weeks!
As we move towards the end of the 2016 winter semester theres still a lot of work to be done. I started mainly with learning how to use QGIS, first the basics and then focusing on scripting it so that things happen automatically. What has been the focus more recently is how to get everything talking to one another. The way that I see it is, because of the infrastructure system there needs to be a middle program or script that is the control point. This control script is what will allow the drone to fly, produce images, and then the script takes over and within a matter of $TIME output the final result. Already we can stitch images together, get flight maps, geolocations, apply the image corrections for viewing in NIR, and a few other things.

Some set backs

with No Comments

This week has been a bit frustrating, but it all seems to be looking up now.

After taking some test images with the UAV last weekend I realised that there was image fragmenting going on with the camera. This meant that one of the cables was either lose, dirty, or damaged. So, after quite a few hours of hardware debugging (cleaning plus some cable re-seating, etc) Im reasonable certain the ribbon cable that transfers the actual image to the tablet receiver needs to be replaced. I still have no idea what caused the actual damage in the first place, but thats a little beside the point. Unfortunately that means until that comes in the camera will be out of order. (…Charlie ordered two just in case…)

BUT FORTUNATELY, that gave me an excuse to take the whole camera gimbal off which means its time to fly without the camera and stress-test how much of a payload it can carry! *cough* I mean, purely in the interest of knowing our payload total for the UAV+CAMERA+LIDAR.

In other news, after setting up a physical router I was able to solve the uplink problem. Essentially it turns out that having complete control over your ports when trying to control ~6 different java servers worth of information transferring all at the same time is really useful.

Super large backlog update

with No Comments

Okay, so a lot has happened over the last month!

Early this month I started to dive a bit into how we are going to stitch our images together. I approached this through OpenCV so that I have access to python libraries like numpy. At its core this is taking a directory of images as an input, detecting key points, matching descriptors and features as a vector, before processing them through homography and warping the images to fit. The results are PRETTY good right now, but could be better. Ive since spent quite a bit of time working with the code getting it to accept more images on the first run. skalanes_stitch_1-copy

The Earlham College poster fair was also earlier this month and I wrote up and designed the poster detailing some highlights in our research, a pdf of the poster can be found here –> uas-poster-2016.

I suggested that Vitali and Niraj use OpenCV for object detection of archaeological sites. Ive also started to look into the different ways in which things like animal activity and vegetation health might be done through computer vision as well. Interestingly enough, vegetation is not only easy for computer vision to find but to also get rough calculations of how nitrogen heavy the plants are.

Ive continued to work with UGCS’s open software packages. I recently got integration of custom map layers (i.e., GIS) to appear in the flight program. As the majority of the sdk comes undocumented it feels a lot like reverse engineering and takes more time than I would like.

This weekend I had the intentions to fly a programmed flight over our campus to make a high resolution map that Charlie had asked me to do. I’ve since run into networking problems though and cannot proceed further until I get them figured out. I cannot get an uplink to the UAV but can receive download and telemetry data. This is mostly a networking error I suspect but one thats continuing to confuse me.

Moving forward: First off my plan is to get the ground station working correctly again and this is my highest priority. Second off, while Niraj and Vitali handle the computer vision software I will start familiarising myself a bit more with QGIS.

Script for batch conversion and algorithms to find rectilinear shapes

with No Comments

This week, I and Vitalii started off by writing a bash script that takes a set of images from a directory, converts them (one at a time) and saves the converted images in a sub-directory or any other directory. The converted files are saved as TIFF and will have the same file name as the original. Because a single image takes more than 3 minutes to convert using ImageMagick, a batch of image could take quite a while. So, for now we will be using the nohup command to run the conversion in background but will work on parallelizing the conversion which should not be difficult considering that it is embarrassingly parallel.

For the rest of the week, we looked at algorithms which could be useful to detect rectilinear shapes from aerial images. We found that most of the algorithm, for accuracy, take multiple aerial images of the same region, often shot from different angles, to determine any man-made objects on the ground. However, we also found a convincing research article which uses Boldt Algorithm to find rectilinear shapes in aerial imagery.

Up next, analyzing aerial images from Skalanes

with No Comments

Vitalii, Niraj and I talked about how to proceed with the aerial images from Skalanes. For now we are going to focus on two aspects, identifying possibly anthropomorphic surface features and measuring the extent of the lupine. There are a couple of algorithms that look promising for feature ID but they require stereo images. We will look into doing that next year. There is also at least one approach that uses mono images that they will start with. The automated image conversion scripts they wrote will be used to tune the input characteristics of the images for the algorithm(s). We haven’t decided on an approach yet for the lupine but I did have an idea for doing it based on a color map seeded with human input. It’s on the back of an envelope in the Hopper lab…

The Icelandic Field Studies May Term looks good, so far…

with No Comments

Over the past couple of weeks Gail and I have spent almost all of our time working on materials describing the (almost) inaugural Icelandic Field Studies May Term, submitting them to the various campus entities, advertising it, and holding information sessions. As of tonight we have 16 applications for 14 spots, that’s much better than we hoped for given the late start. We’ll see how things go from here. Gummi, Oli and Rannveig, we’re on our way! We are especially excited that a few faculty have shown interest in participating with an eye towards leading the program themselves at some point in the future.

Next up is working with Nic, Niraj, and Vitalii on the image processing, Erin on the GIS layers for Skalanes overall (using Nic’s composite image) and the gardens, and Andy Clifford to learn about measuring glaciers.

1 4 5 6 7 8 9 10 22