TI Nano works // Bench plan

with No Comments

This week, with help from Charlie, I got the TI Nano near-IR scanner working well on the bench. We can now scan the same samples we have been benchmarking with the FTIR so we will have good calibration curves to compare with our field samples.

I have also been working on the bench plan in general. The workflow I have come up with is below:

IMG_1240

LIDAR on mac

with No Comments

As I mentioned in an earlier post, I had only managed to get a working ros lidar work station (environment) working on an ubuntu virtual machine. There are several issues with this though; all of our other developing environments are already in OSX, getting the /dev/ ports configured correctly to ubuntu through OSX is brittle at best and if something goes wrong it borks the entire virtual machine (its just faster to re-install than to actually fix the problem manually), and running the ubuntu vm cuts down on the battery life/processor/ram of the hardware. So with this in mine getting the environment up and running on mac has been a priority.

I was successful in being able to build a working “basic” work station on mac running OS X El Capitan (I’ll put instructions in the gitlab). The basic work station can’t do much except actually launch the visualisation app rviz or any of the other tutorial apps. The next step is to dive into the lidars available SDK for the mac architecture and build a node from that capable of pushing information to the sensor into the visualisation (and other nodes). Luckily the SDK is available in python and C flavours so hopefully it won’t be too complex.

LIDAR Mapping

with No Comments

Okay, so success has smiled upon me (for once)! After many, many, many, many, trials I finally got the LIDAR to map in a real time slam visualisation (I will upload a video example asap). ROS works using a subset of “nodes” for each application, sensor input, etc. So in order to get the slam vis running you have to have the main the LIDAR input stream node running, the slam node running, and then tell them all where to dump their data before finally launching the ros visualisation and hoping that it all works together. Rather than starting them all one by one manually I made a launch file (variation of a bash script) that does everything in the background. I’ve also created a git on the gitlab with some simple instructions, work space and launch files that I’ll update and add more to as time goes on.

Now that the mapping is up I can work on getting things calibrated. The LIDARneeds to be able to map when held sideways and scanning the ground rather than in a single 360º plane. The plan is to take advantage of its positioning. When the LIDAR is not perpendicular to the ground it will simply shut off the laser and not record data points, this way our final “scan” will only be those taken within a tighter angle boundary. Once this is working Ill be trying the LIDAR scanning at 3m (10ft) and see how accurately it picks small objects up. This will translate into how slow the drone has to fly to achieve maximum map resolution.

The other week when I was forced to take the LIDAR unit apart I was able to examine the internal design. The end goal with the LIDAR is to be able to sweep a flat ground surface, this means that we will have to limit the beam to ~30º beam angle. The simplest way to do this will by a custom top housing. The inside part of the top housing will spin freely on its own while the outer part of the top housing will be fixed with a limited view “window.” It will have to be from a material that won’t reflect the laser as to not gather bad data points while its spinning inside the device (though, it would be fairly simple to find these data points and ignore them in the code). Charlie also made the suggestion that the viewing window of the outer housing have shutters of a sort that allow for a change in view angle.

Iceland Soil Standards

with No Comments

This past week I was able to use a small sample of the aDNA samples to take an IR profile of Icelandic soil. I was really excited to see a small peak in the part of the spectra we have been focusing on for characterizing organic content. Using the calibration curves we have developed, we were able to calculate that the soil has around 24% organic composition. That is on the higher side of what we would except to see for Icelandic soil, but makes sense given the location of the archeological site.

The next steps will be to run the same standards with the DLP Nano. The Nano covers a much shorter wavelength portion of the IR spectrum, the near-IR. We will not be able to observe the same peaks we can see with the FTIR, but hopefully we will identify others, and can ‘stitch together’ the near and mid regions. One day it would be nice to develop a cheap, arduino-controlled visible light spectra so we can cover the whole visible-to-mid IR region of the EM spectrum. I’ve looked into the design of the vis spec and I think we could do it with a sony barcode scanner and a few lenses. Dreams for the future!

I haven’t had much time to physically work on the soil platforms in the last week, but I have worked out some more of the conceptual kinks around the workflow. I feel really confident that we can do everything we want to do in terms of collecting soil metadata, now it’s just a matter of putting all the pieces together.

One piece that has just been added to the picture is using CO2 measurements as a proxy for microbial life. This would be particularly useful for the glacial forefield (a prime site for dna extractions in the future). I will be reaching out to Chris Smith for advice about how to maintain the samples while CO2 is measured.

Light Blue Bean!

with No Comments

In my last post, I said that I was going to switch all of our individual fragments of sensors to one that is ‘Bluetooth Sensors.’ Since then, I have finished that and redesigned even more of Field Day.

After I finished switching all the fragments to one, I dove into working on BLE connection with the Light Blue Bean (LBB). Unfortunately, the LBB does not do Bluetooth the same way the Red Bear Labs shield does. The LBB uses something called ‘Scratch Characteristics.’ These are built-in 5 characteristics that can be written and read from the client (in this case, Field Day) and the server (the LBB). These characteristics have set UUIDs, so that means I can’t use the custom UUIDs I set for the RBL’s sensor. After browsing some of the android SDK code from Punchthrough (the people that made the LBB), I was able to determine the UUIDs that are used for the characteristics. Since Field Day is going to have to determine what kind of device it is connected to, I redesigned it so that the fragment is cleaner and doesn’t do any of that work. There are separate classes for a Bluetooth Sensor, GATT Client, and Bluetooth Service. The Service talks to the Sensor and the Sensor talks to the GATT client and determines what type of device it is connected to by checking the UUIDs. All the Fragment does now is say when it wants to write a message or read a message.

Punchthrough has sample code available for arduino and reading and writing the scratch characteristics. The examples along with the SDK has proven helpful when rewriting the android code to accommodate for the Bean. Unfortunately, not many people have used the Bean with Android. All of the examples are with iOS. Field Day is able to read scratch characteristics just fine from the Bean, but is currently unable to write to any characteristics. One huge problem with the bean is that everything is Bluetooth. It’s not like arduino where you plug in the device to your computer to upload code and power it on. This means that the Bean can only be connected to one thing at a time and thus, is really hard to debug. Typically when debugging arduino devices, I’m able to plug the device into my computer and watch the Serial monitor for debugging code I’ve added in. With the Bean, I cannot. It’s blind debugging and it sucks. I’m still working on figuring out how to write a characteristic and have the bean read that characteristic, but I think I’m getting closer. I hope to finish that this week.

Iceland Soil

with No Comments

This week I have been mostly working on accessing Iceland soil in the ancient DNA lab. The soil was shipped back from Iceland after the trip in 2014. It currently lives in the ancient lab because eventually DNA will be extracted and specific mammalian DNA with specific markers (Horse and a few others) will be amplified. The goal will be to determine what animals were cultivated in the ancient human settlement formerly at the site where the samples were taken.

Today I will follow the sterile lab protocols to take out a very small sample of the soil and bring it to the analytical chem lab to be dried out in an oven, removing all living microbes. The remaining sample will still be usable for DNA extraction, and there will be more than enough left for multiple extractions.

Other than soil, I’m working on using openSCAD to design housing for the sensors that we will 3D print. My first priority is a housing for the TI Nano, followed by the field platform. The Nano needs a case so we can change its orientation 180 degrees and take soil spectra. I would like to get this working soon.

I am making progress on all of the sensor platforms. Kristin has been making a lot of progress on BLE so the sensors should be integrated with Field Day and ready for testing soon.

BLE in Arduino and Android

with No Comments

I have accomplished quite a bit within the past week, if I do say so myself. Last week, I said that I had begun working on the BLE part of Field Day. Well, after some struggle I finished it! I am able to send a request to the Arduino device and upon receiving that request, the Arduino device sends a message back to the android device and writes to he screen (woo!). But the code is not pretty. Android provides a Bluetooth Low Service example in Android Studio, but it uses deprecated code. I researched and was able to modernize it.

Bluetooth Low Energy is pretty complicated as a service. It uses something called GATT (Generic Attribute Profile). The bluetooth server (arduino device) has a GATT profile and within that GATT profile there are services and services have characteristics. Each service can have multiple characteristics, but there can only be one TX and one RX (read and write) characteristic per service. I learned this the hard way. You can see a diagram of what I’m talking about below.

For the arduino side, we use Red Bear Labs BLE shield. There is a standard BLE library for the arduino but it’s really complex. I’ve read over the code multiple times and I’m just now grasping a little bit of it. RBL has constructed a wrapper for that code. They broke it down into 10 or so different functions, which I must say is mighty useful. I used that when constructing the Arduino code.

During my coding of the BLE on android I discovered that most of our Fragments of sensors are all the same except for the names of them. After talking with Charlie we’ve decided that we’re going to get rid of the individual fragments for each sensors and just have one that is ‘Bluetooth Sensors.’ I’m working on moving the BLE code to that setup now. Hopefully I will be done with it within the day.

 

GATT

LIDAR or: how I learned to stop reason and love a challenge

with No Comments

So, LIDAR or light-based-radar availability and price range has dropped considerably over the years to the point of being relatively cheap even. So we jumped at that!

I spent the last bit of time jumping into working with LIDAR. The lidar unit model is capable of 10hz which translates to 2000 samples per second. I initially got the laser to work using a visualisation program known as RVIZ which is part of the Robot Operating System (ROS). ROS is sort of the de-facto open source robot/machine learning system… that being said it is brittle and with very little documentation for use outside of getting set up. I initially tried to get ROS running on my native mac OSX environment but ran into complications running on El Capitan, so ill put that off for a later date.

Through an ubuntu virtual machine (with much trial and error) I finally got a laser point visualisation, which you can see in the video below. What you are looking at is a college student in his natural habitat practicing for his entrance into the ministry of silly walks.

Some time after getting this working, the LIDAR took a small tumble off a desk and stopped working. For a total of 3 days that same college student frantically took the device apart, wiggled a few of the connections and re-aligned the laser sensor with the laser beam using a 3rd party laser source. After that it started working correctly again.

The next step is to get the laser vis into a simultaneous localization and mapping (SLAM) program.

 

Handlers handling handles

with 1 Comment

After some testing of the Handler implementation in Field Day, I’ve determined that it is actually working! I was finally able to look at the data that I had been writing to the database.

Databases in Android are stored in internal (private) memory. There are Android apps that let you look at the files on your device, but only for external storage. The internal storage is private. What I ended up doing was adding a function to write the database from internal storage to external storage. Once it was on external storage, I used a SQLite Viewer Android app to look at the database and it looks good!

I did some research into the application state when the device screen is turned off. In our old application, Seshat, the activities would die when the screen was off. That’s not good. We want to be able to turn off the screen and hang the Android device from our person some how. It’s useful to have our hands free. After a couple hours of research, I determined the libraries we want to look at are in the PowerManager library on Android, particularly the PARTIAL_WAKE_LOCK state. I wrap the methods I want to happen when the screen is off in what are essentially ‘start’ and ‘stop’ methods. The methods were not executing with the screen off, so I researched some more. I think what will ultimately happen is switching the SensorSampleActivity to a Service. A Service in Android is one of the top priority processes. It’s one of the last to die to if the device needs more memory. It can continue running even if the Activity the started it has died. Since the Activity is now working, I’m going to hold off in converting it to a Service.

The last thing I worked on this week was Bluetooth LE in Android. Tara gave me a BLE shield with an Arduino attached to test my code with those devices. Android provides a BLE example in Android Studio so I’ve been working with that. However, the example they provide has deprecated code so I’m working on modifying it to the new setup. It’s not working yet, but I’ve only been working on it for a couple of hours!

sqlite-fieldday
Sample of sqlite database on Field Day

Databases, and lat long functions

with No Comments

Since my last post was a a very long time ago, this is what has happened with the data clean-up over the past weeks:

Eamon and I both cleaned up all the data we could find individually. Since we’d worked separately, when we compared the cleaned data sets that we each had, they turned out to be different. Neither of us had all the data individually, but when our data sets were combined, the list was exhaustive of all permutations of Iceland 2014 and Nicaragua 2014 data. Then, with Charlie’s help, we were able to determine which of the data sets we needed to zorch. It turns out that a significant chunk of our data sets needed to be zorched, because we each had thousands of rows of testing data or data taken in the car while the group was driving.

After much too much time wrestling data in spreadsheets, the readings table is finally in the field science database where further clean-up relating to sectors and spots can be done. As of right now, most Iceland 2014 data has no sector or spot data. With Charlie’s help, I can now populate the sectors.This should be way quicker to do by date, time stamp and lat long coordinates now that we have it all in the database.

Next up, I will be working to create/adapt a function that measures the distance between a pair of lat long coordinates. This should help with further clean-up and with the bounding box interface.

1 2 3 4 5