Hi Eco-loggers, Some relative drone news for ecologists over the past week or so.
***DJI might have just changed how science is done with their drone payload SDK*** DJI Enterprise recently announced the release of an SDK for payloads allowing for power, control, real- time data visualization, and more on their commercial M200 series of drones. This move was primarily to enable the commercial drone market and support big-ticket verticals like construction, surveying, and infrastructure inspection. Still, I can't help but put on my scientific thinking cap for the opportunities this move could provide the research community. I've gone to a lot of academic conferences over the years, both as an ecologist and as a vendor for the drone industry. Common questions I get from scientists are about what other sensors can be put on drones. Color, thermal, and multispectral are great, but what about other very specific (and sometimes seemingly random) scientific applications? Such applications, like gas monitoring, will certainly benefit from the productions/cost savings of the commercial drone market, while others are far more specialized. At most, there might be a dozen universities or government labs using very specific sensors in their study of ice flows in Antarctica, thermal vents in Icelandic volcanos, or carbon budgets in the Amazon, for example. This is more the AGU crowd than the ESA one. Sure, other drone platforms (i.e. Ardupilot) allow for a certain amount of open integration, but none offer the innovation, off-the-shelf capabilities, or distribution of DJI. The new SDK for payloads could very well open up the abilities of research programs to more efficiently collect data from current scientific instruments, as well as develop new and innovative payloads for addressing questions across the sciences. It seems the limits will be up to DJI to deliver on the software side and in the creativity of the research teams across the globe. ***Micasense opens up their image processing to open-source options*** Good news for the research community is the ability to process Micasense Rededge imagery with open source options. I know some researchers dislike the black box of some processing options. For more details, checkout the the Micasense Github page (https://github.com/micasense/imageprocessing) ***Plant mapping the Mima Mounds using Drone Deploy Live Map, Hangar 360, and Pix4D Cloud*** Scholar Farms was up in Washington state last week with a permit to map those mysterious Mima Mounds Natural Area using drones (and give a talk for the good folks at Evergreen). There are conflicting hypotheses on what actually caused the formation of the mounds, but what is sure are the cool patterns of plants growing on and off of them. We used it as an opportunity to run through some different cloud tools for visualizing plant data for the growing season, including Drone Deploy's Live Map, Hangar 360, and an updated 3D visualization in the Pix4D Cloud. The DJI Mavic is one low cost, portable tool for field scientists or crop scouts to capture rapid color imagery. You can check out the results discussed in the Youtube video (https://www.youtube.com/watch?v=Hio2BZsopRk) You can check out the results here: Drone Deploy: https://bit.ly/2pU87Bj Hangar 360: https://bit.ly/2GrlC2p Pix4D Cloud: https://bit.ly/2GlRrJN Best, [email protected]
