image1
Angle of prototype. Sensors are to the right and left of the processor.

Advances in machine learning and remote sensing provide potential for studying life’s diversity and interactions between organisms and their natural environments. Tim Keitt, professor in the Department of Integrative Biology, and his colleagues are interested in leveraging these technologies to understand environmental changes happening in Central Texas as the area is impacted by human population growth and climate change.

To support this goal, Dr. Keitt and his team recently won a competitive research award from the Stengl Wyer Endowment. This support will allow them to develop an environmental sensing network at Brackenridge Field Lab and Stengl Lost Pines Biological Station while utilizing the computing resources of the Texas Advanced Computing Center (TACC). The sensors network they propose to develop will monitor the environment by recording sonic, photographic and meteorological data. Sound recording will allow the researchers to continuously sample the vocalizations of animal communities including bats, birds, amphibians, and insects. Photographic cameras will monitor of mammals. Meteorological data will test faunal responses to climate.

I had a chance to ask Dr. Keitt about this project to learn about how it will work once it is completely up and running.

Nicole Elmer (NE): Where did this research focus arise? What factors inspired its need?

Tim Keitt (TK): Traditional manual modes of collecting biological and environmental data are expensive and time consuming. We hope to greatly increase the amount of data we can collect and process in order to better understand our changing environment. These data will serve ecologists working in the field by enhancing their research.

NE: How does machine learning actually work when looking at the data you will be generating? 

TK: Machine Learning and Artificial Intelligence are sometimes used interchangeably. The gist of it is to extract the desired information and weed out the noise. As an example, bird songs are specific to species. Just as a trained observer can distinguish the song of a Caroline Wren from a Tufted Titmouse, a suitably trained computer can do the same. In fact, you can download an app for your phone that will identify birds by song. Our goal is to automate this process by sending data from many sensors back to a central processing center where analysts can then use the results.

image2 scaled 
 Another angle of prototype. Sensors are to the right and left of the processor.

NE: For how long will your team be looking at the material generated by the sensors?

TK: Ideally, we would like to turn this into a long-term ecological study so that we can observe ongoing environmental change. Many environmental patterns only become apparent when our records extend across decades. So hopefully a long time.

NE: There is a functioning remote prototype you all have already built. What data is it currently collecting?

TK: I believe the first field deployed device is recording humidity and temperature. The important part is that it is already sending data to TACC where they have developed specialized software to handle the incoming streams.

NE: This project will provide some new training opportunities. Can you elaborate a bit?

TK: We plan to involve graduate and undergraduate students who will learn about the technology, assist in checking the accuracy of sound and image classification and analyzing results.

Congratulations to Dr. Keitt and his team on the award!