top of page

3-Minute Thesis

During spring of 2020 I started preparing for the 3-Minute Thesis competition being hosted by the University of New Hampshire. If you're not familiar, 3MT was originally developed by some researchers at the University of Queensland who wanted to see their graduate students suffer. The challenge is to take all of the work you've done in the past 3-5 years and condense it into a 3-minute pitch to describe what it is you're doing, why it's important, and of course why others should care. Akin to a TED-talk, contestants create multiple drafts of their talk and their slide for months before finally presenting it to a panel of judges in front of a live audience.

 

This year the practice round was hosted in front of a live audience at the Dover Public Library, with the final round taking place online via Zoom due to COVID-19; this year I took first place both rounds.

 

Below is the slide I used along with the script, and here is the link to the actual video.

Imagine, if you will, sitting at a computer and looking at an image and on it are hundreds of randomly placed points like the red ones you see here. Now I want you to look at every single point and label what it’s on top of within the image, and I want you to do that for this image, but also hundreds maybe thousands of other images, and I want you to make this your main priority for the next few weeks, or maybe even months.

 

Believe it or not, what I just described is the most common method for how researchers track changes in coral reef habitats. Sounds fun, doesn’t it?

 

My name is Jordan Pierce and I’m here to talk to you about how my research will help improve our ability to track changes in coral reef habitats across space and time.

 

Coral reefs aren’t just colorful underwater rocks, they’re actually a collection of millions of individual organisms called coral polyps, which work together to form colonies that collectively create a habitat that houses millions of other animals. 


That, and they contribute billions of dollars to global economy through a number of different ways. What’s really impressive though, is that, coral reefs make up less than 1% of the surface of our planet, yet they provide a home to a quarter of all marine life. 


But! Climate change is hitting our reefs with a number of stressors causing those coral polyps to bleach, which increases the odds of starvation and eventually death. Without fortification, the coral structure they built starts to become overrun by other organisms and eventually collapses taking the entire habitat with them.


Now we all want to help coral reefs but it’s important to first be able to track how they are changing from a healthy coral reef to a degraded one. Unfortunately, the current most common method for doing so is both slow and cumbersome.


What my research proposes to do is to automate this process by using deep learning, a popular form of artificial intelligence or A.I., and computer vision, which attempts to give computers the ability to see the world the way that we do.


I’ve developed a workflow that trains this computer with these existing images of coral reefs, teaching it to recognize the objects within them in a way that is similar to how you would teach a child: by having it learn from it’s the mistakes. But once fully trained, we can use it to label images it has never seen before both quickly and automatically.


But what’s more is we can use it with other methods like photogrammetry, which is a method capable of reconstructing a 3D model of an entire underwater scene with millimetre level accuracy from just a few overlapping images. Combined, for the first time, we can give scientists the ability to track the changes in structure and community composition, both in 2- and 3-D. 


But most importantly, by having the ability to assess these habitats more quickly and precisely than ever before, we can make more informed decisions that may help save our reefs. Thank you! 
 

bottom of page