Underwater Substrate Classification

Marine Applied Research and Exploration (MARE) has collected hundreds of hours of video using their unmanned, underwater, remote-operated vehicles (ROVs). In order to better survey and understand life in California's costal waters, MARE has annotated each video with species and substrate labels.

As a first step toward creating a ontext-Driven Detector for underwater species, we first implemented a method using a convolutional neural network (CNN) capable of generating temporal labels for DUSIA. A ResNet-based classification models classifies the video frame by frame.

Input

.mp4 video

Output

.csv with predicted substrate for each frame in the video

Want to know more?

Check out the project here.