Search historical videos to find unique moments and objects

Project Goals:
The Machine Vision Learning (MVL) search tool enables users to search historical video collections to find unique moments or objects. MVL also contributes to the archival community by adding to the metadata through object tagging. The Media Ecology Project (MEP) and the Visual Learning Group at Dartmouth are building a system to provide access to primary moving image materials and motivate new forms of scholarly research. This Machine Vision Learning tool uses a mix of machine learning methods and Google search to find specific moments and objects within an archived film, such as a handshake, a phone call, or a martini. To make the system smarter, users can also tag metadata easily.  The MVL search tool was mostly built, but was divided into two separate elements, and needed a way to explain the goals and usability of the system to funders and scholars.

 

Our Solution:
When we began, there were essentially two tools--a film search engine and a tagging system.  We merged these into a single prototype to tell the story of the potential of this tool. The new MVL website establishes search and annotation interfaces, includes a user tutorial, and a robust "About" page that explains the neural network technology and project.

DALI Staff

DALI Lab, 9 Maynard St, Hanover, NH, 03755, United States