{"id":126179,"date":"2019-08-16T11:02:58","date_gmt":"2019-08-16T15:02:58","guid":{"rendered":"https:\/\/www.bates.edu\/news\/?p=126179"},"modified":"2021-02-10T09:07:35","modified_gmt":"2021-02-10T14:07:35","slug":"bates-announces-3-97-million-national-science-foundation-grant-for-visual-database-project","status":"publish","type":"post","link":"https:\/\/www.bates.edu\/news\/2019\/08\/16\/bates-announces-3-97-million-national-science-foundation-grant-for-visual-database-project\/","title":{"rendered":"Bates announces $3.97 million National Science Foundation grant for visual database project"},"content":{"rendered":"<p>Bates College has received a National Science Foundation grant of $3.97 million to create a groundbreaking Visual Experience Database to support research in fields that rely on the analysis and recognition of images, such as neuroscience, cognitive science, and artificial intelligence.<\/p>\n<div id=\"attachment_109762\" style=\"width: 310px\" class=\"wp-caption alignright\"><a href=\"https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-109762\" class=\"wp-image-109762 size-medium\" src=\"https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901-300x300.jpg\" alt=\"Equipped with a new lab in Hathorn Hall, Assistant Professor of Neuroscience Michelle Greene studies visual perception. (Theophil Syslo\/Bates College)\" width=\"300\" height=\"300\" srcset=\"https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901-300x300.jpg 300w, https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901-150x150.jpg 150w, https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901-900x900.jpg 900w, https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901-200x200.jpg 200w, https:\/\/www.bates.edu\/news\/files\/2017\/09\/170829_Michelle_Greene_0747_LR-e1538745748901.jpg 1081w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><p id=\"caption-attachment-109762\" class=\"wp-caption-text\">Assistant Professor of Neuroscience Michelle Greene is the principal investigator for the project to create a Visual Experience Database (Theophil Syslo\/Bates College)<\/p><\/div>\n<p>The largest-ever federal grant awarded to Bates, the four-year award will fuel the creation of a vast gallery of videos that depict what, and how, people see as they go about daily activities. Bates developed the grant proposal collaboratively with researchers at North Dakota State University and the University of Nevada, Reno.<\/p>\n<p>Michelle R. Greene, an assistant professor of neuroscience at Bates who studies how the brain makes sense of what we see, is the principal investigator for the project.<\/p>\n<p>\u201cI\u2019m delighted and overwhelmed,\u201d said Greene, \u201cand intensely excited. I have a terrific team of co-principal investigators, so fostering and furthering those connections will make these next four years really fun.\u201d<\/p>\n<p>The co-principal investigators are Benjamin Balas, a neuroscientist and associate professor of psychology at North Dakota, and Paul MacNeilage and Mark Lescroart, neuroscientists and assistant professors of psychology at Nevada.<\/p>\n<p><span style=\"font-weight: 400;\">\u201cWe are honored to be the lead partner in this multi-state collaboration,\u201d said Bates President Clayton Spencer. \u201cThis grant is important for Maine, Nevada, and North Dakota, and it also has the potential for significant impact on the future of vision research, neuroscience, and artificial intelligence.\u201d<\/span><\/p>\n<blockquote><p>\u201cIt\u2019s meaningful that the National Science Foundation has chosen a national liberal arts college like Bates to take the lead on this project.&#8221;<\/p><\/blockquote>\n<p>\u201cThis grant brings deserved recognition to Professor Greene, an exemplary member of the Bates faculty who has contributed significantly to the body of published work on visual perception and who engages students extensively in her research,\u201d said Malcolm Hill, vice president for academic affairs and dean of the faculty.<\/p>\n<p>\u201cIt\u2019s meaningful that the National Science Foundation has chosen a national liberal arts college like Bates to take the lead on this project,&#8221; Hill added. &#8220;Indeed, Bates is well-positioned to collaborate on the work to create a Visual Experience Database to support researchers around the world as they grapple with, and bring greater understanding to, the social and ethical consequences of computer vision and artificial intelligence.\u201d<\/p>\n<div id=\"attachment_114725\" style=\"width: 910px\" class=\"wp-caption alignright\"><a href=\"https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-114725\" class=\"wp-image-114725 size-large\" src=\"https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023-900x600.jpg\" alt=\"Sarah Rothmann '19 of Andover, Mass., participates as a subject in an EEG neuroscience thesis experiment for a first-person story she is writing for the Bates Communications Office. Hanna De Bruyn \u201818, Old Lyme, Conn., is the thesis student who is working on the supervision of Michelle Greene, assistant professor of neuroscience in the Bates Computational Vision Lab (Hathorn 108). \u201cWe are piloting the experiment for these students\u2019 thesis experiments. They were piloting Hanna\u2019s experiment. She\u2019s interested in looking at the extent to which visual masking actually inhibits perception. So when you take a visual mask, you take an image followed by another image, you\u2019re impaired at understanding the first image. The question is why. So what we\u2019re going to do is take the neural activity that we\u2019re measuring. And the nice thing about EEG is that it measures millisecond by millisecond electrical potentials that are generated in the brain , we measure them from the scalp. And we can see over time what the brain is processing and we use machine learning, we put these signals into a computer system tha t reads out the extent to which there is information about what the picture is. We\u2019re wondering, does that information persist when you change the image? Does that persist over time? Hannah\u2019s made the experiment, and we are going to try it out to make sure everything\u2019s ready for participants.\u201d -- Michelle Greene, assistant professor of neuroscience, says of three thesis students in neuroscience: \u201cThey\u2019re all terrific, I might add.\u201d Hanna De Bruyn \u201818, Old Lyme, Conn. Katherine \u201cKatie\u201d Hartnett \u201918 of St. Paul, Minn., and Julie Self \u201918 of Redwood City, Calif. Hanna is the only student to appear in this set of pictures.\" width=\"900\" height=\"600\" srcset=\"https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023-900x600.jpg 900w, https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023-400x267.jpg 400w, https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023-200x133.jpg 200w, https:\/\/www.bates.edu\/news\/files\/2018\/04\/180312_Sarah_EEG_0023.jpg 1919w\" sizes=\"(max-width: 900px) 100vw, 900px\" \/><\/a><p id=\"caption-attachment-114725\" class=\"wp-caption-text\">Student researcher Hanna De Bruyn \u201918 (left) works with Assistant Professor of Neuroscience Michelle Greene to prepare an EEG test in March 2018. (Phyllis Graber Jensen\/Bates College)<\/p><\/div>\n<p>\u201cMaine\u2019s colleges and universities are consistently at the forefront of groundbreaking research that improves people\u2019s lives and enhances our understanding of the world around us,\u201d said U.S. Sens. Susan Collins and Angus King of Maine in a joint statement.<\/p>\n<p>They added, \u201cThrough this funding, Bates College will partner with two other universities to build a database to study human behavior and development through first-person experiences. We applaud the NSF\u2019s investment in Bates\u2019 project, which will help advance the field of vision science.\u201d<\/p>\n<p>The VED will comprise more than 240 hours of video created specifically for this project and findable through a publicly accessible database. Wearing cameras that simulate human vision, as well as devices to track head and eye movements, observers will undertake routine activities such as walking, shopping, or touring a museum.<\/p>\n<blockquote><p>Because they were not intended for research purposes, existing still and moving images are compromised by the many biases their creators bring to them.<\/p><\/blockquote>\n<p>By enlisting diverse observers local to each of the three participating institutions, the project will record how changes in environment, age, and task affect the act of looking.<\/p>\n<p>Much of the data used in such fields as visual neuroscience, psychology, computer vision (a branch of artificial intelligence), and computational sociology consists of vast collections of still and moving images. These are curated largely from public online resources such as YouTube and Google \u2014 but because they were not intended for research purposes, they are compromised by the many biases their creators bring to them.<\/p>\n<hr \/>\n<p><span style=\"color: #009779;\"><em>Artificial intelligence systems have biases because they&#8217;re not being fed enough solid data, says Michelle Greene.<\/em><\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Michelle Greene: Why &#039;AI&#039; is biased\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/uTu3bqqfDK0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>The reasons someone may choose a particular photo subject, frame an image a certain way, or upload one image and not another are all biases that diminish the material\u2019s value as data. Such \u201cbiases exist at every level,\u201d said Greene, \u201cand all of the databases that we\u2019ve been using for years are subject to them.\u201d The VED assets, in contrast, will be created specifically to represent ordinary scenes and will be subject to experimental controls.<\/p>\n<p>Undergraduates at all three schools, including 28 at Bates over the four-year grant period, will take part in the research. Among other roles, students will serve as videographers, creating assets for the VED, Greene said.<\/p>\n<p>\u201cI can imagine that next summer there\u2019s going to be a small army of folks going out into various parts of the world, seeing what the world looks like when we&#8217;re hiking, when we&#8217;re at the beach, when we&#8217;re grocery shopping, and all kinds of more mundane things.\u201d<\/p>\n<hr \/>\n<p><span style=\"color: #009779;\"><em>The project to create a Visual Experience Database will benefit Bates students &#8220;at all levels,&#8221; says Michelle Greene, an assistant professor of neuroscience at Bates who is the principal investigator for the project.<\/em><\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Michelle Greene: How Bates students benefit from NSF grant\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/jLndrmcFH6U?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>Students will benefit from the many research questions that the VED will engender. \u201cA database like this essentially means there will be thesis projects for decades to come,\u201d said Greene. \u201cThere are many basic questions that we haven\u2019t been able to answer because we haven&#8217;t had the data.\u201d<\/p>\n<p>She said that Bates&#8217; being a liberal arts college will enrich the VED project in distinctive ways. \u201cOne is our focus on equity and inclusion. We\u2019re trying to get a diverse set of visual experiences to catalog in the VED, and I think that holding that in the forefront is something a liberal arts college can do that might be a somewhat harder sell\u201d at other types of institutions.<\/p>\n<blockquote><p>Innovation in artificial intelligence, in particular, stands to benefit from the VED.<\/p><\/blockquote>\n<p>Bates\u2019 intimate scale, coupled with the liberal arts approach to education, \u201callows us to engage in some multidisciplinary and interdisciplinary thinking,\u201d she added. \u201cAnd one of the things I particularly love about Bates is that I engage in conversations with faculty members across the college \u2014 if I were at a larger institution, we probably wouldn&#8217;t touch paths and learn from one another.<\/p>\n<p>\u201cSo I&#8217;m particularly excited about the ways in which this type of data can be used across disciplines.\u201d<\/p>\n<p>Innovation in artificial intelligence, in particular, stands to benefit from the VED. Model systems in computer vision \u201care very data-hungry,\u201d said Greene. \u201cThey tend to require tens of millions of images, and have been downloading these tens of millions of images from the internet. We will now give them tens of millions of images that are more representative of daily-life experience.\u201d<\/p>\n<p>The VED will be a public resource. \u201cThis is taxpayer-supported,\u201d Greene said, which means that it should be publicly accessible, both for the sake of transparency and simply because it should be a public good.<\/p>\n<p>She added, \u201cWe see throughout digital life that when a resource is available, people appropriate it in really interesting ways. I&#8217;m hoping that there may be artists, digital historians, computational sociologists that might want to use this database. And, as such, it should be available to everybody.\u201d<\/p>\n<hr \/>\n<p><span style=\"color: #009779;\"><em>Michelle Greene explains why it is important and valuable for society that the Visual Experience Database will be publicly available.<\/em><\/span><\/p>\n<p><iframe loading=\"lazy\" title=\"Michelle Greene: Why the Visual Experience Database should be a public resource\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/-7b6zx1PcTA?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>The VED grant was made through the NSF\u2019s Research Infrastructure Improvement Program, part of the Experimental Program to Stimulate Competitive Research. These initiatives are designed to build research capabilities in underserved regions of the country and thereby make those regions more competitive in seeking other federal R&amp;D funding.<\/p>\n<p>The project team will release a suite of software tools for using the database. The team will also establish a program of \u201cBig Data Skills Summer Workshops\u201d to give students basic programming and computational literacy skills that will not only support their contributions to this project, but help prepare them for a variety of STEM occupations.<\/p>\n<p>\u201cIf we can take the next generation of students and get them the best skills, the kind of experience I wish I\u2019d had as a student early on, that is a key part of the workforce development component of the grant,\u201d Greene said.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The largest-ever federal grant awarded to Bates, the award will fuel creation of a vast video gallery to support research in various fields, including artificial intelligence.<\/p>\n","protected":false},"author":105,"featured_media":119197,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_hide_ai_chatbot":false,"_ai_chatbot_style":"","associated_faculty":[],"_Page_Specific_Css":"","_bates_restrict_mod":false,"_table_of_contents_display":false,"_table_of_contents_location":"","_table_of_contents_disableSticky":false,"_is_featured":false,"footnotes":"","_bates_seo_meta_description":"","_bates_seo_block_robots":false,"_bates_seo_sharing_image_id":0,"_bates_seo_sharing_image_twitter_id":0,"_bates_seo_share_title":"","_bates_seo_canonical_overwrite":"","_bates_seo_twitter_template":""},"categories":[4,11011,130,217,224,11009],"tags":[11556,6283,193],"class_list":["post-126179","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-academic-life","category-awards","category-collaboration","category-science-technology","category-society-culture","category-the-college","tag-michelle-greene","tag-national-science-foundation","tag-neuroscience"],"_links":{"self":[{"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/posts\/126179","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/users\/105"}],"replies":[{"embeddable":true,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/comments?post=126179"}],"version-history":[{"count":10,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/posts\/126179\/revisions"}],"predecessor-version":[{"id":126235,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/posts\/126179\/revisions\/126235"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/media\/119197"}],"wp:attachment":[{"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/media?parent=126179"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/categories?post=126179"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.bates.edu\/news\/wp-json\/wp\/v2\/tags?post=126179"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}