Programming for progress

UMaine researchers advance AI for understanding the world, solving problems
The AI developed by UMaine researchers will use object recognition and image segmentation to determine the number of birds, their species and behaviors in aerial photos captured on Maine’s offshore islands and over inland rookeries during the spring and summer months. This image shows double-crested cormorants nesting on one of Maine’s coastal islands. Photo by Meredith Lewis

Programming for progress

UMaine researchers advance AI for understanding the world, solving problems

Sensors, satellites, scanners, other devices and the internet provide a deluge of data for researchers in multiple disciplines. Artificial intelligence is key to helping navigate it all. This evolving technology, in essence, helps connect the dots.

At the University of Maine, AI research is providing innovation, leadership and inroads needed to efficiently and effectively use the constant stream of data, and to help scholars solve long-lasting and emerging problems.   

Penny Rheingans, director of the UMaine School of Computing and Information Sciences, says the growth of AI development has responded to an explosion of data, and coincides with the overall expansion of the data science field. Conducting research and serving as engaged citizens now require copious amounts of data and tools to harness it. Stakeholders in medicine, manufacturing, finance, environmental stewardship, public health, commerce, education and other areas also need to access extensive datasets.

“One of the core game changers for machine learning was that there was so much data that it was overwhelming and almost became a barrier to understanding,” says Rheingans, a professor of computer science. “It’s no longer really possible to understand what’s going on through manual approaches.”

UMaine computer scientists and engineers code software on their computers and build hardware at their benches to create AI that will perceive, reason, communicate and predict more like humans. With greater cognition, coupled with enhanced efficiency and accuracy, these technological neural networks will be able to collate copious amounts of historic and new data and use it in novel ways. 

Rheingans says the abundance, interconnection and diversity of data has prompted researchers to not only develop new AI, but also find novel applications for existing software and hardware. 

“It’s hard to think of an application in the world right now that doesn’t require a large amount of information,” Rheingans says. “There is also this greater potential to do what we could never do before.”

To help Maine capitalize on the social and economic benefits of this emerging field of technology, UMaine launched an initiative to make the state a hub for AI research, education and use. The endeavor, known as the University of Maine Artificial Intelligence Initiative (UMaine AI), seeks to achieve this goal by uniting experts in academia, government, industry and the community. 

UMaine possesses the expertise and resources needed to develop new applications for AI that improve how researchers conduct studies and tackle problems affecting the quality of life for Mainers, Rheingans says. 

The university also is poised to foster strong partnerships needed to advance AI development, and train students to excel and secure employment in this burgeoning field.

“We have the best potential to work on problems that are important to us here,” Rheingans says. “That’s why it’s important to have the expertise, to build that in-house expertise and to have partnerships.” 

Research led by UMaine computer scientists and engineers is underway. Studies are creating AI technology and applications to combat disease, protect natural resources, defend against natural disasters and find new solutions for energizing communities. 

Highlights of AI research at UMaine follow.

Car driving down a flooded coastal road in Maine
A team of scientists, including one from UMaine, plans to develop a program governed by AI that will identify what homes and neighborhoods would be inundated, what roads would be inaccessible, what systems would be inoperable and what areas would need evacuation in a flood.

Defending cities against flooding

When a flood strikes a city, the damage can ripple through its many interconnected systems of infrastructure and services. Despite the interwoven nature of infrastructure, no tool can holistically predict or track the ramifications of a flood in a metropolitan area. 

A team of scientists, including one from UMaine, plans to develop a program governed by AI that will identify what homes and neighborhoods would be inundated, what roads would be inaccessible, what systems would be inoperable and what areas would need evacuation during a flood. 

Torsten Hahmann, an associate professor of spatial informatics, teamed up with researchers from across the country to create the Urban Flooding Open Knowledge Network (UF–OKN), funded by the National Science Foundation (NSF)with over $6 million. The University of Cincinnati is leading development on the project. Hahmann has been structuring the UF–OKN knowledge graph with colleagues from the project team, particularly focusing on the semantics that will govern it.

UF–OKN will not only provide hydrological projections for a city facing a flood, but also identify which neighborhoods, businesses, roads, dams and public health, water, sewer and power systems would be threatened. The network will forecast probable courses of a flood event up to 48 hours before it reaches a city, with predictions updated every half hour. It also will provide real-time, high-resolution data that allows emergency managers and other stakeholders to track problems. 

“At its core, the UF–OKN uses a knowledge graph that brings together ontologies and data,” says Hahmann, who directs the Spatial Knowledge and Artificial Intelligence Lab at UMaine. “It provides computers with the human knowledge necessary to correctly interpret and connect vast amounts of disconnected information.”

Forest with fall leaves
Researchers from UMaine, the University of New Hampshire and the University of Vermont are collaborating to create a digital framework capable of gathering near real-time data about the forests spanning the northern portions of their respective states and New York.

Collecting data to protect, preserve forests 

New technology to enhance scientists’ understanding of the complex yet highly dynamic Northern New England forests began its first trial in a flowerpot. Ali Abedi, a professor of electrical and computer engineering at UMaine, tasked his new wireless sensor with gathering soil moisture data and sending it to a computer in his lab in April 2020, one of multiple tests for the prototype. The device, with rubber-coated wires connecting a red converter and two metal prongs, and its development will lay the groundwork for a multi-institutional effort to assess and forecast changes in 26 million acres of New England forestland.

Researchers from UMaine, the University of New Hampshire and the University of Vermont are collaborating to create a digital framework capable of gathering near real-time data about the forests spanning the northern portions of their respective states and New York — the Northern Forest Region. 

The digital framework will consist of many small networks of wireless sensors spread across the region. Governed by AI, the sensors will collect data about soil moisture, soil temperature, ambient temperature, carbon dioxide, sunlight exposure and other characteristics; and communicate with each other to create a cohesive, regulated and self-monitoring network.

Abedi, who also serves as associate vice president for research and director of UMaine’s Center for Undergraduate Research, leads development of the new sensor technology that serves as the data collection part of the first phase of the multiyear interuniversity project, Leveraging Intelligent Informatics and Smart Data for Improved Understanding of Northern Forest Ecosystem Resilience (INSPIRES), which was awarded a $6 million NSF grant in 2019. This collected data will be combined with remote sensing data within a new AI tool — the “Digital Forest” — that is developed with INSPIRES funding by another team of researchers, led by Kate Beard-Tisdale, UMaine professor of spatial informatics, and Hahmann.

“I think it’s exciting because it’s close to home here in Maine,” Abedi says. “We live in the forests. It’s important to understand what we have.” 

Hands holding a smartphone
A research group led by the VEMI Lab is developing an Autonomous Vehicle Assistant smartphone app to improve accessibility for people with disabilities and seniors.

Helping those with disabilities access the latest in automotive technology

Self-driving vehicles can offer the freedom of the open road, including for people with visual impairments and seniors. UMaine researchers are developing new tools to ensure the latest in automotive technology can accommodate all users.

The Virtual Environments and Multimodal Interaction (VEMI) Lab, led by UMaine computer scientists Richard Corey and Nicholas Giudice, is spearheading projects and supporting others in this area. The lab’s Autonomous Vehicle Research Group (AVRG), which also includes researchers at Northeastern University and Colby College, is developing a smartphone app to aid in autonomous vehicles ride-sharing and ride-hailing. The app, known as the Autonomous Vehicle Assistant (AVA), will help users request, find and enter vehicles using a multisensory interface that provides guidance through audio and haptic feedback, and high-contrast visual cues. The U.S. Department of Transportation awarded $300,000 to AVRG for the AVA project through its Inclusive Design Challenge.

UMaine assistant professor of computer science Salimeh Yasaei Sekeh has teamed with VEMI Lab scientists and other researchers from UMaine and Northeastern University to develop algorithms that will make traveling in self-driving cars safer. The algorithms will improve the ability for autonomous vehicles to detect and avoid deceptive objects, such as adversarial traffic signs, that could misguide them and place users in danger. Sekeh’s team earned a UMaine AI Initiative seed grant for the project.

VEMI Lab also is working with AVRG member Shelley Lin, an assistant professor of electrical and computer engineering at Northeastern, and her colleagues to create algorithms that will improve how AI identifies, tracks and communicates with passengers. The project, funded by a seed grant from UMaine and Northeastern and its Roux Institute, will help self-driving car AI better assist people with visual impairments and seniors, and inform them of its actions.

Great blue herons nesting
Great blue herons nesting. Photo courtesy of Cynthia Loftin

Improving efficiency, accuracy of wildlife surveys 

Biologists count and identify birds in thousands of aerial photos when conducting wildlife surveys, a laborious task that consumes many hours. To reduce time spent analyzing images and the margin for error, UMaine researchers endeavor to create artificial intelligence that will perform the task. 

Faculty and graduate students from several UMaine departments will develop machine learning technology that can pinpoint colonial nesting birds in photos captured by cameras mounted in unmanned aerial vehicles (UAVs) or planes. 

The AI developed by UMaine researchers will use object recognition and image segmentation to determine the number of birds, their species and behaviors in aerial photos captured on Maine’s offshore islands and over inland rookeries during the spring and summer months. Their program, known as a Convolutional Neural Network (CNN), a deep learning AI algorithm typically used for visual analysis, will find and classify the birds in an image by analyzing the pixels that form them. 

The project received $43,000 from the UMaine AI Initiative seed grant funding program, and builds on previously funded grants and partnerships involving UMaine faculty and state and federal agency partners. 

“Humans are prone to fatigue, error,” says project lead Roy Turner, an associate professor of computer science and director of the Maine Software Agents/Artificial Intelligence Laboratory (MaineSAIL). “It takes forever to do this by hand. Graduate students can take several hours identifying birds in one image.” 

Federal land
A team of conservation researchers tested the use of a machine learning algorithm to quantify public sentiment toward decisions involving federal land.

Assisting policymaking, public oversight 

Public comments can help government officials evaluate potential policy decisions that affect national monuments and other federal land. The introduction of online comments, however, has brought staggering amounts of feedback that can be difficult to summarize, and bury concerns agencies should consider. 

Caitlin McDonough MacKenzie, a postdoctoral research fellow with the UMaine Climate Change Institute, led a team of postdoctoral conservation researchers in testing the use of a machine-learning algorithm to quantify public sentiment toward decisions involving federal land.

The group tasked a deep recurrent neural network with analyzing more than 750,000 remarks submitted during the 2017 public comment period for the Department of the Interior’s executive review of 27 national monuments. The review resulted in the federal government reducing the footprints of the Bears Ears and Grand Escalante national monuments in Utah. 

The Interior Department dismissed comments that were critical of the review as “a well‐orchestrated national campaign organized by multiple groups.” Using machine learning, McDonough MacKenzie’s team found that out of the comments submitted by individuals, not organizations or bots that would typically be used in campaigns, 97.4% expressed opposition toward the review. 

Their network found that out of all comments submitted during the review, 20% derived from human individuals, 11% came as form letters, or “individual comment(s) drafted by nongovernmental organizations and customized for submission by humans,” and 69% originated from bots. 

“Through machine learning, we discovered that it’s not form letter campaigns that are overshadowing individual public comments, but bots,” says McDonough MacKenzie, who also is a visiting assistant professor at Colby College. 

Developing novel materials for energy storage 

Two UMaine researchers will use AI-aided design to develop new materials for improved batteries and supercapacitors.

The research initiative led by Liping Yu, assistant professor of physics, and Yingchao Yang, assistant professor of mechanical engineering, aims to predict, synthesize and characterize a new class of 2D materials for active electrodes in batteries and supercapacitors. These 2D materials will be comprised of four or more chemical elements in nearly equal concentrations; distinct from both traditional 2D materials, which consist only of two or three elements, and conventional alloys, which contain relatively small amounts of secondary elements added to a primary element. 

The U.S. Department of Energy awarded the project $750,000 through the Established Program to Stimulate Competitive Research (EPSCoR). 

Yu’s research focuses on the theoretical and computational prediction of new materials with properties suitable for sustainable clean energy and electronic applications, such as solar cells, supercapacitors and catalysts. Yang’s research encompasses fabrication-property-application of novel materials, which includes synthesizing 1D and 2D nanomaterials through chemical vapor deposition, hydrothermal reaction, and other means; mechanics of nanomaterials in situ and ex situ investigated with micromechanical devices; and application of nanomaterials in energy harvest, energy storage and water treatment.

Existing energy storage devices experience limitations such as inadequate power, capacity, efficiency, life span and cost effectiveness, Yu says. To overcome such limits, new electrode materials are critically needed.

Explaining the findings reached 

AI helps scientists make discoveries, but not everyone can understand how it reaches its conclusions. UMaine computer scientist Chaofan Chen is developing deep neural networks that explain their findings in ways users can comprehend, and applying his work to biology, medicine and other fields. 

Interpretable machine learning, or AI that creates explanations for the findings it reaches, defines the focus of Chen’s research. The assistant professor of computer science says interpretable machine learning also allows AI to make comparisons among images and predictions from data and, at the same time, elaborate on its reasoning. 

Scientists can use interpretable machine learning for a variety of applications, from identifying birds in images for wildlife surveys to analyzing mammograms.

Before joining UMaine, Chen and research colleagues at Duke University developed machine learning architecture known as a prototypical part network (ProtoPNet) to pinpoint and categorize birds in photos, then explain its findings. The ProtoPNet would explain why the bird it identified was a bird and why it embodies a particular type of bird.

Chen has begun another AI study with colleagues and students from Duke exploring how they can apply ProtoPNet to review mammograms for signs of breast cancer. He also is investigating the possibility of integrating interpretable machine learning with environmental DNA (eDNA) applications in the hope of uncovering the connections between eDNA and environmental signals.

“I want to enhance the transparency for deep learning, and I want a deep neural network to explain why something is the way it thinks it is,” Chen says. 

read more: