Want to speak with someone?

Call (734) 763-0818, stop by Chrysler 117, or

email engin-mdp@umich.edu with questions.

This start-up faculty research team will develop software-based tools to facilitate diagnosis and interpretation of cancer image data for low resource global settings.
Honda Research and Development is leading advancements in the area of mobility, power units, energy, and robotics. The students on the Honda team will work to develop simultaneous localization and mapping algorithms for a small Parallax robot to identify and navigate 2D planes using a time-offlight camera.
Whirlpool is looking at advanced controls technologies including machine learning, neural networks, and model predictive controllers that will more efficiently cool complex refrigerator systems. Students on this team will evaluate current capabilities on the Python framework that will be fast prototyped and applied in a simulation environment with a conceptual plant model.
This team will make Korean Art Song (Gagok) more accessible to English speaking students by finding Korean composed song scores, creating English translations, phoneticizations and spoken recordings of song texts, and organizing these materials into an accessible database.
The team will explore how pervasive technologies are mediating the way people interact with their cities. The project seeks to make visible and transparent the complex yet critical issues around the use of computer vision and artificial intelligence (as in controversial programs like Detroit’s Project Greenlight and New York’s LinkNYC systems) in public and urban spaces as we build citizen-engaged, physical installations and interventions.
The goal of this project is to explore methods of incorporating visual communication of effort, gesture, and movement into telematic performance without video transmission. Practical experiments with different sensing techniques, including infrared motion capture, inertial measurement, electromyography, and force sensing will be coupled with novel digitally fabricated mechatronic displays.
Following the inspiration of the meteorology community and Weather Underground that connected backyard weather STATSions into the global weather system, this student team will deploy magnetometers and other sensors everywhere to make a dense distributed array to enable new science and understanding of the Earth’s space environment.
This HAPLAB project aims to understand the relationship between the quality of breathing and exceptional performance. We will use data visualization, sonification, and/or visceralization to communicate breathing data back to musical performers.
This project will enable a team of students to learn about environmental sensors and data, specifically around water and watersheds, and create tools and technologies with that data that inform and empower community stakeholders.
This project team will print, patent and market a trio of 3-D polymer objects, representing the Lung/Diaphragm simulator, a polymer tongue, and voice box/vocal folds simulator, made available in a "toolbox" for artists, academics, and physicians.
Critical Improvisation Studies investigates processes related to problem solving, innovation, decision making, interaction, organization, and artistry in fields and projects such as self-driving cars, the Mars Rover, farming, machine learning, comedy, video game design, artistic installation and performance, management, design, architecture, and urban planning. This team will develop new ideas about improvisation by collaborating across these and other disciplines.
Collaborators and conspirators on this team will play with the structure, philosophy and dance of multiple forms of language, define language and its use in multiple ways, and discover how it can be activated, (de)constructed and deciphered in relationship to effort, shape, time and space.
This team will enable the architecture student to translate and test spatial ideas in the design process through immersive technologies using point clouds generated from photogrammetry and LiDAR. In addition to scanning and photogrammetry, this team will test design methodologies (experimenting with VFX and VR), create templates for workflow documentation, and establish a database for site scans and student projects.
The project is called LuCelegans (Luce: light in latin; Light-up C. elegans), or the Interactive Worm Project. It is about building the first interactive, physical, 3-dimensional prototype of C. elegans nervous system through the efforts of a student research team.