Primary Menu

Education, Presentation, Publication

Funding & Recognition

Knowledge Distillation For Reduction of User Input Burden in Medical Image Segmentation

Semester: Spring 2025


Presentation description

Medical image segmentation requires inputting multiple user prompts, such as points, bounding boxes, and scribbles. When segmenting such medical images, this can often times be very time consuming as the amount of needed user input increases. Knowledge distillation helps in this, which is a technique that transfers knowledge from a larger, more complex model (the teacher) to a smaller, more efficient model. This in turn reduces the amount of user input burden required for accurate segmentation.

Presenter Name: Dhruv Rachakonda
Presentation Type: Poster
Presentation Format: In Person
Presentation #15C
College: Engineering
School / Department: School of Computing
Research Mentor: Shireen Elhabian
Time: 1:00 PM
Physical Location or Zoom link:

Union Ballroom