Faculty mentor: Rajeev Balasubramonian
The Transformer is a popular deep neural network model specialized for natural language processing. Like many deep neural networks, the Transformer is composed of hundreds of millions of parameters that makes it favorable to undergo Deep Compression techniques and be able to be deployed on mobile devices. My research involves employing quantization and pruning techniques to reduce the precision of parameters as well as the number of parameters. I do plan on continuing this research through April
Click below to hear me present my poster!
Click on the social media icons above to share this page.
Questions or comments? Contact me at: email@example.com