<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=koi8-r">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Aptos;}
/* Style Definitions */
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#467886;
text-decoration:underline;}
span.EmailStyle20
{mso-style-type:personal-compose;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;
mso-ligatures:none;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style>
</head>
<body lang="EN-US" link="#467886" vlink="#96607D" style="word-wrap:break-word">
<div class="WordSection1">
<p>CV attached. <o:p></o:p></p>
<p><strong><span style="font-family:"Aptos",sans-serif">Title: Building Predictable and Efficient Autonomous Systems: A Cross-Layer Approach</span></strong><o:p></o:p></p>
<p><strong><span style="font-family:"Aptos",sans-serif">Abstract:</span></strong> Autonomous systems increasingly rely on DNN-based perception, but real-world deployment is constrained by unpredictable inference-time variability and inefficient end-to-end use
of compute and communication resources. In this talk, I present a <strong><span style="font-family:"Aptos",sans-serif">cross-layer</span></strong> approach—spanning model design, runtime scheduling, and sensing/communication co-design—to build autonomy that
is both <strong><span style="font-family:"Aptos",sans-serif">predictable</span></strong> and
<strong><span style="font-family:"Aptos",sans-serif">efficient</span></strong>. I will introduce
<strong><span style="font-family:"Aptos",sans-serif">Prophet</span></strong>, which diagnoses root causes of latency variance and bounds tail latency through deadline-aware early exits and multi-task coordination. I will then present
<strong><span style="font-family:"Aptos",sans-serif">RT-BEV</span></strong>, which co-designs communication and BEV detection to accelerate perception while preserving accuracy under real-time constraints. I will conclude with actionable design takeaways and
a research roadmap for future AI-enabled autonomous systems.<o:p></o:p></p>
<p><strong><span style="font-family:"Aptos",sans-serif">Short bio:</span></strong> Liangkai Liu is a Research Fellow in the Department of Computer Science and Engineering at the University of Michigan, working with Prof. Kang G. Shin. He received his Ph.D.
in Computer Science from Wayne State University, advised by Prof. Weisong Shi. His research focuses on AI-enabled computing systems for autonomous driving, robotics, cyber-physical systems, and edge computing, with publications in venues including RTSS, RTAS,
ICCAD, DAC, ICRA, SEC, HotEdge, RA-L, IEEE T-ITS, IoTJ, and IEEE Wireless Communications. He has studied inference-time variability and predictability in AV perception systems, contributing to an NSF award on predictable multi-tenant DNN inference where he
serves as a Co-PI. He has also developed autonomous robotics testbeds (e.g., HydraOne and Donkey) and investigated system-level efficiency in real-world autonomy deployments. More information is available at
<a href="https://nam12.safelinks.protection.outlook.com/?url=https%3A%2F%2Fliangkai.org%2F&data=05%7C02%7Ccosc-grad-students-list%40listserv.tamucc.edu%7Ceea0157743894032584808de6372fae1%7C34cbfaf167a64781a9ca514eb2550b66%7C0%7C0%7C639057542816046816%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=3iEiwZgJgILmrCgwctM6e0dIVJWXYIsxXreYbLD7VRo%3D&reserved=0" originalsrc="https://liangkai.org/" target="_blank">
https://liangkai.org</a>.<o:p></o:p></p>
<p>€€€€€€€€€€<br>
Menda Eulenfeld is inviting you to a scheduled Zoom meeting.<br>
Join Zoom Meeting<br>
<a href="https://nam12.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftamucc.zoom.us%2Fj%2F97494655554%3Fpwd%3DcwK6YJs2o0HufXcx9Hse21aLHNx9KI.1&data=05%7C02%7Ccosc-grad-students-list%40listserv.tamucc.edu%7Ceea0157743894032584808de6372fae1%7C34cbfaf167a64781a9ca514eb2550b66%7C0%7C0%7C639057542816078422%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=XaOw1nD%2BtlIoRYQ5%2FyuU7p4v5WM5Rjl1UagNwfncOys%3D&reserved=0" originalsrc="https://tamucc.zoom.us/j/97494655554?pwd=cwK6YJs2o0HufXcx9Hse21aLHNx9KI.1">https://tamucc.zoom.us/j/97494655554?pwd=cwK6YJs2o0HufXcx9Hse21aLHNx9KI.1</a><br>
<br>
<br>
Meeting ID: 974 9465 5554<br>
Passcode: 896048<br>
<br>
---<br>
<br>
One tap mobile<br>
+13462487799,,97494655554#,,,,*896048# US (Houston)<br>
+12532158782,,97494655554#,,,,*896048# US (Tacoma)<br>
<br>
---<br>
<br>
Join by SIP<br>
• <a href="mailto:97494655554@zoomcrc.com">97494655554@zoomcrc.com</a><br>
Passcode: 896048<br>
<br>
Join instructions<br>
<a href="https://nam12.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftamucc.zoom.us%2Fmeetings%2F97494655554%2Finvitations%3Fsignature%3DQyP0ViEOoclmT5_srsa72yVoUCvrLHGRITAq97qktA0&data=05%7C02%7Ccosc-grad-students-list%40listserv.tamucc.edu%7Ceea0157743894032584808de6372fae1%7C34cbfaf167a64781a9ca514eb2550b66%7C0%7C0%7C639057542816093258%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=GpJTHB5LzylNeKbBx1PjlHGh6xFXKM%2BZKe130TzTX70%3D&reserved=0" originalsrc="https://tamucc.zoom.us/meetings/97494655554/invitations?signature=QyP0ViEOoclmT5_srsa72yVoUCvrLHGRITAq97qktA0">https://tamucc.zoom.us/meetings/97494655554/invitations?signature=QyP0ViEOoclmT5_srsa72yVoUCvrLHGRITAq97qktA0</a><br>
<br>
<br>
€€€€€€€€€€<o:p></o:p></p>
</div>
</body>
</html>