Oct 13, 2024 | The code of our MobiCom 2024 paper “FlexNN: Efficient and Adaptive DNN Inference on Memory-Constrained Edge Devices” is now available on GitHub. [code] |
Jun 02, 2024 | Our MobiCom 2024 paper “FlexNN: Efficient and Adaptive DNN Inference on Memory-Constrained Edge Devices” is now available in the ACM Digital Library. [pdf] |
Apr 28, 2024 | Our MobiCom 2024 paper “FlexNN: Efficient and Adaptive DNN Inference on Memory-Constrained Edge Devices” has been awarded all the 4 badges: “Artifacts Available”, “Artifacts Evaluated - Functional”, “Artifacts Evaluated - Reusable”, and “Results Replicated” in MobiCom 2024 Artifact Evaluation! |
Jan 10, 2024 | Our position & survey paper on Mobile LLM Agents “Personal LLM Agents: Insights and Survey about the Capability, Efficiency and Security” is released. [arXiv] [GitHub] [机器之心] |
Nov 22, 2023 | Our paper “FlexNN: Efficient and Adaptive DNN Inference on Memory-Constrained Edge Devices” is conditionally accepted by MobiCom 2024. Thanks to all the coauthors: Yuanchun Li, Yuanzhe Li, Ting Cao and Yunxin Liu! |