NDSS 2025 – ASGARD
嗯,用户让我总结一篇文章的内容,控制在一百个字以内,而且不需要用“文章内容总结”之类的开头。首先,我需要仔细阅读用户提供的文章内容。 这篇文章是关于ASGARD的研究,它是一种保护设备端深度神经网络的方法。作者们使用虚拟化技术来创建可信执行环境,目的是在不修改现有软件的情况下保护模型。他们还提到了ASGARD如何扩展TEE边界、控制TCB大小以及减少退出次数,从而降低延迟。 接下来,我需要提取关键点:ASGARD、虚拟化、TEE、设备端DNN、隐私保护、兼容性、低开销。然后把这些点用简洁的语言组织起来,确保不超过一百个字。 可能会遇到的问题是如何在有限的字数内涵盖所有重要信息。我需要优先选择最重要的功能和优势,比如兼容性、低开销和高效保护。 最后,检查一下是否符合用户的要求:中文总结,直接描述内容,没有多余的开头。 </think> ASGARD是一种基于虚拟化的可信执行环境解决方案,旨在保护设备端深度神经网络的隐私。通过扩展TEE边界、优化TCB大小及减少运行时开销,ASGARD在不修改现有软件的情况下实现高效模型防护。 2026-1-19 20:0:0 Author: securityboulevard.com(查看原文) 阅读量:0 收藏

Session 9B: DNN Attack Surfaces

Authors, Creators & Presenters: Myungsuk Moon (Yonsei University), Minhee Kim (Yonsei University), Joonkyo Jung (Yonsei University), Dokyung Song (Yonsei University)
PAPER
ASGARD: Protecting On-Device Deep Neural Networks with Virtualization-Based Trusted Execution Environments
On-device deep learning, increasingly popular for enhancing user privacy, now poses a serious risk to the privacy of deep neural network (DNN) models. Researchers have proposed to leverage Arm TrustZone’s trusted execution environment (TEE) to protect models from attacks originating in the rich execution environment (REE). Existing solutions, however, fall short: (i) those that fully contain DNN inference within a TEE either support inference on CPUs only, or require substantial modifications to closed-source proprietary software for incorporating accelerators; (ii) those that offload part of DNN inference to the REE either leave a portion of DNNs unprotected, or incur large run-time overheads due to frequent model (de)obfuscation and TEE-to-REE exits. We present ASGARD, the first virtualization-based TEE solution designed to protect on-device DNNs on legacy Armv8-A SoCs. Unlike prior work that uses TrustZone-based TEEs for model protection, ASGARD’s TEEs remain compatible with existing proprietary software, maintain the trusted computing base (TCB) minimal, and incur near-zero run-time overhead. To this end, ASGARD (i) securely extends the boundaries of an existing TEE to incorporate an SoC-integrated accelerator via secure I/O passthrough, (ii) tightly controls the size of the TCB via our aggressive yet security-preserving platform- and application-level TCB debloating techniques, and (iii) mitigates the number of costly TEE-to-REE exits via our exit-coalescing DNN execution planning. We implemented ASGARD on RK3588S, an Armv8.2-A-based commodity Android platform equipped with a Rockchip NPU, without modifying Rockchip- nor Arm-proprietary software. Our evaluation demonstrates that ASGARD effectively protects on-device DNNs in legacy SoCs with a minimal TCB size and negligible inference latency overhead.
ABOUT NDSS
The Network and Distributed System Security Symposium (NDSS) fosters information exchange among researchers and practitioners of network and distributed system security. The target audience includes those interested in practical aspects of network and distributed system security, with a focus on actual system design and implementation. A major goal is to encourage and enable the Internet community to apply, deploy, and advance the state of available security technologies.

Our thanks to the Network and Distributed System Security (NDSS) Symposium for publishing their Creators, Authors and Presenter’s superb NDSS Symposium 2025 Conference content on the Organizations’ YouTube Channel.

Permalink

*** This is a Security Bloggers Network syndicated blog from Infosecurity.US authored by Marc Handelman. Read the original post at: https://www.youtube-nocookie.com/embed/6W0z1Gp7VMY?si=FSIAQMbOY2zY8UvW


文章来源: https://securityboulevard.com/2026/01/ndss-2025-asgard/
如有侵权请联系:admin#unsafe.sh