Intel software guard extensions applications: A survey

NC Will, CA Maziero - ACM Computing Surveys, 2023 - dl.acm.org
Data confidentiality is a central concern in modern computer systems and services, as
sensitive data from users and companies are being increasingly delegated to such systems …

No privacy left outside: On the (in-) security of tee-shielded dnn partition for on-device ml

Z Zhang, C Gong, Y Cai, Y Yuan, B Liu… - … IEEE Symposium on …, 2024 - ieeexplore.ieee.org
On-device ML introduces new security challenges: DNN models become white-box
accessible to device users. Based on white-box information, adversaries can conduct …

Guardnn: secure accelerator architecture for privacy-preserving deep learning

W Hua, M Umar, Z Zhang, GE Suh - Proceedings of the 59th ACM/IEEE …, 2022 - dl.acm.org
This paper proposes GuardNN, a secure DNN accelerator that provides hardware-based
protection for user data and model parameters even in an untrusted environment. GuardNN …

Hypertheft: Thieving model weights from tee-shielded neural networks via ciphertext side channels

Y Yuan, Z Liu, S Deng, Y Chen, S Wang… - Proceedings of the …, 2024 - dl.acm.org
Trusted execution environments (TEEs) are widely employed to protect deep neural
networks (DNNs) from untrusted hosts (eg, hypervisors). By shielding DNNs as fully black …

Seesaw: Compensating for nonlinear reduction with linear computations for private inference

F Li, Y Zhai, S Cai, M Gao - Forty-first International Conference on …, 2024 - openreview.net
With increasingly serious data privacy concerns and strict regulations, privacy-preserving
machine learning (PPML) has emerged to securely execute machine learning tasks without …

Stealthy backdoors as compression artifacts

Y Tian, F Suya, F Xu, D Evans - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Model compression is a widely-used approach for reducing the size of deep learning
models without much accuracy loss, enabling resource-hungry models to be compressed for …

Fuzzing sgx enclaves via host program mutations

A Khan, M Zou, K Kim, D Xu, A Bianchi… - 2023 IEEE 8th …, 2023 - ieeexplore.ieee.org
Intel Software Guard eXtension (SGX) is the cornerstone of Confidential Computing,
enabling runtime code and data integrity and confidentiality via enclaves. Unfortunately …

[PDF][PDF] CipherSteal: Stealing Input Data from TEE-Shielded Neural Networks with Ciphertext Side Channels

Y Yuan, Z Liu, S Deng, Y Chen… - … on Security and …, 2024 - yuanyuan-yuan.github.io
Shielding neural networks (NNs) from untrusted hosts with Trusted Execution Environments
(TEEs) has been increasingly adopted. Nevertheless, this paper shows that the …

TEE-based decentralized recommender systems: The raw data sharing redemption

A Dhasade, N Dresevic… - 2022 IEEE …, 2022 - ieeexplore.ieee.org
Recommenders are central in many applications today. The most effective recommendation
schemes, such as those based on collaborative filtering (CF), exploit similarities between …

GuardNN: Secure Accelerator Architecture for Privacy-Preserving Deep Learning

W Hua, M Umar, Z Zhang, GE Suh - arxiv preprint arxiv:2008.11632, 2020 - arxiv.org
This paper proposes GuardNN, a secure DNN accelerator that provides hardware-based
protection for user data and model parameters even in an untrusted environment. GuardNN …