Intel software guard extensions applications: A survey
Data confidentiality is a central concern in modern computer systems and services, as
sensitive data from users and companies are being increasingly delegated to such systems …
sensitive data from users and companies are being increasingly delegated to such systems …
No privacy left outside: On the (in-) security of tee-shielded dnn partition for on-device ml
On-device ML introduces new security challenges: DNN models become white-box
accessible to device users. Based on white-box information, adversaries can conduct …
accessible to device users. Based on white-box information, adversaries can conduct …
Fuzzing sgx enclaves via host program mutations
Intel Software Guard eXtension (SGX) is the cornerstone of Confidential Computing,
enabling runtime code and data integrity and confidentiality via enclaves. Unfortunately …
enabling runtime code and data integrity and confidentiality via enclaves. Unfortunately …
[PDF][PDF] Does federated learning really need backpropagation
Federated learning (FL) is a general principle for decentralized clients to train a server
model collectively without sharing local data. FL is a promising framework with practical …
model collectively without sharing local data. FL is a promising framework with practical …
Guardnn: secure accelerator architecture for privacy-preserving deep learning
This paper proposes GuardNN, a secure DNN accelerator that provides hardware-based
protection for user data and model parameters even in an untrusted environment. GuardNN …
protection for user data and model parameters even in an untrusted environment. GuardNN …
HyperTheft: Thieving Model Weights from TEE-Shielded Neural Networks via Ciphertext Side Channels
Trusted execution environments (TEEs) are widely employed to protect deep neural
networks (DNNs) from untrusted hosts (eg, hypervisors). By shielding DNNs as fully black …
networks (DNNs) from untrusted hosts (eg, hypervisors). By shielding DNNs as fully black …
[PDF][PDF] CipherSteal: Stealing Input Data from TEE-Shielded Neural Networks with Ciphertext Side Channels
Y Yuan, Z Liu, S Deng, Y Chen… - … on Security and …, 2024 - yuanyuan-yuan.github.io
Shielding neural networks (NNs) from untrusted hosts with Trusted Execution Environments
(TEEs) has been increasingly adopted. Nevertheless, this paper shows that the …
(TEEs) has been increasingly adopted. Nevertheless, this paper shows that the …
Stealthy backdoors as compression artifacts
Model compression is a widely-used approach for reducing the size of deep learning
models without much accuracy loss, enabling resource-hungry models to be compressed for …
models without much accuracy loss, enabling resource-hungry models to be compressed for …
TEE-based decentralized recommender systems: The raw data sharing redemption
Recommenders are central in many applications today. The most effective recommendation
schemes, such as those based on collaborative filtering (CF), exploit similarities between …
schemes, such as those based on collaborative filtering (CF), exploit similarities between …
GuardNN: Secure Accelerator Architecture for Privacy-Preserving Deep Learning
This paper proposes GuardNN, a secure DNN accelerator that provides hardware-based
protection for user data and model parameters even in an untrusted environment. GuardNN …
protection for user data and model parameters even in an untrusted environment. GuardNN …