Title
On the Adversarial Robustness of Linear Regression
Abstract
In this paper, we study the adversarial robustness of linear regression problems. Specifically, we investigate the robustness of the regression coefficients against adversarial data samples. In the considered model, there exists an adversary who is able to add one carefully designed adversarial data sample into the dataset. By leveraging this poisoned data sample, the adversary tries to boost or depress the magnitude of one targeted regression coefficient under the energy constraint of the adversarial data sample. We characterize the exact expression of the optimal adversarial data sample in terms of the targeted regression coefficient, the original dataset and the energy budget. Our experiments with synthetic and real datasets show the efficiency and optimality of our proposed adversarial strategy.
Year
DOI
Venue
2020
10.1109/MLSP49062.2020.9231839
2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)
Keywords
DocType
ISSN
Adversarial robustness,linear regression,poisoning attack,non-convex optimization
Conference
1551-2541
ISBN
Citations 
PageRank 
978-1-7281-6663-6
0
0.34
References 
Authors
3
3
Name
Order
Citations
PageRank
Fuwei Li1243.14
Lifeng Lai22289167.78
Shuguang Cui352154.46