Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes
Deep neural networks (DNNs) have useful applications in machine learning tasks involving recognition and pattern analysis. Despite the favorable applications of DNNs, these systems can be exploited by adversarial examples. An adversarial example, which is created by adding a small amount of noise to...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2019
|
Materias: | |
Acceso en línea: | https://doaj.org/article/945f11f2fb4b42cb9e89bca0ee090bf1 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:945f11f2fb4b42cb9e89bca0ee090bf1 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:945f11f2fb4b42cb9e89bca0ee090bf12021-11-19T00:02:58ZSelective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes2169-353610.1109/ACCESS.2019.2920410https://doaj.org/article/945f11f2fb4b42cb9e89bca0ee090bf12019-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/8727886/https://doaj.org/toc/2169-3536Deep neural networks (DNNs) have useful applications in machine learning tasks involving recognition and pattern analysis. Despite the favorable applications of DNNs, these systems can be exploited by adversarial examples. An adversarial example, which is created by adding a small amount of noise to an original sample, can cause misclassification by the DNN. Under specific circumstances, it may be necessary to create a selective untargeted adversarial example that will not be classified as certain avoided classes. Such is the case, for example, if a modified tank cover can cause misclassification by a DNN, but the bandit equipped with the DNN must misclassify the modified tank as a class other than certain avoided classes, such as a tank, armored vehicle, or self-propelled gun. That is, selective untargeted adversarial examples are needed that will not be perceived as certain classes, such as tanks, armored vehicles, or self-propelled guns. In this study, we propose a selective untargeted adversarial example that exhibits 100% attack success with minimum distortions. The proposed scheme creates a selective untargeted adversarial example that will not be classified as certain avoided classes while minimizing distortions in the original sample. To generate untargeted adversarial examples, a transformation is performed to minimize the probability of certain avoided classes and distortions in the original sample. As experimental datasets, we used MNIST and CIFAR-10, including the Tensorflow library. The experimental results demonstrate that the proposed scheme creates a selective untargeted adversarial example that exhibits 100% attack success with minimum distortions (1.325 and 34.762 for MNIST and CIFAR-10, respectively).Hyun KwonYongchul KimHyunsoo YoonDaeseon ChoiIEEEarticleMachine learningadversarial exampledeep neural network (DNN)avoided classesElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 7, Pp 73493-73503 (2019) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Machine learning adversarial example deep neural network (DNN) avoided classes Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Machine learning adversarial example deep neural network (DNN) avoided classes Electrical engineering. Electronics. Nuclear engineering TK1-9971 Hyun Kwon Yongchul Kim Hyunsoo Yoon Daeseon Choi Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
description |
Deep neural networks (DNNs) have useful applications in machine learning tasks involving recognition and pattern analysis. Despite the favorable applications of DNNs, these systems can be exploited by adversarial examples. An adversarial example, which is created by adding a small amount of noise to an original sample, can cause misclassification by the DNN. Under specific circumstances, it may be necessary to create a selective untargeted adversarial example that will not be classified as certain avoided classes. Such is the case, for example, if a modified tank cover can cause misclassification by a DNN, but the bandit equipped with the DNN must misclassify the modified tank as a class other than certain avoided classes, such as a tank, armored vehicle, or self-propelled gun. That is, selective untargeted adversarial examples are needed that will not be perceived as certain classes, such as tanks, armored vehicles, or self-propelled guns. In this study, we propose a selective untargeted adversarial example that exhibits 100% attack success with minimum distortions. The proposed scheme creates a selective untargeted adversarial example that will not be classified as certain avoided classes while minimizing distortions in the original sample. To generate untargeted adversarial examples, a transformation is performed to minimize the probability of certain avoided classes and distortions in the original sample. As experimental datasets, we used MNIST and CIFAR-10, including the Tensorflow library. The experimental results demonstrate that the proposed scheme creates a selective untargeted adversarial example that exhibits 100% attack success with minimum distortions (1.325 and 34.762 for MNIST and CIFAR-10, respectively). |
format |
article |
author |
Hyun Kwon Yongchul Kim Hyunsoo Yoon Daeseon Choi |
author_facet |
Hyun Kwon Yongchul Kim Hyunsoo Yoon Daeseon Choi |
author_sort |
Hyun Kwon |
title |
Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
title_short |
Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
title_full |
Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
title_fullStr |
Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
title_full_unstemmed |
Selective Untargeted Evasion Attack: An Adversarial Example That Will Not Be Classified as Certain Avoided Classes |
title_sort |
selective untargeted evasion attack: an adversarial example that will not be classified as certain avoided classes |
publisher |
IEEE |
publishDate |
2019 |
url |
https://doaj.org/article/945f11f2fb4b42cb9e89bca0ee090bf1 |
work_keys_str_mv |
AT hyunkwon selectiveuntargetedevasionattackanadversarialexamplethatwillnotbeclassifiedascertainavoidedclasses AT yongchulkim selectiveuntargetedevasionattackanadversarialexamplethatwillnotbeclassifiedascertainavoidedclasses AT hyunsooyoon selectiveuntargetedevasionattackanadversarialexamplethatwillnotbeclassifiedascertainavoidedclasses AT daeseonchoi selectiveuntargetedevasionattackanadversarialexamplethatwillnotbeclassifiedascertainavoidedclasses |
_version_ |
1718420680303706112 |