Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Books and Book parts
  4. Adversarial Evasion on LLMs
 
book part or chapter

Adversarial Evasion on LLMs

Guerraoui, Rachid  
•
Pinot, Rafael
January 1, 2024
Large Language Models in Cybersecurity: Threats, Exposure and Mitigation

While Machine Learning (ML) applications have shown impressive achievements in tasks such as computer vision, NLP, and control problems, such achievements were possible, first and foremost, in the best-case-scenario setting. Unfortunately, settings where ML applications fail unexpectedly, abound, and malicious ML application users or data contributors can trigger such failures. This problem became known as adversarial example robustness. While this field is in rapid development, some fundamental results have been uncovered, allowing some insight into how to make ML methods resilient to input and data poisoning. Such ML applications are termed adversarially robust. While the current generation of LLMs is not adversarially robust, results obtained in other branches of ML can provide insight into how to make them adversarially robust. Such insight would complement and augment ongoing empirical efforts in the same direction (red-teaming).

  • Files
  • Details
  • Metrics
Type
book part or chapter
DOI
10.1007/978-3-031-54827-7_20
Scopus ID

2-s2.0-85207221382

Author(s)
Guerraoui, Rachid  

École Polytechnique Fédérale de Lausanne

Pinot, Rafael

Sorbonne Université

Date Issued

2024-01-01

Publisher

Springer Nature

Published in
Large Language Models in Cybersecurity: Threats, Exposure and Mitigation
DOI of the book
10.1007/978-3-031-54827-7
ISBN of the book

9783031548277

9783031548260

Start page

181

End page

188

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
DCL  
Available on Infoscience
January 27, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/245325
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés