Infoscience

Conference paper

Can DSA be improved? Complexity trade-offs with the Digital Signature Standard

The Digital Signature Algorithm (DSA) was proposed in 1991 by the US National Institute of Standards and Technology to provide an appropriate core for applications requiring digital signatures. Undoubtedly, many applications will include this standard in the future and thus, the foreseen domination of DSA as a legal certification tool is sufficiently important to focus research endeavours on the suitability of this scheme to various situations. In this paper, we present six new DSA-based protocols for: 1. Performing a quick batch-verification of n signatures. The proposed scheme allows the economy of ≈ 450n modular multiplications. 2. Avoiding the cumbersome calculation of 1/k mod q by the signer. 3. Compressing sets of DSA transactions into shorter archive signatures. Generating signatures from pre-calculated “use & throw” 224-bit signature-coupons. 4. Self-certifying the moduli and bit-patterning directly q on p (gain of 60.4% in key size). All our schemes combine in a natural way full DSA compatibility and flexible trade-offs between computational complexity, transmission overheads and key sizes

    Keywords: computational complexity;cryptography;data compression;digital arithmetic;protocols;telecommunication standards;

    Note:

    complexity trade-offs;DSA;Digital Signature Standard;Digital Signature Algorithm;US National Institute of Standards and Technology;legal certification tool;DSA-based protocols;quick batch-verification;modular multiplications;DSA transactions;archive signatures;signature-coupons;bit-patterning;computational complexity;transmission overheads;key sizes;

    Reference

    • LASEC-CONF-1995-002

    Record created on 2007-01-18, modified on 2016-08-08

Fulltext

  • There is no available fulltext. Please contact the lab or the authors.

Related material