Cryptology is the procedure of writing by means of a variety of methods to keep messages secret and includes communications security and communications intelligence. The cryptologic (code making and code breaking) and intelligence services provide information to both tactical forces and Navy commanders. Shore-based intellect and cryptologic operations engage the compilation, handing out, analysis, and reporting of information from a lot of sources, from communications intelligence to human intelligence. This information is used to assess threats to the Navy and to the protection of the United States. Tactical intelligence, more often than not provided by ships, submarines, and aircraft, gives combat commanders indications and warning of impending opponent activity and assessments of ongoing hostile activity and capabilities.

The start of the 21st century is a golden age for applications of mathematics in cryptology. The early stages of this age can be traced to the work of Rejewski, Rozycki, and Zygalski on breaking mystery. Their employment was a breach in more than a few ways. It made a marvelous realistic input to the conduct of Word War II. At the same time, it represented a major increase in the sophistication of the mathematical tools that were used. Ever since, mathematics has been playing a progressively more important role in cryptology.

This has been the result of the dense relationships of mathematics, cryptology, and technology, relationships that have been developing for a long time. At the same time as codes and ciphers go back thousands of years, systematic study of them dates back only to the Renaissance. Such study was stimulated by the rapid growth of written communications and the associated postal systems, as well as by the political fragmentation in Europe. In the 19th century, the electric telegraph provided an additional spur to the development of cryptology.

The major impetus, despite the fact that, appears to have come with the appearance of radio communication at the beginning of the 20th century. This technical development led to growth of military, diplomatic, and commercial traffic that was open to non-intrusive interception by friend or foe alike. The need to protect such traffic, from interception was obvious, and led to the search for improved codes and ciphers. These, in turn, stimulated the development of cryptanalytic methods, which then led to development of better cryptosystems, in an endless cycle. What systems were built has always depended on what was known about their security, and also on the technology that was available.

Amid the two world wars, the need for encrypting and decrypting ever-greater volumes of information dependably and steadily, combined with the accessible electromechanical technology, led many cryptosystem designers towards rotor system. Yet, as Rejewski, Rozycki, and Zygalski showed, the operations of rotor machines created enough regularities to enable effective cryptanalysis through mathematical techniques. This was yet another instance of what Eugene Wigner has called the “unreasonable effectiveness of mathematics,” in which techniques developed for abstract purposes turn out to be surprisingly well-suited for real applications.

The sophistication of mathematical techniques in cryptography continued increasing after World War II, when attention shifted to cryptosystems based on shift register sequences. A quantum jump occurred in the 1970s, with the invention of public key cryptography. This invention was itself stimulated by technological developments, primarily the growth in information processing and transmission. This growth was leading to explosive increases in the volume of electronic transactions, increases that show no signs of tapering off even today, a quarter century later.

The large and assorted populations of users that were foreseen in developing civilian settings were leading to problems, such as key management and digital signatures that previously had not been as severe in smaller and more tightly controlled military and political communications. At the same time, developments in technology were offering unprecedented possibilities for implementing complicated algorithms. Mathematics again turned out to provide the tools that were used to meet the challenge.

The public key schemes that were invented in the 1970s used primarily tools from classical number theory. Yet as time went on, the range of applicable mathematics grew. Technology continued improving, but in uneven ways. For example, while general computing power of a personal computer grew explosively, there was also a proliferation of small, especially wireless devices, which continued to have stringent power and bandwidth limitations. This put renewed emphasis on finding cryptosystems that were thrifty with both computation and transmission.

At the same time, there was growth in theoretical knowledge, which led to breaking of numerous systems, and required increases in key sizes of even well trusted schemes such as RSA. The outcome of the developments in technology and science is that today we are witnessing explosive growth in applications of sophisticated mathematics in cryptology. This volume is a collection of both surveys and original research papers that illustrate well the interactions of public key cryptography and computational number theory.

Some of the systems discussed here are based on algebra, others on lattices, yet others on combinatorial concepts. There are also some number theoretic results that have not been applied to cryptography yet, but may be in the future. The diversity of techniques and results in this volume does show that mathematics, even mathematics that was developed for its own sake, is helping solve important problems of our modern society. At the same time, mathematics is drawing valuable inspiration from the practical problems that cryptology poses.

The recent breakthrough discovery of public key cryptography has been one (but not the only) contributor to a dramatic increase in the sophistication and elegance of the mathematics used in cryptology. Coding theory enables the reliable transmission and storage of data. Thanks to coding theory, despite dramatic increases in the rates and volumes of bits transmitted and the number of bits stored in computers or household appliances, we are able to operate confidently under the assumption that every one of these bits is exactly what it is supposed to be. Often they are not, of course, and the errors would be catastrophic were it not for the superbly efficient detection and correction algorithms clever coding theorists have created.

Although a number of incessant mathematics has been employed (notably, probability theory), the bulk of the mathematics involved is discrete mathematics. Nevertheless, in spite of the strong demonstration that cryptology and coding theory provide, there is little understanding or recognition in the mainstream mathematics community of the importance of discrete mathematics to the information society. The core problems in applied mathematics after World War II (e.g., understanding shock waves) involved continuous mathematics, and the composition of most applied mathematics departments today reflects that legacy.

The increasing role of discrete mathematics has affected even the bastions of the “old” applied mathematics, such as the aircraft manufacturers, where information systems that allow design engineers to work on a common electronic blueprint have had a dramatic effect on design cycles. In the meantime, mathematics departments seem insulated from the need to evolve their research program as they carry on providing service teaching of calculus to captive populations of engineering students.

However, the needs of these students are changing. As mathematicians continue to work in narrow areas of specialization, they may be unaware of these trends and the appealing mathematical research topics that are most strongly connected to current needs arising from the explosion in information technology. Indeed, a great deal of important and interesting mathematics research is being done outside of mathematics departments. (This applies even to traditional applied mathematics, PDE’s and the like, where, as just one example, modeling has been neglected.)

In the history of cryptology and coding theory, mathematicians as well as mathematics have played an important role. Sometimes they have employed their considerable problem-solving skills in direct assaults on the problems, working so closely with engineers and computer scientists that it would be difficult to tell the subject matter origins apart. Sometimes mathematicians have formalized parts of the problem being worked, introducing new or classical mathematical frameworks to help understand and solve the problem.

Sophisticated theoretical treatments of these subjects (e.g., complexity theory in cryptology) have been very helpful in solving concrete problems. The probable for theory to have bottom-line impact seems even greater today. One panelist opined, “This is a time that cries out for top academicians to join us in developing the theoretical foundations of the subject. We have lots of little results that seem to be part of a bigger pattern, and we need to understand the bigger picture in order to move forward.” However, unfortunately, the present period is not one in which research mathematicians are breaking down doors to work on these problems.

Mathematicians are without a doubt needed to generate mathematics. It is less clear that they are indispensable to its application. One panelist pointed out that there are many brilliant engineers and computer scientists who understand thoroughly not only the problems but also the mathematics and the mathematical analysis needed to solve them. “It’s up to the mathematics community,” he continued, “to choose whether it is going to try to play or whether it is going to exist on the scientific margins.

The situation is similar to the boundary where physics and mathematics meet and mathematicians are scrambling to follow where Witten and Seiberg have led.” Another panelist disagreed, believing it highly desirable, if not necessary, to interest research mathematicians in application problems. “When we bring in (academic research) mathematicians as consultants to work on our problems, we don’t expect them to have the same bottom-line impact as our permanent staff, because they will not have adequate knowledge of system issues.

However, in their effort to understand our problems and apply to them the mathematics with which they are familiar, they often make some unusual attack on the problem or propose some use of a mathematical construct we had never considered. After several years and lots of honing of the mathematical construct by our ‘applied mathematicians,’ we find ourselves in possession of a powerful and effective mathematical tool.”

During the late 1970s, a small group of bright educational cryptographers proposed a series of elegant schemes through which secret messages could be sent without relying on secret variables (key) shared by the encipherer and the decipherer, secrets the maintenance of which depended upon physical security, which in the past has been often compromised. Instead, in these “public key” schemes, the message recipient published for all to see a set of (public) variables to be used by the message sender in such a way that messages sent could be read only by the intended recipient. (At least, the public key cryptographers hoped that was the case!)

It is no exaggeration to say that public key cryptography was a breakthrough “of monumental proportions,” as big a surprise to those who had relied on conventional cryptography in the sixties as television was to the public in the fifties. Breaking these “public key” ciphers requires, or seems to require, solutions to well-formulated mathematical problems believed to be difficult to solve. One of the earliest popular schemes depended on the solution of a certain “knapsack” problem (given a set of integers and a value, find a subset whose constituents sum to that value).

This general problem was thought to be hard (known to be NP- complete), but a flurry of cryptanalytic activity discovered a way to bypass the NP-complete problem, take advantage of the special conditions of the cryptographic implementation and break the scheme, first by using H. Lenstra’s integer programming algorithm, next using continued fractions, later and more effectively by utilizing a lattice basis reduction algorithm due to Lenstra, Lenstra and Lovasz.

Although many instantiations of public key cryptographies have been proposed since their original discovery, current cryptographic implementers seem to be placing many of their eggs in two baskets: one scheme (Rivest-Shamir-Adleman, RSA), whose solution is related to the conjectured difficulty of factoring integers, the second, (Diffie-Hellman, DH), which is related to the conjectured difficulty of solving the discrete logarithm problem (DLP) in a group. The discrete logarithm problem in a group G, analogous to the calculation of real logarithms, requires determination of n, given g and h in G , so that gn = h.

Each of the past three decades has seen momentous improvements in attacking these schemes, although there has not yet been the massive breakthrough (as predicted in the movie “Sneakers”) that would send cryptographers back to the drawing boards. The nature of these attacks leads some to suspect that we may have most of our eggs in one basket, as most improvements against RSA seems to correspond to an analogous idea that works against the most common instantiations of DH (when the group is the multiplicative group of a finite field or a large subgroup of prime order of the multiplicative group) and vice versa.

Asymptotic costs to attack each scheme, although each has declined as a consequence of new algorithms, continue to be comparable. These innovative algorithms, along with improvements in computational power, have forced the use of larger and larger key sizes (with the credit for the increase split about equally linking mathematics and technology). As a result, the computations to implement RSA or DH securely have been steadily increasing.Recently, there has been interest in utilizing the elliptic curve group in schemes based on DLP, with the hope that the (index calculus) weaknesses that have been uncovered in the use of more traditional groups will not be found.

It is believed, and widely marketed, that DLP in the group of points of non-super singular elliptic curves of genus one over finite fields does not allow a sub-exponential time solution. If this is true, DH in the elliptic curve group would provide security comparable to other schemes at a lower computational and communication overhead. It may be true, but it certainly has not yet been proven. There are connections between elliptic curve groups and class groups with consequences for the higher genus case and extension fields. In particular, Menezes, Okamoto and Vanstone showed how the Weil pairing gave a better method for solving DLP for a particular class of elliptic curves, the supersingular ones.

These are curves of order p+1, and DLP is reduced to a similar problem in GF(p2), where it can be more effectively solved. Work continues in an effort to extend these results to the general curve group. A related problem in elliptic curve cryptography focuses attention on another possible exciting interplay between theoretical mathematics, computer science (algorithms) and practical implementation. Calculation of the order of the elliptic curve group is not straightforward. Knowing the order of their group is very important to DH cryptographers, since short cut attacks exist if the order of the group factors into small primes.

Current elliptic curve cryptosystem proposals often employ a small class of curves to circumvent the counting problem. Even less progress has been made on the more general problem of whether there exist any groups whose DLP is exponential and, if so, characterizing such groups. Another interesting problem is whether solving DLP is necessary as well as sufficient for breaking DH. There are some groups for which this is known to be true, but determining whether this is true for all groups, or characterizing those groups for which it is true, remains to be done. A third interesting general DH problem is “diagnosis” of the DH group (when one has intercepted both ends of DH exchanges and does not know the group employed).

For this reason, cryptology is a traditional subject that conventionally guaranteed (or sought to undo the guarantee of) confidentiality and integrity of messages, but the information era has expanded the range of applications to consist of authentication, integrity and protocols for providing other information attributes, including timeliness, ease of use of service and protection of intellectual property. Cryptology has at all times been a charming and an exciting study, enjoyed by mathematicians and non-mathematicians the same.