Main Page

From Theory

Algorithms and Theory Group[edit]

Department of Computer Science, University of Maryland, College Park.

Welcome to the Wiki for our group! You also follow us on Twitter @theorycsumd

Theoretical Computer Science (TCS), broadly speaking, is concerned with understanding the very nature of computation:

What problems can be solved by computers? And how efficiently can such problems be solved? Can `hard' problems be used to our advantage in any way? TCS encompasses research in such diverse areas as complexity theory, algorithms, cryptography, distributed computing, machine learning, and more; the common thread is a focus on precise models and rigorous mathematical analysis of particular problems within those models.

We have a strong group of faculty actively working in this area. We also have a large number of students who are encouraged to get involved early by attending weekly TCS reading groups run by the group. Brief descriptions of faculty research interests follow.

Main Research Areas:[edit]

  • Applied Algorithmics: Joseph Ja'Ja', Mohammad Hajiaghayi, Samir Khuller, David Mount, Mihai Pop, Aravind Srinivasan, Uzi Vishkin
    Applying rigorously developed theory to practical problems arising in, e.g., networks, graphics, image processing, architecture, and social networks and epidemiology. Research at UMD in this area is often undertaken in collaboration with faculty in application areas. A sample of issues covered includes: how to schedule data upload in high-volume Web servers, nearest-neighbor searching, fast interpolation for ray-tracing, scheduling multimedia delivery from replicated sources, and exploiting the massive parallelism in upcoming parallel architectures.

  • Algorithmic Game Theory: Mohammad Hajiaghayi, Aravind Srinivasan
    Mechanism Design in particular Algorithmic Game Theory, which can be viewed as "incentive-aware algorithm design", has become an increasingly important part of (theoretical) computer science in recent years. Recent results show a strong relation between computer science (esp. networking) and economics (esp. game theory), and techniques from each seem well-poised to help with key problems of the other. Our main goal in this area is to study these connections which produce powerful mechanisms for adaptive and networked environments and several other applied areas, and improve the experience of users of the Web and the Internet. To this end, we do research in several topics such as algorithmic mechanism design; auctions (efficient, revenue-maximizing, sponsored search, etc.); congestion and potential games; cost sharing; existence, computation, and learning of equilibria; game theory in the Internet; network games; price of anarchy; selfish routing, etc.

  • Approximation Algorithms: Samir Khuller, Mohammad Hajiaghayi, Aravind Srinivasan, Uzi Vishkin
    Approximation algorithms is the branch of algorithms dealing with the design of polynomial time algorithms that produce solutions that are close to optimal. This is of special interest for intractable problems for which no polynomial algorithms are known that produce optimal solutions. Techniques used are from a variety of areas: greedy algorithms, linear programming, randomized algorithms, non-linear programming, dynamic programming etc.

  • Bioinformatics and Computational Biology: Mihai Pop
    Bioinformatics and Computational Biology involve the application of algorithmic theory to solving biological problems, in particular those problems arising from the modern shift of biological research towards high-throughput experimental techniques. As an example, determining the DNA sequence of organisms could not be possible without sophisticated algorithms used to assemble together the many small fragments generated by sequencing machines. Computational Biology at University of Maryland is conducted in close collaboration with biologists from the life-science departments and from other institutions.

  • Coding Theory: Alexander Barg
    Coding theory is motivated by the problem of information transmission over noisy communication channels. It has developed into an area of applied mathematics with strong ties to theoretical computer science. Among the areas with links to CS are: list decoding of algebraic codes, local testability and local decodability and proofs of the different versions of the PCP theorem, codes and algorithms on graphs, almost orthogonal codes and weakly biased arrays, universal hash functions, quantum information processing, and others.

  • Complexity Theory: William Gasarch, Jonathan Katz
    Complexity theory is concerned with how complex various functions are to compute. The usual measures of complexity are time and space, though there are others such as communication and descriptive complexity. The most important problem in complexity theory is P=?NP. One topic of research in complexity theory at The University of Maryland is looking at analogs of P=?NP in communication complexity and descriptive complexity theory and seeing if the problem is solvable there, and what insights it may offer for the real problem. This approach is also being used on other open problems in complexity theory.

  • Computational Geometry: Samir Khuller, David Mount
    Computational geometry is the study of algorithms and data structures for problems whose inputs are geometric objects, for example, points, lines, spheres, polygons, and polyhedra. The field has applications to areas such as computer graphics and vision, robotics, computer-aided design, computational statistics, and computational biology. Research on computational geometry at Maryland includes work on a number of areas, including computing closest points and nearest neighbor searching and point location, computing clusters in point sets, point pattern matching, and interpolation.

  • Cryptography: Jonathan Katz, Alexander Barg
    Cryptography began as a rigorous formalization of issues arising in data and computer security (encryption, authentication, etc.), but has since expanded to touch upon many other areas of theoretical computer science (the most striking example being the concept of "zero-knowledge proofs"). Research at UMD explores both the practical and the theoretical aspects of this field. A primary motivation is to understand the efficiency of cryptographic solutions by (1) improving the computational/communication complexity of existing protocols, (2) using new number-theoretic assumptions to speed up current constructions, (3) designing improved algorithms for faster computation, and (4) proving lower bounds on the best possible efficiency that can be achieved.

  • Fixed-Parameter Algorithms: Mohammad Hajiaghayi
    Fixed-parameter algorithms (a.k.a. parameterized complexity) is another approach to cope with hardness of NP-complete problems. In these algorithms, we allow the running time to be exponential, but only with respect to a parameter other than the problem size, while the result must be optimal (the parameter is often small in practice). In particular, parameterized complexity uses a set of techniques such as kernelization, bounded search trees, bounded treewidth of graphs, and extremal combinatorics among others. Though until recently, approximation algorithms and fixed-parameter algorithms have been considered to be two different approaches with different methodology that have very little to do with each other, very recently several connections between the two fields were identified. We are currently investigating these connections along with traditional parameterized algorithms for NP-complete problems.

  • Parallel Algorithms: Clyde Kruskal, Joseph Ja'Ja', Uzi Vishkin
    Thinking algorithmically in parallel provides an interesting ``alien intellectual culture", relative to previous algorithmic theories. Current UMD researchers have had a strong presence in this field for over three decades, as evidenced by the recognition of 2 parallel algorithms researchers: Clyde Kruskal and Uzi Vishkin, as Highly Cited Researchers. The PRAM-On-Chip project at UMD is a direct outgrowth of our theoretical work. The project offers a concrete agenda for challenging the 1946 von-Neumann architecture, through streamlining the massive knowledge base developed by the parallel algorithms community with the roadmap for CMOS VLSI. Using our hardware and software prototypes, we recently demonstrated parallel speedups that beat all current platforms by orders of magnitude on the most advanced parallel algorithms in the literature, including for max-flow, graph biconnectivity and graph triconnectivity. This stress test validates our 30-year old hypothesis that the PRAM algorithmic theory can save parallel computing from its programming woes, while traversing an architecture narrative that has belittled for decades the PRAM theory, stipulating that it is too simplistic for any existing or future parallel machine. The demonstrated advantage is especially important for the so-called, irregular, fine-grained programs that all commercial platforms to date fail to support effectively.

  • Pattern Matching: Uzi Vishkin
    Strings of characters (or bits) is an elementary form for representing information. Pattern matching problems on strings have provided a fertile ground for idea-driven algorithmic research.

  • Quantum Algorithms: Andrew Childs
    A quantum mechanical representation of information allows one to efficiently perform certain tasks that are intractable within a classical framework.Some of the main topics of research include quantum walk, quantum simulation, quantum algorithms for algebraic problems, and quantum query complexity.

  • Randomized Algorithms: Mohammad Hajiaghayi, Jonathan Katz, Aravind Srinivasan
    Randomized algorithms are algorithms that make random choices as they proceed, and have had a fundamental impact on many areas of computer science including distributed computing, cryptology, and networking. Research in Maryland covers all aspects of this field, including applications to network design and routing, approximation algorithms, scheduling, and distributed algorithms. On the more theoretical side, we are also interested in studying the limits to which randomization can be useful in computation.