Revision as of 18:21, 8 September 2014 editDemocraticLuntz (talk | contribs)Autopatrolled, Extended confirmed users, Pending changes reviewers, Rollbackers228,201 editsm →Scope: fixed grammar error← Previous edit | Revision as of 05:38, 7 October 2014 edit undoAndywear1 (talk | contribs)68 editsNo edit summaryNext edit → | ||
Line 47: | Line 47: | ||
These developments have led to the modern study of logic and ], and indeed the field of theoretical computer science as a whole. ] was added to the field with a 1948 mathematical theory of communication by ]. In the same decade, ] introduced a mathematical model of ] in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of ]s and ] were established. In 1971, ] and, working independently, ], proved that there exist practically relevant problems that are ] – a landmark result in ]. | These developments have led to the modern study of logic and ], and indeed the field of theoretical computer science as a whole. ] was added to the field with a 1948 mathematical theory of communication by ]. In the same decade, ] introduced a mathematical model of ] in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of ]s and ] were established. In 1971, ] and, working independently, ], proved that there exist practically relevant problems that are ] – a landmark result in ]. | ||
With the development of ] in the beginning of the 20th century came the concept that mathematical operations could be performed on an entire particle wavefunction. In other words, one could compute functions on multiple states simultaneously. This led to the concept of a ] in the latter half of the 20th century that took off in the 1990s when ] showed that such methods could be used to factor large numbers in ], which, if implemented, would render most modern ] systems uselessly insecure. | With the development of ] in the beginning of the 20th century came the concept that mathematical operations could be performed on an entire particle wavefunction. In other words, one could compute functions on multiple states simultaneously. This led to the concept of a ] in the latter half of the 20th century that took off in the 1990s when ] showed that such methods could be used to factor large numbers in ], which, if implemented, would render most modern ] systems uselessly insecure.{{citation needed|reason=Your explanation here|date=October 2014}} | ||
Modern theoretical computer science research is based on these basic developments, but includes many other mathematical and interdisciplinary problems that have been posed. | Modern theoretical computer science research is based on these basic developments, but includes many other mathematical and interdisciplinary problems that have been posed. |
Revision as of 05:38, 7 October 2014
This article is about the branch of computer science and mathematics. For the journal, see Theoretical Computer Science (journal).Theoretical computer science is a division or subset of general computer science and mathematics which focuses on more abstract or mathematical aspects of computing and includes the theory of computation.
Scope
It is not easy to circumscribe the theory areas precisely and the ACM's Special Interest Group on Algorithms and Computation Theory (SIGACT) describes its mission as the promotion of theoretical computer science and notes:
The field of theoretical computer science is interpreted broadly so as to include algorithms, data structures, computational complexity theory, distributed computation, parallel computation, VLSI, machine learning, computational biology, computational geometry, information theory, cryptography, quantum computation, computational number theory and algebra, program semantics and verification, automata theory, and the study of randomness. Work in this field is often distinguished by its emphasis on mathematical technique and rigor.
To this list, the ACM's journal Transactions on Computation Theory adds coding theory, computational learning theory and theoretical computer science aspects of areas such as databases, information retrieval, economic models and networks. Despite this broad scope, the "theory people" in computer science self-identify as different from the "applied people." Some characterize themselves as doing the "(more fundamental) 'science(s)' underlying the field of computing." Other "theory-applied people" suggest that it is impossible to separate theory and application. This means that the so-called "theory people" regularly use experimental science(s) done in less-theoretical areas such as software system research. It also means that there is more cooperation than mutually exclusive competition between theory and application.
History
Main article: History of computer scienceWhile formal algorithms have existed for millennia (Euclid's algorithm for determining the greatest common divisor of two numbers is still used in computation), it was not until 1936 that Alan Turing, Alonzo Church and Stephen Kleene formalized the definition of an algorithm in terms of computation. While binary and logical systems of mathematics had existed before 1703, when Gottfried Leibniz formalized logic with binary values for true and false. While logical inference and mathematical proof had existed in ancient times, in 1931 Kurt Gödel proved with his incompleteness theorem that there were fundamental limitations on what statements could be proved or disproved.
These developments have led to the modern study of logic and computability, and indeed the field of theoretical computer science as a whole. Information theory was added to the field with a 1948 mathematical theory of communication by Claude Shannon. In the same decade, Donald Hebb introduced a mathematical model of learning in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of neural networks and parallel distributed processing were established. In 1971, Stephen Cook and, working independently, Leonid Levin, proved that there exist practically relevant problems that are NP-complete – a landmark result in computational complexity theory.
With the development of quantum mechanics in the beginning of the 20th century came the concept that mathematical operations could be performed on an entire particle wavefunction. In other words, one could compute functions on multiple states simultaneously. This led to the concept of a quantum computer in the latter half of the 20th century that took off in the 1990s when Peter Shor showed that such methods could be used to factor large numbers in polynomial time, which, if implemented, would render most modern public key cryptography systems uselessly insecure.
Modern theoretical computer science research is based on these basic developments, but includes many other mathematical and interdisciplinary problems that have been posed.
Organizations
Journals and newsletters
- Information and Computation
- Theory of Computing (open access journal)
- Formal Aspects of Computing
- Journal of the ACM
- SIAM Journal on Computing (SICOMP)
- SIGACT News
- Theoretical Computer Science
- Theory of Computing Systems
- International Journal of Foundations of Computer Science
- Chicago Journal of Theoretical Computer Science (open access journal)
- Foundations and Trends in Theoretical Computer Science
- Journal of Automata, Languages and Combinatorics
- Acta Informatica
- Fundamenta Informaticae
- ACM Transactions on Computation Theory
- Computational Complexity
- ACM Transactions on Algorithms
- Information Processing Letters
Conferences
- Annual ACM Symposium on Theory of Computing (STOC)
- Annual IEEE Symposium on Foundations of Computer Science (FOCS)
- ACM–SIAM Symposium on Discrete Algorithms (SODA)
- Annual Symposium on Computational Geometry (SoCG)
- International Colloquium on Automata, Languages and Programming (ICALP)
- Symposium on Theoretical Aspects of Computer Science (STACS)
- International Conference on Theory and Applications of Models of Computation (TAMC)
- European Symposium on Algorithms (ESA)
- IEEE Symposium on Logic in Computer Science (LICS)
- International Symposium on Algorithms and Computation (ISAAC)
- Workshop on Approximation Algorithms for Combinatorial Optimization Problems (APPROX)
- Workshop on Randomization and Computation (RANDOM)
- Computational Complexity Conference (CCC)
- ACM Symposium on Parallelism in Algorithms and Architectures (SPAA)
- ACM Symposium on Principles of Distributed Computing (PODC)
- International Symposium on Fundamentals of Computation Theory (FCT)
See also
- Formal science
- Unsolved problems in computer science
- List of important publications in theoretical computer science
Notes
- "SIGACT". Retrieved 2009-03-29.
- "ToCT". Retrieved 2010-06-09.
- "Challenges for Theoretical Computer Science: Theory as the Scientific Foundation of Computing". Retrieved 2009-03-29.
- ^ The 2007 Australian Ranking of ICT Conferences: tier A+.
- ^ The 2007 Australian Ranking of ICT Conferences: tier A.
- FCT 2011 (retrieved 2013-06-03)
Further reading
- Martin Davis, Ron Sigal, Elaine J. Weyuker, Computability, complexity, and languages: fundamentals of theoretical computer science, 2nd ed., Academic Press, 1994, ISBN 0-12-206382-1. Covers theory of computation, but also program semantics and quantification theory. Aimed at graduate students.
External links
- SIGACT directory of additional theory links
- Theory Matters Wiki Theoretical Computer Science (TCS) Advocacy Wiki
- Usenet comp.theory
- List of academic conferences in the area of theoretical computer science at confsearch
- Theoretical Computer Science - StackExchange, a Question and Answer site for researchers in theoretical computer science
- Computer Science Animated
- http://theory.csail.mit.edu/ @ Massachusetts Institute of Technology