Engineering
Advertisement
large capital lambda Plot of a quicksort algorithm
Utah teapot representing computer graphics Microsoft Tastenmaus mouse representing human-computer interaction
Computer science deals with the theoretical foundations of computation and practical techniques for their application.

Computer science (or computing science) is the study of the theoretical foundations of information and computation and their implementation and application in computer systems.

Computer science (sometimes called computation science, but not to be confused with computational science or software engineering) is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate, store, and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems.[1]

Its fields can be divided into theoretical and practical disciplines. Computational complexity theory is highly abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful, usable, and accessible.

Massey University has explained about this field on a web page.[2]

Sub-fields[]

Computer science has many sub-fields, such as:

computer graphics,
computational problems,
computer programming
human-computer interaction etc.,

History[]

The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment.

Programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothers (9th century),[3] and Al-Jazari's programmable humanoid automata and castle clock (1206), which is considered to be the first programmable analog computer.[4]

Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine.[5] He started developing this machine in 1834, and "in less than two years, he had sketched out many of the salient features of the modern computer".[6] "A crucial step was the adoption of a punched card system derived from the Jacquard loom"[6] making it infinitely programmable.[note 1] In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be the first published algorithm ever specifically tailored for implementation on a computer.[7] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM. Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published [8] the 2nd of the only two designs for mechanical analytical engines in history. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business[9] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".[10]

During the 1940s, as new and more powerful computing machines such as the Atanasoff–Berry computer and ENIAC were developed, the term computer came to refer to the machines rather than their human predecessors.[11] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City. The renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world.[12] Ultimately, the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946.[13] Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[14][15] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science department in the United States was formed at Purdue University in 1962.[16] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[17][18] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704[19] and later the IBM 709[20] computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating […] if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[17] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.[18]

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947.[21][22] In 1953, the University of Manchester built the first transistorized computer, called the Transistor Computer.[23] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[24] The metal–oxide–silicon field-effect transistor (MOSFET, or MOS transistor) was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[25][26] It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[24] The MOSFET made it possible to build high-density integrated circuit chips,[27][28] leading to what is known as the computer revolution[29] or microcomputer revolution.[30]

Time has seen significant improvements in the usability and effectiveness of computing technology.[31] Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.

See also: History of computing and History of informatics

Contributions[]

Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the information revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750–1850 CE) and the Agricultural Revolution (8000–5000 BCE).

These contributions include:

  • The start of the "Digital Revolution", which includes the current Information Age and the Internet.[32]
  • A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[33]
  • The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[34]
  • In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World War II.[35]
  • Scientific computing enabled practical evaluation of processes and situations of great complexity, as well as experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.[32] Distributed computing projects such as Folding@home explore protein folding.
  • Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.[36] High frequency algorithmic trading can also exacerbate volatility.[37]
  • Computer graphics and computer-generated imagery have become ubiquitous in modern entertainment, particularly in television, cinema, advertising, animation and video games. Even films that feature no explicit CGI are usually "filmed" now on digital cameras, or edited or post-processed using a digital video editor.[38][39]
  • Simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE,[40] as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits.
  • Artificial intelligence is becoming increasingly important as it gets more efficient and complex. There are many applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in video games and on the modern battlefield in drones, anti-missile systems, and squad support robots.[41]
  • Human–computer interaction combines novel algorithms with design strategies that enable rapid human performance, low error rates, ease in learning, and high satisfaction. Researchers use ethnographic observation and automated data collection to understand user needs, then conduct usability tests to refine designs. Key innovations include the direct manipulation, selectable web links, touchscreen designs, mobile applications, and virtual reality.

See also[]

  • Information technology
  • List of computer scientists
  • List of important publications in computer science
  • List of pioneers in computer science
  • List of unsolved problems in computer science
  • List of terms relating to algorithms and data structures
  • Software engineering

Notes[]

  1. "The introduction of punched cards into the new engine was important not only as a more convenient form of control than the drums, or because programs could now be of unlimited extent, and could be stored and repeated without the danger of introducing errors in setting the machine by hand; it was important also because it served to crystallize Babbage's feeling that he had invented something really new, something much more than a sophisticated calculating machine." Bruce Collier, 1970

References[]

  1. "WordNet Search—3.1". Wordnetweb.princeton.edu. Retrieved 14 May 2012.
  2. Massey University
  3. Koetsier, Teun (2001), "On the prehistory of programmable machines: musical automata, looms, calculators", Mechanism and Machine Theory, 36 (5): 589–603, doi:10.1016/S0094-114X(01)00005-2..
  4. Ancient Discoveries, Episode 11: Ancient Robots, History Channel, archived from the original on March 1, 2014, retrieved 2008-09-06
  5. "Science Museum—Introduction to Babbage". Archived from the original on September 8, 2006. Retrieved 24 September 2006.
  6. 6.0 6.1 Anthony Hyman (1982). Charles Babbage, pioneer of the computer.
  7. "A Selection and Adaptation From Ada's Notes found in Ada, The Enchantress of Numbers," by Betty Alexandra Toole Ed.D. Strawberry Press, Mill Valley, CA". Archived from the original on February 10, 2006. Retrieved 4 May 2006.
  8. The John Gabriel Byrne Computer Science Collection
  9. "In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical data, and the transfer of numerical data from one register to another", Bernard Cohen, p.44 (2000)
  10. Brian Randell, p. 187, 1975
  11. The Association for Computing Machinery (ACM) was founded in 1947.
  12. "IBM Archives: 1945". Ibm.com. Retrieved 2019-03-19.
  13. "IBM100 – The Origins of Computer Science". Ibm.com. 1995-09-15. Retrieved 2019-03-19.
  14. Denning, Peter J. (2000). "Computer Science: The Discipline" (PDF). Encyclopedia of Computer Science. Archived from the original (PDF) on May 25, 2006.
  15. "Some EDSAC statistics". University of Cambridge. Retrieved 19 November 2011.
  16. "Computer science pioneer Samuel D. Conte dies at 85". Purdue Computer Science. July 1, 2002. Retrieved December 12, 2014.
  17. 17.0 17.1 Levy, Steven (1984). Hackers: Heroes of the Computer Revolution. Doubleday. ISBN 978-0-385-19195-1.
  18. 18.0 18.1 Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Taylor and Francis / CRC Press.
  19. "IBM 704 Electronic Data Processing System—CHM Revolution". Computerhistory.org. Retrieved 7 July 2013.
  20. "IBM 709: a powerful new data processing system" (PDF). Computer History Museum. Archived from the original (PDF) on March 4, 2016. Retrieved December 12, 2014.
  21. Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University Press. ISBN 9781139643771.
  22. Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 Volumes. John Wiley & Sons. p. 14. ISBN 9783527340538.
  23. Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
  24. 24.0 24.1 Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
  25. "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine. Computer History Museum.
  26. Lojek, Bo (2007). History of Semiconductor Engineering. Springer Science & Business Media. pp. 321–3. ISBN 9783540342588.
  27. "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
  28. Hittinger, William C. (1973). "METAL-OXIDE-SEMICONDUCTOR TECHNOLOGY". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
  29. Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETs. Cambridge University Press. p. vii. ISBN 9781107434493.
  30. Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic Instrumentation. American Chemical Society. p. 389. ISBN 9780841228610. The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
  31. "Timeline of Computer History". Computer History Museum. Retrieved November 24, 2015.
  32. 32.0 32.1 "Computer Science : Achievements and Challenges circa 2000" (PDF). Archived from the original (PDF) on September 11, 2006. Retrieved January 11, 2007.
  33. Constable, R.L. (March 2000). "Computer Science: Achievements and Challenges circa 2000" (PDF). {{cite journal}}: Cite journal requires |journal= (help)
  34. Abelson, H.; G.J. Sussman with J. Sussman (1996). Structure and Interpretation of Computer Programs (2nd ed.). MIT Press. ISBN 978-0-262-01153-2. The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology – the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects.
  35. David Kahn, The Codebreakers, 1967, ISBN 0-684-83130-9.
  36. "Black box traders are on the march". The Telegraph. August 26, 2006. Archived from the original on June 21, 2008.
  37. Kirilenko, Andrei A.; Kyle, Albert S.; Samadi, Mehrdad; Tuzun, Tugkan (2017-01-06). "The Impact of High Frequency Trading on an Electronic Market" (PDF). Papers.ssrn.com. doi:10.2139/ssrn.1686004. SSRN 1686004. {{cite journal}}: Cite journal requires |journal= (help)
  38. Maly, Timy (2013-01-30). "How Digital Filmmakers Produced a Gorgeous Sci-Fi Movie on a Kickstarter Budget". Wired. Retrieved November 24, 2015.
  39. Matthau, Charles (2015-01-08). "How Tech Has Shaped Film Making: The Film vs. Digital Debate Is Put to Rest". Wired. Retrieved November 24, 2015.
  40. Muhammad H. Rashid, 2016. SPICE for Power Electronics and Electric Power. CRC Press. p. 6. ISBN 978-1-4398-6047-2.
  41. Marko B. Popovic, 2019. Biomechatronics. Elsevier Science. p. 501. ISBN 978-0-12-813041-4.

Further reading[]

Overview[]

  • Tucker, Allen B. (2004). Computer Science Handbook (2nd ed.). Chapman and Hall/CRC. ISBN 978-1-58488-360-9.
    • "Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. […] all in all, there is absolute nothing about Computer Science that can not be found in the 2.5 kilogram-encyclopaedia with its 110 survey articles […]." (Christoph Meinel, Zentralblatt MATH)
  • van Leeuwen, Jan (1994). Handbook of Theoretical Computer Science. The MIT Press. ISBN 978-0-262-72020-5.
    • "[…] this set is the most unique and possibly the most useful to the [theoretical computer science] community, in support both of teaching and research […]. The books can be used by anyone wanting simply to gain an understanding of one of these areas, or by someone desiring to be in research in a topic, or by instructors wishing to find timely information on a subject they are teaching outside their major areas of expertise." (Rocky Ross, SIGACT News)
  • Ralston, Anthony; Reilly, Edwin D.; Hemmendinger, David (2000). Encyclopedia of Computer Science (4th ed.). Grove's Dictionaries. ISBN 978-1-56159-248-7.
    • "Since 1976, this has been the definitive reference work on computer, computing, and computer science. […] Alphabetically arranged and classified into broad subject areas, the entries cover hardware, computer systems, information and data, software, the mathematics of computing, theory of computation, methodologies, applications, and computing milieu. The editors have done a commendable job of blending historical perspective and practical reference information. The encyclopedia remains essential for most public and academic library reference collections." (Joe Accardin, Northeastern Illinois Univ., Chicago)
  • Edwin D. Reilly (2003). Milestones in Computer Science and Information Technology. Greenwood Publishing Group. ISBN 978-1-57356-521-9.

Selected literature[]

  • Knuth, Donald E. (1996). Selected Papers on Computer Science. CSLI Publications, Cambridge University Press.
  • Collier, Bruce (1990). The little engine that could've: The calculating machines of Charles Babbage. Garland Publishing Inc. ISBN 978-0-8240-0043-1.
  • Cohen, Bernard (2000). Howard Aiken, Portrait of a computer pioneer. The MIT press. ISBN 978-0-262-53179-5.
  • Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press, Taylor & Francis.
  • Randell, Brian (1973). The origins of Digital computers, Selected Papers. Springer-Verlag. ISBN 978-3-540-06169-4.
    • "Covering a period from 1966 to 1993, its interest lies not only in the content of each of these papers – still timely today – but also in their being put together so that ideas expressed at different times complement each other nicely." (N. Bernard, Zentralblatt MATH)

Articles[]

  • Peter J. Denning. Is computer science science?, Communications of the ACM, April 2005.
  • Peter J. Denning, Great principles in computing curricula, Technical Symposium on Computer Science Education, 2004.
  • Research evaluation for computer science, Informatics Europe report. Shorter journal version: Bertrand Meyer, Christine Choppy, Jan van Leeuwen and Jorgen Staunstrup, Research evaluation for computer science, in Communications of the ACM, vol. 52, no. 4, pp. 31–34, April 2009.

Curriculum and classification[]

  • Association for Computing Machinery. 1998 ACM Computing Classification System. 1998.
  • Joint Task Force of Association for Computing Machinery (ACM), Association for Information Systems (AIS) and IEEE Computer Society (IEEE CS). Computing Curricula 2005: The Overview Report. September 30, 2005.
  • Norman Gibbs, Allen Tucker. "A model curriculum for a liberal arts degree in computer science". Communications of the ACM, Volume 29 Issue 3, March 1986.

External links[]

Bibliography and academic search engines[]

Professional organizations[]

Misc[]

This page uses Creative Commons Licensed content from Wikipedia (view authors). Smallwikipedialogo.png
Advertisement