share
Stack OverflowSignificant events in Computer Science
[+15] [30] Brabster
[2010-03-16 22:56:25]
[ computer-science history ]
[ http://stackoverflow.com/questions/2458751] [DELETED]

What were the most significant events or milestones in the history of computer science?

I haven't been able to find a potted history, so I thought I'd see what views the SO community had on the question. I'm studying for a Masters in CS at the moment, so I'm hoping for some stuff to go take a look at that I've not come across before.

Related:
Computer science advances in past 5 years [1]
Significant new inventions in computing since 1980 [2]

(1) This seems like a good first pass at a timeline: warbaby.com/FG_test/Timeline.html. I like it because of the utilitarian nature. - vkraemer
(5) A lot of these answers seem to be about computing rather than computer science. Remember Dijkstra's remark: "Computer science is no more about computers than astronomy is about telescopes" - Simon Nickerson
I don't have an answer, but I strongly recommend reading "Hackers" by Steven Levy en.wikipedia.org/wiki/Hackers_Heroes_of_the_Computer_Revolution amazon.com/Hackers-Computer-Revolution-Steven-Levy/dp/… - Sonny
(1) erm - the possible duplicate seems to clearly state that its only concerned with the last 5 years...? - Brabster
(2) very, very bad "possible duplicate". voted for reopen - Roman
Videos about the history of computers: youtube.com/user/ComputerHistory - Kimmo Puputti
[+24] [2010-03-16 22:59:22] Roman

Invention of Turing Machine [1]

[1] http://en.wikipedia.org/wiki/Turing_machine

(7) + Turing's proof of the undecidability of the halting problem ... - mjy
I have to agree with this.. - shake
I had a professor who claimed to have known Alan Turing. Apparently, he had a VW bus at one point, which he called his Turing Machine. Pretty punny for a genius. - mcliedtk
1
[+22] [2010-03-16 23:01:53] Henk Holterman

Invention of the zero (0) [1], ca 2000 BC

[1] http://en.wikipedia.org/wiki/Zero

2
[+11] [2010-03-16 23:04:44] JustJeff

The stored program. Before that, you didn't program so much as re-wire.


(2) Von Neumann, en.wikipedia.org/wiki/Von_Neumann_architecture - Henk Holterman
3
[+9] [2010-03-16 23:07:41] Roman

Invention of Lambda Calculus [1] and funtional languages as a result.

[1] http://en.wikipedia.org/wiki/Lambda_calculus

4
[+8] [2010-03-16 23:07:57] Igby Largeman

I would think the invention of solid state electronics/semiconductors/integrate circuits had a HUGE impact.

Also, publication of volumes 1-3 of The Art of Computer Programming [1]. Maybe not "most" important, but it is a seminal work.

[1] http://en.wikipedia.org/wiki/Art_of_Computer_Programming

5
[+8] [2010-03-16 23:05:24] Ben Griswold

Garbage collection


Ahmen to that one. - wheaties
6
[+7] [2010-03-16 23:01:30] JaredPar

Invention of the C programming language


(10) I'd say LISP was actually more valuable. C might be great from an engineering perspective but for actual science ... not that much I think. - Joey
(4) while undoubtedly used by many computer scientists, it's an engineering feat, C wasn't even particularly innovative as a language... - mjy
7
[+6] [2010-03-16 23:06:08] Simon Nickerson

The first compiler (A-0 programming language, Grace Hopper, 1952).

[Source: Wikipedia [1]]

[1] http://en.wikipedia.org/wiki/Compiler

8
[+5] [2010-03-16 23:02:28] vkraemer

First Bug [1]

[1] http://www.jamesshuggins.com/h/tek1/first_computer_bug.htm

9
[+5] [2010-03-16 23:09:11] Leniel Macaferi

Invention of Relational Databases [1] based on Relational Algebra [2].

Search Algorithms [3]

Aritificial Intelligence [4]

[1] http://en.wikipedia.org/wiki/Relational_database
[2] http://en.wikipedia.org/wiki/Relational_algebra
[3] http://en.wikipedia.org/wiki/Search_algorithm
[4] http://en.wikipedia.org/wiki/Artificial_intelligence

10
[+5] [2010-03-16 23:04:03] Roman

Internet and first high-quality search engine


+1 for the internet, .1 for search - DaveDev
11
[+3] [2010-03-16 23:46:56] James McLeod

A significant, although not positive, event was the first time a computation algorithm was patented.


12
[+3] [2010-03-17 00:12:53] Developer Art

The idea of sticking to just two element states.

This allowed usage of electronic components for reliable calculations. Before that happened, attempts to play with more element states (there was an attempt with three states first) were not entirely successful. Different voltage ranges for different types of elements, jitter and other effects were disrupting reliable state recognition.

So someone came with an idea to minimize the number of states to the minimum which was still useful. That was two.

This is how binary system was born.


"...and it had notable advantages over the binary computers which eventually replaced it (such as lower electricity consumption and lower production cost)." en.wikipedia.org/wiki/Ternary_computer - community_owned
Duplicate of stackoverflow.com/questions/2458751/… - community_owned
13
[+3] [2010-03-16 23:02:34] Brabster

Invention of the Williams Tube [1] - the first RAM

[1] http://en.wikipedia.org/wiki/Williams_tube

(4) A series of tubes? - Kevin Panko
14
[+3] [2010-03-16 23:03:45] Andriyev

Binary Number System - Though this predates Computers.


(6) Binary digits are just a concession to a weakness of electronics. - Henk Holterman
In electronics, lower bases are often easier_to_implement/less_error_prone. The binary number system is more efficient in other places than just electronics. For example: normally we can only count up to 10 using our fingers. But if we count in binary using them, we can reach 1023. - Wallacoloo
@wallacoloo You say that base 2 is more efficient than base 1? Tell me more. - Josh Lee
@jleedev Did you already see my example of counting in base 2 vs 1? Now if you're talking electronics, how would you implement base 1? - Wallacoloo
15
[+2] [2010-03-16 23:10:06] N 1.1

Assembly language,

Compilers


16
[+2] [2010-03-16 23:30:28] Simon Nickerson

Invention of the algorithm (usually credited to Al-Khwārizmī in 9th century AD)


I thought Al Gore invented them. Or was that something else? - kibibu
@kibbu HAHAHAHA! THAT JOKE IS STILL FUNNY IN 2010! HAHAHAHAHAHA /me dies - Andrew Heath
I've never heard anyone else tell that joke. Although I've done so. I think it's a pun that so obvious that everybody comes up with it on their own. - Wallacoloo
17
[+2] [2010-03-16 23:40:59] Brabster

Hash Function - allowing indexes in databases, etc, first reference in 1953, according to Wikipedia [1]

[1] http://en.wikipedia.org/wiki/Hash_function

18
[+2] [2010-03-17 02:32:10] Jojo Sardez

Invention of Calculator [1].

[1] http://en.wikipedia.org/wiki/Calculator

19
[+2] [2010-03-17 06:11:37] uncle brad

Indirect addressing


If you ever had to write self modifying code you would vote me up. - uncle brad
20
[+2] [2010-03-17 06:19:56] wolfsnipes

Charles Babbage and the first real computer - its hard to have computer science history without a first computer!


It's hard to call Babbage's machine the first real computer, because it was never built. It may have been the first real computer design, but it was never a real computer. - Gabe
(3) The fatal flaw with this design is that if one puts wrong figures in, the right answer does not come out. - Kevin Panko
21
[+2] [2010-03-17 20:22:01] Bratch

After recently reading about Charles P. Thacker, the 2009 Turing Award recipient, one idea would be to take a look at the list of Turing Award recipients [1] and what some of their accomplishments were. This will take you back to 1966 and contains many advancements in computer science.

[1] http://en.wikipedia.org/wiki/Turing_Award

22
[+1] [2010-03-20 19:21:17] steve
+1 Here are some crayons to go with that, try not to mess things up. - uncle brad
23
[+1] [2010-03-17 06:24:32] Gabe

The first computer program, written sometime circa 1842-1843 by Ada Lovelace, the first programmer [1].

[1] http://en.wikipedia.org/wiki/Ada_Lovelace

24
[+1] [2010-03-17 02:36:22] mikerobi

Y2K, wait, never mind.


No, this was still significant. It proved that software must be designed with consideration to the future. Sure computers didn't randomly blow up, but there were still quite a few changes made. - Wallacoloo
25
[+1] [2010-03-17 02:25:05] DaveDev

I'd say Wiki & Web 2.0 has its significance, particularaly in the context of a site like StackOverflow. This is a knowledge centre that anyone can contribute to, and is constantly vetted to a great degree by its participants. Our species has never had anyting like this before for collaboration & information sharing.


But how is it a significant event in Computer Science? - community_owned
26
[+1] [2010-03-16 23:07:39] Rickard von Essen

Linux [1] and and other free software [2].

EDIT: Motivation: these tools give most CS people the tools to realize and do there research and development and to learn more. This have increased the peace and interest a lot in this subject. That is why it is important.

[1] http://kernel.org
[2] http://www.fsf.org/

-1, and not because of any feelings towards Linux (which I'm using to write this now). You could say that pens and paper have given "most CS people the tools to realize and do there research and development and to learn more. This have increased the peace and interest a lot in this subject. That is why it is important." - community_owned
@Roger Pate But free sw have specifically impacted CS, pen and paper have taken humanity where we are now in most (all) knowledges. - Rickard von Essen
27
[0] [2010-03-16 23:08:51] JustJeff

The qubit .. if they can be entangled in sufficient quantity, a LOT of things will need to be reconsidered.


28
[0] [2010-03-17 06:10:13] Mathias

Pascal's calculator [1]. 1645!

[1] http://en.wikipedia.org/wiki/Pascal%27s_calculator

29
[0] [2010-03-24 22:05:20] John Doe

Creation of ARPANET [1] - the predecessor of Internet.

[1] http://en.wikipedia.org/wiki/ARPANET

30