share
Stack OverflowSignificant new inventions in computing since 1980
[+561] [129] Alan Kay
[2009-01-11 13:27:42]
[ computer-science ]
[ https://stackoverflow.com/questions/432922/significant-new-inventions-in-computing-since-1980 ]

This question arose from comments [1] about different kinds of progress in computing over the last 50 years or so.

I was asked by some of the other participants to raise it as a question to the whole forum.

The basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles.

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

(77) Jeff Atwood confirmed, that the user "Alan Kay" is THE "Alan Kay". You know, the guy who worked for that copier machine company... ;-) en.wikipedia.org/wiki/Alan_Kay - splattne
(1) I watched this video: video.google.com/videoplay?docid=-533537336174204822 - A historical Video (1979) about the development of the Dynabook, Children and Computers and a lot more presented by Alan Kay. AMAZING things done before 1970 - especially the "Sketchpad" part in 1962. - splattne
(1) Answering this question is hard because first a more essential, historical, and philosophical question should be answered: how to define: "a really new idea" - Emile Vrijdags
(2) depending on your own definition the answer could be anything from "none" up to an enumeration of every possible technology. And all those answers would be either correct or incorrect depending on the definition of "a new idea" the reader/observer uses... - Emile Vrijdags
(1) If this thread has tought me something, is how much things that we take for granted have existed for mor than 3 decades in the academic and research sector. - Esteban Küber
(3) After looking at all the answers here: Good grief! Have we done nothing in the past 30 years?? - Jeremy Powell
(1) Does Differential Cryptanalysis count? - BlueRaja - Danny Pflughoeft
(2) @Will: Oddly enough I believe I have recently learned of a interesting answer to this question: fast clustering algorithms. DBSCAN is the state of the art for a lot of this (O(n log n) in the number of points in the data set), and it dates to 1996. Alas, with the question closed I will not take the time to read the many answers to find out if someone beaten me to it. - dmckee --- ex-moderator kitten
Eh, @dmckee - it does happen to be open at the moment... If you still wanted to throw one on the pile, I don't mind leaving it that way. - Shog9
[+311] [2009-01-11 15:11:57] splattne

The Internet itself pre-dates 1980, but the World Wide Web ("distributed hypertext via simple mechanisms") as proposed and implemented by Tim Berners-Lee started in 1989/90.

While the idea of hypertext had existed before ( Nelson’s Xanadu [1] had tried to implement a distributed scheme), the WWW was a new approach for implementing a distributed hypertext system. Berners-Lee combined a simple client-server protocol, markup language, and addressing scheme in a way that was powerful and easy to implement.

I think most innovations are created in re-combining existing pieces in an original way. Each of the pieces of the WWW had existed in some form before, but the combination was obvious only in hindsight.

And I know for sure that you are using it right now.

[1] http://www.nytimes.com/2009/01/11/business/11stream.html

(26) +1 for the most obvious but also the most easily forgotten because we all take it for granted :) - PolyThinker
(20) I'm not using the World Wide Web right now. I'm using a series of tubes known as the internets, achieved via the google. - Robert S.
@le dorfier, The World Wide Web is a system of interlinked hypertext documents accessed via the Internet, it's not TCP/IP networking. World Wide Web was begun in 1989. - Roberto Russo
WWW is an implementation of hypertext. Hypertext was invented in the 60's. - bruceatk
@bruceatk, I guess every invention is a combination of existing parts - and hypertext is a very important component, but not the only one which made WWW such a success. - splattne
(13) @bruceatk: Hypertext is an implementation of text. Text was invented in 3500 BC. - Portman
@aplattne/Portman - I see WWW as the logical progression of hypertext that took off when the environment was ready for it. I can admit that it is enough of a leap to be considered an invention. The invention part was actually written about in 1980 by Tim Berners-Lee, so it probably predates 1980. - bruceatk
(1) @bruceatk: I don't believe he wrote about the WWW until 1989. w3.org/People/Berners-Lee - Portman
+1 for mentioning xanadu, which in my opinion was much better concept (in terms of scalability and wiki-style versioning) than how WWW got implemented. - dusoft
(2) @splattne: And think has become search - u0b34a0f6ae
(1) For those too young to remember, WWW before Mozilla was nothing special (I tried the telnet-to-text-browser). Mozilla started a wildfire. To give an idea of the difference think CounterStrike without networking. - Thorbjørn Ravn Andersen
1
[+235] [2009-01-11 14:01:10] Oddthinking

Free Software Foundation [1] (Established 1985)

Even if you aren't a wholehearted supporter of their philosophy, the ideas that they have been pushing, of free software, open-source has had an amazing influence on the software industry and content in general (e.g. Wikipedia).

[1] http://en.wikipedia.org/wiki/Free_Software_Foundation

(2) unix itself was born as a collaborative freely distributed project of Bell Labs and collaborators (and subsequently UC Berkeley and other sources with variants and contributions.) It has taken some ugly detours, but it's effectively Open Source now because the cat was out of the bag from birth. - dkretz
(4) Most database technology was born the same way. In many senses, FSF and its ilk are simply restoring what had previously been provided by educational and corporate basic research facilities. - dkretz
(9) Agree that FSF has been very influential, but there is a tendency among its advocates to espouse "group think". So many FSF cannot accept that Apple OSX and MS Windows are much better than any open source OS for the average user. No one wants to admit that. - RussellH
(32) The entire purpose of the FSF is to promote software that can be freely used, modified, and redistributed by all. OSX and Windows are not "better" at this by any definition. - Adam Lassek
(5) @RussellH: you're confusing "Open Source" and "Free (as in Freedom) Software". Your comment, in fact, illustrates precisely why the distinction is important. But anyway, Firefox is better than Internet Explorer and Safari, and it's more important to users than Windows vs MacOS vs Linux. - niXar
(2) niXar: how is RussellH confusing Open Source and Free Software? Can you please point to their definitions and tell me any difference that has to do with RussellH's post? Can you point to any license that's one and not the other? - Jonas Kölker
I will never emrace communist principals. How did this get upvoted so much? - Janie
(8) Janie, you don't have to be a supporter to see that the principles that they are pushing have had a major effect on the industry. I have no interested in getting dragged into a discussion as to whether the FSF is communistic, or whether you should embrace some communist principles. - Oddthinking
(3) @Jonas: It's a bit like the difference between the People's Front of Judea and the Judean People's Front, but only a bit. There's a definite difference between OS and FS: gnu.org/philosophy/open-source-misses-the-point.html - outis
(9) Legal invention, not computing invention. - Charles Stewart
2
[+149] [2009-01-11 14:14:08] Dylan Beattie

I think it's fair to say that in 1980, if you were using a computer, you were either getting paid for it or you were a geek... so what's changed?

  • Printers and consumer-level desktop publishing. Meant you didn't need a printing press to make high-volume, high-quality printed material. That was big - of course, nowadays we completely take it for granted, and mostly we don't even bother with the printing part because everyone's online anyway.

  • Colour. Seriously. Colour screens made a huge difference to non-geeks' perception of games & applications. Suddenly games seemed less like hard work and more like watching TV, which opened the doors for Sega, Nintendo, Atari et al to bring consumer gaming into the home.

  • Media compression (MP3s and video files). And a whole bunch of things - like TiVO and iPods - that we don't really think of as computers any more because they're so ubiquitous and so user-friendly. But they are.

The common thread here, I think, is stuff that was once impossible (making printed documents; reproducing colour images accurately; sending messages around the world in real time; distributing audio and video material), and was then expensive because of the equipment and logistics involved, and is now consumer-level. So - what are big corporates doing now that used to be impossible but might be cool if we can work out how to do it small & cheap?

Anything that still involves physical transportation is interesting to look at. Video conferencing hasn't replaced real meetings (yet) - but with the right technology, it still might. Some recreational travel could be eliminated by a full-sensory immersive environment - home cinema is a trivial example; another is the "virtual golf course" in an office building in Soho, where you play 18 holes of real golf on a simulated course.

For me, though, the next really big thing is going to be fabrication. Making things. Spoons and guitars and chairs and clothing and cars and tiles and stuff. Things that still rely on a manufacturing and distribution infrastructure. I don't have to go to a store to buy a movie or an album any more - how long until I don't have to go to the store for clothing and kitchenware?

Sure, there are interesting developments going on with OLED displays and GPS and mobile broadband and IoC containers and scripting and "the cloud" - but it's all still just new-fangled ways of putting pictures on a screen. I can print my own photos and write my own web pages, but I want to be able to fabricate a linen basket that fits exactly into that nook beside my desk, and a mounting bracket for sticking my guitar FX unit to my desk, and something for clipping my cellphone to my bike handlebars.

Not programming related? No... but in 1980, neither was sound production. Or video distribution. Or sending messages to your relatives in Zambia. Think big, people... :)


(1) I think media compression is not a new concept (it goes back to Shannon's work in 50s), it's just become feasible with improved hardware (fast enough, able to play the media). - Kornel
I would have to agree with fabrication being something I think may be one of the next big things. When object "printers" become mainstream (printers that can replicate simple physical items that are durable) I think we will be there. - Andy Webb
It would also be great to scan existing items so replacements can be made. I have on many occasions had to shop for an odd screw or part to replace one that broke around the house or on my bike. With such a system I could scan the old part, repair it in software, and then create the replacement. - Andy Webb
And if you see piracy purely as a problem, you will hate that future. :-) - Jens Ayton
@Ahruman - have you read "Printcrime" by Cory Doctorow? Short story dealing with exactly that subject... craphound.com/?p=573 - Dylan Beattie
I agree the invention of MP3 and other non-lossless compression methods was very significant - kohlerm
Interesting story Dylan and interesting thought on how piracy fits into all of this. - Andy Webb
(44) Desktop publishing and high quality printing was invented at Xerox PARC in the 70s, some of the Altos back then also had high quality color screens. The Internet predated 1980. Media compression predated 1980. The question is about what fundamental new technologies have been invented since 1980 - Alan Kay
Agree with Alan there - None of these are new inventions, all the items mentioned above are simply advancements in technology of older concepts. - saschabeaumont
Didn't the Apple II have colour before 1980? - Tom Hawtin - tackline
(3) You sir, are a visionary. Do not let the man get you down. 'Printing' printers is the next big revolution. - Waylon Flinn
Fabricating objects at home is already well on its way. Check out 3D printing: en.wikipedia.org/wiki/3D_printing - Peter Di Cecco
am not familiar to the idea of 3D printing, but what you talk of sounds like a nanofactory (en.wikipedia.org/wiki/Molecular_assembler#Nanofactories) - tshepang
@Dylan Isn't this what replicator's do in Star Trek? - greatwolf
3
[+137] [2009-01-11 14:54:13] merriam

Package management and distributed revision control.

These patterns in the way software is developed and distributed are quite recent, and are still just beginning to make an impact.

Ian Murdock has called package management [1] "the single biggest advancement Linux has brought to the industry". Well, he would, but he has a point. The way software is installed has changed significantly since 1980, but most computer users still haven't experienced this change.

Joel and Jeff have been talking about revision control (or version control, or source control) with Eric Sink [2] in Podcast #36 [3]. It seems most developers haven't yet caught up with centralized systems, and DVCS is widely seen as mysterious and unnecessary.

From the Podcast 36 transcript [4]:

0:06:37

Atwood: ... If you assume -- and this is a big assumption -- that most developers have kinda sorta mastered fundamental source control -- which I find not to be true, frankly...

Spolsky: No. Most of them, even if they have, it's the check-in, check-out that they understand, but branching and merging -- that confuses the heck out of them.

[1] http://en.wikipedia.org/wiki/Package_management_system#Impact
[2] http://www.ericsink.com/entries/stack_overflow_podcast_36.html
[3] https://blog.stackoverflow.com/2009/01/podcast-36/
[4] https://stackoverflow.fogbugz.com/default.asp?W29018

(1) If one should count as a significant new invention, it's git. - hasen
(8) hasen j: git is a fantastic DCMS, however there were several others implemented before git - git, is a significant new -implementation- of an idea. - Arafangion
+1 for Package Management. Still one of the major things that Linux/BSD has to hold over everybody elses' heads, although the rest are getting there (just really slowly). - new123456
Even server-based revision control systems are largely post-1980 developments, and going from just having the current state to having the history of the state as well… that's a colossal and subtle change. - Donal Fellows
Distributed revision control is the wrong name. Nobody cares if your system is centralized of not. What is important is whether you track change-sets or versions. But most of the time, they come together (GIT, Mercurial), which confuses everybody. Joel Spolsky said it himself in a blog post: With distributed version control, the distributed part is actually not the most interesting part. - Benjamin Crouzier
@pinouchon: Joel's comment (that the interesting part is that DVCSes think in terms of diffs rather than snapshots) is actually false in the case of git. git works fundamentally with snapshots and computes diffs on the fly. - kini
4
[+122] [2009-01-22 07:16:07] Kief

BitTorrent [1]. It completely turns what previously seemed like an obviously immutable rule on its head - the time it takes for a single person to download a file over the Internet grows in proportion to the number of people downloading it. It also addresses the flaws of previous peer-to-peer solutions, particularly around 'leeching', in a way that is organic to the solution itself.

BitTorrent elegantly turns what is normally a disadvantage - many users trying to download a single file simultaneously - into an advantage, distributing the file geographically as a natural part of the download process. Its strategy for optimizing the use of bandwidth between two peers discourages leeching as a side-effect - it is in the best interest of all participants to enforce throttling.

It is one of those ideas which, once someone else invents it, seems simple, if not obvious.

[1] http://en.wikipedia.org/wiki/BitTorrent_%28protocol%29

True, ahtough while BitTorrent may be somewhat different/improved, the significant new invention really should be P2P-distribution, rather than any specific implementation like BitTorrent. - Ilari Kajaste
(10) I disagree. P2P is not at all new, it's older than USENET. Pre-bitTorrent "P2P" apps for the desktop (Kazaa and the like) are simply repacking of the client-server concept, adding a dynamic central directory of servers. Each "peer" client connects to a single other "peer" server to transfer a file. The fact that a single node does both is old hat (at least for pre-Windows systems). The bitTorrent protocol is (AFAIK) a completely new way to transfer files, which leverages multiple systems to transfer a file between one another in a truly distributed manner. - Kief
-1. In reality torrents are much slower than direct download, so the practical applications just don't support the theory. In reality you'll always have more leachers than seeders. Most ISPs throttle torrent traffic lately, and do heavy data shaping to detect torrents (encrypted or not). - JL.
(7) @JL: In theory, direct download is faster, but not in practice. With one seeder and one leacher, there shouldn't be any difference. As soon as you add another leacher, that leacher can start taking pieces from whoever has a faster connection (even if the client with the faster connection doesn't have the complete file). With a direct download, to take advantage of the faster connection, you would first have to wait for the client to finish the download before you could start. - Peter Di Cecco
(1) I think the better question becomes how much bandwidth do you save by hosting a torrent and seeding it with what would have been a direct download box. Only companies like Blizzard know that now, and I havent seen them talk numbers. Without a 'super seed' torrents will rely on users to seed, which just doesnt work with async connections and people not wanting to leave their computer on and upstream saturated. - semi
(6) @JL: torrents are slower than direct download? My "practical" experience says different; try going to download Eclipse both ways. - Dean J
5
[+120] [2009-01-12 03:04:04] Norman Ramsey

Damas-Milner type inference (often called Hindley-Milner type inference) was published in 1983 and has been the basis of every sophisticated static type system since. It was a genuinely new idea in programming languages (admitted based on ideas published in the 1970s, but not made practical until after 1980). In terms of importance I put it up with Self and the techniques used to implement Self; in terms of influence it has no peer. (The rest of the OO world is still doing variations on Smalltalk or Simula.)

Variations on type inference are still playing out; the variation I would single out the most is Wadler and Blott's type class mechanism for resolving overloading, which was later discovered to offer very powerful mechanisms for programming at the type level. The end to this story is still being written.


(3) +1 Static type systems are a huge huge step in software development. I couldn't agree with this answer more. - Jeremy Powell
6
[+104] [2009-01-12 03:07:59] Norman Ramsey

Here's a plug for Google map-reduce, not just for itself, but as a proxy for Google's achievement of running fast, reliable services on top of farms of unreliable, commodity machines. Definitely an important invention and totally different from the big-iron mainframe approaches to heavyweight computation that ruled the roost in 1980.


(10) map-reduce isn't an invention of Google at all. - akappa
(20) I'm a functional programmer. My first language was APL. Your point, exactly? - Norman Ramsey
(15) So (mapcar f l) and (reduce f l) in Lisp automatically run on arbitrary numbers of commodity machines, handling all intercommunication, failures, and restarts? - Jared Updike
(16) The Google map-reduce doesn't have much at all to do with functional map-reduce. - aehlke
7
[+91] [2009-01-12 11:57:38] Greg Dan

Tagging, the way information is categorized. Yes, the little boxes of text under each question.

It is amazing that it took about 30 years to invent tagging. We used lists and tables of contents; we used things which are optimized for printed books.

However 30 years is much shorter than the time people needed to realize that printed books can be in smaller format. People can keep books in hands.

I think that the tagging concept is underestimated among core CS guys. All research is focused on natural language processing (top-down approach). But tagging is the first language in which computers and people can both understand well. It is a bottom-up approach that makes computers use natural languages.


(1) Agreed - this correlates with my submission that the only new thing I can think of is syntactic markup to query among many domains - but you stated it better. - dkretz
(40) Check out Engelbart ca 1962-72 - Alan Kay
For me tagging is very much like early search engines that used meta=keywords tag (that's post-80's too, I'm just making argument that tagging isn't worth mentioning). - Kornel
(1) While tagging in computing is relatively new approach, tagging is also a concept inherited from books; in books, it's called indexing. - Domchi
(6) libraries have been using "tags" since... well I don't know but since a long time. Think about the book cards (sorry, I'm not sure how they're called in English) tagged "books about xxx". - nico
@nico: tags? library search (digital or card catalog) with "tags" is a nasty mess. Go to a university library (even great institutions like Caltech, UCLA) and sit down at their library search computer and type "Music theory" into the box. Try 'keyword' or 'subject' or 'title'. You get 3000 results and first two pages, only half of them are relevant (with Google Search on the same term setting the standard). What would be truly revolutionary would be to fire all the Head of Library Tech and give Google free reign to set up Google Books search (entire corpus of all the univ's books) instead. - Jared Updike
@Jared: Some university libraries already index everything (or at least all abstracts and papers, which are actually what's most valuable in a uni library when dealing with a fast-moving subject) through Google, but you only get to see the full results if you're on campus. - Donal Fellows
@Donal: sounds like those places are doing things right; wasn't my experience at university and certainly not at LA Public Library - Jared Updike
8
[+80] [2009-03-13 10:27:06] Sylver

I think we are looking at this the wrong way and drawing the wrong conclusions. If I get this right, the cycle goes:

Idea -> first implementation -> minority adoption -> critical mass -> commodity product

From the very first idea to the commodity, you often have centuries, assuming the idea ever makes it to that stage. Da Vinci may have drawn some kind of helicopter in 1493 but it took about 400 years to get an actual machine capable of lifting itself off the ground.

From William Bourne's first description of a submarine in 1580 to the first implementation in 1800, you have 220 years and current submarines are still at an infancy stage: we almost know nothing of underwater traveling (with 2/3rdof the planet under sea, think of the potential real estate ;).

And there is no telling that there wasn't earlier, much earlier ideas that we just never heard of. Based on some legends, it looks like Alexander the Great used some kind of diving bell in 332 BC (which is the basic idea of a submarine: a device to carry people and air supply below the sea). Counting that, we are looking at 2000 years from idea (even with a basic prototype) to product.

What I am saying is that looking today for implementations, let alone products, that were not even ideas prior to 1980 is ... I betcha the "quick sort" algorithm was used by some no name file clerk in ancient China. So what?

There were networked computers 40 years ago, sure, but that didn't compare with today's Internet. The basic idea/technology was there, but regardless you couldn't play a game of Warcraft online.

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

Historically, we have never been able to "find them" that close from the idea, that fast. I think the cycle is getting faster, but computing is still darn young.

Currently, I am trying to figure out how to make an hologram (the Star Wars kind, without any physical support). I think I know how to make it work. I haven't even gathered the tools, materials, funding and yet even if I was to succeed to any degree, the actual idea would already be several decades old, at the very least and related implementations/technologies have been used for just as long.

As soon as you start listing actual products, you can be pretty sure that concepts and first implementations existed a while ago. Doesn't matter.

You could argue with some reason that nothing is new, ever, or that everything is new, always. That's philosophy and both viewpoints can be defended.

From a practical viewpoint, truth lies somewhere in between. Truth is not a binary concept, boolean logic be damned.

The Chinese may have come up with the printing press a while back, but it's only been about 10 years that most people can print decent color photos at home for a reasonable price.

Invention is nowhere and everywhere, depending on your criteria and frame of reference.


(1) +1. Take a look for instance at the iPad ;) See stackoverflow.com/questions/432922/… - VonC
(4) If only there was a fav. answer tag... if only there was an option to give 2 upvotes... - tshepang
Great answer. Maybe we should be asking then, what new ideas have there been in the past 30 years (not new products/inventions). And since it's too hard to say whether or not they'll be "significant" or revolutionary before they're even built.... maybe we can speculate and then decide where to spend more energy. - mpen
(3) There have been countless amazing new ideas in the last 30 years, but there hasn't necessarily been time to see which ones matter. Pick any field of computing and just flick through the research released in the last year, and you'll find no shortage of new ideas, from small improvements to complete overhauls. However, the 1980s and before seem so revolutionary and packed because those ideas have now come to fruition and are ubiquitous, so they seem significant. We'll be having this same discussion in 30 years, when the ideas from now have boiled down into wonderful inventions. - Perrako
@Mark: What qualifies as a "new idea"? Every idea, piece of code, biological organism has a context, which in one view would make nothing truly new. The problem with Prof. Kay's question is that the philosophy behind the fire that he and his colleagues at Xerox Parc (and Engelbart 10 years before him) lit under the tech/computer industry has been burning like an uncontrolled fire and changed the world, the context. Truly new ideas out there have no impact so none of us have heard of them -- OSes written with proofs of their correctness and kernel security, non-ARM, non-x86 architectures, etc. - Jared Updike
9
[+68] [2009-01-12 15:27:21] Bill the Lizard

Google's Page Rank [1] algorithm. While it could be seen as just a refinement of web crawling search engines, I would point out that they too were developed post-1980.

[1] http://en.wikipedia.org/wiki/PageRank

"Just a refinement" is often an oxymoron. In this case, the refinement is the technology. The internet was a much scarier place before google brought ought that page rank algorithm (and delivered the results quickly and without page clutter, and all the other dredge that we use to have to suffer through to use other search engines in the past). - David Berger
(19) i don't think you know what an oxymoron is. - Jason
(1) Do you remember altavista and that little unknown company: yahoo? - Esteban Küber
@voyager: Hotbot and Lycos weren't bad, either. - Dean J
(2) @martin it's a non-oxymoron oxymoron. contradiction is in the definition: ninjawords.com/oxymoron - Jason
10
[+66] [2009-03-05 16:27:31] Andrew Dalke

DNS, 1983, and dependent advances like email host resolution via MX records instead of bang-paths. *shudder*

Zeroconf working on top of DNS, 2000. I plug my printer into the network and my laptop sees it. I start a web server on the network and my browser sees it. (Assuming they broadcast their availability.)

NTP (1985) based on Marzullo's algorithm (1984). Accurate time over jittery networks.

The mouse scroll wheel, 1995. Using mice without it feels so primitive. And no, it's not something that Engelbart's team thought of and forgot to mention. At least not when I asked someone who was on the team at the time. (It was at some Engelbart event in 1998 or so. I got to handle one of the first mice.)

Unicode, 1987, and its dependent advances for different types of encoding, normalization, bidirectional text, etc.

Yes, it's pretty common for people to use all 5 of these every day.

Are these "really new ideas?" After all, there were mice, there were character encodings, there was network timekeeping. Tell me how I can distinguish between "new" and "really new" and I'll answer that one for you. My intuition says that these are new enough.

In smaller domains there are easily more recent advances. In bioinformatics, for example, Smith-Waterman (1981) and more especially BLAST (1990) effectively make the field possible. But it sounds like you're asking for ideas which are very broad across the entire field of computing, and the low-hanging fruit gets picked first. Thus is it always with a new field.


11
[+63] [2009-01-11 21:42:47] Domchi

What about digital cameras?

According to Wikipedia, the first true digital camera [1] appeared in 1988, with mass market digital cameras becoming affordable in the late 1990s.

[1] http://en.wikipedia.org/wiki/History_of_the_camera#The_arrival_of_true_digital_cameras

But the idea, the invention and the patents were there in the early 70's (See the section on "Early Development") - saschabeaumont
(10) Digital camera? One wonders, judging from up votes, what people understand today by the term "computing". - MaD70
(1) Pictures is what modern consumer computing is based around. Without a webcam, a point-and-shoot or expensive SLR (for newspapers), modern consumers wouldn't really need computers. - Marius
(14) @MaD70: I guess you're not so much into photography, are you? Just to name a few: automatic face recognition, autofocus, "panoramic mode", automatic white balance ... it definitely falls into computing. - nico
@nico: one doesn't need to be an expert in digital photography to appreciate the algorithmic sophistication of software on digital cameras. I wanted to mean: programmable electronics is pervasive these days, if you refer to all these applications as "computing" then nearly everything is computing. For me "computing" is in such algorithms and they are certainly not bound to digital camera hardware. - MaD70
@MaD70: the developing of those algorithms has been strongly pushed by the new camera hardware (and viceversa) so, I would say that even if they're not strictly bound they are definitely strongly related. - nico
@nico: I'm not an expert in that field (the last thing I read was "Principles of pictorial information systems design", by Shi-Kuo Chang, which is a 1989 book) but I seriously doubt that some (many?) of those algorithms are quite new and not adaptations of old algorithms from other image processing fields instead (satellite images processing, for example). Of course, you can be an expert in digital image processing and up-to-date with current research and knowledgeable of its history. In such case I'll shut up: ubi maior minor cessat. - MaD70
@MaD70: Oh, I wouldn't consider myself a super expert in that. Anyway I guess you will agree with me that it is pretty normal for algorithms, as it is for hardware, to evolve and adapt from previous ones. Anyway you should check out some of the new algorithms used in modern microscopy... that's absolutely stunning new stuff (ok, it's probably not the type of digital cameras he was talking about but still...) - nico
(6) Sorry, the first prototype digital camera was made by Kodak in 1975 apparently. pluggedin.kodak.com/post/?ID=687843 - Mark Ransom
12
[+50] [2009-01-16 22:24:10] Jared Updike

Modern shading languages and the prevalence of modern GPUs.

The GPU is also a low cost parallel supercomputer with tools like CUDA and OpenCL for blazing fast high level parallel code. Thank you to all those gamers out there driving down the prices of these increasingly impressive hardware marvels. In the next five years I hope every new computer sold (and iPhones too) will have the ability to run massively parallel code as a basic assumption, much like 24 bit color or 32 bit protected mode.


Try it. You won't like it. Multi-core systems are much faster for most real-world problems. YMMV. Good for graphics, and not much else. - xcramps
There's a reason they're called GPUs and not PPUs... (Parallel processing units). Most people don't have the patience and/or skills to write good code for them. Though there is an increasing amount of research projects that are exploring using GPUS for non graphics purposes. - RCIX
(3) I tried it. I liked it. I can run all of my Matlab code on the GPU, with no source code modifications apart from a few typecast changes which you can do with a search'n'replace. Google "Matlab GPU computing". - Contango
(3) I agree with the OP. The programmable pipeline, while something we now might take for granted, completely changed the world of graphics, and it looks like it might continue changing other parts of the programming world. @xcramps: I think I'm missing something; last I checked, GPUs were multi-core systems. Just with a lot more cores. Kind of like... supercomputers. But I guess those aren't really being used for anything in the real-world... - Perrako
Two years later (not 5 as I said) and mobile devices shipping with OpenCL are on the horizon: macrumors.com/2011/01/14/… - Jared Updike
13
[+43] [2009-01-11 15:58:21] Jasper Bekkers

JIT compilation was invented in the late 1980s.


Well, the whole work on the implementation of the Self language (which was completely JIT-compiled) was amazing, and its usefulness can be seen today for Javascript inside Google V8. And that's from the late '80s and early '90s. - Blaisorblade
(7) I first saw this idea in the last chapter of John Allen's book Anatomy of Lisp, published in the 70s. He gave a ref to a 70s PhD thesis as the originator. - Darius Bacon
Maybe we should refine it to "profile based adaptive JIT compilation" such as the Self JIT or Suns' Java Hotspot - kohlerm
(34) One of the PhD theses in the early 1970s which had JIT was Jim Mitchell's at CMU -- he later went to PARC - Alan Kay
If JIT is defined as what Self did, like Wikipedia defines it, then it really seems to be an 80s concept. But, if you do define it like that, what is the really important concept: the bytecode compilation that goes way back, or the optimization that JIT represents? - Daniel C. Sobral
(2) Nori, K.V.; Ammann, U.; Jensen; Nageli, H. (1975). The Pascal P Compiler Implementation Notes. Zurich: Eidgen. Tech. Hochschule. (Thanks wikipedia) - Arafangion
14
[+42] [2009-01-11 22:24:15] dkretz

To address the two questions about "Why the death of new ideas", and "what to do about it"?

I suspect a lot of the lack of progress is due to the massive influx of capital and entrenched wealth in the industry. Sounds counterintuitive, but I think it's become conventional wisdom that any new idea gets one shot; if it doesn't make it at the first try, it can't come back. It gets bought by someone with entrenched interests, or just FAILs, and the energy is gone. A couple examples are tablet computers, and integrated office software. The Newton and several others had real potential, but ended up (through competitive attrition and bad judgment) squandering their birthrights, killing whole categories. (I was especially fond of Ashton Tate's Framework; but I'm still stuck with Word and Excel).

What to do? The first thing that comes to mind is Wm. Shakespeare's advice: "Let's kill all the lawyers." But now they're too well armed, I'm afraid. I actually think the best alternative is to find an Open Source initiative of some kind. They seem to maintain accessibility and incremental improvement better than the alternatives. But the industry has gotten big enough so that some kind of organic collaborative mechanism is necessary to get traction.

I also think that there's a dynamic that says that the entrenched interests (especially platforms) require a substantial amount of change - churn - to justify continuing revenue streams; and this absorbs a lot of creative energy that could have been spent in better ways. Look how much time we spend treading water with the newest iteration from Microsoft or Sun or Linux or Firefox, making changes to systems that for the most part work fine already. It's not because they are evil, it's just built into the industry. There's no such thing as Stable Equilibrium; all the feedback mechanisms are positive, favoring change over stability. (Did you ever see a feature withdrawn, or a change retracted?)

The other clue that has been discussed on SO is the Skunkworks Syndrome (ref: Geoffrey Moore): real innovation in large organizations almost always (90%+) shows up in unauthorized projects that emerge spontaneously, fueled exclusively by individual or small group initiative (and more often than not opposed by formal management hierarchies). So: Question Authority, Buck the System.


I loved Framework, and you can still buy it, but it's expensive. - Norman Ramsey
(7) It's always easier to have new ideas in a new area of knowledge, so a very large number of the important ideas came about in the 1950s and 1960s. We just can do most of them a whole lot better now. - David Thornley
(6) I think this reply and the comments are very well put. - Alan Kay
(5) @David: "whole lot better now". And cheaper. And smaller. Which enables new ways of doing other things better. E.g. 10 songs -> 1,000 songs -> 1,000 albums in my pocket, sure it is a matter of degree but it changes everything, even if someone back before 1980 showed it could be done, in theory, on a giant mainframe. The pieces may have been there but some inventions, like the iPod, are more than the sum of the parts. - Jared Updike
@Alan Kay, @le dorfier : it seems to me that one partial counter-example with that entrenched attitude is Donald Knuth decision's to asymptotically increment TeX version number toward pi. But he is an institution, not a corporation. I am appalled by mozilla and google race for version number 100 of their browsers while intelligent and creative standardization as well as innovation in data access and transformation is lagging. - ogerard
15
[+36] [2009-01-11 14:00:47] Daniel Paull

One thing that astounds me is the humble spreadsheet. Non-programmer folk build wild and wonderful solutions to real world problems with a simple grid of formula. Replicating their efforts in desktop application often takes 10 to 100 times longer than it took to write the spreadsheet and the resulting application is often harder to use and full of bugs!

I believe the key to the success of the spreadsheet is automatic dependency analysis. If the user of the spreadsheet was forced to use the observer pattern, they'd have no chance of getting it right.

So, the big advance is automatic dependency analysis. Now why hasn't any modern platform (Java, .Net, Web Services) built this into the core of the system? Especially in a day and age of scaling through parallelization - a graph of dependencies leads to parallel recomputation trivially.

Edit: Dang - just checked. VisiCalc was released in 1979 - let's pretend it's a post-1980 invention.

Edit2: Seems that the spreadsheet is already noted by Alan anyway - if the question that bought him to this forum [1] is correct!

[1] https://stackoverflow.com/questions/357813/help-me-remember-a-quote-from-alan-kay

(5) I had thought of this answer, but Visicalc was released just a smidgin before the 1980 deadline. (en.wikipedia.org/wiki/VisiCalc) - Oddthinking
but this reveals an interesting point: just presenting a simple way to display and manipulate data created a incredibly useful class of tools. is there some other 'enabling' idea like this? do we need one? i think so. - Javier
I agree wholeheartedly. Automatic dependency analysis could be and should be a part of modern programming languages. - Jesse Pepper
I don't understand what are spread sheets and why/how do people use them! - hasen
(1) @hasen j: Excel is a spreadsheet By the way there are modern platforms that keeps dependencies between calculations - for example Haskel (Excel and functional languages have much in common - for example pure functions and lazy evaluation). Excel is just much more intuitive than Haskel :) - ajuc
16
[+36] [2009-01-11 16:58:29] mjy

Software:

  • Virtualization and emulation

  • P2P data transfers

  • community-driven projects like Wikipedia, SETI@home ...

  • web crawling and web search engines, i.e. indexing information that is spread out all over the world

Hardware:

  • the modular PC

  • E-paper


(6) Virtualization was implemented on VM/CMS in 1972. What do you mean by "the modular PC"? - Hudson
I think that by "the modular PC" he means that anyone can buy almost interchangeable components and build their own computer. - NGittlen
(14) P2P was invented at Xerox PARC in the 70s -- the Altos were all P2P and the file resources and printers and "routers" were all P2P Altos - Alan Kay
(1) I saw "E-paper" and thought, what? how does that effect me day to day. I'm glad it exists but e-Readers are not very important technologies on a widespread basis, compared to say, the cellphone or iPod. - Jared Updike
I'm still looking for a decent, cheap eReader.. and I still didn't find one that can totally replace true dead tree edition. - kostiak
(3) I'd like to point out that about 40-50 years ago everyone was still doing math on paper mainly and saying the same about computers... - RCIX
The web shouldn't be considered innovative, it's insecure, which means it's badly designed. @Jared: You seem to be confusing the field of computing with marketing (ipod). Have you ever even seen e-paper? It's like DVCS vs VCS, they just need to get the refresh rate better (which they already have in prototype) and it will be better than LCD in every way. Unfortunately though, it was invented at Xerox PARC in the 70's. - L̲̳o̲̳̳n̲̳̳g̲̳̳p̲̳o̲̳̳k̲̳̳e̲̳̳
17
[+36] [2009-01-12 00:24:22] solidsnack

The rediscovery of the monad by functional programming researchers. The monad was instrumental in allowing a pure, lazy language (Haskell) to become a practical tool; it has also influenced the design of combinator libraries (monadic parser combinators have even found their way into Python).

Moggi's "A category-theoretic account of program modules" (1989) is generally credited with bringing monads into view for effectful computation; Wadler's work (for example, "Imperative functional programming" (1993)) presented monads as practical tool.


18
[+35] [2009-01-12 15:21:56] David Thornley

Shrinkwrap software

Before 1980, software was mostly specially written. If you ran a business, and wanted to computerize, you'd typically get a computer and compiler and database, and get your own stuff written. Business software was typically written to adapt to business practices. This is not to say there was no canned software (I worked with SPSS before 1980), but it wasn't the norm, and what I saw tended to be infrastructure and research software.

Nowadays, you can go to a computer store and find, on the shelf, everything you need to run a small business. It isn't designed to fit seamlessly into whatever practices you used to have, but it will work well once you learn to work more or less according to its workflow. Large businesses are a lot closer to shrinkwrap than they used to be, with things like SAP and PeopleSoft.

It isn't a clean break, but after 1980 there was a very definite shift from expensive custom software to low-cost off-the-shelf software, and flexibility shifted from software to business procedures.

It also affected the economics of software. Custom software solutions can be profitable, but it doesn't scale. You can only charge one client so much, and you can't sell the same thing to multiple clients. With shrinkwrap software, you can sell lots and lots of the same thing, amortizing development costs over a very large sales base. (You do have to provide support, but that scales. Just consider it a marginal cost of selling the software.)

Theoretically, where there are big winners from a change, there are going to be losers. So far, the business of software has kept expanding, so that as areas become commoditized other areas open up. This is likely to come to an end sometime, and moderately talented developers will find themselves in a real crunch, unable to work for the big boys and crowded out of the market. (This presumably happens for other fields; I suspect the demand for accountants is much smaller than it would be without QuickBooks and the like.)


Turbo Pascal & C at $100 on a MS-DOS system provoked a $100 price tag on a C complier for a C/PM from others. - CW Holeman II
Sorry, pretty sure Microsoft was selling shrink-wrap software before 1980. Not that they were the only ones. - Mark Ransom
19
[+34] [2009-01-11 13:44:45] frankodwyer

Outside of hardware innovations, I tend to find that there is little or nothing new under the sun. Most of the really big ideas date back to people like von Neumann and Alan Turing.

A lot of things that are labelled 'technology' these days are really just a program or library somebody wrote, or a retread of an old idea with a new metaphor, acronym, or brand name.


(3) You can't see the forest since all the trees are in the way... The building blocks are much the same, but the result has changed/evolved. - Johan
(8) ...That's the definition of technology ;) "the practical application of knowledge..." - steamer25
(1) I agree it's time for the next big thing. I'm tired of all the re-packing of things forgotten from the past as something new. Like Javascript = AJAX. - James
20
[+32] [2009-01-11 16:46:36] splattne

Computer Worms were researched in the early eighties of the last century in the Xerox Palo Alto Research Center.

From John Shoch's and Jon Hupp's The "Worm" Programs - Early Experience with a Distributed Computation [1]" (Communications of the ACM, March 1982 Volume 25 Number 3, pp.172-180, march 1982):

In The Shockwave Rider [2], J. Brunner [3] developed the notion of an omnipotent "tapeworm" program running loose through a network of computers - an idea which may seem rather disturbing, but which is also quite beyond our current capabilities. The basic model, however, remains a very provocative one: a program or a computation that can move from machine to machine, harnessing resources as needed, and replicating itself when necessary.

In a similar vein, we once described a computational model based upon the classic science-fiction film, The Blob [4]: a program that started out running in one machine, but as its appetite for computing cycles grew, it could reach out, find unused machines, and grow to encompass those resources. In the middle of the night, such a program could mobilize hundreds of machines in one building; in the morning, as users reclaimed their machines, the "blob" would have to retreat in an orderly manner, gathering up the intermediate results of its computation. Holed up in one or two machines during the day, the program could emerge again later as resources became available, again expanding the computation. (This affinity for nighttime exploration led one researcher to describe these as "vampire programs.")

Quoting Alan Kay: "The best way to predict the future is to invent it."

[1] http://vx.netlux.org/lib/ajm01.html
[2] https://rads.stackoverflow.com/amzn/click/com/0345467175
[3] http://en.wikipedia.org/wiki/John_Brunner_(novelist)
[4] http://www.imdb.com/title/tt0051418/

@Bobby: According to Computer security basics, 2006, Lehtinen, Russell & Gangemi, this work began "around 1980". So if you disregard the sci-fi precursors, this counts. - Charles Stewart
21
[+31] [2009-01-11 23:10:17] Konrad Rudolph

Better user interfaces.

Today’s user interfaces still suck. And I don't mean in small ways but in large, fundamental ways. I can't help but to notice that even the best programs still have interfaces that are either extremely complex or that require a lot of abstract thinking in other ways, and that just don't approach the ease of conventional, non-software tools.

Granted, this is due to the fact that software allows to do so much more than conventional tools. That's no reason to accept the status quo though. Additionally, most software is simply not well done.

In general, applications still lack a certain “just works” feeling are too much oriented by what can be done, rather than what should be done. One point that has been raised time and again, and that is still not solved, is the point of saving. Applications crash, destroying hours of work. I have the habit of pressing Ctrl+S every few seconds (of course, this no longer works in web applications). Why do I have to do this? It's mind-numbingly stupid. This is clearly a task for automation. Of course, the application also has to save a diff for every modification I make (basically an infinite undo list) in case I make an error.

Solving this probem isn't even actually hard. It would just be hard to implement it in every application since there is no good API to do this. Programming tools and libraries have to improve significantly before allowing an effortless implementation of such effords across all platforms and programs, for all file formats with arbitrary backup storage and no required user interaction. But it is a necessary step before we finally start writing “good” applications instead of merely adequate ones.

I believe that Apple currently approximates the “just works” feeling best in some regards. Take for example their newest version of iPhoto which features a face recognition that automatically groups photos by people appearing in them. That is a classical task that the user does not want to do manually and doesn't understand why the computer doesn't do it automatically. And even iPhoto is still a very long way from a good UI, since said feature still requires ultimate confirmation by the user (for each photo!), since the face recognition engine isn't perfect.


(3) Google's Picasa has had that for a while. In fact, picasa has so many other features that are slowly crawling into iPhoto. - akshaykarthik
22
[+30] [2009-01-11 14:25:18] sharkin

HTM systems ( Hiearchical Temporal Memory [1]).

A new approach to Artifical Intelligence, initiated by Jeff Hawkins through the book " On Intelligence [2]".

Now active as a company called Numenta [3] where these ideas are put to the test through development of "true" AI, with an invitation to the community to participate by using the system through SDKs.

It's more about building machine intelligence from the ground up, rather than trying to emulate human reasoning.

[1] http://en.wikipedia.org/wiki/Hierarchical_temporal_memory
[2] http://www.onintelligence.org/
[3] http://www.numenta.com/

(11) When they do something interesting, I will be the first and loudest leader of the applause - Alan Kay
@AlanKay Well, it seems that HTM is already used in real products. For example: vitamind inc allows you to recognize objects or people in CCTV footage. Vitamindinc is entirely powered by HTM. In this paper, you can see that HTM actually beats the SVM approach for handwritten recognition on datasets such as USPS. The fact that it is at the same time biologically inspired and of high practical value blows my mind. I think you can start applauding right now. - Benjamin Crouzier
23
[+26] [2009-01-11 19:15:12] Steve Steiner

The use of Physics in Human Computer interaction to provide an alternative, understandable metaphor. This combined with gestures and haptics will likely result in a replacment for the current common GUI metaphor invented in the 70's and in common use since the mid to late 80's.

The computing power wasn't present in 1980 to make that possible. I believe Games [1] likely led the way here. An example can easily be seen in the interaction of list scrolling in the iPod Touch/iPhone. The interaction mechanism relies on the intuition of how momentum and friction work in the real world to provide a simple way to scroll a list of items, and the usability relies on the physical gesture that cause the scroll.

[1] http://www.gamasutra.com/view/feature/2798/physics_in_games_a_new_gameplay_.php

The earliest example I can think of was Randy Smith's Alternate Reality Kit, built in Smalltalk-80 at PARC in '86 or '87. You could implement new objects with a physical metaphor. Every object had location, mass, momentum, and a pop-up menu for interacting with it via its message interface. - PanCrit
24
[+25] [2009-01-11 13:33:23] krosenvold

I believe Unit Testing, TDD and Continuous Integration are significant inventions after 1980.


(2) Testing first was a very old method that has be ressurected i believe. - Johnno Nolan
That's a software engineering thing, not a "computing" thing - SquareCog
@Dmitriy I find that a bit reductionistic - krosenvold
Yea I would find it hard to believe that nobody had done all that stuff before. Especially unit tests. - Quibblesome
(7) I'd agree with John, for instance Brooks describes a test-first approach in The Mythical Man-Month (1975). - Fabian Steeg
Testing as such is undoubtedly older than 1980, and I'm sure someone also thought it'd be best to test up-front. Lacking a better distinction I'd say all the significant advances in this area are post-1980. I'm sure Lenoardo DaVinci was planning to test his helicopter. - krosenvold
(28) Continuous integration was first done seriously in BBN Lisp 1.85 in the late 60s, which became Interlisp at PARC. Smalltalk at PARC in the 70s was also a continuous integration system. - Alan Kay
(3) TDD only became generally useful when computers got fast enough to run small tests so quickly that you are willing to run them over & over. - Jay Bazuzi
@Jay I would argue that given proper code separation, we have reached that point already. EDIT: Jay, I see, you are saying that this became useful recently. - mklauber
25
[+25] [2009-01-11 21:32:34] Domchi

Mobile phones.

While the first "wireless phone" patent was in 1908, and they were cooking for a long time (0G in 1945, 1G launched in Japan in 1979), modern 2G digital cell phones didn't appear until 1991. SMS didn't exist until 1993, and Internet access appeared in 1999.


(4) Japan in 1979, that's pre 1980. We're looking for new inventions - think research labs, universities, practical demonstrations of patent applications... all which will predate the mass-market availability by a number of years. - saschabeaumont
(1) The difference between 1G and 2G is about as big as difference between analog and digital computer. I think 2G (1991) deserves the status of "new" invention. - Domchi
And is dependent on powersave technologies and good batteries. - Johan
26
[+23] [2009-01-11 15:13:40] bruceatk

I started programming Jan 2nd 1980. I've tried to think about significant new inventions over my career. I struggle to think of any. Most of what I consider significant were actually invented prior to 1980 but then weren't widely adopted or improved until after.

  1. Graphical User Interface.
  2. Fast processing.
  3. Large memory (I paid $200.00 for 16k in 1980).
  4. Small sizes - cell phones, pocket pc's, iPhones, Netbooks.
  5. Large storage capacities. (I've gone from carrying a large 90k floppy to an 8 gig usb thumb drive.
  6. Multiple processors. (Almost all my computers have more than one now, software struggles to keep them busy).
  7. Standard interfaces (like USB) to easily attach hardware peripherals.
  8. Multiple Touch displays.
  9. Network connectivity - leading to the mid 90's internet explosion.
  10. IDE's with Intellisense and incremental compiling.

While the hardware has improved tremendously the software industry has struggled to keep up. We are light years ahead of 1980, but most improvements have been refinements rather than inventions. Since 1980 we have been too busy applying what the advancements let us do rather than inventing. By themselves most of these incremental inventions are not important or powerful, but when you look back over the last 29 years they are quite powerful.

We probably need to embrace the incremental improvements and steer them. I believe that truly original ideas will probably come from people with little exposure to computers and they are becoming harder to find.


"original ideas will probably come from people with little exposure to computers" so true. and even more sad since most of that 'numbing' exposure is windows/office. - Javier
(1) Some dates for earlier inventions: Engelbart's GUI was demoed in 1968 and the Xerox PARC Alto was developed in 1973. Multiple CPUs are new on the desktop, but not in the machine room -- the VAX cluster was first available in 1978. - Hudson
You were programming before I was born. Dang I have a long way to go. - Kieran Senior
Ouch. I didn't start until I was 26, now I really feel old. :) - bruceatk
Did you factor in inflation for that $200 16k memory chip? - Tim Tonnesen
@Tim Tonnesen - I paid $200 1980 dollars for that 16k. I don't know what it would be now. It was for an Atari 800 that I paid $750.00 for with 24k. $444.00 for a 90k floppy drive. I just looked it up. $200.00 is $497.17 in 2007, $750 is $1874 and $444 is $1103. - bruceatk
I note that 8 of the 10 are hardware improvements. The remaining are GUIs and IDE technology. GUIs are from the 60's or 70's (en.wikipedia.org/wiki/History_of_the_graphical_user_interfa‌​ce). So in 30 years, all that's new, software-wise, is IDE autocompletion? That make me a sad panda :( - Jonas Kölker
27
[+22] [2009-01-11 19:08:02] Edward Basena

Nothing.

I think it's because people have changed their attitudes. People used to believe that if they would just find that "big idea", then they would strike it rich. Today, people believe that it is the execution and not the discovery that pays out the most. You have mantras such as "ideas are a dime a dozen" and "the second mouse gets the cheese". So people are focused on exploiting existing ideas rather than coming up with new ones.


(3) So many of the existing ideas just haven't been implemented yet. - Breton
(3) There are always a few lunatics that will come up with new ideas, they just can't help it ;-) - Johan
But they're lunatics, so they can't sell their ideas because nobody will listen to them. - Adam Jaskiewicz
Ideas are more the province of artists. Practical implementation is what we guys do. Looking at engineers for brand new ideas is kind of fishing in the wrong pond. For bright new ideas, read Sf and figure out how this stuff could be done (I figure a lot of it could be done). However, implementing a wild idea can take years. Artists can get away selling ideas and dreams, but engineers are expected to come up with products... and they have to eat too. - Sylver
28
[+16] [2009-01-11 14:13:21] sharkin

Open Source community development.


(2) Actually, the SIG/M user group disks kind of pre-date what we now call open source. It contained hundreds of disks (of the floppy variety) full of CP/M software, much of it open source (although the term "open source" didn't exist then). - Mike Thompson
(2) In the sense of open cooperation and development among people who had access to a computer, it's much like the IBM user groups in the 1960s. It's just that more people can afford computers now. - David Thornley
(2) Agree with david, it's only become more prominent now as computers have moved from the education and scientific areas into the business world, this gave rise to "closed source" software, confusing licenses. It was always there, it just didn't need a name until the lawyers got involved. - saschabeaumont
(1) Yes, I must also agree with David here. Open Source is way earlier than 1980. Predates it by at least 20 years. I thought it was the 1950s not the 1960s though. - Brendan Enrick
29
[+16] [2010-04-11 20:56:35] VonC

The iPad [1] (released April 2010): surely such a concept is absolutely revolutionary!

alt text http://www.ubergizmo.com/photos/2010/1/apple-ipad//apple-ipad-05.JPG [2]

No way Alan Kay saw that coming from the 1970's!
Imagine such a "personal, portable information manipulator"...


...

Wait? What!? The Dynabook [3] you say?

alt text

Thought out by Alan Kay as early as 1968, and described in great details in this 1972 paper [4]??

NOOOoooooooo....

Oh well... never mind.

[1] http://en.wikipedia.org/wiki/IPad
[2] http://www.ubergizmo.com/photos/2010/1/apple-ipad//apple-ipad-05.JPG
[3] http://en.wikipedia.org/wiki/Dynabook
[4] http://www.mprove.de/diplom/gui/Kay72a.pdf

See stackoverflow.com/questions/432922/… for a larger context illustrated by this answer. - VonC
Well surely the idea was around before (for example apple newton); however the technology now proceeded so far that it's possible to build a cheap (and great) consumer device. - Nils
30
[+15] [2009-01-11 18:41:07] Steve Steiner

Ideas around Social Computing have had advances since the 1980. The Well [1] started in 1985. While I'm sure there were online communities before, I believe some of the true insights in the area have happened post 1980. The adverse dynamic aspects of social communities and their interaction on a software system are much like the disasters of the Tacoma Narrows Bridge [2].

I think Clay Shirky's [3] work in the area illuminates those effects and how to mitigate them. I'd say interesting real world examples of social software insights include things like reCAPTCHA [4] and Wikipedia [5], where significant valuable work is done by the participants mediated by the software.

[1] http://en.wikipedia.org/wiki/The_WELL
[2] http://en.wikipedia.org/wiki/Tacoma_Bridge
[3] http://www.shirky.com/
[4] http://en.wikipedia.org/wiki/ReCAPTCHA
[5] http://en.wikipedia.org

(6) Check out what Engelbart was really about, starting in 1962 - Alan Kay
Also, Luis Von Ahn did a great GoogleTalk (video.google.com/…) on his research resulting in ESP game (espgame.org/gwap/gamesPreview/espgame) - Mike Tunnicliffe
(1) One could also go back to Vannevar Bush and Memex. Vannevar's work doesn't negate Engelbart's. I doubt anything can be truly said to be without precedent. - Steve Steiner
(1) Also consider Control Data's PLATO CAI system, which had substantial social interactions - circa 1965-72. - Eric Brown
31
[+14] [2009-01-12 00:29:13] Breton

I think the best ideas invented since the 1980's will be the ones that we're not aware of. Either because they are so small and ubiquitous as to be unnoticable, or because their popularity hasn't really taken off.

One example of the former is Clicking and Dragging to select a portion of text. I believe this first appeared on the Macintosh in 1984. Before that you had seperate buttons for picking the beginning of a selection, and the end of a selection. Quite onerous.

An example of the latter is (may be) Visual Programming languages. I'm not talking like hypercard, I mean like Max/MSP, Prograph, Quartz Composer, yahoo pipes, etc. At the moment they are really niche, but how I see it, is that there's really nothing stopping them from being just as expressive and powerful as a standard programming language, except for mindshare.

Visual programming languages effectively enforce the functional programming paradigm of referential transparency. This is a really useful property for code to have. The way they enforce this isn't artificial either- it's simply by virtue of the metaphore they use.

VPL's make programming accessible to people who would not otherwise be able to program, such as people with language difficulties, like dyslexia, or even just laymen that need to whip up a simple time-saver. Professional programmmers may scoff at this, but personally, I think it would be great if programming became a really ubiquitous skill, like literacy.

As it stands though, VPL's are reall a niche interest, and haven't really got particularly mainstream.

What we should do differently

all computer science majors should be required to double major- coupling the CS major with one of the humanities. Painting, literature, design, psychology, history, english, whatever. A lot of the problem is that the industry is populated with people that have a really narrow and unimaginative understanding of the world, and therefore can't begin to imagine a computer working any significantly differently than it already does. (if it helps, you can imagine that I'm talking about someone other than you, the person reading this.) Mathematics is great, but in the end it's just a tool for achieving. we need experts who understand the nature of creativity, who also understand technology.

But even if we have them, there needs to be an environment where there's a possibility that doing something new would be worth the risk. It's 100 times more likely that anything truly new gets rejected out of hand, rather viciously. (the newton is an example of this). so we need a much higher tolerance for failure. We should not be afraid to try an idea which has failed in the past. We should not fully reject our own failures- and we should learn to recognize when we have failed. We should not see failure as a bad thing, and so we shouldn't lie to ourselves or to others about it. We should just get used to it, because it is just about the only constant in this ever changing industry. Post mortems are useful in this regard.

One of the more interesting things, about smalltalk, I think, was not the language itself, but the process that was used to arrive at the design of smalltalk. The iterative design process, going through many many revisions- But also very carefully and critically identifying the flaws of the existing system, and finding solutions in the next one. The more perspectives, and the broader the perspectives we have on the situation, the better we can judge where the mistakes and problems are. So don't just study computer science. Study as many other academic subjects as you can get yourself to be interested in.


(2) As usual there's always a counter-example! The MITSyn stream processing language is a pipe-oriented visual programming language from the early 1970s and is still available. - RobS
Really? Could you cite some documentation about this system please? I'd like to find out more. - Breton
(1) Hrmn just out of curiosity, has it ever struck anyone here how inadequate the metaphore of "language" is for representing a computation, or a program? Imagine the programs we could make if we had a more suitable metaphore. One where getting a semicolon in the wrong place didn't matter. - Breton
Not saying the VPL's of today are "it", but it demonstrates that metaphores other than "language" are possible for representing a computer program, and they each have different advantages and disadvantages. - Breton
(6) Clicking and dragging through text: invented at Xerox PARC in the 70s. GRAIL at RAND in the 60s was both a visual language and tablet driven. - Alan Kay
(1) Damn, this whole thread has just been "Owned" by Alan Kay. But this all kind of proves my point. If there's a significant new idea or invention, none of us would be aware of it. It'll be feircely guarded by whoever owns it, or entirely unrecognized by most everyone as a "good" idea. - Breton
(2) Alan Kay asking us about significant new ideas is a bit like asking flatlanders to imagine the third dimension. - Breton
32
[+14] [2009-01-20 12:17:18] Jens Roland

The pre-1980 days were, of course, the glory days of Xerox PARC. Back when the GUI, the mouse, the laser printer, the internet, and the personal computer were all being created. (Seeing as I'm too young to have been alive back then, and you were pretty much working on inventing all of those, I can't tell you anything about 1980 that you don't already know, so let's move on.)

The thing is, though, that the pre-1980 days were a lot more vibrant in terms of truly disruptive new technologies. That's the way it is with any new field -- hwo many game-changing technology advances have you seen in railroads in the past 100 years? How many have you seen in lightbulbs? In the printing press? Once something ignites a hype in the right circles, there is an explosive period of invention, followed by a long period of maturing. After that, you're not going to see the same kind of completely radical changes again UNLESS the basic circumstances change.

Luckily, that might be happening in a number of fields, and it has already happened in a few others:

  • Mobility - smart phones bring computing to a truly portable platform, which will soon include location-based services and proximity-based ad-hoc networks. It's a completely new paradigm that's potentially as game-changing as the GUI has been

  • The WWW (HTTP, HTML and DNS) has already been mentioned and is an obvious addition to the list, since it is enabling global, inexpensive, mainstream rich communication across the globe - all thanks to a computing platform

  • On the interface side, both touch, multitouch (Jeff Han comes to mind) and the Wiimote need mentioning. Currently, they are basically curiosities, but so were the early GUIs.

  • OOP design patterns -- higher level solutions as best practices to hard problems. Depending on your definition of 'computing', it may or may not belong on the list, but if you count OOP as a significant advance pre-1980 (I certainly do), I think design patterns and the GoF deserve a mention too

  • Google's PageRank and MapReduce algorithms - I am pleased to notice I wasn't the first to mention them, and seriously --- where would the world be without the principles of both of them? I vividly remember what the world looked like before them, and suffice it to say Google really IS my friend.

  • Non-volatile memory -- it's on the hardware side, but it is going to play a significant role in the future of computing - making bootup times a thing of the past, for example, and enabling us to use computers in entirely new ways

  • Semantic (natural language) search / analysis / classification / translation... We're not quite there yet, but companies like Powerset give the impression that we're on the brink.

  • On that note, intelligent HTMs should be on this list as well. I am yet another believer in Jeff Hawkins' model and approach, and if it works, it will mean a complete redefinition of what computers can do, what it means to be human, and where the world can go from here. Creating a real intelligence in that way (synthetically) would be bigger than anything the human race has accomplished before.

  • GNU + Linux

  • 3D printing / rapid prototyping (and, in time, manufacturing)

  • P2P (which also lead to VoIP etc.)

  • E-ink, once the technologies mature a bit more

  • RFID might belong on the list, but the verdict is still out on that one

  • Quantum Computing is the most obvious element on the list, except we still haven't been able to get enough qubits to play along. However, my friends in the field tell me there's incredible progress going on even as we speak, so I'm holding my breath for that one.

  • And finally, I want to mention a personal favourite: distributed intelligence, or its other name: artificial artificial intelligence. The idea of connecting a huge number of people in a network and allowing them access to the combined minds of everyone else through some form of question answering interface. It's been done a number of times recently, with Yahoo Answers, Askville, Amazon Mechanical Turk, and so on, but in my mind, those are all missing the mark by a LOT... much like the many implementations of distributed hypertext that came before Tim Berners-Lee's HTML, or the many web crawlers before Google. Seriously -- someone needs to build an search interface into 'the hive mind' to blow everyone else out of the water. IMHO - it is only a matter of time.


(1) Design patterns were around earlier, just not applied to software engineering (Christopher Alexander wrote about them as applied to architecture). Arguably by their nature, they are discovered, not invented. That the GoF was able to write about them meant that they were pre-existing. - Adam Jaskiewicz
I know about architectural design patterns, but my point is still valid. The car was still invented in the late 1800s even if the locomotive existed before then. Software patterns are a different beast. - Jens Roland
(1) Lightbulbs? How about LEDs? They used to be only green and red. The blue LED was the holy grail just 15 years ago, now they are everywhere. I remember seeing white LEDS for the first time in a Hewlett-Packard lab in 1998. The tungsten lightbulb is about to be outlawed due to its power consumption. Right now, we are in the biggest technology change of illumination since neonlights. - Guge
Linux is a significant new invention?! GNU?! Linux is a monolithic kernel ('50s or '60s) that is UNIX-like ('70s). The GNU project as a whole aims to catapult us into the future by providing a free clone of a '70s operating system. There's nothing new or innovative in either (aside from the chutzpah of their proponents claiming innovation). - JUST MY correct OPINION
Wikipedia says MVC was around in 1979, and it's definately a pattern although perhaps not formally described at the time in Alexandrian form. http://en.wikipedia.org/wiki/Model–View–Controller - Dafydd Rees
Non-volatile memory...ever seen "beads on wires" core? Nice to have it back, of course... - dmckee --- ex-moderator kitten
33
[+14] [2009-04-18 18:57:52] Francois

Reorganization is what we need, not reinvention.

We have all the hardware and software components we need right now to do amazing things for years to come.

I believe there is a disease in the Sciences, where ever participant is always trying to invent something new to distinguish themselves from others. This is in contrast to doing some of the messy work of cataloging or teaching older works.

People who build 'new' things are generally considered of a higher pedigree than people who reuse existing and something almost ancient works. (Ancient to say a 20 year old to whom something like say Lisp was made more than double their life time in the past. 1958)

Good old ideas need to be resurrected and propagated far and wide, and we need to stop trying to build businesses or programmer movements that effectively trample old works and systems in power-plays to be the next new thing-when in fact most 'new shiny' things are just aspects of old ideas resurrected.


Yes, it is. The O/S is taken from the iPhone and the concepts of tablet computers and systems designed for consumption (think: set top boxes and java applets) are not new either. - ConcernedOfTunbridgeWells
@Nils: the iPad is an Apple Newton device, 25 years later. en.wikipedia.org/wiki/Newton_%28platform%29 - Dean J
(1) Meh, The iPad is a stone tablet, 25 million years later. - xelco52
34
[+12] [2009-01-11 14:23:40] Cade Roux

Effective Parallelization and Quantum Computing - I think these are two areas where progress has been made and much more progress will be made to make very significant changes to our use of computing power.

Effective Parallelization meaning parallelizing and distributing processing without the need for special programming techniques, but where it is built into the compiler/framework.


Both of them are still promising but not widely used. Especially quantum computing - hey, you could break RSA, but up to now factoring 15 is an amazing achievement. And while the complexity of buiding classical computers scales linearly, the one for quantum computer "scales exponentially". - Blaisorblade
(12) The Burroughs B5000 designed in 1961 and deployed in 1962-3 was shipped with multiple CPUs and a higher level language and automatic hardware support to allow this to be done safely - Alan Kay
35
[+12] [2009-01-12 04:11:27] Kip

Flying cars and hoverboards. Oh wait, those haven't been invented yet. But by 2015, we have to have them. Otherwise Back To The Future 2 will have been a big lie!


Thanks! I guess not as many people liked the humor here though, since i'm still at 0 - Kip
when Doc is about to try the time machine for the first time, and travel "into the future", he's aiming for 25 years ahead of 1985... which is to say, now. I want my Mr. Fusion. - Dean J
36
[+12] [2010-05-30 00:17:45] Barry Brown

One thing that hasn't changed in mainstream computing is the hierarchical filesystem. That's a shame, IMO, since some work was being done in the late 1980s and 1990s to design new kinds of file systems more appropriate for modern, object-oriented operating systems -- ones which are OO from the ground up.

The OO operating systems tended to have flat object stores that were expandable and flexible. I think the EROS Project [1] was one built around that idea; PenPoint OS [2] was an 1990s object-oriented OS; and Amazon S3 [3] of course is a contemporary, flat object store.

The are at least two ideas in OO, flat filesystems that I particularly liked:

  • The entire disk was essentially swap space. Objects exist in memory, get paged out when they are not needed, and brought back in when they are. There's no need for a hierarchical filesystem that's separate from virtual memory. Programs are "always running," in a sense.

  • A flat file/object store allows content to be indexed and searched, rather than forcing the user to decide -- ahead of time -- where the content will live in relation to other content and what its name shall be. A hierarchical system could be built on top of the flat storage, but it's not required.

As Alan Cooper states in his book, About Face [4], hierarchical filesystems are a kludge, designed for the computers of the 1960s and 1970s with limited memory and disk storage. Sadly, the popularity of Windows and Unix have guaranteed the dominance of the hierarchical filesystem to this day.

[1] http://www.eros-os.org/design-notes/DiskFormatting.html
[2] http://en.wikipedia.org/wiki/PenPoint_OS
[3] http://aws.amazon.com/s3/
[4] https://rads.stackoverflow.com/amzn/click/com/1568843224

You should add Plan 9 to that list - luser droog
Plan 9 flips the whole memory/filesystem relationship around. The OSs I described have no filesystem; everything lives in memory. Plan 9, in contrast, has (in a sense) no memory; everything is a file in the filesystem. - Barry Brown
Understood. But it does represent a radical departure from Windows's notion of a hierarchical filesystem. Compared to Unix, of course, the departure is more "evolutionary" than "revolutionary". - luser droog
37
[+10] [2010-07-08 10:32:44] Eric

Pretty much everything important in modern 3D computer graphics. Ray-tracing (in the compute graphics sense) got its jump start from Whitted's 1980 paper. Marching cubes ('87) is the standard way to extract an isosurface from 3D data.


38
[+9] [2009-01-19 15:44:29] steffenj

Virtual Worlds in which you are represented by a virtual alter ego (aka Avatar), for socializing and roleplaying.

Most commonly referred to as MMOs - Massive(ly) Multiplayer Online. Some popular examples include World of Warcraft, Everquest, Second Life.

PS: no, they still don't require the heavy headgear as typically depicted in geek movies of the 80s. It's a shame....


39
[+8] [2009-01-19 17:03:36] steffenj

Touchscreens and Motion Sensing interfaces for human computer interaction.

For example:

  • Touchscreens for PDAs, iPhone or Nintendo DS
  • Motion Sensing, Nintendo Wii Controller or (to a lesser degree) SixAxis controller for Playstation 3.

Only question is ... are these technologies really post-80s?


Touchscreens are 1960s era in origin, and part of the PLATO system in 1972. One of the games on PLATO using touchscreen? "squish the bug" - Andrew Dalke
Bear in mind that the PLATO terminal alone cost several tens of thousands of dollars, and needed to be connected to a CDC Cyber mainframe, at $10 million or so. And these were 1960's era dollars, so multiply everything by at least 10, and probably closer to 100. - Eric Brown
40
[+8] [2009-02-17 22:38:03] hasen

BitTorrent [1].

[1] http://en.wikipedia.org/wiki/BitTorrent_%28protocol%29

41
[+7] [2009-01-11 21:59:16] Domchi

As for programming concepts, IoC / Dependancy injection in 1988 with roots in 1983. Fowler has some notes on the history of the concept on his Bliki [1].

[1] http://martinfowler.com/bliki/InversionOfControl.html

42
[+7] [2009-01-12 20:18:06] Paul W Homer

Access to massive data.

The sheer size and scale of the data we have available these days is massive compared to what it used to be in the 80s. We've had to make a large number of changes to both our hardware and software to be able to store and display this stuff. One day, we'll actually learn how to qualify and mine it for something useful. Someday.

Paul.


43
[+7] [2009-01-19 16:14:36] Robert K

Premise: virtually no new inventions since 1980.

The first thing to do is define invention [1], or else you'll get off on the wrong track. The second definition of invention from Dictionary.com says:

U.S. Patent Law. a new, useful process, machine, improvement, etc., that did not exist previously and that is recognized as the product of some unique intuition or genius, as distinguished from ordinary mechanical skill or craftsmanship.

Thus, since 1980, there have been very few new inventions in computing. What has there been? Obviously there has been large amounts of new technologies and new things coming about, but what are they?

We aren't inventing any more, we are improving what primarily exists already.

A simple example:

The CD, or compact disk, was first started in 1977 though they weren't accepted by industry until 1982. At this time the first factory for pressing CDs just came into readiness. Eventually, by 1985, the CD-ROM (Read-Only Memory) was accepted as a medium. The CD-RW followed 5 years later. (Source: Wikipedia [2])

Now what? Well, given that we have larger hard drives (still just improvement on the paradigm) we need more space to be able to supplant the VHS market and make videos compatible with computers. Thus came about the DVD, though I am cutting out many improvements to the existing CD technology.

The DVD came about, was "invented", during the year of 1995. (Source: Wikipedia [3])

Since then we have had:

  1. Writable, and ReWritable DVDs
  2. Dual-layer DVDs
  3. Triple- and Quad-layer DVDs (unreleased though feasible through a simple driver revision)
  4. HD-DVD
  5. Blu-ray Disc

Obviously this list isn't all inclusive. But spot the new invention, remember the definition I gave above, in that list. You can't! They're all just variations on the concept of an optical disc, all just variations on the same hardware, and all just variations on existing software.

WHY?

Cost. See, it's cheaper economically to make incremental improvements to an existing product. If I can sell you a HD DVD or a Blu-ray Disc because you believe it to be necessary or cool, then I have no need to release my plans for the Triple or Quad layer DVDs. In fact, I can charge you through the nose just to get the new technology because you are an early adopter and you need my "new and improved!" hardware.

This is called either marketing, or product relations.

But what about software?

What about it? Pre-1980 there was a lot of software inventiveness going on, but since then it has mostly just been improvements on what already exists or reinvention of the wheel. Look at any OS or office package to see this.

Conclusion

As far as I'm concerned, there have been virtually no new inventions in the past 29 years. I could wax long and cross a great many industries, but why should I bother? Once you start thinking about it, and start comparing an "invention" to a prior, similar product ... you'll find it is so similar that it isn't even funny. Even the internal combustion engine has been around since 1906 with no new inventions in that field since then; many improvements and variations of this "wheel" yes, but no new inventions.

Not even that new weapon America deployed in Iraq--the one that uses microwaves to make a person feel shocked like they touched a lightbulb--is new. The same idea was used in security systems, then classified and taken off the market, with ultrasound to make an intruder feel physically ill. This is a directed form of the weapon with a different wavelength and application, not a new invention.

[1] http://dictionary.reference.com/browse/invention
[2] http://en.wikipedia.org/wiki/Compact_Disk#History
[3] http://en.wikipedia.org/wiki/DVD#History

It's funny you mentioned U.S. Patent Law in your definition, because if you look at patents, especially software patents - you have quite a lot of "inventions" since 1980 ;-), it's a shame that they aren't real inventions just some kind of parodies, just like you said... - inkredibl
Software patents are mostly just "conceptual patents", which cover an umbrella of regions. These sorts of patents are an abuse of the system, in my opinion. They also aren't inventive at all. >_< - Robert K
(4) The first 'computers' where really just an improvement on electronic calculators. The first electronic calculators were really just an improvement on mechanical adding machines. The first mechanical adding machines were really just an improvement on an abacus. The first abacus was really just an improvement on using your fingers. The first fingers were really just an improvement on legs, and the first legs were really just an improvement on wriggling around like a worm. - Kirk Broadhurst
(2) In a similar analogy, the Great Wall of China was never 'built'. Individual bricks were laid that ever-so-slowed improved on what was already there. And every day people would say 'it's only slightly longer than it was yesterday, that's not exciting'. - Kirk Broadhurst
I don't see why, but you seem to have divorced development, effort, and planning from their association; it took great planning to get the Great Wall of China built. And the abacus wasn't an improvement on using fingers, unless you used base-6 notation with both hands and feet! An abacus can represent far larger numbers. Find me some "missing links" that were the surviving transitional fossils between these distinct inventions ... because last I saw people still use fingers, abacuses, calculators, and computers (yet more sophisticated an adding machine). - Robert K
(1) You also forgot the slide-rule, which operates on yet other principles. I suggest you think more critically about this: the abacus originated in Babylon long before the Greeks, as far as we know, and represented a major advancement; the slide rule is based on logarithms and was invented in the 1600s; and the oldest mechanical device (of mathematical & navigational nature) is the Antikythera device from about 150 - 100BC. Considering the near perfection and age of this device there had to be prior work, yet the abacus was/is still used. These in no way form a "fossil record" of the calculator! - Robert K
44
[+7] [2009-01-19 17:12:11] ShuggyCoUk

Electrically Erasable Programmable Memory, generalized into non volatile read/write memory the most well known and ubiquitous currently being Flash. http://en.wikipedia.org/wiki/EEPROM lists this as being invented in 1984.

By giving the storage medium the same general physics, power requirements, size and stability as the processing units we remove this as a limiting factor in designs for where we place processors. This expands the possibilities for how and where we place 'intelligence' to such a plethora of smart devices (and things that would previously never have been candidates for being considered smart at all) that we are still taken up in the surge. Mp3 players are really just a fraction of this.


-1: The EEPROM existed in 1978. The subsequent lowering of the voltage, and messing about with gates, does not constitute a "really new idea" in computing, even though it did make a dramatic difference to the ease of designing circuits and computing devices that could erase memory in situ. - Charles Stewart
@Char Hmm the update to the page for the date does make this tricky to put in the category now. Will update to reflect shortly - ShuggyCoUk
45
[+7] [2009-01-20 22:33:16] nezroy

Optical computing. Seems like it should have been around longer but I can't currently find any references pre-dating 1982 or so (and the relevant piece of technology, the optical transistor, didn't pop up until 1986).


My dad knows a man that patented a holographic computer, which was 100% holographic. I've no idea how it worked, but it was supposedly an extremely fast system. - Robert K
46
[+7] [2009-11-03 22:05:17] Michael Zilbermann

Well the World Wide Web has already been told, but more basically, I would say "DNS". Seems that it was invented in 1983 (http://en.wikipedia.org/wiki/Domain_Name_System) and IMHO we can consider that it's the mandatory link between invention of the internet protocol and the capability to spread all over the world what is now called the web.

Still in the "network" section, I would add WIFI. It was invented in the 90's (but I agree it's not exactly "computing", but more related to hardware).

In a more strict "algorithmic" section, I think about turbocodes (dated 1993); some say it's only closing the limit defined by the Shannon signal theory, but wouldn't this argument reject all other answers to "everything was already in seed in Lovelace, Babbage and Turing writings" ?

On the field of cryptography, I would add the PGP program from P.Zimmermann (dated 1991), which brought a quite robust (at this time) free encryption program to the citizen, and contributed to shake a little the government's posture about encryption. In fact I think it was one of the factor of cryptography "liberalization", which was a prerequisite for developing e-commerce.


47
[+6] [2009-01-11 13:52:50] Richard Harrison

The changes to infrastructure to allow accessible internet from home and office.

Documented and accepted standards from W3C through to APIs

Apart from that most of what we'd think of as new dates back a lot longer than you'd think (e.g. GUI, OOP).


48
[+6] [2009-01-11 22:05:47] brabster

I think the laptop was invented around 1980 and I also think that the development of laptops and portable computing changed a lot of people's lives - certainly those of us who work in IT, or who use computers and travel.


You do know that Dr. Kay originated the idea for the laptop, known as the Dynabook back then. Even as late as 1994, when I first read about the Dynabook, I hoped that something as "good" as its design would come to market. And here we are. - Robert S.
The dynabook sketch looks similar to a TRS-80 model 100 (released in 1984 IIRC) - finnw
49
[+6] [2009-01-11 22:11:47] community_owned

I'd say the biggest trend is an ever increasing lack of location dependence and pervasiveness. An interesting philosophical exercise these days is to count the computers in you immediate area. They're everywhere desktops, keyboards, microwaves, radios, televisions, cell phones etc... My grandmother computer is illiterate however her life is as infested with small computers as everyone else's. She can make a call to me from the middle of an empty field. I can then answer that call zipping down the highway.


50
[+6] [2009-01-12 01:48:32] Portman

Declarative Programming.

In 1979 "computer programs" were imperative. The programmer was expected to instruct the compiler on both what to do and how to do it. (N1)

Today, ASP.NET WebForms [1] and WPF [2] programmers regularly write code without knowing or caring how it will be implemented. Wikipedia [3] has other, less mainstream examples. Additionally, all of the SGML [4]-derived "markup" languages are declarative, and I doubt many of the programmers of 1979 would have predicted their importance or ubiquity in 30 years.

Although the concept of declarative programming existed before 1980 (see this paper [5] from 1975), it's invention took place with the introduction of Caml [6] in 1985 (debatable) or Haskell [7] in 1990 (less debatable). (N2) Since then, declarative programming has increased greatly in popularity. And, when massively multicore processors finally arrive, we'll all be declarative programmers.

--
Notes:
(N1) I can't vouch for this firsthand, since I was a fetus in 1979.
(N2) From other answers, it seems like people are confusing conception with invention. Da Vinci conceived of a helicopter, but he didn't invent it. The question is specifically on inventions in computing.
(N3) Please don't mention Prolog (rel. 1975) in the comments unless you have actually built an app in it.

[1] http://msdn.microsoft.com/en-us/library/ms973868.aspx
[2] http://msdn.microsoft.com/en-us/netframework/aa663326.aspx
[3] http://en.wikipedia.org/wiki/Declarative_programming
[4] http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=16387
[5] http://portal.acm.org/citation.cfm?doid=953064.811138
[6] http://caml.inria.fr/
[7] http://www.haskell.org/

Oracle and IBM came in 1979 with commercially available SQL databases, so the use of declarative programming is older than 1980. - tuinstoel
(1) Declarative programming is a bit of an overloaded term. Declarative programming according to microsoft is usually a smart way of using XML to configure an application. Functional languages like Lisp, Scheme and Haskell allow for a different form of declarative programming. - Mendelt
(3) Ivan Sutherland's Sketchpad was completely programmed declaratively and had no imperative features. And it wasn't the last declarative system done before 1980. - Alan Kay
In my humble opinion - there's no such thing as declarative programming. Even if you say only WHAT needs to be done you still know HOW it will be done, and if you don't - weird things will happen from time to time and you will have no clue WHY until you know HOW. That's why humans are needed here. - inkredibl
(3) Suppose this were only about solving linear equations. We supply the relationships that we want to have simultaneously hold, and the solver program solves or says there is no solution. We know how the solver program works, but are programming completely declaratively....? - Alan Kay
Oh dear. I don't know what "Declarative programming" is, but I guess Prolog and ML fit the bill ‹check WP› yup it does. - niXar
@niXar: Caml was 1985. I will update the post now. And as for Prolog... I stand by my original footnote. - Portman
51
[+6] [2009-01-12 04:45:20] Jared

Podcasting It allows for an informative way to distribute information and debate. I find it to be more interactive then standard interviews but have less noize then blog comments.


Harder to examine, though, and requires extra equipment (earphones) to avoid bothering cow-orkers. - Adriano Varoli Piazza
52
[+6] [2009-01-12 06:56:52] VonC

Instant Messaging [1] has been around from long time (mid to late 60), but IRC [2] did not come before 1988.

Video communication, on top of that, (as in, for instance, Windows Live Messenger [3], or Skype, or ...) really did change the way we are communicating [4] ;) and is much more recent.


<correction>
(see VideoConferencing: 1968 [5], alt text http://wpcontent.answers.com/wikipedia/en/thumb/6/64/On_Line_System_Videoconferencing_FJCC_1968.jpg/180px-On_Line_System_Videoconferencing_FJCC_1968.jpg [6], as Alan Kay himself points out in the comment:

Again, please check out what Engelbart demoed in 1968 [7] (including live video chatting and screen sharing). IOW, guessing really doesn't work as well as looking things up. This is why most people make weak assumptions about when things were invented.)

Take that in my face ;), and rightfully so.

Note: the "webcam" (video setup) of those times were not exactly made for your average living-room ;)

</correction>


[... resuming the answer:]

The generalization of webcam [8] alt text http://wpcontent.answers.com/wikipedia/commons/thumb/c/c5/Logitech_Quickcam_Pro_4000.jpg/180px-Logitech_Quickcam_Pro_4000.jpg [9] helped too (Started in 1991, the first such camera, called the CoffeeCam, was pointed at the Trojan room coffee pot in the computer science department of Cambridge University).

So: Post-1980: 2 out of 3: IRC and Webcam.

[1] http://en.wikipedia.org/wiki/Instant_messaging
[2] http://en.wikipedia.org/wiki/IRC
[3] http://en.wikipedia.org/wiki/Windows_Live_Messenger
[4] https://stackoverflow.com/questions/287764#287776
[5] http://www.answers.com/topic/videoconferencing#History
[6] http://wpcontent.answers.com/wikipedia/en/thumb/6/64/On_Line_System_Videoconferencing_FJCC_1968.jpg/180px-On_Line_System_Videoconferencing_FJCC_1968.jpg
[7] http://www.bootstrap.org/chronicle/pix/pix.html#2D
[8] http://www.answers.com/topic/web-cam#History
[9] http://wpcontent.answers.com/wikipedia/commons/thumb/c/c5/Logitech_Quickcam_Pro_4000.jpg/180px-Logitech_Quickcam_Pro_4000.jpg

(3) Again, please check out what Engelbart demoed in 1968 (including live video chatting and screen sharing). IOW, guessing really doesn't work as well as looking things up. This is why most people make weak assumptions about when things were invented. - Alan Kay
I got smacked in the face by Alan Kay! I will never wash that cheek again ;) I guess my answer only contains two post-1980 bits of "invention" instead of 3 (even if their concept was around before): IRC (1988) and webcam (1991). - VonC
(1) How does IRC count for anything? It's a real-time chat medium -- an incremental refinement of something I'd been using long before IRC existed. - JUST MY correct OPINION
53
[+6] [2009-02-11 16:47:27] Bahaa Zaid

“American’s have no past and no future, they live in an extended present.” This describes the state of computing. We live in the 80’s extended into the 21st century. The only thing that’s changed is the size. Alan Kay

Source: Alan Kay: Is Computer Science an Oxymoron? [1]

[1] http://www.windley.com/archives/2006/02/alan_kay_is_com.shtml

54
[+6] [2009-02-14 18:37:35] Ellery Newcomer

The memristor.

While the idea is not newer than 1980, I believe a working model was not created until 2008. Should it make it past R&D, it will be the most significant advance in computer hardware since the transistor; at the very least, obviating secondary memory.


55
[+5] [2009-01-11 21:47:20] Alex Baranosky

I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

The way that I see it, we have not had so many new ideas in computing because we largely haven't needed them. We have been milking the old ideas, and getting so much out of them, such as the phenomenal growth of cpu speed.

When we need new ideas because the "well has run dry" so to speak, then we will see that necessity is the mother of invention.


> The phenomenal growth of cpu speed. Well, this is something close to an end. The research on alternatives to silicon is raising indeed, with examples such as graphene transistors: technologyreview.com/… - Blaisorblade
Yes, I know it is coming to an end, and so I am sure a new chapter of computing will be born from it. Necessity id the mother of invention, right? - Alex Baranosky
I think it's clear at this point that advancement in cpus is coming more from parallelization than speed. - Adam Lassek
56
[+5] [2009-01-11 21:57:29] dkretz

The one activity I can think of that wasn't there in 1980 was Global Searching Across Disjoint Domains. i.e. google and a (very few) predecessors - all of which were well post-1980. Associated with conventions for syntactic markup,I think it qualifies as a "new idea"; but I think it also has only just begun; there's a lot of overhead space to build up into.

One device that has the potential to accelerate this already lightning-speed vector will soon emerge as the combination camera/GIS/phone/network. It creates the opportunity to automatically collect, classify, and aggregate datapoints in four-dimensional space for the first time. Even tedious manual collections of this type of data are sprouting; imagine when it's done by default.

For better or worse.


57
[+5] [2009-01-14 00:46:49] jeffD

Design Patterns which brought computer science closer to computer engineering. GPS and internet address lookup for location based interactions. Service Oriented Architecture (SOA).


58
[+5] [2009-01-17 01:19:49] Tom A

Open PC design that led to affordable components (except from Apple :-) and competition that drove innovation and lower prices. This caused the big change from the user going to the computer -- where there was a terminal to use -- to the computer coming to the user and appearing at home and even in ones lap.


And keep in mind that because of this Macs are now on the same architecture as everybody else too ;-). - inkredibl
Multiple manufacturers were delivering S100 bus designs to users in 1976. - Dour High Arch
59
[+5] [2009-01-24 19:37:23] Jeff Moser

Games With a Purpose [1] - Collective intelligence tools like Luis von Ahn and his team are developing might have been a dream before 1980, but there wasn't a widely deployed network with millions of people available and a need (e.g. reCAPTCHA [2]) to actually make it happen.

[1] http://www.gwap.com/
[2] http://recaptcha.net/

60
[+5] [2009-02-04 04:13:40] James Cape

IP Multicast (1991) and Van Jacobsen's Dissemination Networking [1] (2006) are the biggest inventions since 1989.

[1] http://video.google.com/videoplay?docid=-6972678839686672840&hl=en

61
[+5] [2009-05-07 16:46:26] Steve Steiner

This is a negative result, which is odd as a 'Fundemental innovation', but I think applies since it opened new areas of research, and closed off useless ones.

The impossibility of distributive consensus: PODC Influential Paper Award: 2001 [1]

We assumed that the main value of our impossibility result was to close off unproductive lines of research on trying to find fault-tolerant consensus algorithms. But much to our surprise, it opened up entirely new lines of research. There has been analysis of exactly what assumptions about the distributed system model are needed for the impossibility proof. Many related distributed problems to which the proof also applies have been found, together with seemingly similar problems which do have solutions. Eventually a long line of research developed in which primitives were classified based on their ability to implement wait-free fault-tolerant consensus.

[1] http://www.podc.org/influential/2001.html

62
[+5] [2009-10-16 08:11:52] Callie J

Low cost/home computing. Something that (at least here in Blighty) wasn't really heard of until the early 1980s. Without home computing, how many people posting here would have got into computing as a career? Or even as a hobby 1 [1]?

Myself, had my folks not got Clive Sincliar's humble rubber-keyed ZX Spectrum [2] back in 1982/1983, I probably wouldn't be here now. And it wasn't just the Speecy: the C64 [3], Vic-20 [4], Acorn Electron [5], BBC A/B/Master [6], Oric-1 [7], Dragon-32 [8], etc. all fuelled the home computer market and made programmers out of every 8 year old boy and girl who had access to one.

If that wasn't a revolution in terms of computing and programming, I won't know what was...!

1 [9] curious aside: what is the breakdown of hobbyists vs pro programmers on this site? I realise these stats aren't collated, but could be interesting to know.

[1] http://en.wikipedia.org/wiki/ZX_Spectrum
[2] http://en.wikipedia.org/wiki/ZX_Spectrum
[3] http://en.wikipedia.org/wiki/Commodore_64
[4] http://en.wikipedia.org/wiki/Commodore_VIC-20
[5] http://en.wikipedia.org/wiki/Acorn_Electron
[6] http://en.wikipedia.org/wiki/BBC_Micro
[7] http://en.wikipedia.org/wiki/Tangerine_Computer_Systems#Oric-1
[8] http://en.wikipedia.org/wiki/Dragon_32/64#System_software
[9] http://en.wikipedia.org/wiki/ZX_Spectrum

Low cost/home computing - revolutionary, yes, but it was essentially an economic sea change in computing, not an invention. Were there any particular siginificant inventions that made it possible? - Charles Stewart
@Charles Stewart - Yes, Sir Clive Sinclair realising that you could fit pretty much all of the discrete logic you needed for the ZX-81/Timex 1000 onto a single ULA chip, which meant that you only needed 4 chips to make a computer: CPU, Ram, Rom and ULA. See en.wikipedia.org/wiki/Gate_array - Mark Booth
63
[+5] [2010-03-03 13:49:22] Sam

Augmented Reality. This hasn't really taken off yet, but as ideas go I think it is huge, from being able to paint virtual arrows on the ground to help you find your destination, to decorating everything around you with useful information or aesthetic fancies.

Imagine your phone ringing across the room, you look at it and a information bubble pops up above it to tell you who is calling. How cool would that be? AR will bring massive changes in the way we think about and interact with technology.

Haunted houses would probably get significantly scarier too.

I also wanted to mention Electroencephalography [1] for brain-computer interfacing, but apparently this was first invented in the 1970's.

[1] http://en.wikipedia.org/wiki/Electroencephalography

64
[+5] [2010-06-25 14:00:20] Behrooz

Virtualization?
applications like VirualBox OSE or VMWare have saved me many hours.


CP-67 predates 1980 by a long shot. - Windows programmer
From kernelthread.com/publications/virtualization: In the mid 1960s, the IBM Watson Research Center was home to the M44/44X Project, the goal being to evaluate the then emerging time sharing system concepts. The architecture was based on virtual machines: the main machine was an IBM 7044 (M44) and each virtual machine was an experimental image of the main machine (44X). - Charles Stewart
65
[+5] [2010-07-14 07:16:24] Ravindra S

USB [1]

[1] http://en.wikipedia.org/wiki/Universal_Serial_Bus

(3) It's a serial bus standard. Serial data transmission is older than the general purpose computer. Does it have any "really new ideas"? It looks like a standardisation effort to me. - Charles Stewart
(2) The new idea of USB focuses on end user ease of use. A tree of devices that can all communicate on the same bus is a huge improvement. This ease of use is why USB won out over all the other bus standards, in my opinion. - Shane Holloway
USB also allows more than 1 peripheral to connect to the computer's USB port. Lots more than 1, in fact. - Windows programmer
66
[+4] [2009-01-11 13:54:21] Oddthinking

Adoption of Object Orientation.

The idea was around earlier (e.g. Simula), but it became mainstream in the 1990s. (IMHO, one of its greatest benefits is having providing a common vocabulary amongst developers, so its widespread adoption made it much more valuable.)


"OO was around earlier (e.g. Simula)... What a beautiful answer to a question from Alan Kay. :-) - Jens Ayton
To expand your comment, Alan Key is the inventor of Smalltalk, the first hugely relevant OOP language (I think Simula died early, in practical use). The first mainstream Smalltalk was Smalltalk-80 actually :-). - Blaisorblade
@[Blaisorblade]: Honored to have Dr. Kay on this humble site - nevertheless, Simula was technically the first OOP language. Smalltalk was the first "pure" OO environment, i.e. where everything was an object. - Steven A. Lowe
Didn't he also come up with the term OO? - bruceatk
Ooops! I didn't look at the name of the question-asker, and if I had there is no way I would have believed it was that Alan Kay! I also would have gushed embarrassingly about how OO changed my (software development) life, so perhaps it was for the best. - Oddthinking
(6) There were several systems that were as "object-oriented" as Simula I, including a file system (early 60s) in USAF, Sketchpad (1962), the B5000 hardware. The stuff that I gave the term "object oriented" to was a somewhat different orientation that was sparked by these earlier systems (and Biology) - Alan Kay
(1) I work mostly in object-oriented languages and I don't see much evidence of the widespread, commercial adoption of object-oriented programming. :-p - Dafydd Rees
67
[+4] [2009-01-11 21:53:46] Domchi

I would also nominate 3D mouse. There are several variants in existance from early 1990s. For anyone working with 3D, things like SpaceNavigator [1] make life much easier. (Disclaimer: I'm not affiliated with 3Dconnexion in any way, just satisfied and now RSI-free user.)

[1] http://www.3dconnexion.com/3dmouse/spacenavigator.php

68
[+4] [2009-01-11 23:31:47] Quamis

I belive that nothing important was invented.. but the perspective on software changed a lot since the '80s. Back then there were more theoreticians involved in this thing, and now you are asking this question on a programmers 'forum'.

Most of the ideas back then didn't get implemented, or when implemented they didn't had any real importance as the software industry did not exist, nor marketing or HR or development stages, or alpha versions:).

Another reason for this lack of inventions is the fact that most people use Windows:) dont get me wrong, i do hate M$, but look at it this way: you have a perfectly working interface, with nothing new to add to it, maybe just some new colored buttons. Its also closed enough so you wont be able to to anything with it without breaking it. Thats why i prefer open apps, this way you get more "open" people, to whom yo can actually talk, ask then questions, propose new ideeas that actually gets implemented, or at least put on an open todo-list, thus you get some kind of "evolution". You dont really see anything new because you are stuck with the same basic interface "invented" lots of years ago... did anyone actually tried ION window-manager in a production environment? It has a new kind of interface, and actually lets you do things faster, event it it looks quirky

M$, Adobe..you name it,holds lots of patents so you wont be able to base your work on them, or derivatives(you also wont know what kind of undeveloped tehnologies they hold). Look at MP3 and GIF as examples( i belive that they are both free formats now, but they are also kinda dead..) MP3 is the 'king' of audio evend if there are few algorithms out there much better that it..but didnt get enough traction because they weren't pushed on the consumer market. The GIF... come on, 256 colors??? From this point of voew i'm curios how many people from this thread are working on something "open" that will get to be reused in some other projects, and how many on "closed", protected by NDA's projects?

Even if it sounds kinda "free willy" kinda speech, back in the 80's the software was free, you got documentation for everything, and all hardware was more simple and easier to work with... and also more limited, so people didnt actually waste time to implement 3d games or web-pages but worked on real algorithms.


Automatic down vote for anyone who writes "M$". That tired old cliche should have been retired from the vicious Slashdot peanut gallery in the late 90s. It's a shame for computer science that website and worn out anti-Microsoft Linux fanboi-ism remains to this day. - Judah Gabriel Himango
69
[+4] [2009-01-12 15:00:34] kohlerm

The Eclipse IDE [1]

Bringing an Smalltalk like IDE to the masses ;)

[1] http://www.eclipse.org/

not just smalltalk like, visual age which become eclipse, written in smalltalk. - MkV
(1) So, a reimplementation of an Alan Kay/Xerox idea from 1976? - Charles Stewart
70
[+4] [2009-01-13 08:48:09] Guido

Ctrl-C + Ctrl-V + Ctrl-X combo :)


Don't forget Ctrl-X! I love the mnemonic nature of these - V looks like the tip of a glue bottle (glue pastes) and X looks like scissors (scissors cut). And of course Copy starts with C (at least in English). - D'Arcy Rittich
an even better invention is the clip history. It's unfortunate that this is not built into most operating systems. And also unfortunate that the external programs that supply this functionality have such appalling interfaces and poor integration into the OS - Breton
71
[+4] [2009-02-06 18:58:30] Jeff Read

The first true multimedia personal computer, the Amiga: the first 32-bit preemptive multitasking personal computer, the first with hardware graphics acceleration, the first with multichannel sound and in many ways a far more useful and capable machine than the multicore, multigigahertz Windows boxen that proliferate today.


72
[+4] [2009-02-11 16:27:27] Mike Tunnicliffe

The Bizarre style of development (as described in http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ by Eric S Raymond). Raymond credits Linus Tourvald's release of the Linux kernel in 1991 as the first use of the Bizarre style of development.


You are correct, but I actually quite enjoyed the typo, so I left it in. - Mike Tunnicliffe
73
[+4] [2009-08-18 02:45:24] projectshave

Sensor networks: very tiny (nano scale) computers form ad-hoc p2p networks and transmit "sensory" information.

3D printing: Star Trek replicator for physical objects (no Early Grey tea yet).

DNA computing: Massively parallel computing for some types of problems.


74
[+4] [2009-10-14 23:52:40] JL.

Translation software with community support to make manual corrections and recommendations, followed up with an AI bot to form patterns to eventually distinguish and correctly predict ambiguity in different translations and contexts.

While it's true Google Translate [1] might not be that beast, it is the mother, or perhaps the grandmother of a system just waiting to be developed.

If you think about it - textual language is really input to the brain, the eyes see the text and sends images to the brain, which then translates this into understanding.

While its true communication (especially human communication) is an advanced topic, the basics are input (with context) -> translation -> understanding.

Why do we still have no really good way to send emails to distant co-workers, or partners who don't speak our language? This is obviously the Phase 1.

Once this is complete, we can move onto stuff like real-time phone call translation.

Instead month after month our greatest intelectual assets are involved in other more crucial projects, like space research, and meteor detection, or trying to prove the Bible wrong (yawn).

How about we dedicate more time to basic practical communication?

[1] http://en.wikipedia.org/wiki/Google_Translate

75
[+4] [2010-03-06 12:55:30] Dan

USB Keys/Thumb drives

USB Keys were the effective replacement of the floppy, where the floppy was still superior to the CD or DVD in simple transfer.


76
[+4] [2010-06-12 02:50:46] Luke101

I think a very important invention for computing in the past 50 years was GOOGLE. The internet means nothing without a good tool to search it. The advent of search engine revolutionized the internet and enabled it to be monetized by the little guy.


But you do know that search has been around a lot longer? Sure Google made it better and more mainstream but they hardly invented it. - Jonas
77
[+4] [2010-08-03 22:56:02] user224291

RAID [1] (1988).

Arguably this is just an application of error correction codes from years gone by, but then arguably everything in computer science can be reduced to basic mathematics which has been around for millennia.

[1] http://www.cs.cmu.edu/~garth/RAIDpaper/Patterson88.pdf

78
[+4] [2010-08-06 00:36:50] Dafydd Rees

Augmented Reality

Where a view the the real world is combined with virtual elements in some way.

The term Virtual Reality was coined in 1989 a few years before the term "Augmented Reality" came into existence.

Some early enabling technologies were invented before 1980 but the concept itself dates from the early nineties (at least that's what Wikipedia says.)

http://en.wikipedia.org/wiki/Augmented_reality#History


79
[+4] [2010-10-09 06:24:57] Paul Johnson

Maybe a forum of science fiction authors would give you more interesting answers? ;-)

I suspect theres a bit of a fallacy at work here, your viewing the history of technology and science as a steady march of progress, as a linear phenomenon. I suspect it is in fact a process of fits and starts, context, economics, serendipity and plain ole randomness.

You should feel fortunate that you were at the centre of one of the great waves of history, most people will never have that experience.


80
[+4] [2011-03-01 22:18:16] Domchi

A few answers mention quantum computers as if they're still far in the future, but I beg to differ.

There were vague mentions of possibility of quantum computers in 1970s and 1980s (see timeline on Wikipedia [1]), however the first "working" 3-qubit NMR quantum computer was built in 1998. The field is still in infancy, and almost all progress is still theoretical and confined to academia, but in 2007 company called D-Wave Systems presented a prototype of a working 16-qubit, and later during the year 28-qubit adiabatic quantum computer. Their effort is notable since they claim that their technology is commercially viable and scalable. As of 2010, they have 7 rigs, current generation of their chips has 128 qubits. They seem to have partnered with Google to find interesting problems to test their hardware on.

I recommend this short 24-minute video [2] and Wikipedia article [3] on D-Wave for a quick overview, and there a lot more resources on this blog [4] written by D-Wave founder and CFO.

[1] http://en.wikipedia.org/wiki/Timeline_of_quantum_computing
[2] http://www.youtube.com/watch?v=56qR0iX5A4o
[3] http://en.wikipedia.org/wiki/D-Wave_Systems
[4] http://dwave.wordpress.com/

Paul Black from NIST gave a fascinating talk at the 2011 ACCU conference on "Quantum Computing for Programmers": accu.org/content/conf2011/… - Mark Booth
To my knowledge, D Wave have not shown any computer that shows performance on any algorithm that demonstrably exploits quantum effects to speed up computation relative to a classical computer. -1 for linking to vapourware in an otherwise redundant answer - Charles Stewart
@CharlesStewart How is this vapourware? hpcwire.com/hpcwire/2011-05-26/… Also, have you seen their demo? youtube.com/watch?v=pzFTXYJ2J1I - Domchi
@CharlesStewart BTW, the first link I posted in my comment gives exactly the demonstration you require: "HPCwire: Can you prove that quantum computing is actually taking place? Rose: This was the question we set out to prove with the research published in the recent edition of Nature. The answer was a conclusive 'yes.'" nature.com/nature/journal/v473/n7346/full/nature10012.html - Domchi
domchi: I actually didn't vote you down as I intended to... I didn't say "quantum computation", which has a number of interpretations, but "demonstrably exploits quantum effects to speed up computation relative to a classical computer", which is a more precise that I believe that paper does not demonstrate. I don't say what they are doing is not interesting. I do say they have no deliverables, hence vapourware. If quantum computation means machinery exploiting quantum effects, then we have had these since 1964: en.wikipedia.org/wiki/SQUID - Charles Stewart
81
[+3] [2009-01-11 13:43:44] duffymo

MPI and PVM for parallelization.


(2) No, concurrent and distributed programming has been considered the "next big thing" since at least the 60s/70s. - BobbyShaftoe
(1) MPI really is some ancient technology. It's awesome that you can write fast parallel code in C but, gag, you shouldn't have to do it at such a low level! (cf. shading languages/CUDA/GPGPU). - Jared Updike
I thought there were MPI bindings for more modern languages, like Java. i.cs.hku.hk/~lchen2/javampi.html - duffymo
It amazes me how little modern programmers know about past programming. This is a classic example. What's next? Thin clients? - Stu Thompson
82
[+3] [2009-01-11 14:07:10] sharkin

Utilization of functional programming/languages within OS core development.


Depending on what you consider a functional language to be, LISP was invented in the 1950s, APL was invented in the 1960s and John Backus (of BNF fame) gave us FP in the 1970s. - jason
(1) still wrong, unfortunately. there were LISP machines long ago, i don't think there's anything so 'core' than that nowadays. - Javier
"Can Programming Be Liberated from the von Neumann Style?" is from 1978. The attempt to apply FP to writing an OS is from Turner in 1985, and gave rise to the whole industry of functional I/O. +1 - Charles Stewart
83
[+3] [2009-01-11 14:19:16] sharkin

'Singularity', and all projects like it, i.e. development of operating systems in managed code.


again, LISP machines and APL code were the original ideas... and failures. - Javier
That's not a post-1980 invention (Lisp and Smalltalk). - Jules
84
[+3] [2009-01-11 16:38:35] Mike Dunlavey

Not sure about 1980, but the AI community has been an idea-generator for decades, and they're still at it.


85
[+3] [2009-01-11 17:28:52] codybartfast

To answer a slightly different question. I think we need big ideas in the areas of Privacy, Trust and Reputation. My computer has the ability to capture almost everything about me, where I am, what I say, what I type, what I see,... A huge amount of information with an equally large number of entities (people, shops, sites, services) with whom I might want to share some of that information even if it's just a single piece of data.

My information needs to mine (not Google's, Facebook's or Apple's). My computer needs to use it on my behalf and so trust needs to be end-to-end. Then we can dis-intermediate the new information middle men.


(3) So, your answer is more about 1984, not 1980. - splattne
Our cell phones are now capable of sampling our location geochronologically (i.e. in four dimensions) to a resolution of 1 sec. of time; and automatically submitting it to the phone network. Asynchronously queued, for efficiency. A conceptual technology pattern increasingly discussed among us here. - dkretz
Like I say it doesn't address the original question, but today reputaion, etc.., is typically done through an intermidiary. Google,PPal or FaceBook are today's Ma Bell, the comms is end2end but tust is too oftern through a middle man; it needs to be end2end too. - codybartfast
86
[+3] [2009-01-11 20:02:42] Mauli

(Widespread) Encryption. Without Encryption no financial transaction would ever take place. And this is still an area which can use more innovation and user friendlieness.


(7) When did the trapdoor and public key ideas get invented? Hint: before 1980 - Alan Kay
87
[+3] [2009-01-15 17:44:03] Daniel C. Sobral

Multi-Agent Systems.

You can go back to distributed artificial intelligence roots, and I think still stay safely this side of the 80s.

There's many components to multi-agent systems, with lots of studies going into speech acts or cooperation, so it's rather difficult to point and say "See, here, this is different, innovative and important!" But I'll try anyway. :-)

I think the Belief-Desire-Intention model is particularly noteworthy. Agents have internally constructed models of the world. They have particular desires, or goals, and formulate plans on how to interact with the world as they know it to achieve those goals, thereby making up intentions.

Or, to use an analogy, the characters in Tron, the movie, have a certain understanding of how the world around them worked. They did not KNOW the whole world, and they could be mistaken about parts of it. But they had desires and goals, and they came up with plans to try to further that. If you saw Tron, I'm sure you'll get the analogy.

It hasn't had much an impact on computing YET. But, see, things that have impact on computing seems to take a few decades anyway. See: OOP, GC, bytecode compilation.


88
[+3] [2009-01-16 13:38:08] open-collar

The massive increases in processor speed that have occurred over the last 30 years can't be overlooked. All manner of clever ideas such as pipelining and pre-emptive branching, as well as improvements in electronic side of processor design, mean that programmers today can worry more about the design and maintainability of their programs and worry less about counting clock-cycles.


that's not invention, that is evolution (making things bigger, better, badasser) - steffenj
Moore's Law has been in effect since before 1980 - Stu Thompson
89
[+3] [2009-01-16 14:23:35] Bill Martin
  1. The mouse - There have been posts about human interaction. To me, the mouse was the gateway to human interaction. Without it, we'd still be typing and not clicking in dragging, even with our fingers.

  2. GUI - Complimented the mouse perfectly. I work in an environment where an as400 is the backend of one of our major apps. Yeah.. Interesting stuff but it just reminds me of the screens 'Bill Gates' is working in in the movie 'Pirates of Silicon Valley' even though that's not what it was. To me, 1 and 2 are the reason anybody, including grandpas and grandmas can use a computer.

  3. Excel / spreadsheets - Someone mentioned this before but it's work mentioning again. It's so user friendly and is a great entry point for non-technical users to try their hand at simple programming concepts when performing calculations on cells. Granted it came out before 1980, but the versions post 1980 are when the technology in spreadsheets evolved.

  4. Internet (of course) - Not sure how people wrote code without it! Don't flame me for repeating because this belongs on every list.

  5. INTELLISENSE - LOVE IT LOVE IT LOVE IT!!!!


(4) Mouse: Engelbart, 1968. GUI: was in Sutherland's Sketchpad, 1963. Internet: 1969. - Andrew Dalke
Perhaps strickly speaking they were invented then but they weren't in use extensively in the 60s. I thought Al Gore invented the internet? ;) - Bill Martin
90
[+3] [2009-02-13 17:08:04] Oliver Mooney

The successful integration of different programming paradigms into single programming environments.

The exemplar of this (for me) is the Mozart/Oz programming system [1], which integrates functional, OO, logic, concurrent and distributed programming mechanisms into a coherent whole. There are other examples though.

[1] http://mozart-oz.org

91
[+3] [2010-01-04 08:12:52] rocknroll

The rise of motion sensors in gaming which does away with the traditional game joysticks and lets the user very close to the game itself. This complements our ever changing urban landscape and lifestyle where we have limited physical activity. This advancement in gaming definitely induces atleast some physical activity while doing something that one enjoys. It is definitely better than doing same mundane reps at your gym.


92
[+3] [2010-05-25 17:46:18] naasking

I think the most concepts in computing have mostly been undergoing refinements, but there have been some new developments, particularly in distributed computing.

  1. Robustness against failure and defection, and failure recovery, ie. Paxos, Byzantine Fault Tolerance, etc.
  2. I know people have mentioned P2P, and that P2P communication was happening in the 70s, but with all due respect I don't think it was of the same nature as is commonplace today, with distributed hash tables, efficient dynamic ad-hoc networks, and most importantly, anonymity (ala Freenet, Tor).

The majority of work has been refinement, and while many modern systems are little better than the original concepts first described in the 60s or earlier, some are orders of magnitude better.


93
[+3] [2010-08-24 01:54:03] user392139

I would say that CDMA was/is an important and powerful new idea that was created after 1980.


(1) Hashim & Constantinides, Digital Code Division Multiplexing, Proc. Zurich Int. Seminar on Dig. Comm., March 1974. - Charles Stewart
94
[+3] [2010-09-06 09:06:15] plan9assembler

c++ programming language (1983) template metaprogramming (1994)


(1) What about C++ is meant to be a significant new invention? C++'s templates (like the C++ STL) are derived from Ada's generics (1977), which in turn were based on the meta-programming facility in Liskov's CLU. - Charles Stewart
95
[+3] [2011-02-10 18:37:35] xelco52

X.500 [1] and the x.500 series of standards (circa 1988). While the x.500 standards were inspired by telco standards [2] dating back decades, they are significant as they paved the way for the widespread use of LDAP/AD and our current incantation of x.509 certificates to name a few.

[1] http://en.wikipedia.org/wiki/X.500
[2] http://en.wikipedia.org/wiki/ITU-T

96
[+2] [2009-01-11 16:36:06] Cheery

A really hard question since, aside ridiculously improved hardware, there's few things that'd have been significantly positive inventions after that time. Though there are many significant inventions before 1980s that affect people only but now because they were infeasible back then.

Heck. Descent


97
[+2] [2009-01-12 01:49:07] Steven A. Lowe

the Enterprise Service Bus [1] would appear to be a fairly recent 'invention', though of course it is based on much older technologies.

[1] http://en.wikipedia.org/wiki/Enterprise_service_bus

98
[+2] [2009-01-12 14:52:45] kohlerm

The Eclipse memory Analyzer [1]:

and it's of use of the Lengauer-Tarjan dominator tree algorithm [2] for memory usage analysis.

[1] http://www.eclipse.org/mat/
[2] http://www.cl.cam.ac.uk/~mr10/lengtarj.pdf

(1) Sorry :) In 1979. google.com/… - Kornel
99
[+2] [2009-01-12 14:55:04] blabla999

Digital music synthesizers.

I think, the whole music scene was affected by the availability of cheap polyphonic synths. The early polyphonic synths where effectively multiple analog synths (discrete or using CEM or SSM chips). They were both expensive and very liited. During the 80's, the first digital systems arrived (I am not sure, but I think Kurzweil was one of the first). Today, mostly all are digital - even the analog ones are typically "virtual anlog".

regards

EDIT: oops - I just found out that the CMI fairlight was invented in 1978. So forget the above - sorry.


100
[+2] [2009-01-13 20:51:17] mhw

I'm not qualified to answer this in the general sense, but restricted to computer programming? Not much.

Why? I've been thinking about this for a while and I think we lack two things: a sense of history and a way to objectively judge everything we've produced. This isn't true in all cases but is in the general.

For history, I think it's just something not emphasized enough in popular writing or computer science programs. Take language features, for example. A canonical source might be HOPL, but it's definitely not common knowledge among programmers to be able to mark the point in time or in which language a feature like GC or closures first appeared. And of course after that there's knowledge of progression over time: how has OOP changed since Simula? Compare and contrast our sense of history with that of other fields like maybe political science or philosophy.

As for judgement, this is really a failure on our part to seek objective measures of success. Given foobar, in what measurable way has it improved some aspect in the act of programming where foobar is any of design patterns, agile methodology, TDD, etc etc. Have we even tried to measure this? What do we even want to measure? Correctness, programmer productivity, code legibility, etc? How? Software engineering should really be picking away at these questions, but I've yet to see it.


101
[+2] [2009-01-15 23:58:07] BobbyShaftoe

I think part of the problem with these answers is they are either not well researched or are attempting to a new implementation or some technology that has seen significant "improvements." However, this is not a significant invention. For instance, any talking about functional programming or object oriented programming just fails; most of these ideas have been circulating since before most of the participants of SO were born.


102
[+2] [2009-02-16 09:21:53] Walter Mitty

In order to start thinking about this, I need a model for what "innovation" means.

The best model I've seen is The Technology Adoption Life Cycle. You can get an overview at this Wikipedia Article [1].

Using this model, I began to ask myself... at what stage of the life cycle is software itself? We can think of "software" as a distinct technology from machinery going all the way back to Babbage, or perhaps more precisely, to Lady Ada Lovelace.

But it surely remained at the very early pioneering stage at least until about 1951. That's the year programmed computers "went commercial" in terms of selling a model for a computer product, and building lots of units of that model. I'm thinking of the machine that Univac sold to the Census Bureau.

From 1951 to about 1985, software innovations were numerous. They mostly had to do with extending the span of computing to an ever wider field of endeavor. In parallel, mass marketing and mass production kept bringing the cost of entry down till the Apple and IBM-PC made a programmable device a commonplace appliance.

Somewhere between 1980 and 1985, I'd say that software passed from the Innovator's domain to the "Early majority" domain. Sorry, guys, but that makes all of you that participated in MS-DOS, the Mac, Windows, C++ and Java eraly majority rather than innovators. That doesn't preclude your having done significant innovation on your own turf and in your own projects. It just means that the field itself had moved on from the earliest stage.

While the Internet's precursor had been around since the 1970s, it wasn't until Al Gore invented the internet (sorry) that everybody hooked up. At that stage, software passed from the early majority to the late majority. This shift was subtle, as the top of the bell curve suggests. Not every shop moved from early majority to late majority at the same time.

I don't think software has quite passed into the "laggard" stage yet, but I think that real innovators are tackling the problem of producing progress on different fronts today.

Two fronts that I can think of are Bioengineering and Information Appliances. Both of these fields require software, but the main thrust is not software innovation. It's applying software to uncharted territory. There are probably lots of other fronts that I'm not even aware of.

[1] http://en.wikipedia.org/wiki/Technology_adoption_lifecycle

103
[+2] [2009-08-18 03:09:00] user158029

I would vote, as a Debian user, for package management. It makes OSX and Windows 7 look like primitive amateurish playthings.

But since package management was already mentioned, I will vote for X. The network transparent window server has made a lot of applications possible. It's wonderful to be able to seamlessly summon programs running on different computers side by side on the same screen.

And that was a tad more impressive in the late 80s.


104
[+2] [2011-07-27 00:06:12] Domchi

Bitcoin [1]'s solution to the double-spending problem. It was used to create a decentralized electronic currency. A variant called Namecoin [2] uses the same technology to build a decentralized naming system (similar to DNS).

There were attempts to create cryptocurrency in the past (and the idea is certanly not new), but Bitcoin seems to be the first implementation which took off. Its unique P2P algorithm solves the double-spending problem without relying on any trusted authority.

[1] http://en.wikipedia.org/wiki/Bitcoin
[2] http://forum.bitcoin.org/?topic=6017.0

105
[+1] [2009-01-13 04:37:51] James Creasy

Protected memory. Before protected memory if your program made a mistake, you could start executing code anywhere- virtually always hanging the entire machine. That's right, reboot time!

Low cost of hardware. My first computer cost $500 in 1978- a huge sum at the time. Lowering costs put PCs on every desk.


(2) Protected memory was invented in the 60s, at latest. - Darius Bacon
It amazes me how little modern programmers know about past programming. This is a classic example. What's next? Thin clients? - Stu Thompson
106
[+1] [2009-01-14 00:35:27] D'Arcy Rittich

Natural Language Processing [1]. The first time I encountered this was in the early 1990s with a program from Symantec called Q&A [2] that let you query the database by typing English queries. I am still impressed by it to this day.

[1] http://en.wikipedia.org/wiki/Natural_Language_Processing#Short_history_of_evaluation_in_NLP
[2] http://en.wikipedia.org/wiki/Q_&_A_(software)

There was a lot of invention in natural language processing before 1980. For just one example Try looking up Terry Winograd in Wikipedia. - Walter Mitty
107
[+1] [2009-01-16 13:45:07] Brendan Enrick

StackOverFlow.com


Voted up because it's funny. :) - ibz
(3) Voted down because it's not. :( - sharkin
Didn't do anything because i agree with R.A :| - Ólafur Waage
108
[+1] [2011-07-30 08:30:39] Evgeny Lazin

Paxos protocol. It's difficult to describe how valuable it is in internet era.


I think Paxos isn't hugely valuable yet, but it's a good answer nonetheless: it's an answer to a problem that is obviously fundamental, that we didn't appreciate until the early 80s, and we didn't have good solutions to until the turn of the 90s. Cf. stackoverflow.com/questions/6223370/… - Charles Stewart
109
[+1] [2011-08-10 04:11:03] Kevin

FPGA [1]s are a major breakthrough invented after 1980.

[1] http://en.wikipedia.org/wiki/Field-programmable_gate_array

Mask-programmable Gate Arrays date from 1969, and are essentially one-shot FPGAs. -1, not new. - Charles Stewart
So the ability to reprogram them isn't a significant improvement? - Kevin
Sure it is. The qn doesn't ask for significant improvements since 1980. The big idea was programmable circuitry. - Charles Stewart
The significant invention of FPGAs is field programmability. - Kevin
Surely if flash memory (a form of solid state storage which previously existed as one-shot ROM) counts as a significant invention (stackoverflow.com/a/458370/472698) so too should FPGAs. - Kevin
I've -1ed and commented on the EEPROM answer. I don't count field programmability as a fundamental advance on mask programmability in the sense Alan Kay means. - Charles Stewart
110
[0] [2009-01-12 04:06:35] Gordon Bell

Computer Graphics, Special Effects, and 3D Animation


(2) All available in the 60s and 70s. Texture mapping, for example, is from 1974. - Andrew Dalke
111
[0] [2009-03-19 14:12:27] Özgür

Top ten software engineering ideas / picture [1]

[1] http://www.yourdonreport.com/wp-content/uploads/2007/11/compaidtoptenalb.png

Most of the books are BS. Where is OOP and the patterns book. - Nils
112
[0] [2009-07-27 08:08:12] Alphaneo

I do not know if somebody has already answered, "machine learning" as a significant new development that is developing fast. With intelligent spam filtering, stock market predictions, intelligent machines like robots, ...

May be, machine intelligence might be the next big thing.


Smith, Mitchell, Chestek & Buchanan, 1977, A model for learning systems contains a nice survey of machine-learning techniques up to 1977. - Charles Stewart
113
[0] [2009-08-18 03:28:16] MkV

Let's see, Connection Machines (Massive Parallelism) for one.

Anyway, this whole question seems like an egoboo for Alan Kay since he invented everything.


114
[0] [2009-08-18 06:28:27] Joe Chung

The mathematics for quantum computing has been around since before 1980, but the hardware isn't here yet and may be physically and economically infeasible for many years to come.


115
[0] [2009-11-03 22:18:00] Dean J

The Personal Computer.

Hands down, the most important part of computing in the last thirty years is that everyone is now part of it. Computers for home use only date to 1977 or so, and widespread adoption took until well into the 80's. Now, kindergartens, senior centers, and every next door neighbor you'll ever have owns one.


116
[0] [2010-03-17 04:04:00] user216441

The Internet.

That's it.


(1) The birth of the internet is often said to occur with Cerf & Kahn's work bringing into existence the first TCP, completed in 1974 with RFC 675. - Charles Stewart
117
[0] [2010-05-25 20:34:32] Eric Brown

I'd have to say that the biggest invention in computing since 1980 is Moore's law. There were tons of really cool, innovative things created in the 1960s and 1970s - but they were insanely expensive one-off projects. And most of these projects are lost in the mists of time.

Today, the cool, innovative project gets a couple rounds of funding and is available on everybody's desktop or web browser in 6 months or so.

If that's not innovative, what is?


Moore's law was actually coined in 1965. - Charles Stewart
True, but it didn't really kick in until after 1980. Z80's (1976) weren't that much cooler than an 8080 (1974); 8086s (in 1978) were nicer, but 68000s (1979/80) and subsequent CPUs were hands down superior. The "Killer Micros" really didn't take over until 1990 or thereabouts. - Eric Brown
118
[0] [2010-05-29 23:45:07] user353799

I would say Linux and the reification of the worse-is-better philosophy, but you can argue that those are older. So I´d say: quantum, chemical, peptide, dna, and membrane computing, (re)factoring in a non ad-hoc fashion and automated, aspects, generic programming, some types of type inference, some types of testing,

The reason why we have no new ideas: sw patents (this comes from the late 60s ...), corporations and education.


119
[0] [2010-08-04 12:52:02] Ilari Kajaste

Personal Broadcast Communication

Facebook, Twitter, Buzz, Qaiku... the implementations are varying, focusing on different aspects - managed audience, conciseness, discussions. The specific services come and go, but the new concept of communication remains. Blogs are of course what started this, but the new services have made the communication socially connected, which is an essential difference.

Not quite sure if this exactly goes under the subject of computing, though, but it's something that's significant, and only made possible by computing and networks.


120
[0] [2010-08-06 00:23:59] Dafydd Rees

Open Croquet http://www.opencroquet.org - A Squeak, Smalltalk-based 3D environment which lets multiple users interact and program the environment from inside itself. It has it's own object replication protocol for sharing environments efficiently and scaleably over the internet. **It's difficult to describe because there just isn't anything else remotely like it...

1) I'm proposing this because when I try to explain to other people what it is I find them expecting me to compare it to other things... and I still haven't found anything remotely like it although there are many elements present from other systems (e.g. Smalltalk, Open GL, etoys, virtual worlds, remote collaboration, object-oriented replication architectures) the whole seems to be much more than the parts...

2) Unlike many of the technologies mentioned here it hasn't settled down into a widely exploited commercial niche...

Both points are signs of an early-stage technology.

I suspect that when Alan Kay started work on it, he might have been thinking about the theme of this question in the first place.

http://www.onlisareinsradar.com/archives/001281.php


121
[0] [2012-06-15 15:22:01] dmckee --- ex-moderator kitten

Fast clustering algorithms ( O(n log n) in the number of data points ) such as DBScan (from 1996) [1] seem to all date from after 1980.

These have been part of general wave of progress in data-mining techniques.

Contrast this with lack of progress in line-finding for which poorly scaling techniques like the Hough still seem to represent the state of the art.

[1] http://en.wikipedia.org/wiki/DBSCAN

122
[-1] [2009-01-12 11:56:11] Eldelshell

DOS. I'm not a DOS fan, but thanks to DOS and the IBM-PC computers are what they are today (for better or worse).


123
[-1] [2009-01-15 05:56:18] JeanHuguesRobert

20 years ago: Object oriented programming - To better handle software complexity.

Now: Cloud computing - To better handle hardware complexity.

Future: something Declarative, but it will take another 20 years.


Really? You should look Alan Kay up, object oriented programming is an older idea that 20 years ago. As far as cloud computing, bah this isn't new. - BobbyShaftoe
Sorry I had to -1 you, but you're so wrong it's not even funny. Have you heard of Wikipedia? You might consider looking stuff up there before posting: en.wikipedia.org/wiki/Object-oriented_programming - niXar
124
[-1] [2009-02-04 16:06:50] David G

If we are serious about answering this question as a group.
I unfortunately believe we need more than a string of random well intentioned post !
I know, it sounds boring, getting thing done often is !

We Write a list of powerful ideas in the area of computing
Maybe we should define a few categories to separate each one because videoconference somehow does not fit well with object oriented programming.
Seeing ideas by categories makes it easier to generate them without redundancy. It's too easy to sidetrack in teleportation if quantum computing is not kept away from flying cars.

Try to attribute each of them a date
This will settle the before/after 1980 and restrict debate about each idea to its own. It will be fun to dig for earliest reference, first known implementation, etc.
Plus this will allow people like me who were 2 years old in 1980 to have a better idea of what was common programming knowledge in 1980 (nothing beats being there at the time)

Try to attribute each of them the current state of their implementation
Ok, some idea were sci-fi in 1850, with early development in the 1970 and serious improvement breakthrough in the 1990.
Some ideas are just starting to get around. Some are almost forgotten.

Probably the wiki thing is a good idea.
I think this could really get somewhere if slightly organized.
I did not check, but maybe this whole thing already exist already on the net (I usually find that if you think about something, someone already did it).

What do you think ?

Cheers !


125
[-1] [2009-08-17 22:40:15] dan_the_welder

Perhaps the shift from client server to peer to peer. One of the reasons I hate the whole cloud/SAS thing is that it is a return to client/server.

I've got a VAX in my pocket and you want me to pretend it's a VT-100?


126
[-3] [2009-07-22 19:11:34] Janie

The teevee tube box


127
[-8] [2009-01-14 01:36:09] Andrew Harry

It's a little thing i like to call the internet


(1) That little thing existed before 1980: en.wikipedia.org/wiki/History_of_the_Internet - some
128
[-8] [2009-02-05 15:55:21] joeforker

Software Patents


Lol it will have an impact on the development for sure.. he didn't say that it has a positive impact so 1+ for that :D - Nils
129