In my question Insert Update stored proc on SQL Server [1] I explained an efficient way of doing an insert/update - perhaps THE most efficient. It's nothing amazing but it's a small algorithm that I came up with in a mini-Eureka moment. Although I had "invented" it by myself and secretly hoped that I was the first to do so I knew that it had probably been around for years but after posting on a couple of lists and not getting confirmation I had never found anything definitive written up about it.
So my questions: What software algorithm did you come up with that you thought that you'd invented? Or better yet, did you invent one?
When I was in 2nd grade, I figured out that if you have a square, like 9 = 3^2... to get to the next square (4^2), you simply add 3 and 4.
I generalized it, so if you had any number squared, you could get the next number squared by adding the first number and the next to the first one squared.
So, I kind of invented algebra.
(x+1)2 = x2 + x + (x+1)
Once, as I was walking home from the train, I thought to myself "wow, Linked Lists are totally awesome, except for the whole O(n) lookup thing, which makes them useless for a great many purposes. If only there was a way to quickly find things in a linked list and still have the very-fast insertion/removal..."
By the time I got home (almost exactly 20 minutes), I had completely specified every possible nuance of a truly incredible algorithm which made linked list lookups extremely efficient while still largely keeping their advantages. I mean, this this was flawless, it was going to revolutionise the world of data structures and probably make me famous.
I don't know what terms I put into Google that night which revealed the existence of the Skip List [1], but let me assure you I was crushed.
On the plus side, my algorithms were basically identical to Bill Pugh's, so at least I reinvented it properly. Small mercies.
[1] http://en.wikipedia.org/wiki/Skip%5FlistI "invented" the infinite loop because of never updating the termination condition.
string crush = "Meg";
string girlfriend = "";
int daysAlone = 1;
while( crush != girlfriend )
{
Output( "Days alone: " + daysAlone );
SeeGirl(crush);
TellSelf("Try to talk to her again tomorrow");
daysAlone++;
}
daysAlone
? - Joe D
I invented the Internet. ;)
On a more serious note:
I was setting up my home network one day and thought "I should have a machine that is purposely weak so that hackers go after it instead of my other machines". For about five minutes I thought I had an original idea... Needless to say, I quickly found the term honeypot in Wikipedia and realized I'm just an average joe.
When I was just a kid I "discovered" that when you multiply a number XY (less than 100) by 11 you just have to put to sum of X+Y in the center, and X, Y in the extrema to get the result (no more calculator).
11*45 = 4 (4+5) 5 = 495
95 * 11 = 9 (9+5) 5 = 9 (14) 5 = (9+1) (4) 5 = 1045
. - Abhinav Sarkar
I was about ten and playing with the Basic interpreter on my very own 386 workstation. QBASIC actually, was a much nicer editor. Anyway, I knew counters and variables and GOTO, and needed a repeater structure, and I had constructed a bit of code that nicely incremented and checked a value, and jumped out of the loop if it exceeded the amount.
It wasnt until two years later I got a book on programming in C and I discovered I had invented For and While loops!!
Another grade school one: In fifth grade my student teacher in math class asked if anyone knew how to find the area of a triangle. I thought, hey, a triangle's kind of like half a rectangle. So I raised my hand and said, "Maybe multiply height times width and then just divide by two?" I must say she was quite shocked, and I felt very proud. In retrospect that's probably not all that clever for fifth grade...
Recursive descent parsing.
Not quite an algorithm.. but way back in elementary school I saw an episode of Bill Nye the Science Guy where Bill said something along the lines of "..but it would take the force of exploding stars to rearrange matter". I immediately thought: "Man.. What if a giant explosion created all the matter in the universe out of something else?" I thought I was on to something!
I invented bubble sort in 1993. The year prior I had attended a computer camp (for pre-high school kids) and studied BASIC and Pascal. '93s summer computer camp we graduated up to C and had to sort an array of numbers. I, along with several kids in the group, arrived at the worlds most common bad implementation of sorting.
Our instructor then explained Big-Oh notation (and I believe the implementation of quick-sort) and the rest, as they say, is history!
I invented modern supercomputing/distributed computing. I was only about five or six years old and don't remember ever thinking of it (thankfully Dad still has the paper somewhere).
When in Church I'd doodle on paper, and one day I drew an interesting diagram. Basically there were lots of boxes filled with 1's and 0's that encircled a central, larger, box (computer). When Dad leaned over to ask me what I'd drawn, I explained that the central computer was the boss of all the others. The central computer would delegate pieces of the problem to the other computers, and then assemble the final answer.
I wanted to draw lines with dashes or colors that vary along the length in GWBasic, but it had no facility for these, so I worked out how to generate a line very similar to Bresenham's method - no gaps, and a single width of pixels for a line of any angle.
I was very proud of myself, but my parents and siblings just didn't understand how cool it was.
Then I discovered the real Bresenham years later, and the awesome optimizations for linear memory implementations of it. It didn't dim my happiness - I was very young at the time and there was no such thing as the Internet back then.
Algorithms are so cool...
I knew a girl who at a young age wasn't paying attention and was generally being a hyperactive kid. At one point, she swung a full cup around a vertical circle and nothing spilled out. She thought she'd broken gravity or something. She tried it again and again and it worked. She showed her dad who had an expression of "yeah... and?" She couldn't conceive that other people already knew.
When she got old enough to encounter this taught in the classroom, she was quietly really proud of herself, knowing she'd found it all on her own and before any of her friends knew what it was.
I can honestly claim I never "invented" bubble sort.
Nope, I went and "invented" bucket sort instead.
I'm so ashamed. :)
√i - i√i = √2
(not an algorithm, I know, but I still thought it was the coolest thing when I discovered in in 7th grade algebra... until I got to complex analysis in college. Actually, I still think it's cool, but I'm a math geek)
Like many others, I invented bubble sort, binary search, etc in high school.
For a more interesting example, I recently "invented" an algorithm for approximating Fourier integrals, based on applying a specific sequence transformation to partial integrals. Turned out, upon consulting specialized literature, that someone already thought of this in the 1960s.
As a rule of thumb, if you come with a brilliant new algorithm, someone already thought of it in the 1960s.
I "invented" the triangular number [1] formula in 5th grade -- and then proceeded to spend a real lot of time trying to promote the operators using logarithms to compute factorials.
Early in programming, I "invented" the selection sort [2] -- made way more sense to me than bubble sort [3].
Back in the late 90s, I invented double-interleaved firmware-generated PWM to boost apparent regulation resolution. Patent # 6252373 [4].
[1] http://en.wikipedia.org/wiki/Triangular_numberLike everyone else who was primarily self-taught, most common data structures: lists, trees, queues, etc. I didn't know what they were called until college, several years later.
One day the trie [1] just popped into my head, for no apparent reason - it didn't even solve the problem I was working on.
The major object-oriented elements (objects, messages, inheritance) were invented/derived out of necessity while working on a 2D CAD application (in assembly language).
[1] http://en.wikipedia.org/wiki/Ternary%5Fsearch%5FtrieDuring an interview, I came up with the Knuth shuffle [1] (or Fisher-Yates Shuffle, as it is also known). I was quite proud after I looked it up later, as I'd never really considered the problem of randomizing a list before (sorting, on the other hand...)
[1] http://en.wikipedia.org/wiki/Knuth%5FshuffleNot exactly an algorithm, but I "invented" AJAX back in the late 90's to support dynamically loading branches of a navigation tree without a full page refresh. It was some pretty hack-y code that used JS to load data into a hidden I-Frame then read it out into the parent page and manipulate the DOM.
One summer as I was really bored, I started playing around with some trig functions and came up with a way to solve a triangle based on knowing two sides (a
and b
) and the area (K
). In all of the time since then, I've never come across this algorithm elsewhere:
c2 = a2 + b2 ∓ 2√(a2b2 - 4K2)
It's similar to the Law of Cosines, but instead of using an angle to find the third side, it uses the area.
How about... ARRAYS!
I was 7 or 8, fiddling with BASIC, trying to make a prime number generator. I invented the exact concept of an array, tried in vain to figure out how it was done in BASIC (anyone used PHP's variable variables? I tried that kind of thing but it didn't work) and in the end used sequential files to simulate arrays. To read a certain element I'd open the file, read n lines, and there was my value.
At exactly the same time I had invented primality testing by trial division! Hehe. I even thought up the "only test primes, and only up to the square root of n" optimisation.
Needless to say I discovered BASIC's arrays a few months later. And as a matter of fact, I'm man enough to admit that I still use BASIC.
Enigma machine (substitution, altering & rotating cipher).
When I was in 9th grade and into number theory I went to sleep really tired and started thinking about an algorithm to find is a number is prime or not, I doing some head calculations and then waking up a few hours later screaming "I found it!".
Turns up I had discovered the formula 2^p - 2 / p == 0 when p is prime, also known as Chinese Hypothesis and a derivative from one of the Fermat's formulas - I found out about it two weeks later and I also found that it fails for pseudo primes (numbers such as 341) - it was a really bad double deception.
Since then I've never done any more work on number theory again.
When I was second grade middle school, our teacher gave us a homework: Make an alghorithm in pseudo code that prints the first n Fibonacci numbers. So I made my own iterative procedure:
a = 0 ;
b = 1 ;
n = 10 ;
i = 0 ;
while ( i < n ) {
b = b + a ;
a = a + b ;
println b ;
i = i + 1 ;
println a ;
i = i + 1 ;
}
Much later I learnt that this was one of the most used examples for recursive alghoritms.
I invented a way of turning infix to postfix using just an array in 1989. For many years I though I had reinvented something else, but lately I'm not so sure. All I can find when I google or run into how-to articles is Dijkstras shunting yard, which uses a stack.
So I have decided to publish it tonight on my blog [1]. If anybody can point out that it is just a reinvention I'll be a bit disappointed and you can share that with me.
[1] http://www.guge.net/?p=3I have a few.
Most recently, I was the only programmer on a medium-sized CRUD-type application [1] that incidentally did have some meaningful logic as well. So for the first time in my career (I was still in college at the time) I was in total charge of UI, domain layer, and the DB.
I had this great idea that in order to give data to the UI, I should "flatten" my domain objects into what amounted to a big struct. This way, the UI could focus on mapping field to UI control and have as little non-UI logic as possible. Then I found out that these were actually Data Transfer Objects [2].
I also hand-coded my own strategy to save domain objects into a relational database. Imagine my surprise when I found out that this was called the object-relational impedance mismatch [3], and there was an entire sub-industry devoted to the problem.
Even earlier in college I had to write a smaller tool that would grab spec data from a bunch of servers on our network, and then dump out a suggested plan for how to make sure each server had the minimum amount of some resource, like RAM, in the smallest amount of swaps. It was really ugly procedural soup because I wrote it in VBA in an MS Access DB (they forced me to, don't hate me).
I ended up with a heuristic algorithm that was correct most of the time, and it was a feeble attempt at a dynamic programming algorithm [4], which I wouldn't learn about until three years later in grad school.
Fermat's Little Theorem [1]. I only discovered the binary case, so thank goodness Fermat realized it worked with other bases.
[1] http://mathworld.wolfram.com/FermatsLittleTheorem.html@ Ross [1]
Amusingly enough, I wrote the original version of the word_wrap function [2] in PHP, before it became part of the core PHP function set.
It was originally written because I needed to be able to create quoted text areas for an online messaging system.
Extra amusing - It's listed as an alternative in the comments [3] to PHP's word_wrap.
[1] #25175@martinatime
Math is generally considered "discovered" as opposed to invented. And programming and related studies are considered applied math.
I've generally seen the opposite. e.g. Newton invented calculus. On the other hand, natural properties are generally "discovered." e.g. Pythagoras discovered the Pythagorean theorem.
I invented a way of picking a random line from a file in a single pass through the file. The comments to this answer in StackOverflow [1] show that it was a known technique long before I figured out my answer.
This is just the latest example of a long history of figuring stuff out. It was a much more valuable skill before you could look up anything you want on the internet.
[1] http://stackoverflow.com/questions/232237/whats-the-best-way-to-return-a-random-line-in-a-text-file-using-c/232248#232248In 1988 I did a high school science project that implemented parts of the "Sorting out Sorting" [1] video. It was done in BASIC on a C64. Quick sort posed a challenge as C64 BASIC subroutines did not accept parameters, only the instruction pointer was saved on the stack.
I solved this problem by using an array as a stack to store the range of the left partition for later processing and "invented" tail recurssion [2] removal to handle the right partition without an additional subroutine call.
I was devastated the following year in my first college CS course to learn that the technique had a name and it was invented well before I was born.
At least I won first place at the science fair.
[1] http://www.youtube.com/watch?v=F3oKjPT5Khg&feature=relatedMy friend and I once decided to "invent" a compression algorithm that eventually boiled down to an incomplete version of Huffman code [1].
[1] http://en.wikipedia.org/wiki/Huffman_codingAs a kid I "invented" sine and cosine in order to figure our how to draw a circle point by point using QBasic.
When I was about 10, my father would make Sunday omelette in a square pan for the 7 of us. But it was too difficult to divide the omelette evenly into 7 so he divided into 8.
But wait! Who gets the last piece?
Divide it and distribute it of course! Same problem: it's too difficult to divide it into 7 so divide it into 8.
And in theory, you can continue dividing and distributing until your piece is the size of an atom and the omelette ends up evenly divided into 7.
From this I realised that 1/7 = 1/8 + 1/82 + 1/83 + 1/84 + ...
And I could easily generalise this to 1/(n-1) = 1/n + 1/n2 + 1/n3 + 1/n4 + ...
I "invented" Sorting Networks [1] while trying to find ways of sorting that could be vectorized (with SIMD architectures). Unfortunately it turned out that Knuth, Batcher, et al, beat me to it by about 40 years or so. ;-)
[1] http://en.wikipedia.org/wiki/Sorting_networkI thought I came up with a new sorting algorithm besides what was being taught in 2nd semester of my masters. I explained one of my friend this new sort algorithm that I came up with, in the end he told me it was quick-sort. he advised me I better start attending lectures regularly and not waste my time re-inventing the wheel, but I knew I actually did find the algorithm.
Stopping short.
I have a lot of ideas for an OS I want to make someday (but may never have the expertise or time to), and one of my best ideas was for a memory management algorithm. Basically I had (unknowingly) reinvented Doug Lea's algorithm, with the one exception that, because of the way my idea had developed, I was still thinking you'd want to use a hash table to store the next-bin information, when in reality you don't need one at all.
I've also invented a few sorting routines, which may or may not be useful (or even practical), and some of them are variations or crosses between other, existing sorting algorithms. http://inhahe.blogspot.com/search?q=sorting
I also invented a method for finding primes (when i was young) but it's not as good as the sieve of whoever and it's probably obvious anyway. (for every odd number, try to divide by each prime already found in between 3 and sqrt(n). if none divides, add this number to the list of found primes.)
oh, just recently i came up with a way to use SQL to efficiently find substrings within a document. i have no idea if this method is already known.. (i can only post one hyperlink, so i'll just tell you that the SQL algorithm is curretnly on the front page of the aforementioned blog)
here's a Python one-liner I made once for returning all permutations of a string, but i don't know if the basic algorithm behind this has already been done.. i would guess it has.
perms = lambda a: a[1:] and [c+r for i, c in enumerate(a) for r in perms(a[:i]+a[i+1:])] or a
In the 90s when I was doing game graphics programming on the Mac we needed to compress the size of our image.
I came up with a scheme that I didn't realize was very similar to BMP-RLE (run-length encoding), which I think was pretty commonly-used in Windows.
When I was just learning how to program I needed to store a collection of data in a way that allowed for fast access. I hadn't heard of binary search trees or hash tables at that point; in fact, I just knew about dynamic arrays and linked lists. To store the data, I initially thought about putting it into an array and then scanning it each time, but that seemed really slow. Moreover, if you kept accessing the same entry over and over again, it would always take the same time. I kept thinking - what if you stored everything in a linked list, then moved the element you searched for to the front every time you accessed it? This would make lookups really fast for commonly-accessed elements, and so I actually ended up using it.
I found out less than a month ago that Sleator and Tarjan had invented this structure along with a few other self-balancing structures [1] like splay trees and skew heaps when they were doing their work on self-balancing structures with amortized guarantees. The remarkable part is that this approach is actually very close to optimal - if you knew in advance what the access pattern was going to be, you couldn't do much better than "my" approach!
[1] http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15451-f01/www/lectures/lect0925.ps(Granted, none of these were as formally defined when I did them...)
Edit: Derek called me out on a slight miswording. By "none of these were as formally defined when I did them," I meant that my implementations weren't as complete as their formal definitions.
[1] http://en.wikipedia.org/wiki/Temporal_difference_learningI got a surprise the day before I was scheduled to present my findings on how to combine multiple source of soft bit decisions. I asked a colleague to review my presentation. He informed me that I had reinvented Maximal Ratio Combining [1].
[1] http://en.wikipedia.org/wiki/Maximal%5Fratio%5FcombiningWhilst it isn't really an algorithm, I did completely code a lolcat compiler/translator in Python, before actually googling it and finding out there were already a couple existing.
And I was so proud of myself...
Sequential Quadratic Programming with constraints. I "invented" a way to turn a nonlinear optimization problem with constraints into a sequence of linear (or in this case, quadratic) problems. I was not amused when I discovered it had been around for decades! (in my defense, optimization was not my field).
Huffman Coding [1], in 6502 assembly, while trying to improve an existing compression routine to squeeze an extra "cracked" Commodore 64 game onto a floppy disk, in 1991.
Carry Save Adder [2], for a digital amplifier. Turned out this was already in use in some ASICS but would have been less efficient on the FPGA I was using (which was optimized for ripple carry.)
Transport Triggered Architecture [3]
[1] http://en.wikipedia.org/wiki/Huffman_codingI reinvented Insertion sort as a kid when sorting our-of-order volumes of the encyclopedia at school.
I "invented" Bubble Sort, floor(), ceil(), and sleep() in C.
Got in an argument with a professor (I can TOO have pointers to functions!), ended up with a propensity for using sparse jump tables, which I didn't hear of until a few years later.
eval
, etc. - Daniel Earwicker
Ugh. One embarrassing project of mine was porting some code written for a processor with floating-point math to one without. Fixed point was out of the question, so I "invented" what I thought was a novel approach: a structure containing a set number of bits for the value and another set of bits for the magnitude of the number. And then I wrote functions for performing mathematical operations on them. Yep, I basically duplicated the floating point number (and probably not that well). I should have taken computer science in school.
I happened to partially re-invent the quicksort during my masters. I was a nerd not attending Data structure lectures. I was always believing there must exists one more way to sort the numbers. I spent half-a-night designing my algo. for sorting numbers and next day my colllegue told me it's quicksort and it's already there. poor me! :(
Hyper operators [1] when I was 13. And some cryptographical and compression algorithms (RLE, Vigenère) ;-)
I actually invented another a little compression algorithm - I don't know 'til today whether it already exists. It's based on eliminating the leading 0-bits in the binary representation of source bits which have been ordered according to their probability.
And Cantor's pairing function [2].
[1] http://en.wikipedia.org/wiki/Hyper%5FoperatorMerge sort, radix sort, bucket sort, see in wiki: sorting algorithms [1].
[1] http://en.wikipedia.org/wiki/Sorting%5FalgorithmRecently we designed and built a set of PL/SQL packages on an Oracle database, for use as an interface to the database by multiple front ends. Normally things like data validation and error reporting might be implemented in procedural code in the form, but in this case we needed the form to get all its information about each column including whether it was mandatory, and if it failed any validation checks, from the database.
We pretty much solved it with what we called "instructions", which encompassed a number of different things that the database packages could "tell" the form, e.g. "item X is mandatory", "item Y has error ZZZ", "hide item M", "make item N disabled", etc.
After we'd implemented it we found we'd just reinvented the Command pattern [1].
[1] http://en.wikipedia.org/wiki/Command%5FpatternNot sure if you could call it an "algorithm", but I came up with a generic form validation mechanism for jquery that was VERY similar to the 'validate' plugin.
Also, in high school I wrote a program on my TI-89 that was the Sieve of Eratosthenes, all on my own. Of course, what I didn't realize was that there was already a built in method for testing primality.
The ones that have stuck with me:
I invented display lists [1] in 1980 when I was in 8th grade for games that I was writing.
I invented non-recursive flood fill [2] in 1983 when I was a junior in high school as part of a graphics package I was writing.
I invented divide and conquer [3] in 1982 while I was trying to write a line drawing routine that only used addition and shifting (it worked--looked like crap--but it needed fixed point arithmetic).
[1] http://en.wikipedia.org/wiki/Display%5FlistSpent a few days writing an algorithm to shuffle Arrays when I found several, more concisely written methods already in existence!
I invented bucket sort (w/o realizing what pigeonhole principle is. I thought quicksort was slow for use in a project of mine and since I was using only integers I could do so much better) and divide-and-conquer form of convex hull algorithm (that works with only integer coordinates) just from sheer repetitive doodling of convex hulls around points :) I just have to find out the first and last points of each row, get the top left most and bottom right most point, join a line between them, then repeatedly add a point in between them to expand the convex hull in a divide-and-conquer manner. Eventually a convex hull is formed.
Upon learning raycasting algorithm (the one generally thought to be used in Wolfenstein), I invented one that instead of using a matrix of walls (zero value = non-wall, non-zero value = wall), it uses an array of vectors (each vector represents a point, and a wall is made from two such points).
I "invented" linked lists as a junior in high school, armed only with the knowledge of pointers and classes (but not inheritance yet).
When I first started tinkering with assembly language, I figured out how to make a dynamically allocated list by storing a pointer to the next piece of memory in each allocated block. It wasn't till a few years later when I took a data structures class that I learned that my "invention" was called a linked list.
I once wrote a recursive descent parser [1], without knowing the concept beforehand.
Among other - then unnamed to me - Design Patterns I invented the Visitor Pattern [2] and the Facade Pattern [3] (who did not?).
[1] http://en.wikipedia.org/wiki/Recursive_descentLazy synchronization with asynchronous method calls, i.e. functional programming.
I was pleasantly surprised to find out some years later that I had independently invented a technique for lossless compression.
I had written a program (Turbo Pascal on a Tandy 1000) for drawing images (basically a keyboard-only version of paint) and was concerned with how much space the saved images were taking up, leading me to a basic lossless compression algorithm that drastically reduced the size of the image files.
I couldn't believe it when I researched IoC that I had 'invented' it 6 months earlier for an object engine in our local metadata repository.
When I was in 'upper school' (high school if you like), I was writing a program in Fortran IV (with BASIC as the prior language) and I discovered that I had created a looping construct that was different from a DO loop, but not supported by Fortran directly. It was actually a WHILE loop (supported in Fortran 77, of course). I discovered that what I'd invented was a WHILE loop a year or two later, when I was reading more books about programming. (That program was also unrolling tail recursion with an array representation, but that took me still longer to realize.)
Canny (sp?) Line Detection in images (like photographs).
I happily invented the "One Time Pad".
My idea was that if the weakness of encryption comes from repeatedly applying the same key (albeit with mathematical manipulations) to a large data file, you could get around this by just having a key of pure entropy (or as close to it as you have on hand :) that is bigger than the data you wish to encrypt. My other idea was that if the key was totally random your algorithm could be as simple as adding for encrypting and subtracting for decrypting. I also predicted this would be unbreakable.
I found out later that this is a one time pad, and it is indeed unbreakable.
(I also invented steam engines as a kid, and a space shuttle with 3 booster rockets because, you know, its 1 better!)
Anagram Trees [1] - apparrently also known as an 'addagram'.
Cryptographically secure IOUs [2] - of course, there is a lot of research in this area.
[1] http://blog.notdot.net/archives/39-Update-on-Anagram-Trees.htmlWhen I was in grad school I "invented" the composite design pattern. I was coding a graphics editor in Java 1.0, with the requirement of being able to group multiple shapes into a single shape. I came up with what I thought was the clever idea of writing a class that implemented the Shape interface but contained a collection of Shape instances. I almost injured myself while patting myself on the back.
Sometime in the next year, I was introduced to the Gang of Four Design Patterns [1] book. That burst my bubble.
[1] http://rads.stackoverflow.com/amzn/click/02016336121992: Auto-rebalancing AVL trees, where the re-balance is done on the tail of the recursive insert.
In a university assignment.
I guess I must have "invented" pretty much every array function in PHP :D And maybe some of the string functions too (but who hasn't, in any language ?)
Since library functions written in C seemed not good enough for me, I had to rewrite them in pure PHP (performance is for sissies ;)
Then I learned to check the docs at php.net more often...
Recursive descent parsing supporting left recursion.
Binary searches.
I had been working on a C program to organize card game tournaments and wanted it to be really fast -- enough so that a large number of players could be handled on as little as a 486. When I started to realize how long looking up players was going to take, I tried to come up with a better solution than repeated linear searches and wound up with a binary search.
Not sure it really qualifies as an algorithm, but I "Invented" the technique of disabling a button on an HTML form with javascript so that the user doesn't inadvertently post the form twice.
This wasn't necessarily an algorithm at all, but as a homework for my C/Assembly class we had to write and implement our own malloc [1].
If anyone did the google code jam qualification round this year, they did a load balancing algorithm, a scheduling algorithm, and a surface area algorithm.
[1] http://en.wikipedia.org/wiki/MallocWhile I have the opportunity, I'd like somebody to show that something I came up with independently was invented before. It's so simple, I'm surprised that I haven't heard of it, but maybe one of you have. Namely, if you want to maintain a structure in order by the number of times an element occurs, in the worst case, you have to swap every two times. But, if you are willing to tolerate the elements being out of order by some small percentage of the overall count, then you only need swaps in the log of the number of arrivals (the log base depends on the percentage). You can prove this with the Master theorem. I did this in ~2005.
I re-created PHP's word_wrap [1] here [2]. Mine's grammatically better.
[1] http://www.php.net/manual/en/function.wordwrap.phpMedian-Heaps :(
Mean of Circular Quantities [2]
Language Oriented Programming [3]
Division [4] - back in 2nd grade haha
[1] http://en.wikipedia.org/wiki/Shunting_yard_algorithmAfter I was first exposed to selection sort I immediately saw room for improvement and created shaker sort. This was several years before I learned the name for either one.
@Ryan
- Temporal difference learning [1]
- Several different design patterns ( Observer [2], State [3], etc..)
- Breadth-first search [4]
- Depth-first search [5]
- Binary search [6]
- If statements in PIC assembly. (Subtract one number from another, and then check for over/underflow to see which one is bigger.)
(Granted, none of these were as formally defined when I did them...)
Either the age on your profile page is way off, or you are drastically underestimating the development of computer science. It's a young field, but it's not that young.
Depth-first and breadth-first are both so old that no one seems to know who "invented" them first. A* (of which both are merely special cases) was published in 1968. Binary search is probably as old as sorting. If
statements have been around since at least 1978 (K&R C), and probably quite a bit longer. Temporal difference learning was first described by that name in 1988. The Design Patterns book came out in 1994 (when you were presumably about 8 years old.
It's cool that you reinvented these things without being exposed to them first, but I'm pretty sure they were all formally defined before you did so.
[1] http://en.wikipedia.org/wiki/Temporal_difference_learningI rewrote strcmp in C before I knew it existed (silly, I know. I learned the syntax well before the library). Of course, my strcmp was not as good as the C library's.
strcmp()
implementations they ship with, I would not to be too sure of that... - Louis Gerbarg
I was playing around with sorting and how to use threaded sorting, and I sort of reinvented quicksort in the process.
I "invented" recursion over 20 years ago in college as a way to solve the Towers of Hanoi problem. Unfortunately, the language we were using (Vax Basic) didn't support recursive calls so I could never get it to work properly.
I didn't learn the term or the concept of recursion until the next year :)
It was in High School and I was working on the shortest-path problem, when I came up with the flood-fill algorithm. It wasn't a recursive or queue-based like the actual algorithm but an iterative version of it instead.
There was no Wikipedia back then, or I wouldn't have spent those numerous days and infinite revisions in coming up with a solution.
I was very proud to invent a search tree for storing sequences. I then showed it to my tutor and he said "oh, that's a trie" (aka digital search tree, retrieval tree, prefix tree). I'm not sure if I was disappointed or not...
Also, Sir Clive Sinclair reportedly invented binary.
I "invented" the iPhone's scrolling method. A few years before it came out, I had drawn up a complete design for a touch screen interface controlled by dragging content in the direction you wanted it to move.
XOR drawing of a cursor so as to avoid a need to redraw the screen.
Elsewhere in the same program I also developed the other technique for avoiding a redraw--copying off the block containing the cursor.
How were patents ever granted for such things??
Sqaures of 5
I think i came up with this when i was in 9th grade or something. I was just playing with numbers and doing some divisibility test when i discovered this peculiar trend in squares of numbers ending with 5.
Let X be the number ending with 5 and let Y be the number before 5. For example if X=25 Y=2; X=625 , Y=62
Then if X' is the square of X. Then Y' = Y(Y+1) and the number is {Y(Y+1)}25
For example 15^2 = {1(1+1)}25=225 15^2 = {1(1+1)}25=225 25^2 = {2(2+1)}25=625 35^2 = {3(3+1)}25=1225 45^2 = {4(4+1)}25=2025 55^2 = {5(5+1)}25=3025 65^2 = {6(6+1)}25=4225
I was working with QBASIC and was too dumb to know what a SUB was.
So I figured out structured programming to deal with my spaghetti code.
Then, later, I started figuring out how to actually pass parameters internally.
I wrote an in-memory database which can handle multiple concurrent transactions (for a hobby project [2]). When thinking about how to isolate the transactions, I decided to use a revision number system similar to Subversion. I realized that the resulting isolation level was not serializable [3], but anyways quite good. Afterwards I did some digging and found out that I had implemented snapshot isolation [4] and multiversion concurrency control [5].
[1] http://en.wikipedia.org/wiki/Snapshot%5FisolationWhen I first started to try to get my head around the new weirdness before me that was OOP, I "invented" a way of doing stuff which was essentially the strategy pattern. I only knew of encapsulation and inheritance at the time. The descriptions of polymorphism that I found were totally incomprehensible to me and it would also be almost a year before I discovered design patterns. I thought I'd really invented something ground-breaking!
I "invented" binary search when I was still a teenager and had just started out with the programming language C. That was about two years before I got internet access at home.
Although Internet took away the 'magic' I associate with learning by 'trial and error' and having little access to relevant literature, I can't say I miss those times either. I envy the youth nowadays.
Once someone had asked me a puzzle question: To write a program to add two numbers without using arithmetic operators. For this I made an algorithm to add numbers using bit-wise operators, and was quite happy with what I did. Because until then I had never known of what full-adder was. Later when I studied about full adder and its implementation in Digital Electronics, I realized that it was exactly what I had written code for :)
Not an algorithm, but I invented high-order-functions (specifically map).
I also invented versioned FUSE (file system in user space) in a shamefully ugly way (stat() everywhere).
I "invented" switch/case. I begun my programming career with a BASIC that did not have switch/case and turned to 68k assembly on the amiga. I didn't like to use multiple conditionals for a set of values and "invented" switch/case via jump lists.
A little later I connected the amiga and a PC via parallel port with a special selfmade cable and wrote a program for each machine (both in assembly!) for sending files back and forth. I "invented" all kinds of error corrections, multilevel handshakes and discovered the "Two Army Problem". I thought I must be a genius. What a disappointment when I learnt all that a couple of years later in college as pretty basic stuff...
The first time I played with a calculator, I was eleven (Cuba 1981). At that time it was a rare great thing for me and I was fascinated. I calculated many times n^2 for two-digits n. I found out that:
[ab]^2 = [a^2][2ab][b^2] -> where a,b digits.
For example:
[34]^2 = [9][24][16] -> write 6 and carry over 1
= [9][25][6] -> write 5 and carry over 2
= [11][5][6] = 1156
I think that was the first time, when I experienced an Aha-effect and thought I had invented a formula.
Later on I learned algebra and realized that "my formula" was a special case:
(10a + b)^2 = 100a^2 + 10(2ab) + b^2
I thought my XOR and bit rotating encryption algorithm was awesome until I read the introduction to Schneier's book [1]. I'm not sure I would have kept my kid sister locked out either.
[1] http://www.schneier.com/book-applied.htmlWhen I was a young kid in kindergarten or there abouts, the teacher asked all us little students how high we could count on our fingers. I sat there for a while and figured out I could count to 2047. It took me years to realise I'd 'invented' binary.
I "invented" Object-Relational Mapping (like hibernate) in the course of implementing my undergrad thesis. I was writing a web spider, based on the GNU wget source, and I was using a database to store my data already. When I found myself needing to conditionally escape from a 1000-line function 500-lines in, with lots of conditional allocations in between, I found it easier to just dump state to a database, and exit with a code that told the parent process to restart it with arguments instructing it to restore state from the database. It was an ugly hack, but it made me way more productive. Once I had invested in the code for that, I ended up using it in a bunch of other places for more sensible things as well.
Also in college, I "invented" the treap while implementing LRU virtual memory in a toy OS. It turned out it was only efficient for highly-randomized memory access patterns, since sequential access would turn large sections of the treap into high-overhead linked lists, but it was still neat that my intuition that such a data structure would have to have a unique representation turned out to be correct.
Since college I've found it a better use of my time to re-use proven algorithms than reinvent the wheel, but reinventing the wheel can be a good educational exercise.
When I was a kid, in the early 1980s, my first bigger program I wrote was a relational database. Written in BASIC. Granted, it was for small datasets as all the data had to fit in RAM, but it consisted of multiple tables (as arrays) and a connection between them (one field contained a number that referenced a row number in another table.
I had never before seen a database in my life.
But if you think of it: with data all in a similar structure, a table becomes a natural choice; and in an attempt to reduce redundancy, move identical data out to a different location and reference it from different points instead of blindly repeating it, it just comes naturally.
I once "invented" a Dictionary Coder [1] LZ77-like compression algorithm. I was so proud :-) Then I found out that I would need to store stuff into n-bit numbers, so then I also "invented" Huffman coding [2]. Finally I looked up stuff on the internet and found out that these things have been known for decades.
[1] http://en.wikipedia.org/wiki/Dictionary_coderI "invented" Gödel numbering [1]
When I was an undergraduate, I worked for the navy as a programmer in an "automatic data processing" center. I wanted to develop a way to efficiently store textual data in data bases by enumerating words (hard disk space was a premium then).
I came up with a way to assign unique natural numbers to words so that I could encode or decode a word in O(1) time without needing a lookup table, just the number. Later in graduate school, I learned that something similar to my enumeration method had already been invented by Kurt Gödel [2] 56 years earlier.
Gödel didn't develop his method until he was about 25. I came up with my method when I was 20. Do I get to claim to be a genius too?
[1] http://en.wikipedia.org/wiki/Godel_numberAbstract syntax trees.
I reinvented a few suffix array algorithms in the literature. Actually kind of surprised when I ran across them because I just thought that was the way everyone did it.
A thread pool. I was screwing around in a totally un-thread-safe environment and decided that I could put a function and it's arguments into a table, then call it from another thread when that thread wasn't busy. Now, it was hideous and non-functional, but when I saw the diagram for a real thread pool, I instantly recognized it.
Long back, when i had just started programming, a friend asked me if there was a way if we could calculate the square root of a number without using a library function.
Spent an evening, comming up with a brilliant algorithm that actually worked.
Next Semester, in Math class, they taught us the Bisection Method [1] which was oddly similar to my ingenius algorithm. I was heartbroken. ='(
I never realized it could have been used for any function (not just sqrt)...
[1] http://en.wikipedia.org/wiki/Bisection_methodI have combined B-Tree and hashing to solve problem of
I was working on a mapping project using visual basic 6.0 when I was in grade 12. The project started just to display lists of businesses on a map. (I was actually inspired by shows like "24" at the beginning ). But long story short I started to wonder how computers would know how to find a path from point a to b. and using primitive tools available on vb 6 like lines I came up with an algorithm which is very similar to Dijkstra's path finding algorithm. my next movie inspired project will be called "Enhance -- Enhance..."
In high school I was asked to do a program for my father's office's TRS-80 Model II. Basically I had to sort a lot of numbers and the algorithm I'd learned for sorting in school (bubble sort) was obviously woefully inadequate. I therefore "invented" the merge sort.
Imagine how crushed I was a few years later in university when it became clear that I'd been scooped on my "secret sort" by slightly under 40 years. :(
I realized that some DP algorithms that provide an exact answer given a set of constraints, like the 0-1 knapsack, could be modified to provide an answer that is provably within x% of the most optimal answer when the set of constraints is too big.
For example if the "weight" can go from 0 to 1 billion, you can't do a normal 0-1 knapsack. But you can get a very speedy answer that, although not perfect, can be proven to be within x% of the most optimal answer. This is pretty shiny stuff. I don't know how it's called nor where it's documented, but it's pretty cool :)
(and yes, I know it's not optimal, but for a whole lot of problems you'll get answer that are very close to the optimal answer and you can know how close you are)