I always liked to ask myself "what's the first principle(s) of this?" after I learned the basic stuff of something (e.g. programming). It's an inspiring question, IMO, that can force you to think about the most important principle(s) behind something, especially a skill such as programming.
So, what do you think is the first principle(s) of programming? I'll give my answer below a little later.
Write code like if it was you that would have to maintain that code.
Be as lazy as possible.
Zen, part I: Programming is only the road, not the way.
Programming is only the technique to teach a computer what it's gotta do. To be successful in creating fast, reliable software means to know your algorithms, best-practices and all the other stuff not necessarily connected to your Programming (language).
Zen, part II: If you are in a hurry, stroll along slowly. If you really are in a hurry, make a detour.
Sounds silly, but do not let yourself get into compromises that (really) may trouble you afterwards. I got a rule: If you are at the core of a program, try to be as precise and good as possible. If you are using methods from the core that are deep in your software, try to be faster in coding. If you are coding above these two, you can even get a little bit more sloppy.
Design errors are the hardest to find and/or fix, next step are programming errors in parts everyone relies on, then the "real showing-off software parts". If you need to fix a design error at the end of a project, ummm, that's not good... ;-)
Zen, part III: Know your path, Neo.
Know your environment, tools and the stuff you rely on on a daily basis and get it sorted so that it works for you. Best if you use your programming "environment" so natural that you do not even have to think of it. If you have to get a job done do not introduce "fancy new stuff" but do your work. This stuff can be introduced in a new project, namely then when you have time to prepare and use it.
KISS (keep it simple, stupid).
It does indeed beg the question "How do you define simple?" And also "When is something too simple for the task at hand?" This is why you cannot become a good programmer just by knowing the first principle of programming.
Premature optimization is the root of all evil. -- Donald Knuth
Understand the problem first!
Do not reinvent the wheel.
YAGNI - You Ain't Gonna Need It [1]. The idea behind YAGNI is to program for your requirements, not for prospective, potential features. The premise is that by keeping to what you need to program, you will (among other things) cut code bloat, reduce complexity, avoid feature creep, and reduce the restrictions on what can be done (and how it can be done) in the future.
I suppose it works in tandem with modular design: Future features can be augmented without redesigning existing code.
[1] http://en.wikipedia.org/wiki/You%5FAin%27t%5FGonna%5FNeed%5FItKnowing when not to program.
If it wasn't tested, it is broken.
Coffee in, code out.
Distinguish between cause and effect (working with computers)
Distinguish between fact and opinion (working with people)
As simple as possible, but no simpler (design)
There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult.
-- Charles Antony Richard Hoare
Think first, code later.
You're nowhere near as smart as you think you are. Ask questions. Learn to value your peers.
When debugging, the first answer will almost always be wrong.
Code you write with the intention of tossing out tends to become a cornerstone of much larger processes. Never leave anything written haphazardly.
it doesn't work till you showed it in a test
Programming is a means not an end. Or perhaps, "Can does not mean should."
Program with an audience in mind. By that, don't assume that anything you write will not be read and maintained by you or someone else.
A corollary to that: Prove that you understand the problem you are trying to solve by naming your variables and functions and classes well!
Know your tools.
In my opinion, the most important principle is the reduction of complexity by creation of good abstractions.
This includes
but also determination of the point where to stop creating abstractions and get down to the fundamental properties of the implementation technologies (e.g. database system, programming language) to prevent creation of avoidable additional complexity.
Always write code as if the person who will be maintaining it is a psychotic serial killer who knows where you live
Also, never think you know everything about programming, keep learning
Indirection.
It might not be obvious why this is, or even what this means. But indirection is really at the basis of all of programming.
At a more superficial glance, it only seems to touch abstraction (as a concept), or perhaps also pointers (after all, they are the archetype of indirection). But pointers are just one instance (there! indirection!) of the concept, and there are many more, that are effectively equivalent upon closer examination.
First and foremost, variables are indirections because they allow the manipuation of a value indirectly via a symbol (name). As a direct consequence, functions are an indirection, because they replace one symbol (the formal parameter) with another (the actual parameter, or argument (sometimes, the definition is the other way round)).
Since classes are historically just functions in disguise, classes are obviously an indirection for the same reasons as functions.
Arrays (or lists, same thing) are another indirection, often exposed as a fundamental type. In fact, there is no difference between an array and a pointer. Both refer to other things, or none (in which case the array is empty, the pointer is null
or a special placeholder, “not in list”: NIL
).
I've recently read a paper where the pseudo code contained the following function, and use:
function UpdateItem(item, position) do
P <- { }
if item.x > position then
item.count <- 0
P <- { item }
item.count <- item.count + 1
item.x = position
Results <- { }
for something or other do
position <- GetPosition()
Result <- Result U UpdateItem(current, position)
The point here is that, like all good mathematical pseudo-codes, it operates on mathematical sets, and augments a Results
set by joining it to another one. Now, how would one implement this? Obviously, we could just use a Set
data structure, or an array, or a vector, or any of these. But usually, this is done via pointers, right?
item_t* update_item(item_t* item, int position) {
if (item->x > position) {
item->count = 0;
return NULL;
}
++item->count;
item->x = position;
return item;
}
item_t* result = (item_t*)malloc(sizeof item_t * N);
unsigned index = 0;
for (something; or; other) {
item_t* r = update_item(item, get_position());
if (r != NULL)
result[index++] = item;
}
For me, this shows really well that many, many different programming concepts just implement/perform some kind of indirection and that, despite all their differences, most of them can be expressed in terms of other means of indirection trivially.
So yes, I think indirection is really the first principle of programming, since all others are just indirection in disguise. Except recursion. Of course, recursion can be used to describe indirection. ;-)
I would have to say that testing is one of the most important pieces of the puzzle. In my opinion test early and test often. Whether you design method is highly planned or agile there is nothing more important than testing to keep you on the right path.
Do no harm :)
Use your head. It is terrifying how many people fail that one.
Paraphrasing Fred Brooks:
Representation is the essence of programming. Much more often, strategic breakthrough will come from redoing the representation of the data. This is where the heart of a program lies. Show me your code and conceal your type definitions and function prototypes, and I shall continue to be mystified. Show me your type definitions and your header files, and I won't usually need the bodies of your functions or methods; they'll be obvious.
And just to add a shred of originality, when you write down your data-structure definitions, document their bloody invariants already!
I got into programming by way of studying digital electronics, so I guess for me the basic logic gates (not, and, or, xor, implies) were the first principles of programming.
It's all about the user.
When refactoring unnecessarily complex code, I often repeat the mantra:
The computer wants to do the right thing, you just need to get out of the way.
Principle: Software is Knowledge Capture.
Consequences: Many techniques for knowledge representation, all founded on Abstraction. Gives us layers, tiers, encapsulation, separation of concerns.
Many techniques for procedure representation, all founded on Sequence, Choice, Repetition.
Write code for the next guy.
DRY, pretty much everything else spawns from it. KISS is the other end of the balancing act to make sure you don't pursue software elegance to levels of insanity.
Think about how then end product will be used at least as much about how the code looks. You could write the best commented, most maintainable, most brilliantly logical code ever but it's essentially a failure if no one wants to use the end product.
Occam's Razor. Reduce the problem/task to its simplest form. Then - and only then - start coding. Don't put the cart before the horse. Requirements first. Sure, they may evolve but the core requirement will be the core of your code.
Garbage in - Garbage Out It doesn't matter how nice your user interface is if the data is bad.
In practice, and very unfortunately, good testing turns out to be more important than good programming. Testing increases the value of ugly code [1]. If you can't write beautiful code, you should at least make it testable.
[1] http://www.1729.com/blog/EconomicsOfTestingUglyCode.htmlWhile keeping it simple (KISS) and not duplicating code (DRY):
Never completely believe what you are told about how the program will be used.
This is a good question.
Any problem can be solved with another layer of indirection.
Sequence, Choice, Repetition
What is the simplest thing that could possibly work...
SOC - Separation of concerns
KISS - Keep it simple stupid
DRY - Don't repeat yourself
in that order
There are only three things in the universe: data, containers for data, and tools that either put data in a container, take data out of a container, or change the data in a container, and they overlap.
"Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live." ---- Martin Golding
Besides not reinventing the wheel, you should understand how the wheel was built and what it really does.
If the system won't work on paper then it won't work as a program. The reverse isn't always true, but a good computer system is usually based on a good paper system.
Start with the output and work backward.
20% code for function
80% code for exception
Beneficially relating elements.
This means that there are elements (modules, subroutines, whatever) that relate in order to benefit one another (nothing superfluous). This is part of Kent Beck's responsive design concept. There's a talk on it [1].
[1] http://www.infoq.com/presentations/responsive-designUnderstand the problem.
Decomposition. Solve large, complex problems by breaking them into smaller, more manageable pieces.
And - style matters.
Abstraction, Composition
Do one thing, and do it well. It's the UNIX philosophy (http://en.wikipedia.org/wiki/Unix_philosophy). It works at every layer.
BE SMART AND LAZY
Just smart, and you will be engineering your way into bloated frameworks and writing UML until Duke Nukem Forever is released.
Just lazy and you are worthless, eating bon-bons in your sweats with no hope of amounting to anything.
If you are smart and lazy, that's where the money is. Engineering your way to nirvana by being pragmatic and recognizing ways to make your life easier daily.
When in doubt, manipulate the data!
You have to resolve all the problems in the world with "if, for, while".
JFDI - Just @#*&^%$ do it.
A friend recently suggested that agile, waterfall, iterative, etc etc etc are a waste of time and the best way to write software is the JFDI school of thought. Not my mantra, but made me smile.
Knowing WHAT not to program is as (sometimes even more) important as knowing what to program.
making it bug free.
"Computers Are Blind, Deaf and Stupid".
I should tell this to that teacher (not a programmer) who thinks that the formula is enough for programming an app that makes math calculations. You must tell the computer what to do with that formula, doh!! (the same is for data from a BD).
Blind and Deaf... if you make signal and image processing, you know this.
It doesn't exist unless it's committed.
Computers do ONLY what you tell them to. If it doesn't work right, its because you haven't "told it" (coded) it right.
2nd favorite: its usually a problem with you (your code) - interpret this as in "first look for bugs in your code, before blaming it on bugs in libraries you use"
Loose coupling. [1] High cohesion. [2]
[1] http://en.wikipedia.org/wiki/Coupling%5F%28computer%5Fscience%29Always code as if the person who will maintain your code is a maniac serial killer that knows where you live
No idea where that phrase originated from (possibly from some humorous caption), but I think there is some truth in it: Code for maintainability. If other people can maintain it, then that usually means that it's kept simple and well structured for the most part.
Don't be stupid on purpose
Code is written once, and read many times. Optimize for the reader.
I think that one consequence of the Church-Turing thesis [1] is that any algorithm that can be thought of, can be programmed on a machine.
It makes it incredibly hard to tell a manager/a client 'this is impossible' because in theory, if you can describe it, it is possible.
The rest is a matter of resource. The difference between a programmer and a non-programmer is that a non-programmer will ask for features which will range from 5 minutes development to 5 billion years, and they will be equally happy with each one of them. I exaggerate a bit, but that's the idea.
So here's the first rule of programming:
Maximize your 'end users satisfaction'/'resource' ratio.
[1] http://en.wikipedia.org/wiki/Church%E2%80%93Turing%5FthesisSo, what do you think is the first principle(s) of programming? I'll give my answer below a little later.
In other words, always check data for validity first. Bad (or unexpected) data can create havoc.
I'll second DRY and KISS. I'd also add, "Knowing a language is not the same as knowing how to program. Just like knowing how to use the steering wheel is not the same as knowing how to drive." Learn fundamental principles, and then apply those using whatever language or tools you have available. Languages and database engines and the like come and go. Data structures and algorithms are forever.
Ask Questions first.
This is a good question.
Don't repeat yourself!
If it (the project) doesn't give you a hard-on, don't do it.
When you start something finish it!
Use the other principles to achieve this.
Refactor before it's too late.
One important aspect of programming that is often neglected and ignored is "Separation of concerns". Before starting to code, it is crucial to analyze and design your classes to ensure they are not tightly coupled. Otherwise you will end up with very dependent objects and code, which makes change very difficult and refactoring a nightmare.
Applications should be layered sufficiently and use of design patterns to decouple your classes allows for easy maintainence and ease of testing.
Humbleness.
Think as if you don't know any particular programming languages (so that you don't fall into the trap of "thinking in XXX". Code to realize that thinking using the proper language.
Structured Programming
When you assume, you make a YOU-KNOW-WHAT out of U and ME.
The golden rule, that one is. Always verify what you're taking for granted.
Progamming is not for the lonely geek.
Do not overuse Interface.
Sequencing, what do I do and When do I do it.
I will go with an item that is too often neglected: check your I/O.
When you write a program/function/etc. make sure that the input/output is valid.
0 + 0 = 0
1 + 0 = 1
0 + 1 = 1
1 + 1 = 10
1 * 10 = 10
10 / 10 = 01
~ 0 = 1
~ 1 = 0
That is all there is to computers
In general: Problem solving.
That is what it all boils down to.
No infinite loops.
0 + 1 = 1
1 + 1 = 10
10 + 1 = 11
11 + 1 = 100
100 + 1 = 101
101 + 1 = 110
Get it?