ProgrammersWhat will be considered harmful next?
[+20] [29] mlvljr
[2010-11-03 20:03:50]
[ best-practices ]


We all remember (or have heard of) the GOTO debate. The practice was pretty normal for a long time, but the times have somehow changed in a way that now it is near-prohibited and a sign of bad taste.

Will we see some other common techniques go that way?

In other words, if in near future there emerges a game-changing paper like "GOTO considered harmful" and rocks the programming world what exactly could it "consider harmful"?

Possible technologies / approaches coming to mind:

[suggested in the answers]

P.S. Crazy / dreadful predictions are welcome too (when supported with at least some arguments).

[NOTE: this is not a question of personal hatred but of possible common practice change.]

This is just going to dredge up a bunch of argumentative topics. - Jeremy
(1) @Jeremy, ain't that is still a fascicle this site! - Peter Turner
@Jeremy May be, but it's the arguments I'm precisely interested in ;) - mlvljr
@the police people You guys are... too serious ;) - mlvljr
So the answer obviously is "NOT ANY OF OUR SHINY TOOLS. EVER." It's cause we live already in the future (unlike Dijkstra), see? - mlvljr
(18) StackExchange considered harmful to intellectual freedom. - Peter Turner
I will be hated for it but I would answer: garbage collected language. If you reopen I would be happy to argument it ;o) - n1ckp
@n1ck And how can I, btw? - mlvljr
@mlvljr: well you can't, other people with enough rep need to do it for you. - n1ckp
polymorphism - for the same reason as GOTO - Steven A. Lowe
@Steven A. Lowe How so (if not kidding) -- for "inability" to reason about program states easily or (what)? - mlvljr
(1) @mlvljr i'm kidding. "oh my gawd that message could jump anywhere in the code!" - Steven A. Lowe
(6) The funny part of this is that goto was a majority-backed practice for a long time, so the answer with the FEWEST votes here is most likely to be the correct one. "Relational Databases considered harmful" is leading the pack at the moment. - Inaimathi
(3) Too broad. Voting to close. - Jas
(1) @Jas Too close. Vote to broad! - mlvljr
@Jas Again now?? How is is not real??? - mlvljr
@mlvljr > It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. - Jas
@Jas All at once? I really doubt. If it is the "prediction" thing, then well, one cannot reason about future with ultimate confidence, but in our case that does not render the opinions useless since a) they are supported by arguments and b) there's an actual interest in them. - mlvljr
(1) @Jas, I know exactly what he's asking. It doesn't require a computer scientist to figure it out. If you were going to write a paper like "GOTO considered harmful" which would rock the world what would you "Consider harmful". You don't need to write the paper, just the abstract. - Peter Turner
@Peter Turner Possibly so. Let that be our motto! - mlvljr
@Inaimathi Thanks for the vote of confidence in my prestidigitation! - Peter Turner
@Peter, @mlvljr - you lost me there. - Jas
(1) @Peter, StackOverflow questions replace actual thinking... - Thorbjørn Ravn Andersen
Interesting link: - Peter Boughton
@Peter Boughton Great one, thanks a lot! :) - mlvljr
(1) HTML5. Because it's the first buzzword I tried for which google didn't turn up results for "X considered harmful" (quoted). - peterchen
spin-locks in thread programming??? - vartec
(2) A speculative question like this cannot be reasonably answered. Specifically, the FAQ states that open-ended hypothetical questions should not be asked. - Anna Lear
Though it should be pointed out that this question pre-dates the FAQ. - ChrisF
[+41] [2010-11-04 10:12:03] bobah


Instead of thinking on how to solve a problem in the most cost-efficient (or most elegant) way, more and more people are working hard on figuring out what pattern to apply in each particular case (I've heard several times already that "programming is all about patterns").

This is not only causing suboptimal decisions but is also significantly decreasing the level of "creative entropy" (similar to what Google's suggest is doing).

"considered harmful" is also a pattern :)

Added 2010-11-12: Cumulative reply to patterns advocates: I am not against patterns as a reflection of a natural human brain's way of learing by generalizing. Patterns are also good when used to compress the human language (programmers' professional jargon, as one my friend noted). I am against this [1].


I doubt, these are too shiny at the moment ;) - mlvljr
(2) @mlvljr - but when you're on the top, the only further way is down :) - bobah
Well its not like the GoF where put on trial - Conrad Frix
(1) Patterns (as well as "methodologies") are an excuse for being non-creative, so no, they're going to be always around I'm afraid ;) - mojuba
(3) Programming patterns is like grammar for a natural language. If you know a language fluently you never think about its grammar you just speak. The same goes for software developemnt, you write efficient and manageable code without even thinking about that the structure has a name. - Ernelli
(1) @Ernelli - programming is not like a grammar, but like a language itself. You can become a good copyrighter if you learn the grammar by heart, but you never become an author of a bestseller this way, you need to learn much, much more for that, analogously, some people, instead of going deep just buy "N enterprise design patterns" and write "enterprise architect" in their CV - bobah
(3) Our lecturer on CPU architectures, when described Z80 said "if you don't know how particular machine command works, just think how you'd like it to work and most probably you guess the right answer" -- there is almost always a clear winner solution in any particular case, but the more complex the problem domain is the less probable is that a solution of one problem can be applied to another one as-is. - bobah
(1) Patterns are a simple way to rapidly solve a generic problem which may be modified easily to solve a problem. Yes, it does make the job of programming much easier at the trade-off of...? What? Making work easier? Okay. I don't want to re-invent opening a CSV file every time I need to open a CSV. Does it allow some programmers to be closer to clerks that full-out problem solving masters? Yeah, it does. Does this mean it's going to kill the world? No. Do I honestly think that allowing someone to take a solution to opening a CSV is going to be considered like a goto? Heck no. - Incognito
@user1525 - there is no problem to use CSV parser to parse a file. The problem is to, say, use a DOM parser when you need a single record from XML file. - bobah
(2) Aren't patterns merely a concept of commmon practices and solutions to common problems? Aren't patterns the logical progression after a certain group of people start knowing how to use a tool or language correctly? - Jonn
@Jonn - they are indeed, I do myself employ Intel's "copy exactly", which is an extreme degree of a pattern concept. But even the best tool can be abused and with patterns it is heavily abused indeed. I have updated my answer with the link to the joking story highlighting the type of an abuse I am talking about. - bobah
(2) I always saw the whole 'patterns' thing as just a development of a lexicon of common ways of solving problems to aid developers talking to each other. I don't see them as a goal in and of themselves. - Kaz Dragon
@bobah: So, really your answer was: "mindless programming is bad"? - John Fisher
(1) @John Fisher - no, though mindless programming is bad indeed. The answer is exactly what's written. The way most people interpret the idea of patterns is evil -- it tends to give a false sense of "expertise". You can't be a true expert in any technique(or whatever else) until you once have followed the full path of the synthesis of that technique. - bobah
[+27] [2010-11-04 01:52:16] Berin Loritsch

Hmm the next great evil? Here is something I see abused all the time:

  • Singletons considered harmful

This will particularly bear out as concurrent programming becomes more prevalent. I have seen abuses of the singleton pattern so bad the author should have just written the application in C. The truth is, there are very few circumstances where a singleton is truly the correct solution. I think I can count on one finger the number of times that it might have been the correct answer--oh wait, I was wrong....

(11) Singleton is kind of like fancy OO-speak for global variable. People, just because it's a named pattern, doesn't mean it's OK to use it everywhere!! :-) - Mike Clark
"glabal state considered harmful" -- how could I forget this, updating the question :) - mlvljr
(5) Just because it's abused doesn't mean it's wrong - "Free speech does not give you the right to shout 'Fire!' in a crowded theater." - Michael
(8) I thought Singleton was already considered harmful. - Graham Lee
(2) Steve Yegge's Singleton Considered Stupid? - Joe Zoller
(1) Global state (whether Singleton or not) is always harmful. - Matthieu M.
This is going to far. "Global state" is highly useful in many situations -- why else would we have databases? Singletons aren't bad in themselves, but they must used correctly. (Hopefully, you don't use multiple connection strings to point to your database from your app...) - John Fisher
(2) There is a difference between global constant values (const in C++ and C#) and values that change (maintaining state). Changing values in a global state require much care if you ever want to use multithreading which is also becoming more of a must. And sometimes, you need to use multiple databases in your application (particularly if you are bridging multiple systems). - Berin Loritsch
@Berin: So, you're saying that multiple databases (read "multiple, large, many-valued singletons") are ok, but other forms of singletons are not? - John Fisher
A database is not a singleton. It is a service. Additionally, most db connect code requires a unique connection per thread. Singletons are code constructs. Files and databases live outside of your code. Bottom line, you can't rely on a singleton for your database connection. - Berin Loritsch
+50 goes here ;) - mlvljr
@Berin: I guess you'll need to define a singleton, then. There are many instances of what could be called singletons on desktop operating systems which are required for safe and proper operation, but apparently your definition doesn't include those. So, what does it include? - John Fisher
Let me ask you this: can there only be one file handle to reference a file? can there only be one database connection to talk to a database? The answer to both of these questions is no. There can be multiple file handles, thread handles, database connections, etc. In fact web applications rely on multiple connections in a pool so that it can respond efficiently. A singleton is a code construct. Check out the GoF design patterns book for that definition. Just because there is one file does not mean it is a singleton. - Berin Loritsch
@John Fisher It's imposing that something is a singleton in all scenarios (including those important ones where it could be better done the other way i.e. testing) which is a harmful practice (as I get it). If there must be only one instance of something by design, of course let it be, but make it a "locally" global one and you'll get best of both worlds (the design done as desired together with perfect testability / modularity). - mlvljr
@mlvljr: "Locally global" might work in some circumstances (assuming the oxymoronic nature of that phrase can be overcome). However, a singleton application cannot be implemented that way. It must be a global singleton to prevent multiple instances of the app from running simultaneously on the computer. Other singleton scenarious would have similar problems. - John Fisher
@John Fisher I actually cannot imagine a case where turning singletons into "locally global" variables / objects / components will not work (assuming that imports / implicit references to the singleton will be reworked into explicit dependencies and properly set up / injected). That's about software construction / design. Again, restricting oneself to a single instance of an application (while absolutely necessary by design (as in the previous case) in certain settings -- to prevent data corruption, etc) can become very painful in some other (i.e. during maintenance, (re-) deployment, etc). - mlvljr
(1) @mlvljr: How about a thread pool manager? More than one of those will totally contradict the point of having one in the first place. How do you propose to limit yourself to only one without at least one bit of global state? (If you use a variable that only exists once per instance of the app, then you have a singleton...) - John Fisher
(1) A thread pool manager per thread pool can work well. I can think of many instances, particularly in writing domain specific servers, where multiple thread pools and managers with differing parameters will be needed. Even on the client side, when you need to provide isolated state between different sandboxed entities, a global thread pool manager becomes a vulnerability to escape the sandbox. Consider how Google Chrome works. Each tab is run in its own process. Each process manages its own set of threads, etc. That's how it can provide the most protection. - Berin Loritsch
There's a lot fewer examples where the singleton, as defined by the GoF, actually makes sense. In fact, these become vectors of attack for people who want to do bad things. Additionally, they provide choke points if you synchronize access to it where the application will not be able to take advantage of multiple cores as easily. - Berin Loritsch
@John Fisher A single but not global variable is totally ok. I.e. this is bad: Manager mgr; void do_stuff() { mgr.do_smth(); } int main() { do_stuff(); } and this is (in above sence) better: void do_stuff(Manager & mgr) { mgr.do_smth(); } int main() { Manager mgr; do_stuff(mgr); }. - mlvljr
In other words it's not having a single instance of something during normal operation that is bad but utilizing such technical means to do it that harm some other ways the software can be dealt with (i.e. development, testing, deployment, etc). - mlvljr
(1) @mlvljr: I guess the difference here is one of preference. Do you design your code with future potential uses in mind, or do you design your code to meet the current known needs and refactor later if needed. Based on that choice, you can decide whether the singleton is useful or harmful. - John Fisher
@John Fisher Of course. I just don't perceive singleton as a "nice" thing, though it's curious. - mlvljr
[+25] [2010-11-04 00:53:10] dsimcha

Single Paradigm Languages

The '90s and 2000s will be seen in hindsight as an unenlightened search for "the" programming paradigm. What is becoming clearer every day is that no one paradigm is right for every problem. More importantly, no one paradigm is even right for every subproblem within a small project. To create code that is at the same time DRY, terse and readable, you need to be able to mix and match paradigms at a fine-grained level and pick the right paradigm for the subproblem.

I'm not sure this applies to the "subproblem" in all projects (it will in some, won't in others) but you are definitely right about one paradigm not fitting every problem. - HedgeMage
(8) You are basically telling us the Java is harmful :) - happy_emi
(4) @happy let's have the guts to state it loudly then ;) - mlvljr
(1) I think this is an argument for writing software in multiple languages, but most "multi-paradigm" languages are mediocre at all paradigms but not excellent in a single one. The better languages tend to be ones that focus on a single paradigm. - mipadi
@mipadi: I've heard this argument several times before, and my main gripe against it is that I like to blend paradigms at a finer grained level than using multiple languages would allow. Using multiple languages only works when you want to use a variety of styles at a very coarse-grained level. - dsimcha
[+18] [2010-11-03 20:14:21] Jeremy

I think we are past the point where we can again have such a simple and yet meaningful insight as "goto considered harmful".

(5) As simple as it was, the goto insight was backed up by a paper of considerable length, as far as I remember. - mlvljr
Is this being downvoted as a bad answer (to a good question), or because downvoting the question itself does not feel enough? "Yours answer da wrong question, pal! Bang!" - mlvljr
Well, I didn't downvote it. But it's not an answer so it oughta be downvoted. - Peter Turner
(3) @mlvljr: The "Goto Considered Harmful" was a letter, not an academic paper. It sparked more debate, including a paper by Knuth on when goto should be used (and he found several uses, many now done by other means). - David Thornley
@David Thornley Thanks, did not know the first (letter/paper) part. - mlvljr
@David Thornley Turns out initially it was intended to be a paper, even more, the "considered harmful" part of the title comes from Wirth ;) ( - mlvljr
[+17] [2010-11-04 08:18:58] zvrba

Test-driven development. It is a crutch for half-baked developers.

Why TDD: I think it's a hype. It kinda works for toy examples where you have functional relationship between inputs and outputs. But when it comes to temporal dependencies, such as event-driven systems, it's useless because the functioning of the program is determined by 1) the events that arrive, and 2) the order they arrive in. You can test that each event handler does the thing it's supposed to, but when there's shared global state (unavoidable in any program that makes money for their creators), you CANNOT test that any ordering of events will produce a correct result.

So what to do: use assertions to ensure that the program state is always consistent and valid. I use this methodology even for algorithmic code, and I spend very little time in the debugger -- most of the bugs are discovered by failed asserts. Assertions force you to think about the state of the program AND about assumptions required for the code you write to work. If you assume something, then assert that it is true! This reduces the number of bugs tremendously.

Why is it a crutch for half-baked developers: Heavily-used and complex systems have been written without TDD, and they have very few bugs. Examples: PostgreSQL database, Solaris or NetBSD kernel and base system (one of the best structured and clean code bases I have seen in my life), TeX typesetting system (I'd dare say, bug-free).

wish this better not happen, at least for the "test-" part (the "driven" one seems very insightful to me though, while simultaneously doubtful) - mlvljr
(12) You appear to have gone from "I've found a case where TDD doesn't work" to "TDD is considered harmful". Compare with goto, where the situation is reversed and the cases where it's the right tool for the job are limited. - Graham Lee
(7) I think TDD is limited to cases where the results are clearly defined - libraries, etc. - Michael
@Michael And when are they undefined? - mlvljr
(2) Possibly used wrong word - what I was saying was where requirements are clearly defined. Like, not in an internal company app that is constantly getting changed and updated. - Michael
(2) While I see your point, your final argument is plainly wrong; you can find lots of examples of projects using GOTOs succesfully as well. - UncleZeiv
(7) +1. I also think asserts are more important than unit tests (see…). I also hate TDD because it forces you to design heavily for testability, which is a secondary requirement. This leads to all kinds of unnecessary indirection in your code just so you have testing seams available. - dsimcha
I like the idea of using asserts, but they litter the code, in the same way that if compiler directives litter code. Yuck. - Robert Harvey
(2) @Robert: That's kind of the point. Asserts are a form of documentation that make assumptions explicit both to the reader of the code and to the compiler. - dsimcha
(2) @Graham: Heavy realiance on testing is harmful because testing can only show the presence of bugs, not their absence. Maintaining tests requires double work, makes refactoring more difficult, they test only a limited number of use-cases, and you end up with two code bases to debug: the tests and the code that matters. Also, the best code is the unwritten code. TDD goes directly against that principle. - zvrba
(5) Criticizing test driven development without understanding what makes it work is more harmful than using it blindly. What TDD does is force you to think about what needs to be done. Then you prove that the code cannot do the work currently (in the form of a test). Finally, you make the code do that work. That doesn't imply what you wrote doesn't have bugs, particularly if the person hasn't started thinking about the edge cases. However, when a bug is discovered, you write a new test to prove something needs to change. Continued... - Berin Loritsch
... Certain problems are difficult to test with the most common unit test frameworks. But that does not mean its impossible or that a different framework would be better suited. At least one of the principles behind TDD is what caused you to sprinkle assert statements throughout your code. If you have a requirement, use an assertion to prove the requirement is met. Just because your test harness is built into the code instead of a separate test library doesn't mean you aren't testing. - Berin Loritsch
My problem with TDD in general is that its really dependent on this "middle ground" of programming that I don't think is stable. You need the tests to verify that code works, but in order for tests to be useful, you need smaller, tighter methods. (That's good practice, in general.) But if your functions were small and limited enough (ala strongly-, statically-typed functional programming) then unit tests can be largely automated by static analysis, and the program's correctness is largely guaranteed by the strong type system. Continued... - CodexArcanum
In essence, TDD depends on the using a language that is powerful enough to support clean methodologies that can be tested easily (often needing things like Mock Objects and IoC) but which are not powerful enough to just be fully functional and strong-statically-typed. Now, TDD can be very valuable for dynamic languages where the compiler isn't helping you, but as zvrba indicates, you're probably better off there either using assertions or code contracts, since that puts the tests closer to the code being tested. - CodexArcanum
One final thought: BDD is totally alright. But TDD and BDD have a fuzzy line between them. A lot of professionals are trying to push the distinction of BDD, because verifying that your code actually meets the requirements of the written spec is very valuable, and is something that can't really be automated away. - CodexArcanum
(3) Reflexive rejection of over-hyped but perfectly valid techniques considered harmful. - Frank Shearar
(4) This is a flippant answer from someone who apparently thinks they are an infallible developer. Using asserts throughout the code is like writing tests without the benefit of the more maintainable code that could have been produced as a result of TDD. - Matt H
(3) TDD is only partly about tests. TDD is more about finding a design which fits exactly the know requirements today. No more, no less. - Martin Wickman
(1) @zvrba TDD is about testing features, not event-handlers - Steven A. Lowe
(7) I am a developer of PostgreSQL, and I use TDD. - Peter Eisentraut
(4) -1 for a badly formulated argument against a concept you had a poor understanding of. It shows you leapt to firm judgements about it before you had even understood what it was you were judging. That's quite an unprofessional attitude to hold. That attitude's responsible for such WTFs as a College Head saying "OOP is just a fad, we don't teach it here" - shared by someone in another thread I can't find again; though that was probably also caused by the Head having no decent industry experience in recent years. Learn first, learn even more second, understand third, judge last. - Jonathan Hobbs
@Axidos: I would have had a more open mind toward something that requires me to do double work if I spent a lot of time debugging or fixing my code. But I do not. - zvrba
(2) @Matt: there is no firm evidence that TDD produces more maintainable results. There are few decent studies, and one of them says that the results are inconclusive. This is another reason why I say that TDD is a "fad": their proponents come out with claims about "better quality", but the evidence is nowhere to be found. The discussion on the study in question is here:… - zvrba
@zvrba Have you got the "On the Effectiveness of Test-first Approach to Programming" paper itself? The links are either broken or lead to paid downloads. - mlvljr
(1) @mlvljr: Try this link:… - zvrba
@zvrba Many thanks. - mlvljr
[+17] [2010-11-09 04:09:05] Jeremy

Personally, I think using four digit years are going to be considered bad practice. Just think about it. In less than 8,000 years we're going to face a major crisis due to our dependence on four digit years.

(5) I'm already writing code analysis tools to help businesses determine if they are Y10K compliant. ;-) - Jason Whitehorn
(1) @Jason Whitehorn Hope you'll finish it in time ;) - mlvljr
Actually, the gregorian calendar it's self has only existed for about 400 years, most of the world didn't even accept it until the early 1900s. By this logic, it's even worse for us to be pompus enough to think our already failing calendar will be used when it's 20-times it's own age, meaning that we don't need a better solution to deal with changing calendar paradigms that have plagued the last six thousand years. - Incognito
[+17] [2010-11-10 23:35:37] AShelly

Comments considered Harmful

I see signs of this one popping up in lots of discussions here. People believe that comments are a code smell - that all comments can be replaced by better method and variable names. I can imagine a world where they are considered as distasteful as GOTOs.

Personally, I think this is an extreme position. There are some potential problems with comments - If they get out of date, they do more harm than good. And there are usless comments: i++; // increment i helps no one.

But, used correctly, comments add context to the code, and capture stuff that you just can't do with naming and refactoring. There are plenty of times I have been grateful for a comment that explained WHY something was done a certain way, instead of how it was done.

So I hope this prediction does not come to pass.

I used to be a huge fan of comments, and would often write all of them before I'd write any code as a way of organizing my thinking. But now I put a lot of that information in good names and clearer structure. And most of the rest goes in my test cases. I still occasionally write comments, but for me it's very much a last resort these days, because they're the least likely thing to be properly maintained. - William Pietri
(1) @William Pietri: the guys maintaining your code want to tell you otherwise. With a big stick :) - gbjbaanb
@gbjbaanb - Possibly, but I doubt it. I changed my approach based in large part on what I was most happy to get when I deal with other people's code. - William Pietri
[+16] [2010-11-09 10:35:52] mlvljr

Pseudo-OO spaghetti with "dumb" getters / setters everywhere and severely duplicated code.

Rant: see here [1].


[+12] [2010-11-04 11:19:09] mlvljr

Inheritance (at least when used heavily).

see here [1] and here [2] for both C# and Java-based rationale.


tada! the (Holub?) haters came by ;) - mlvljr
I'd say class inheritance, yes, interface inheritance, no. - Oliver Weiler
(1) @Oliver Weiler Correct, I meant class inheritance; interface inheritance, being a means of only specification/signature composition, is a bit hard to misuse (I hope) :) - mlvljr
[+9] [2010-11-04 14:11:22] HLGEM

I'm going to way out on a limb here and say that too much abstraction will be considered harmful. We'vee been moving to more and more abstraction and I think this will cycle back to less abstraction when it reaches the poitn that nobody bothers to understand anything and really horrible stuff is produced. Note I am not saying all abstraction is bad (I'm not interested in doing database work in machine code!), just that I think the trend is going to a dangerous place where lack of understanding will cause major development problems. Thus we will cycle away from abstraction sort of how in American politics we cycle between political parties.

Then you are welcome to… - mlvljr
so you are saying you can keep track of details at some finite but large number of levels? You want to go back to assembler? Sounds fine - just leave me out of it - I need like the abstraction. - Tim
I didn't say no abstraction I said too much. I think there is an appropriate level of abstraction somewhere beween assembler and "I don't want to have to to understand anything" - HLGEM
If writing an application in terms of printf is ok, one has no need to know anything from the levels below it (or better to say inside it -- because the asm-level inner-mechanics is hidden/encapsulated between the high-level routine and its visible behavior (console output)). - mlvljr
[+8] [2010-11-09 06:02:48] rwong

Computation considered harmful

Computation generates a lot of heat. The heat that is generated is proportional to the amount of computation performed, and also to the amount of electricity used. In most parts of the world, electricity is generated from non-renewable sources. Removing the heat from the servers also requires significant energy. All of these will consume resources and emit pollution.

Multinational corporations are building large computation facilities everywhere. A lot of investment is made into making computation as efficient as possible. Often, the most crucial factor is neglected: Why compute? Why do we need to answer a question when nobody is going to use the result? If a person asks a vague question, shouldn't the computer verify the person's intention before going any further? Why waste the person's energy by requiring s/he to browse through thousands of results and still not finding the answer s/he wants?

"And those are the Googling taxes of yours." - mlvljr
Reducing this heat uses up a lot of water, which is a scarce resource in many places. Including many places where data centers are located. See here:… - user16764
[+7] [2010-11-09 09:34:45] macgarden

"GOTO considered harmful" is considered harmful.

..together with Knuth and Dijkstra ;o - mlvljr
(4) {"GOTO considered harmful" is considered harmful.} is considered harmful. - bobah
consider_harmful( bobah_s_comment ); - hasen j
(1) consider_harmful( matthew_crumley_s_comment ) - Matthew Crumley
(1) run-time error R6000 -stack overflow - raimesh
def consider_harmful(bool harmful) { harmful?consider_harmful():!consider_harmful() } - mlvljr
[+7] [2010-11-12 17:53:24] Ross Patterson

Aspect Oriented Programming

Guy Steele [1] famously said [2]

... the ability to define your own operator functions [in C++] means that a simple statement such as x = a + b; in an inner loop might involve the sending of e-mail to Afghanistan.

Aspect Oriented Programming takes this to extremes, such that in some languages it is nearly impossible to determine what code has been inserted where inside an otherwise-understandable block of code.


Aspect-oriented programming has been (in good humor) described as an operation symmetrical to goto: It's called comefrom. - Macneil
(4) In a language without operator overloading such as Java, what guarantees does the language give me that a.add(b) won't send an e-mail to Afghanistan? - FredOverflow
Also, I generally find this pointless. Why doesn't Guy Steele come over here and debate it with us? No? Then what he said is worthless. - DeadMG
@DeadMG: George Santayana said "Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it." But don't expect him to come over here and debate it with us, he's been dead a little too long. - Ross Patterson
(1) @FredOverflow: I think that was almost exactly Steele's point. My point was that a reader of x=a+b or a.add(b) in an AOP world has no way of knowing anything about the actual meaning of those statements. - Ross Patterson
[+6] [2010-12-03 10:23:40] missingfaktor

Mutability Considered Harmful

As multiple cores become a norm, mutability will become the new goto. :-)

[+6] [2010-11-09 14:46:21] Incognito

Direct SQL injection.

SQL that doesn't go through something that sanitizes user-input, or works through something like ActiveRecord.

Actually, this is already getting enough flack I think it's going to die.

(5) OK, who's gonna be the first to post a link to the little Bobby Tables cartoon ? :-) - Sean Patrick Floyd
[+6] [2010-11-04 12:23:44] Graham Lee

OK, I know this is going to be controversial, but:

Developers considered harmful

Seriously. Many advances over the last fifty years or so have involved reducing scope for human error at the part where the requirements get turned into executable code:

  • compilers stop us having to translate down to the basest hardware level
  • modeling tools and GUI builders automatically generate code for common scenarios
  • structured and OO paradigms both let us re-use existing software instead of writing our own
  • automated testing provides a form of the requirements/spec that can be automatically checked
  • BDD is starting to bridge the gap between the high-level requirements and the low level at which automated testing is usually applied

So within a couple of decades, software will probably be largely written by tools that comprehend the business analyst's requirements, rather than by squishy error-prone coders. Well, they did say that computers would make some people redundant...

Micosofft "Object Thinking" has its last chapter devoted to depicting a (possible) future where there are no programmers, just "software assembly specialists" (if I remember right). I also remember some (all-in-all favorable) reviewer calling this chapter "bizarre" though ;) - mlvljr
(6) Your argument is basically that computers will eventually program themselves. They've been saying that for decades; it will never happen. The advances you talk about have more to do with managing the increasing size, complexity and sophistication of computer programs than they do with managing computer programmers. - Robert Harvey
(2) @Robert: no it isn't, my argument is that business analysts will write the applications. That's my reason for writing that, and not anything about computers programming themselves. But on that isn't that exactly what YACC does? - Graham Lee
(3) Well, we do have domain-specific languages, but you still need a developer to create the domain-specific languages, and how many different domains are out there? YACC is just another domain-specific language. - Robert Harvey
Unless the tools the analysts will use will translate their specifications / requirements into algorithms in a fully automated mode, there will still be enough place for a programmer. I.e. to specify requirements to a sort routine one does not need to possess any CS knowledge, but writing/implementing one is extremely different (and also has little to do with business analysis). So programmers will be employed at least writing library routines. And we also should not forget that automatically generating applications that are resource-efficient enough will not also probably be possible. - mlvljr
.. I mean, the business of combining existing components into an application conforming to the specified (business) requirements with the goal of achieving some desired resource (memory / CPU / network load / etc) "effectiveness" 1) is clearly not an analysts job (since it has nothing to do with business requirements), 2) should obviously be done by an engineer with the knowledge on the above topics (component / algorithm properties, means, pitfalls and effective techniques of their (inter-)combination, etc) -- i.e. a programmer. - mlvljr
(2) @mlvljr I think one of the problems with software today is that programmers still think they need to write sort algorithms. This problem was actually solved before computers existed. And when you realise that 95+% of programmers are writing in-house ERP in VB, you find that the wasted effort is worse than you originally thought. - Graham Lee
@Graham Agreed. What I meant is that given a precise specification of the business part of the app, overall requirements to its "resurce hungriness" and a set of ready-to-use software components one will not always get a not-resurce-hungry-enough application with a more-or-less direct transformation even if the business requirements part is perfectly sane from the business point of view. A good example is a simple numeric relation like factorial (or even addition) defined mathematically strictly and making perfect sence from the analyst's point of view, which can easily be translated into... - mlvljr
..executable code but a totally ineffective one. Thus there will (at least in more-or-less involved cases) be need to transform the formal requirements into one of appropriate equivalent forms first and only then automatic transformation into code (with more trivial optimizations) could be done. Again, the analyst's job is to provide strict specifications from the business of view (and it is probably a bad analyst who obfuscates the spec with algoritmic optimizations) and the programmer 's one is / will always be to translate them into code (which is not be fully automated due to above). - mlvljr
There will always be inherent complexity in software that requires actual software developers. Plus: Programmers will always invent new things to do in software, and new ways to present it to the user. For example, a few years ago, clever designers and software developers invented touch gestures that replaced many gui elements on touchscreen devices. No business-analyst-driven process could have done that. - ammoQ
(1) @ammoQ: I'm not denying that there will be a need for operating systems developers, compiler developers and the like. But your touch gesture inventors wouldn't have made good payroll software if they spent all their time reinventing the mouse. Payroll software ought to be a solved problem by now, but there are still thousands of developers working on it. It doesn't need development, it needs deployment and customisation. - Graham Lee
Agreed. Many already-solved problems mostly just require deployment and customisation. But I do not think that we are close to a state where almost everything can be considered already-solved. I even doubt that such a state exists. - ammoQ
(1) @ammoQ: the payroll problem was solved around a century before programmers came and delivered buggy payroll software. - Graham Lee
@Graham: good point ;-) - ammoQ
(1) I disagree utterly. The main function of developers isn't typing the right number of parentheses. It's translating real-world problems into solutions precise enough that a computer can perform them. Until we solve the strong AI problem and make humans obsolete, there will be a need for developers. - William Pietri
(1) The more computers are able to do, the more we will want them to do. So until we hit the point that people can't think of anything new for a computer to do there will always be a need for programmers. - Dunk
@Graham - I think (and I hope) there will always be new things for developers to work on, but it would be great to think that the 'solved' problems could just be code blocks that only needed updating as new security issues were found, and that are transferable so devs don't have to reinvent the wheel and rediscover all the same issues everyone before them had to get through! - Rory Alsop
@Rory: one of the early goals of OOP was an 'object marketplace' where application developers would be integrators, plugging working modules together to support their users' workflows. The Eclipse RCP world almost works like that. - Graham Lee
[+4] [2010-11-12 00:44:25] Nick Hodges

Multiple Inheritance Considered Harmful

I can't imagine a quicker way to get seriously convoluted than multiple inheritance. I'm sure the comments will be along the lines of "sometimes it's really useful", but those are the comments that come when you say "GOTO Considered Harmful" and they are no more compelling today. :-)

Yeah, multiple implementation (implementing multiple interfaces) via aggregation is often the way to go instead. - mlvljr
Multiple inheritance seems to be catching up again. (e.g. traits in Scala, PHP 5.4; mixins in Ruby, D etc) - missingfaktor
(2) MI isn't a big problem - except for the 'diamond pattern' (which is somewhat analogous to the "implement 2 interfaces that contain the same name" problem) - what is a bigger problem is everyone that has never used MI complaining about how broken it is. - gbjbaanb
[+4] [2010-11-05 04:07:30] Tim Post

The subject of preprocessor abuse [1] is quite hot. It never really cooled down, but various development methodologies such as TDD have spawned some new and interesting 'tools' that continue to kick off an age old argument.

Additionally, I've seen the argument creeping up even more in the development of other languages, such as PHP that are written in C.

Usually, a mile long cryptic macro is rejected as being overly clever, ambiguous, or other terms. When something like a test framework meets that criteria, but is intended to proliferate an entire code base, it could be referred to as harmful.


[+3] [2010-11-12 01:41:33] Scott Whitlock

Model-View-ViewModel Considered Harmful

Don't get me wrong, I love the separation of concerns I get from MVVM, but has anyone noticed the large amount of duplication you get at the ViewModel level? Essentially any property on a domain object that you want to edit has to have at least one (probably more) corresponding properties on the ViewModel objects to allow for binding (with INotifyPropertyChanged and all that).

For many cases I think you may have a point about the duplication. It depends how closely your view model matches your domain model. I would say lack of native support for INotifyPropertyChanged in the language is more harmful at the moment. Things would be much easier if properties could provide a matching changed event, although I haven't thought about how this would work in practise. - Andy Lowry
@Andy Lowry: what you're describing is Dependency Properties, but it's frowned upon to use that in the ViewModel layer. What I've been working towards is a standard bucket of very general ViewModel objects, and them assembling them together at presentation time with a Presenter, and using the Presenter to pump data back and forth. You get the benefits of MVVM (testability of the ViewModel) but without the duplication of business-y logic. - Scott Whitlock
[+3] [2011-04-14 16:13:04] gbjbaanb

Frameworks considered harmful.

Of course there are loads of frameworks already and they all are designed to help you do what the framework designed expects you to do in the way they assume is the best... but I think one day people will start to reject this idea in favour of more flexibility and less learning curve.

They say a you call a library but a framework calls you. I think frameworks will go away and be replaced with reusable libraries.

[+3] [2010-11-11 13:16:31] Larry Coleman

Functional Programming
I predict a backlash against functional programming, mostly because its integration into mainstream languages like C# will allow for its abuse by programmers who don't really understand it.

(1) +1, An interesting point. - mlvljr
[+3] [2010-11-09 04:27:15] Macneil

Non-software transactional memory considered harmful.

Currently, implementations of threading and concurrency (using locks) feels like low-level assembly programming. Software transactional memory [1] is an alternative synchronization mechanism that leads to code that is much easier to write and understand. Several new languages have it (like Fortress and Clojure).

I think this also has a nice parallel to the origins of your question: clear blocks of code to add structure to low-level mechanisms that are too powerful. Saying "Locks considered harmful" not quite a slam dunk case as goto was, but it's the closest I've seen.


Thanks, that is something I hoped to get. - mlvljr
[+2] [2010-11-04 19:56:28] mojuba
  • Underscore naming, despite that it is the most readable naming convention in programming according to Bjarne Stroustrup. I'll miss it.

  • Tricks like connect() or die() allowed in many dynamic languages, will be considered bad tone

  • Assignment as expression while (a = get())

  • I wish this thing is considered harmful one day: if (0 == i), it's so anti-human and plain stupid despite all justifications (which are as stupid)

"which are as stupid" -- thanks for just double (not triple) "s" here ;). And what's this connect() / die() trickery? - mlvljr
@mlvljr: it's a shorter way of saying if (!connect()) then die() which works in languages like JavaScript, PHP, Python. Short boolean evaluation allows to skip the die() part if connect() was successful, plus dynamic languages usually don't care about the return type of die(), which is going to die anyway :) P.S. also in Perl, the UNIX shell, etc - mojuba
decease() or die() -- isn't it funny? ;) - mlvljr
@mlvljr: code() or die() ? ;) - mojuba
I liked underscore naming until I started using camel-casing and realized how underscore naming is a huge distraction where camel-casing feels natural to read. When you get used to it your mind automatically puts a space before the uppercase character without your having to think about it. Even though I used underscore naming for years, my mind always locked in on those underscores instead of reading past them. - Dunk
[+2] [2010-11-04 01:35:35] n1ckp

Shared ownership

I think it is bad because you loose information that is pertinent to get it, that's it, the owner of the object. Here is a try to an example:

Say you have a class with some members, say a list

class MyClass
   list<SomeObject> list;

now you pass the pointer of one of those object to another class, say something about GUI that is updated to the user. Now, if you have to delete the list, say to recreate the object because they are polymorphic or something, there will be a reference to an object that is no longer valid in the GUI, but, since you have shared ownership, that object will still look valid to the GUI object and be shown to the user, which will probably not detect that the change did not take place correctly.

Now if we look at the other extreme, normal pointer, you don't have that problem since when the list get deleted, the program will crash when trying to show the thing to the user, which is bad if in production, but, in my view, better than having incorrect data.

Now, sometime you might not care, say in the example of a function that do calculation on the reference and return the result: you might care only about the result after the calculation and if someone delete the pointer while doing the calculation it will crash while the shared ownership strategy will work since you wanted the calculation on the old data.

The solution, I think, is to have way to differentiate the usage (encode it in the type if you will) depending on the situation. So, in the case of the GUI, you would maybe want the pointer in the GUI to be updated automatically with the new reference. In the case of the calculation one you would want a sort of copy on delete (well, keep on delete, that's it, shared ownership). Now you could also have both situation with the same SomeObject reference and that would complicate things a little but I think it can be done.

so, if it's about some instances not being notified properly about the change of others, changing the latter one's contract / interface probably suffice (and will not require language changes)? - mlvljr
@mlvljr: it's about having valid reference to invalid data. The point is not that it can be changed by changing the design, the point is that it will be hard to catch and possibly not caught by testing. Having "valid" invalid data really scare me off if you want my opinion. - n1ckp
(2) Basically, of course the design can be changed but you need to catch the bug first. The problem is not shared ownership per se, like I said, it can be what you want sometime. The problem is not caring how memory is managed. You don't even need to have to delete it yourself but you do need to tell the compiler how exactly it will be needed, and that's what's missing from garbage collected language. With my solution you don't need to delete the memory yourself. The promise of garbage collected language is that you don't have to care and that's what I think is harmful. - n1ckp
[+1] [2010-11-04 10:04:08] mlvljr

Not (fully) specifying exact code semantics / contract via

  • code itself implicitly (i.e. naming conventions with precisely documented meaning)
  • formal enough comments
  • (stand-alone) documentation

(and nobody speaks of useless / excessively lengthy documentation here, of course)

[+1] [2010-11-11 19:40:44] haylem

I actually don't think of any, but it might be because none of the stuff listed in other answers or even previous papers really seem "harmful" to me.

A gun looks harmful. A MOMD looks pretty harmful. Their intent is to be. A development process, tool, technique or language do not look harmful. Because their intent has never been to be. They're only artifacts that morons use the wrong way or in wrong situations.

I don't use goto in day-to-day programming and would indeed be breathing down the neck of any coder who would use some in a project I work on to force him to tell how in hell that thing could have been useful there (and in some case, it can), and I will encourage the use of static checkers and linters to oppose to the use of some constructs and patterns, but if a bypass is needed, I'll grant the exception with grace.

However, when an idiot manages to kill himself driving a fast car because he cannot drive, it's not the car which is harmful. It's the moron, and the people who don't did not regulate and enforce the regulations on the conditions of use. Basically, if my company's projects get crippled by "harmful" constructs, it's not the constructs fault: it's mine. They still have their uses in some situations.

Same with people who manage to burn themselves pouring coffee. I won't blame the kettle, the hot water, the coffee, the coffee maker, or mother nature. The only element of failure here I can see is pretty obvious, and we cannot remove him or her either.

In the end I don't see any harmful constructs, tools, patterns, techniques, languages and processes. I see situations, and iterations of these items which tend to improve them.

Dicing stuff for being harmful and contriving your mind into thinking about things in terms of what can be damaging, that's the harmful thing. It's a lack of open-mind thingy.

[+1] [2010-11-11 01:08:20] Dave

Any current or past programming paradigm carried to the extreme.
I once worked on a program where the initial programmer refused to use GOTO, didn’t comment his code, didn’t limit his functions / subroutines to one functionality or anything close to one functionality and moreover he created new code by copying old code instead of creating a function to handle the logic. The result of this was 3 million lines of code that was almost impossible to maintain.

+1, truly horrific, but how can that paradigm be called? A ball-of-mud one? ;) - mlvljr
it's called moderation in everything. I still use GOTO at times. - Dave
Heretic!! - mlvljr
[0] [2010-11-11 02:50:54] RCProgramming

I may sound Crazy but c variants lol. Back to BASIC everyone :)

[-2] [2010-11-03 20:11:07] Peter Turner

Relational Databases considered harmful

Relational databases are to easy to abuse screw up, not be maintained and drag your software into the grave. Programmers who interface with databases should eschew any notion that having a relational database packaged with or connecting to their software will somehow make it seem more modular.

A bucket of snakes is portable, a handful of snakes is not portable - a handful of snakes may kill. A handful of snakes in the wrong hands in a 10 story building will kill all the occupants of that 10 story building. A relational database is a handful of snakes, waiting. Watching and waiting.


Well, I just figured I'd go out a limb with that answer as a crazy/dreadful prediction.

You may not find this to be applicable to your situation, but it's clearly the truth regardless. If you have a 'relational' database but, keep a plethora of multivariable attributes and every field in your tables is indexed and/or a primary key. Then you're only using the database as a stand-in for a big honky text file. Like a handful of snakes, this will wiggle out of your hands and bite you, not a bad analogy if I do say so myself. A RDBMS, like a GOTO is always one step away from being terribly abused. Simpler methods of storing and sorting data just plain old can't be abused - or at the very least mistaken for what they are.

And I never advocated NoSQL, I don't even really know what that means.


This may be of interest [1], as a better, snakeless argument against RDBMS for things they're not meant to be used for.


(12) That snake analogy is not only horrible, but makes no sense in context. - Malfist
(1) @Malfist, What kind of a programmer are you? - Peter Turner
(11) Since when were databases, let alone relational databases, watching and waiting to kill everyone in a 10 story building? - Malfist
(3) Also, "A handful of snakes in the wrong hands" wouldn't that be more than one handful if they were in hands? - Malfist
It's a metaphor, I'm not being judgemental, which kind of programmer are you, systems, windows, web, iphone app etc? I'd like to know what kind of programmers don't get my metaphors. - Peter Turner
(6) DROP snakes ON floor ;) - mlvljr
(1) Tried a No-SQL DB once. Hated it. Never did like that all my data is floating in space - TheLQ
(2) If the building only has three stories, is one snake enough or do I need two? Does the number of snakes required increase if there's a group of DBAs having a meeting? If I stick a snake in each ear, will it make me a prophet? - Peter Boughton
I'm a fan of NoSQL DBs and even I think you ran that one right off the rails at amazing speed. :o I have no idea from what you typed whether you have a good point or not. Perhaps work on how you communicate your ideas and try again? - HedgeMage
(6) -1 from me. The day I find out my bank has converted all their databases to NoSQL is the day I withdraw all my money from them. - Dean Harding
@Peter Boughton One large and hungry snake possibly will be enough.. - mlvljr
@Peter Turner: I have done too much programming around RDBMS and I hope they go the way of the dodo bird, but I still don't get your snake metaphor. - Matt Ellen
(3) @Peter Turner: Are you saying that technology is bad because people misuse it? - Kramii
@Malfist The Umbrella thing sure had some form of a DB inside it. ANd it was watching.. - mlvljr
(1) This is just creepy. - HLGEM
(2) I'm am so SICK of these muthaf*#&in snakes in this muthaf*#&in thread! - Matt DiTrolio
(4) "If you have a 'relational' database but, keep a plethora of multivariable attributes and every field in your tables is indexed and/or a primary key." - then you don't have a relational database. (You don't have a handful of snakes either, which is a pity because there's no pet shops nearby and I'm on the lookout for some.) If you try using a hammer as a saw, you'll probably get splinters and/or bruises, but that's your fault not the hammer's fault. - Peter Boughton
@Kramii, kind of. I'm saying it's bad because people use it and because it doesn't crash, they think it's OK to misuse it. - Peter Turner
(1) @Peter Boughton, I'd define a relational database as a thing that holds related tables which you can run Joins on. The problem is, you don't get bruised right away. You do, however contract leprosy and pass it on to your code's inheritors. - Peter Turner
(2) I prefer to use the standard definition of "relational database" instead of making up my own description of the term. Although I don't want to turn into a leopard so I guess I better stop doing that and start using XML for everything instead? - Peter Boughton
This is just mad, and forget about the snakes! - Reallyethical
@Peter, :-) At least with XML you get to watch yourself shooting yourself in the foot in a human-readable format (unless you embed XML in your XML in base64 like some dailyWTF'er did recently) - Peter Turner
local people seem to hate snakes ;) .. - mlvljr
Could have been worse... um... they could have used auto-incrementing numbers for tag names (saves space, see), and stored the real names inside a separate series of XML tags which were all encoded in Base80. :) Ah, and speaking of craziness... I see you've added a reference to ORM - certainly wont find me disagreeing there! - Peter Boughton
Dean Harding You should probably have left a small amount to leave the account alive just in case a million-buck UPDATE erroneously occurs ;) - mlvljr
"People don't know how to handle snakes so rather than train snake handlers KILL THE SNAKES" - Incognito
(3) +1. The snake panic is a little over the top. But in principle I agree. - back2dos
@user1525 (new SnakeHandler(this._snake_events)).start_handling() ? - mlvljr
(1) @back2dos "over the roof" you mean? (remember, it's a 10 story building ;) ) - mlvljr
Good thing you have a lot of rep 'cause you took a beaten on this one! - kirk.burleson
(1) @kirk actually I am making out pretty well: -17x2 + 10*10 = +66. - Peter Turner
So that's a valid strategy then ;) - mlvljr
+1 relational databases are an awful concept. SQL is a terrible language. - Sean Patrick Floyd
And now someone just compares non-SQL db's to a "handful of spiders"... - mlvljr
Why the negative votes? He's not saying RDBMS actually are harmful, just that those idiot programmers who have a copy of NoSQL or consider an ORM to be 'the DB' will start to think this. I already see lots of people who consider SQL to be totally obsolete now they have XML via LINQ. - gbjbaanb