For someone to be a good programmer should they have a basic understanding of hardware and the operating systems or is it enough that they understand what they are doing inside the specifics of their own application?
Should they be able to achieve general tasks in their own operating environment without having to rely upon someone else?
Should they show an interest in computers and computing in general?
Or is it purely good enough that they produce working code that solves the problems that they are looking to overcome?
I'm looking for language agnostic answers.
My PERSONAL opinion is that in general all programmers should have a basic understanding of what a computer is and how it works as well as being familiar with the operating system(s) that they are writing software for (if any).
As a bare minimum this shows that the person is interested in their field of work and is interested in reading around the subject rather than limiting themselves to a single area.
Obviously, a programmer that can't "achieve general tasks in their own operating environment" is about as useful as a carpenter who doesn't know how to use a hammer.
You can grow a tomato by following the directions on a seed packet. You'll grow a better tomato if you understand how soil, water, sun, and insects work together. Programming is no different.
This kind of knowldge is most useful not to make a program work, but when your program stops working.
Take a CRUD database admin page. The page does not run on the database machine, (and you should know why that's a good idea) and last night, the sysadmins moved the database machine to a new spot in the rack, and now the page doesn't work. If you don't have a clue about hardware, or how networks work, where would you even start to diagnose the problem?
You would be right in saying it is not your program's fault, but in the real world, you will need to demonstrate that, and you will need to help get it working again if you are a great programmer.
Yes, yes and yes. Here is the reason why: Joel Spolsky's law of leaky abstractions .
Of course, one can create applications without a good understanding of the underlying architectures. It is even good to generally have that knowledge abstracted away, indeed. Sometimes, ignorance is helpful. But, no matter how fancy the framework you are working on top of, the underlying architecture always finds a way to have an impact on your application. This goes both for making good decisions and for solving problems down the road. As time goes by and as further away we get from the machine, the less this is true, but there is still to come the framework that isolates you totally from the machine's inner workings. http://www.joelonsoftware.com/articles/LeakyAbstractions.html
I don't think it's possible to write efficient code without understanding what's going on under the hood.
I think it's also necessary to have an understanding of the OS and related things if they are going to be able to solve all the problems that they face.
You say you want language agnostic, but I really think it depends on which language you use. For example, if you're on super low level languages, understanding the computer hardware is very important, if not required. However, the higher level your language is, the less you HAVE to know. Indeed, you can get away with knowing little to nothing about computer architecture, provided that your language is of a high enough level.
This may get a lot of disagreement, but I believe that (depending upon the language you use) you can be fairly ignorant about the majority of the guts of a system and still be a good programmer (although it is not recommended in the least). With languages like C you have to be very mindful of what is going on in the background to be successful and it requires a lot of management on your side.
On the other hand, languages like VB have always been designed from the point of view of allowing Joe Blow to be able to write his/her own programs. Whether that's a good thing or not shouldn't be up for debate here. With things like more natural-language syntax, garbage collection (also found in many other languages), and other "helping hands" type features, it makes it entirely possible with someone with a limited amount of experience under the hood to still be able to develop complex and powerful programs.
Heck, as another example, look at Microsoft Access. I can't tell you the number of times I've seen secretaries of clients our company has show off their current systems, which generally revolve around a giant Access database. Although they aren't what a "great" programmer would consider top notch, they are big enough and complex enough to have handled most of their company's needs up to that point.
a basic understanding of hardware and the operating systems
Let's use the classic ad absurdum  argument. Let's think of a programmer who uses a very high level language, and lacks the basic knowledge of OS and hardware. For example, let's take a look at a mathematician who uses a functional language to recursively calculate a long numerical calculation. Another fine example would be a physicist trying to do some heavy number-crunching using MATLAB.
For the sake of the argument, let's assume that the OS and language abstractions actually work  - no crashes, no blue screens of death and no kernel panics. Moreover, the programmer never falls into array index or stack overflows or other pitfalls - they are not related to the OS, but to good programming.
The programmer can't choose the right architecture for running his program. Can the compiler exploit dual and quad core processors? Can the OS support parallel execution? If the program is slow, what would be more efficient - buying more memory, using a faster CPU or rewriting the program to run on several computers in a parallel manner (and using TCP sockets, rather than flash memory drives, to exchange information).
There are more trivial examples: If writing to a log file terminates with an error, we might be using the wrong file system . Trying to split the result to many small files causes another bizarre crash , and some files just can't be opened .
To conclude: A programmer without a basic OS and hardware knowledge can produce some good code in an extremely sterile environment . But the program is very unlikely to use the computer resource in an efficient manner, and as the program grows more and more complex - often unexplained crashes are inevitable. http://en.wikipedia.org/wiki/Reductio%5Fad%5Fabsurdum
To quote Reverend Lovejoy:
...ooooh short answer yes with an if, long answer no with a but...
And that's a pretty big 'but' (no pun intended).
Yes you need to understand how computers work (at least at a high level), as well as operating systems, compilers, boolean algebra and a bunch of other things. There's only so far you can go with "monkey see, monkey do".
There's part of the answer that doesn't depend on requirements.
A good programmer is curious about things, and wants to learn. You can become adequate for many purposes without a drive to greater understanding, but I don't think you can become good.
Since a programmer deals with computers a lot, a good programmer will be curious as to how all of this works. A good programmer will have the desire and ability to learn more or less what's going on under the hood.
The idea that this makes one a better programmer may be partly rationalization, but a good programmer will get the deeper understanding anyway.
There's an Turing-like test here. Say you have a programming problem, and post it on here somewhere. A day later, someone posts a complete program as a reply that solves your problem correctly and efficiently.
You could ask, "how can I tell, based on this blind situation where I know nothing about the author, whether they're a good programmer or not?" But really the test is implicitly telling you, "there is effectively no difference between a 'good programmer' and 'a person that produces good programs'." The distinction starts to be something subjective that by its very nature can be whatever you want it to be.
Personally I think it's unlikely that someone would become a good programmer without a fairly decent all-round knowledge, simply because producing enough 'good programs' would require wider and wider domain knowledge. If I was trying to advise someone on how to become a good programmer, I would be telling them to seek out such information. But that's not the same as saying that a person without that knowledge is not and cannot be 'a good programmer'. The loose definition of 'good' and 'programmer' are wide enough to cover a range of competencies from the incredibly specialised to the incredibly versatile.
For someone to be a good programmer should they have a basic understanding of hardware and the operating systems or is it enough that they understand what they are doing inside the specifics of their own application?
In short, yes. You can be a programmer and not know what the actual machine is doing, but you will be limited in your ability to write highly efficient code. Good programmer is a subjective term. You can write code that works, and is commented properly etc., but you won't be able to "Pull a rabbit out of a hat" when it is needed. for about 90% of the time I would say, you won't really need to know what is going on and be able to get by, but it is that 10% of the time that separates the best from the rest.
Abstraction is great but let's not forget that computation needs to be done on computers. I liked the carpenter analogy (I don't have the rep to do so, which is why I didn't reply directly to that answer) but I think that hardware is more like the wood.
Your software is the tool that shapes the wood into something useful. A carpenter who doesn't know how to use his tools is not a carpenter at all. However, a carpenter can get by without understanding the details of wood. You can still build things. They just won't look nearly as good as things built by a carpenter who does understand wood. You will be more likely to break stuff and less likely to tackle projects that require an intimate understanding of the properties of wood.
Likewise... a programmer can get by without understanding hardware. Will you be a good programmer? Maybe. You can compensate with other skills. Are you going to be building systems that push the limits of what a computer can do? No.
To put it simply: It is absolutely essential.
Just spend a few minutes around a programmer who doesn't know basic hardware architecture and principles (CPU, memory management, OS's, etc.), and you'll find out very quickly just how essential it is.
I very much doubt your statement, "Or is it purely good enough that they produce working code that solves the problems that they are looking to overcome?" could possibly be true for a programmer for very long without that knowledge. At least, when speaking of more advanced techniques in programming that go beyond "Hello World".
Short answer is no.
In my opinion good programmer means someone who have the skill to fire up the development environment (not necessarily setup by themselves) and write good enough code to solve the problem at hand in a short enough time. In the real world sucess is measured by how much bread you return to your home, not by how many people follows you on twitter. Nobody really cares about your specific skills as much as you want to believe, as long as you give the warm & fuzzy feeling of success. Considering the level of "mediocre" in programming as a job, "being good" is what it would be considered as mediocre for other fields.
On the other hand, being great is a wholly different thing that is most often meant by "good". It requires near complete understanding of your platform so you can create a complete solution which is also efficient and maintainable. It requires constant learning and experimentation. It requires you being aware of your own shortcomings and remedy them in a prioritized fashion. Ex: if you are mainly working on web applications, you will eventually have to learn how your runtime interacts with web server on different platforms that you need to deploy now or in the near future. You won't immediately have to know how to land patches to your platform when seemingly correct code misbehaves and you know it is a bug of your platform and fixed it yourself.
As good as the level of abstraction, garbage collection, etc are these days, IMO you still need to know what is going on under the hood for any program that is non-trivial.
Using .NET as an example... while it does a great job at hiding complexities from the programmer, the programmer can still get into trouble by doing things like:
Even if you only use your "under the hood" knowledge 5% of the time; that knowledge will usually be what actually makes your application run correctly and with acceptable performance.
I think you also need to start to look beyond the computer as well - there are too many developers stuck with a deep understanding of the hardware and their chosen programming language(s), but are unable to grasp the needs of the end user/business.
For me, to really be a 'good' programmer, you have to also be able to look at and sensibly question the requirements you are given.
Now I'm biased, not coming from formal computer science discipline (I was mechanical engineering), but I've learnt what I need to know, when I need to know it. I agree with some of the comments above that an ability to go out and learn the inner workings as you need to is important. With what I do day to day, I don't need to know this stuff and I largely forget what I learn in between those rare times that I use it.
I would call myself a Good PHP programmer and there are still things that baffle me about the inner workings of PHP. (after following the PHP internals mailing list)
But I don't need to know, I push button (code) and recieve bacon (get results).
Would knowing the inner guts of PHP make me a Good programmer? Yes
Do I need to know? No
As someone who has taught myself one language(python) and is learning another (c++) in a classroom environment I have found that the answer that question depends on A few things things.
So no you don't need any of the above mentioned skills to be a programmer. Now if you want to be a software developer, or engineer you'll need everything mentioned above.
I think it helps to understand the fundamentals. You should learn how to do something the hard way so that you appreciate doing it the easy way and understand exactly how the easy way is helping you. So, it's good to learn C/C++ so that you know how Java helps in abstracting away from the OS. It's good to learn how compilers work so that you know why programming in a higher-level language is helpful. It's good to learn AND/OR gates so you have a basic understanding of transistors.
That being said, I never learned what exactly the process was to go from hardware to software (hmm, analogous to abiogenesis?). I understand logic gates, but don't understand how you tell the hardware to send electrons to the right spot. I also think that networking is this big black magical mystery. I know how to open sockets, but I don't know what goes on beneath the covers. Has this hindered me in being a good/great software engineer? Not that I can tell.
I think you can be a good programmer in certain scopes without having the in depth knowledge of operating systems. For example, I believe you can become a good (not great) SAP or JDEdwards programmer without knowing the operating system details. These systems are setup within a particular framework that shields you from this. I've personally worked with highly paid SAP developers who didn't understand very basic programming concepts. I don't agree with it but they made a good living.
However, if you ever have dreams to design applications / modules from the ground up, it is important to know the details. I would throw in Server technologies as well as being equally important, if not more.
To a point it helps, but you can be an effective programer without this in some domains. Understanding x86 memory model might not help me develop a standard .net web application. It might help you optimize it. As we have brought more abstraction layers into programing the need to understand what goes on under the hood has become significantly less.
How many of you can write in pure machine code? Not many I would guess. With that said you should know the OS your working on, the platform your building against, but more important then knowing it is knowing where to learn about it when you need it.
Having a basic understanding shows that a person is interested in computers if they have no knowledge then why would they be interested in programing against them? Does a person need a detailed knowledge of the low levels? Well that depends on what they are doing.
How good of a good programmer?
If you're just putting together API puzzle pieces in VB.NET calling an API here or importing a library there and writing a CRUD database admin page, no, you don't need to know about computers. There is certainly a lot of value in constructing queries and writing data driven apps and quickly deploying these types of solutions.
If you want to be a great programmer, you must understand memory, the stack (I suppose the new processors take care of buffer overruns now...), communication protocols, or whatever else might be related to your project. The better understanding you have of what is under the hood makes your high level applications better performing and more secure. These types of considerations seem to be coming into play yet again as processors go multicore and devices become extremely portable with small amounts of available memory. Of course helpful concurrency APIs and garbage collectors will come out in time (and already have) for mobile devices, but the fact remains that the fundamentals don't disappear--they just get abstracted.
That's not even mentioning computing and algorithms--why one sort/search algorithm is better than another--or at what point further optimization isn't possible.
I'd say that it depends on what you're doing, and how much damage you are able to do to your project/coworkers/company by doing something you're not qualified for.
As an example, a coworker told me this morning of a project where the database architect was a very skilled SQL-hacker, who was known to be able to drum out very clever queries. The problem was that that was all she knew, and she'd had no education in information theory or database design, so the end result was a database so complex and spaghettied apart that there was barely even a hope of ever bringing it to any reasonable level of normalization. Her not knowing how databases actually function screwed up that project (amongst other things).
I personally appreciate working with programmers that have a formal education, because a lot of abstract (and not so abstract) concepts regarding performance, memory and design require an inherent understanding of a computer as a giant calculator. Otherwise you'll end up having programs full of architectural mistakes and misused datastructures and algorithms.
I think of it in terms of levels, and how much you really need to know depending upon the type of programming you do (cf. Joel's categorization in Five Worlds ).
If your'e writing enterprise software, you probably don't need to know hardware minutae as that's all abstracted away for you on an application framework. All you need to know how to do is write SQL queries, write in the language your app provides, and don't have to worry about whether or not there are registers, or spinning off multiple threads, or reading/writing to I/O ports. Instead, you need to care about load balancing, data organization, and other things.
If your'e writing embedded software for a microcontroller that would go in say the ECU of a car, you surely must know about registers, I/O ports, memory management, etc. But you won't care about GUI design, or making pretty pictures, or outputting data to a printer. http://www.joelonsoftware.com/articles/FiveWorlds.html
Though I agree with the statement that you can still be a decent high level programmer with limited low-level knowledge, you are still at a loss. While Java and .Net don't exactly require great knowledge of what's under the hood, these languages and IDE's were designed by people who do. And the syntax is an evolution of languages (like C) where you do need to know. So, I would argue that even with high level langauges knowing the history of computing and what's under the hood is always valuable.
It's kinda like how you can be a great rock guitarist without ever studying classical music, but having a year or two of classical music theory will still give you a leg up - even though rock and classical are distinct in many ways.
Summary: Know your context.
I think that it is essential to get a feel for what's going on.
One of the best ways to start is to read Charles Petzold's excellent book Code ( sanitised Amazon link )
From the Amazon review:
The real value of Code is in its explanation of technologies that have been obscured for years behind fancy user interfaces and programming environments, which, in the name of rapid application development, insulate the programmer from the machine
There are plenty of programmers out there who have no idea about how to do things with operating systems, as well as internet. Still they're hired as programmer and working with the set of some languages like c,c++. You can find few SEO people who knows HTML/XML without knowing much about operating system and other stuff related to computers. Sometimes some people want to know how to get job done, for that they can learn cut-paste things like band-aid and earn money for the life. In such cases all the rules of good programmer/concepts and techniques and etc fades away.
So in short it is possible to see people programming some language without knowing about OS/Networking(Depends on the projects they get in life so far and the way they educate themselves to learn programming.
Don't know if it is essential to be a good programmer, but definitely essential to be a very good programmer.
The knowledge related to programming is great, you don't have to master every aspect, just know that knowing something, will probably make you better, even by a little.
For example consider the knowledge that some CPU's can compute CRC32 with a single instruction. (the CRC32  on sse4 cpu's)
You can write fast programs without knowing that, but probably not as fast as it can be... http://en.wikipedia.org/wiki/SSE4
I feel it really depends on what you're doing.
For example, some guy who writes WebForms for some financial software really doesn't need to know much about how a hard drive works other than how to write to it. They don't need to understand networking to the core, but more how does HTTP work.
We can't be experts at everything and that's why people build frameworks and tools for other programmers to use. They took the time to learn the computer on a deeper level in a specific area so you don't have to.
(Personally, I don't know hardly anything about networking - Blue cord goes in the back and magic begins)
Actually, it depends on what you define as the job of a programmer... Technically, a programmer is just someone who reads some specifications and converts it manually to source code. (Why? Because we haven't automated this process yet!)
However, many programmers aren't just converting a design into code, they are also creating the design. And although a good knowledge of computers is helpful while designing an application, it is not a requirement! But such a programmer would be more like a software designer, not programmer.
And as a software designer you shouldn't be limited by the limitation of computers. That's because those limitations might change real fast, compared to the time it takes to change an idea into a working application. If you get stuck within the limitations of the current computers, there would be almost no new developments.
Basically, the more you know about computers, the more you know about it's limitations. And this knowledge will unconsciously influence you when designing something new, thinking something is impossible simply because you know the current limitations won't allow it. Programmers without this knowledge might fail to write "Hello, World." in code, but they won't be limited by their knowledge either and thus could come up with some grand ideas that will open up a lot of new techniques. (Like the Internet, which was just some wild idea for connecting several computers together in a single network...)
I'd have to say the canonical example of a programmer who doesn't understand computers can be summarized thusly:
We see that here all the time, people wondering why their currency-handling code is incorrect.
"Good" in this case is a relative term. Generally, someone who has a thorough understanding of a given problem and the tools available will come up with a "better" solution than someone who doesn't.
This is not to say that the solution offered by the one with less experience/knowledge would inherently be without value. In fact, a good deal of the time, the lesser solution is quite frankly "good enough". And if the seasoned professional was not bothered with such a "trivial" problem, then everybody wins.
The best-case scenario for everyone is the one in which you find a field where you, personally, have the most motivation and wherewithal to solve the most pressing problems. These problems will be non-trivial by definition--if they're easily solvable by anyone, they don't remain problems.
It would depend upon your definition of a good programmer.
If you are looking for a good programmer for device drivers, low level programming, assembley, c / c++ then yes, absolutely.
If you are looking for a good php / python / html / css / etc. high level language programmer, then it is not as important.
Some languages and problems absolutely require more in depth knowledge of the hardware and software you are deployed upon, and others could care less.
Of course it's important!
You should also have that kind of knowledge just based on the fact that you're interested in programming, not just because it's important.
The more you know, the better equipped you'll be to tackle the more challenging problems when they come up.
As they say: "To a man with a hammer every problem looks like a nail."
Check out this site if you need to brush up on your basics:
Another view on this should be the differenence between "good" and "great".
To go from good to great requires significant experience and mastery of ones craft, and at least some understanding of ALL aspects of computer programming, including the physical hardware aspects of programming (all computers have some sort of physical representation, at least at the time of this writing). Thus, I believe it is essential to become a great programmer, some it is essential to understand computers.
But to simply be good, to be effective in your own software-programming domain and make good, useful programs, no. And it's not a bad thing. My hope is that everyone would strive to be great, though. :)
I would say that as long as a developer has a thorough understanding of the framework that they develop under, fairly decent code can be written.
I did a lot of coding in C back in the day, but I feel that the abstractions provided by .Net as pretty good. So for C, I have writeen a ton of "#pragma inline". For C#, bring on the easy to use combo boxes and the abstracted data access layer.
A top notch application is one that fulfills all the needs of the consumer, not because it has all the bells and whistles or because it is the fastest of it all. I knew how the 286 and the 386 worked (they forced that down our collective throats in Engg). But now, to program in C#, I do not know the architecture of the dual core proc. Unless I run out of interesting things to read on the back of cereal boxes, I'm not going to pick up a book on the hardware architecture.
Same things in SQL - I can write decent code in SQL without knowing exactly how the the pages work. Now, when I want exceptional performance, I have to know how to use my hints, and joins and tuning.
So I guess, I will still read up on the software architecture. That's as far deep as I need to go.