With all the new "modern" languages out today, how is it that C is still heralded as the fastest and "closest to the machine"? I don't really believe in there ever being only one correct way to do things, and C has been around for a really long time (since the 60's!). Have we really not come up with anything better than something written nearly 50 years ago?
I am aware that modern languages are higher-level and take care of certain tasks like garbage collection and memory allocation and utilize libraries and such. I'm just asking why there has never been a true second option to C.
Can it be that C is so perfect that no other way of operating a computer could be possible (developer-adoption aside)?
EDIT Look, I'm not trying to knock C or whatever your favorite language is. I'm wondering why C has become the standard and why other alternatives never emerged and C was just "accepted".
C is a very simple language, and it's because of this, along with its longevity, that's it's fast and optimized. It's also extraordinarily widely supported, in concerns with embedded environments, microprocessors, etc.
It's hard to beat a really simple and fast language. The only thing to improve upon a language like that is usability: decrease the time it takes to make similar, generic code, and make it easier to model with abstractions.
This is where C++ comes in. C++ can be just as fast as C. The thing is, C++ is a much more complex language, which means it definitely increases productivity; as long as people know how to use it. C++ and C are not almost the same language anymore.
Now, D was another step up. Same ability for fast code, optional garbage collection, etc., but it never caught on. Hopefully that changes, because it drops what plagues C++: backwards compatibility with C.
So to answer your question, "better" is a hard thing to judge. In terms of simplicity and speed, C is probably close to the best we could do. In terms of productivity versus simplicity, C++ is probably best we could do, though that opinion varies much more. Lastly, in terms of a fleshed-out and cleaned up language, with the speed and simplicity of C, D wins this context.
There are faster than C languages.
There are faster languages than C. For example Fortran as already mentioned is doing very well because it has much more restricted aliasing language rules.
There are also experimental assembly like languages which are attacking C on the front where it is used as a high level assember language for example compiler creation. Ever heared about C-- or Janus? But those two were killed by the LLVM project.
I would bet that APL or other mathematical languages will blow C out of the water in there special application domains as they have build in support for Vector processing units. This is something which is not possible for C (and guys: NO! Special optimized libraries with C linkage have nothing to do with C as a language).
Also CPU producers removed all stuff helping compiler writers in other languages - remember the tagged arithmetic assembler codes for making LISP implementation on SPARC fast? Gone with the wind.
And if you go away from micro benchmarks to application development then there are faster languages for application development. My personal example here is always SmartEiffel. It targets C but is using global system optimization which makes it faster then C in real world application development.
In this domain even a simple wrong or low level abstractions can kill the whole language performance. Because C does not offer high abstractions most people say it is a programming problem but it is not. For example look at the lack of generics. In C you will end up with slow implementations like the "qsort" library function which can be written a magnitude faster with generics (where the function call for key comparisons is eliminated).
Just compare a qsort call on a megabyte array of ints with a good hand written implementation which is using array access and the builtin '<' operator.
restrict
in C99 (10 years ago!). - Pavel Minaev
Good question. I think languages succeed by finding a niche. It's important to note that there are plenty of newer languages that are better than C in their niches.
C was once widely used as an application language, and in that domain it has steadily lost ground to C++, Java, and recently all sorts of other languages (notably the dynamic languages).
C used to be a language for writing server code. The Web pushed an amazing variety of languages into that space--Perl, Java, Python, VBScript, VB.NET, Ruby, C#--and cases where C makes any kind of sense for server code are now few and far between.
C has been used for scientific computing, but it faces competition from domain-specific languages like Matlab and Mathematica, as well as libraries like SciPy [1]. A lot of people who write code in this niche are not coders by trade and C is not a great fit for them.
But C's niche is system code. Operating system kernels. Drivers. Run-time libraries. It is so established in that space that even C++ is displacing it rather slowly.
C won back in the 1970s because of UNIX, because the competing languages were either too restrictive or too slow, and because C code was considered reasonably portable (lies, even then). But its biggest advantages today are unrelated, and stem mainly from decades of dominating its niche. There are good tools for C: optimizing compilers, kernel debuggers, effective static analysis to find bugs in driver code, etc. Almost every major platform defines a C ABI, and often it's the lingua franca for libraries. There's a pool of programmers who know how to code C--and who know what C's problems and pitfalls are.
Long-term, this niche isn't going away; and C has some problems. But it would still be extremely hard for any newcomer to compete.
[1] http://www.scipy.org/Fortran is faster than C for numerical tasks because of the way it handles memory references (C pointers are more difficult to optimize). The heavyweight numeric libraries at the base of things like Matlab and Numpy are still written in Fortran.
On the other hand, C++ can be just as fast as C, but has many more advanced programming features. It's a much newer language, from the mid 80-s.
restrict
pointer have the same aliasing semantics is Fortran? And what else is there that makes any difference? - Pavel Minaev
-ffast-math
, but it's not obvious that that's a good thing. - Stephen Canon
In my opinion, C is a basic language. It can control bit, like Assembler! On the other hand, C has a good structure, such as "if...else...", "switch", "for" and so on. As well as you could even create your own functions.
If a language is prime (closed to hardware), and is also easy to use, specially not complicated (there aren't gaudy packages). Then it's popular.
There isn't a best language, but a fit one.
Paraphrasing a very good comment: There are not many different ways to make a language fast and "close to the machine" - C did it well, and there is hardly any room to improve upon that.
Original answer:
Fast to execute or fast to write stuff in?
Languages are not fast or slow to execute, specific implementations are. A languge can only be considered faster than others when it somehow makes it easier to have a fast implementations. Invariably, that means "close to the machine". But with machines getting faster exponentially, that has become progressively less interesting over time. Instead, ease and speed of development and portability have become much more important, so "better" has become to mean "away from the machine". Pretty much all efforts in language design have gone into that direction for the last 5 decades.
So there you are: closer to the machine and faster languages than C exist; they're those that came before C: Assembler, Fortran. Probably some forgotten ones.
Check out Go [1]: I think Google is thinking the same way you are.
C has been around for a long time because it is a decent language and it just so happens it is used on a lot of projects, therefore it lives as long as those projects do.
[1] http://golang.org/What the heck, I'll chime in with my $0.02.
In many instances there is a real or perceived difference between "systems" languages and higher level languages. I'll ignore most "higher level" languages, since nobody (at least not many) will argue that for many tasks, languages like Python, Ruby, etc. are simpler to work in.
C was designed to be a systems language, meaning it was designed as the language in which the Unix operating system was written in. As such, it was designed to be simple, powerful, and fast. A simple language gains power by means that non-systems-programmers often consider dangerous: pointers, manual memory management, etc. As has already been mentioned, C is quite simple. K&R is the smallest book on my programming shelf by far (not counting O'Reilly Pocket References) and it's only marginally "bigger" than my Ruby Pocket Reference. C is quite powerful. If you need to talk to hardware, manually check and twiddle with memory, etc. C has the capability.
From a programmer's perspective, however, C is not so simple. Speed and power come at the price of manual memory management and not much OOP support built in to the language. C++ (not my favorite language) is much simpler from a programmer's perspective, but much less simple from a compiler's perspective. Objective-C (possibly my favorite language) has the same tradeoff, with slight lean in the direction of keeping the language simple (garbage collection is a newcomer to Objective-C, for example). But since the computing world as many of us know it was written in C, it's difficult for newer, more-complicated but "easier" languages to gain widespread adoption.
In some cases, especially when the current "standard" is as "good enough" as C is, there's simply not a lot of incentive for something "better" (C++, Objective-C, D etc.) to gain traction, when there is even enough incentive to create something "better".
As others have questioned: Have you completely forgotten about c++? Also, lately I have been experimenting with D. Have a look at it. [1]
[1] http://en.wikipedia.org/wiki/D%5F%28programming%5Flanguage%29C intentionally models a CPU very closely. Hence you can create very tight code which maps very well to the target CPU.
Just because things have been thought of long time ago, does not mean that the thoughts outdate if the basic premises are the same. Do do not try to build a better mouse trap.
fcos
are rarely used, since C functions often override their behavior. In this case, for performance. - Pestilence
Computers have gone faster and programmers have gone lazy. So not many people bother making something faster than C; or; code in C.
Your choice of tool is going to be dependent on the task at hand. I would not write a kernel in C++ just as much as I would shy away from writing a monte carlo simulation in C.
I think it would be extremely difficult to justify a single 'uni language' that boasted the features of C/C++/Fortran/CAML/OCAML and others selectively, depending upon compilation. Its more common sense than 'consensus thinking' to just use what exists and best fits the need.
There is also the issue of the language meaning the 'same' thing to vastly different groups of people, which is one of the reasons you only see C and Assembly in the Linux kernel.
Usually, the 'right and correct' tool for the job also implies accepted and immutable versions of a single specification and standard. While new languages like Go do promise awesome flexibility with great performance, its going to be a while before you see them used on Wall Street or in a Boeing passenger jet.
Pascal is about as fast as C. Missing libraries aside, there is hardly anything you can do in C but not in Pascal.
I'd say that C mirrors how the CPU works pretty good. Not that each and every machine instructions is mirrored, but rather the basic features are there, you have functions and a stack. There are also the possibility to add machine instructions with the asm
word in C. Together these things make it a good start to write an operating system in, and when that has been done you will have a base set of functions in libraries to use when creating programs to run on the system. From these base libraries one can create standardized libraries as posix
.
As an example one can look at Linux. Linux is the kernel, the operating system that makes a system tick. On top of this there are many layers of utility libraries making use of the features available in the kernel. On a Linux system this often comes from the GNU project.
As a result of all this all other languages still need to be interfaced to the system. Using C you are closer to use these features.
As an example, if you really want a fast http server there are at least one that is built into the kernel http://www.fenrus.demon.nl I can't imagine trying to do that in any other language than C
As an example of another system not suited for the C language I can mention the old Xerox Lisp Machines which even had the basic Lisp features implemented in the CPU. See wikipedia info [1]. But that looks more like an experiment [2].
[1] http://en.wikipedia.org/wiki/Lisp%5FmachineThe fundamental reasons here is that C is generally good enough, and is well established.
There are a whole lot of things I'd change about C if I were to do it over. The switch statement is awkward and very error-prone. It's very difficult to parse, compared to modern languages. The string libraries are ill-designed. The operator precedence is badly done. On the other hand, people and compilers have adapted to these, and therefore C is generally very usable.
There's also millions, perhaps billions, of lines of basic system code written in C. Changing to another language would mean moving away from all that. There's real advantages to maintaining a codebase in one language, and real disadvantages in rewriting immense amounts of real-world code.
For that reason, I don't expect to see a real replacement for a long time.
I don't think C would have caught on so well, had it not been for UNIX. C is not the most efficient thing to parse & compile. It's awful in that respect, actually.
Not everyone jumped on the C bandwagon. Apple, for instance, started Macintosh OS in Pascal. In fact, there was a bit of Pascal in early versions of Windows, IIRC.
I really think the battle between the two could have gone either way. Both derive from ALGOL, which is the source of the whole structured programming revolution (loops, if statements, etc.)
Up until the mid 90's, Pascal still had a strong following. Lots of commercial apps continue to be written in it, but more for legacy reasons than practicality. Now that universities have stopped teaching it, it's destined for obscurity.
A program written in C is only as fast as the compiler makes it. One of the reasons that C is still around is a historic one - there is a lot of momentum in brains involved in C compiler design.
But then again just about the only thing you could do to make C better is to add garbage collection. This is certainly not acceptable when working close to the metal and needing full control over your memory lifetime and your threads.
One some level you are right, there could be things that will make C suck less - i.e. not having to typedef struct
and better variable scoping, but those are pretty much in the syntactic sugar category.
typedef struc
t anymore - drhirsch
C++ is pretty much a superset of C - you can write a C program and pass it through a C++ compiler to get the same code. However if you use all the features of C++ then it does become a much more complex language and you move further away from the metal with a corresponding loss of performance, but it is possible to be use a selective subset of C++ to make it an enhanced C. Part of the attraction of C++ is it gives that choice to the programmer.
Others have mentioned D. There is also Google Go [1] which was announced a few weeks ago, which they are touting as a modern alternative to C. I have not tried it so cannot comment on it, but the feature set looks interesting especially for concurrent programs - this is an area where both C and C++ are lacking.
[1] http://golang.org/There is a lighter (even closer to the CPU) implementation of C,called "Sphinx C--" ( C minus minus ).
Here's some text quoted from the Sphinx C-- site:
Sphinx C-- is a programming language that combines C and assembly language. It runs at the DOS commandline and can generate applications, library files and drivers for 16-bit and 32-bit DOS, 32-bit MS Windows, and MenuetOS, and probably more. In this page I introduce C-- and how it can be used for MenuetOS and Windows.
Why write apps in C--? In a nutshell, C-- has low-level features, placing it somewhere between C and assembly. Sometimes a high-level language such as C can be very frustrating when you want to write code that accesses the underlying hardware architecture, or utilises the architecture for optimum efficiency. For example, I have used a C compiler called MACC to write applications for MenuetOS, and to call a software-interrupt routine, passing parameters via registers, involves first pushing the parameters onto the stack, calling an inline asm routine which transfers them to registers then calls the interrupt -- which is very inefficient. Often a C compiler places frustrating restrictions on in-line assembly code -- for example, MACC will only allow asm code outside a C function. C-- releases you from the straight-jacket, into a world of unfettered coding.
See at http://www.goosee.com/cmm/
Everyone's answers (for the most part) are correct, so I'll just try expanding on them.
The closer you get to the machine, the more room for optimization you have. If you are programming in C (vs C++), you are using what would be called a lower-level language and therefore would be closer to machine language. In the same manner, you could program everything in assembler and make micro-optimizations while you develop, with small (very very small) enhancements to already simple functions.
Then again, if you build the machine that uses the assembly language, you have even more room for improvements, which all higher level languages (at this point everything) can benefit from by some transitive property.
I have programmed in assembly languages and could easily argue that they are "better" than C at their functions, even though creating simple programs with them could take forever and then some. The advantage that each new iteration of language brings about is reduced development time, and perhaps some execution speed. If you want a faster program, look toward the hardware.
Without saying what has already been said about C's advantages...
As long as there are still millions of lines of legacy code out there that need to be used, C will remain. I do believe many languages can perform with satisfactory speed -- and face it, probably not even half of us need to really squeeze every speed we can get -- especially with today's technology where memory too is not much of concern to the majority of us.
C just happens to be available (and good) back then.
One reason that C is still popular is library development. Many programming languages allow you to link to C libraries and call C functions. It's easy to add this feature because C's object file format is pretty simple and C data types and functions are pretty easy to model.
It a chicken and egg situation. There are many C libraries, so new languages support calling C functions. Because many languages do this, new libraries are more likely to be written in C so that they can be used from many languages.
C can do that. Other languages is for abstraction. They are better. But, you can do everything, using c. Use it if you like. I use it when I need and don't asking why.
C has an long history and widely supported in every regions.
But I think you can try some concurrent programming language like erlang [1], which runs also fast on parallel computing.
[1] http://www.erlang.org/about.htmlClosest is microcode. VHDL [1] you may want research since supervisors won't like microcode. Most interesting new I tried is vvvv [2]
[1] http://www.vhdl-online.deAssembler is the fastest language. There is no any language except asm that would be faster than C, because it is not needed. What will you do with another C language? Which is exactly as C but with different syntax? If you want speed at all cost - you use asm. If you want a bit more comfortable programming, you take C. If you need complex constructions, classes, OOP and fast development, you take C++, C#, Delphi, etc. There is no another language as fast as C, just because there is C.
Languages you use depend on what you want to do, but also on their acceptance. Java is way simpler to program some things, but it's way slower because it uses the JVM but especially because it's higher level, so you don't have the same control over the way your hardware is used. Assembly is faster than C, but C is way easier to use :-P. The amount of control you have it's pretty important.
IN AI Lisp is very popular, but it's still coping to be as fast as other languages. However it's syntax is appropriate for AI work. C++ became popular in games because it's OO-oriented, is, for now, faster than Java and C#, and the most popular game engines are written in that language.
With the advent of the OO-programming languages, I think people forgot or stopped trying to make competition for the lower level languages. That's also because there's nothing wrong with C. Of course C has some disadvantages, especially for it's use of pointers which may cause memory faults that are hard to test and debug. Some recent projects aim at solving C's problems, but I think they're not popular because C doesn't have that many problems :-P and also, there's already a large number of applications that were developed in C.
C has the power and flexibility of assembly language, while still offering the clarity and maintainability of assembly language.
This is from memory...The history of C came about during the early 1970's when Ken Thompson and Dennis Ritchie were bored playing (was it Lander? for the PDP-10). At the time, Bell Labs, wanted to bring over the timesharing system across to the newer PDP system, called MULTICS, which was written in machine code. Dennis and Ken, played about with different language concepts, and coined the first C program by borrowing semantic bits from the BCPL language (incidentally, that was where the notion of *p++ and *--p came from, the actual machine instruction to do the equivalent of the pointer increment...)
So they used the new language (it underwent a few iterations, BCPL -> B -> C) to develop the system which was now Unix, hence the close ties and history behind it, this was in 1970, the K&R C language became official around 1972, before it got ANSI'd and ISO'd around the early 1980's.
Hence, that is why, it is said, C is closer to the metal, as the compiler took care of the incrementing/decrementing of the pointers (byte = 1, int = 2 and so on) as the job to port and embellish MULTICS into UNIX took less time as it was written in a language that was close to assembler.
Furthermore, this would explain why there is a Unix epoch time, 1/1/1970 and hence date/time manipulations were based by offsetting a positive number (in seconds if my memory serves me correct) to the epoch time to get the relevant date/time by using the said number as a parameter.
Because, it got ANSI'd and ISO'd, the popularity of the language took off and sky-rocketed because it was portable, it can run on many platforms.
DOS, OS/2, IBM's DB2, Borland's Sidekick, Lotus 123, QEMM, Duke Nukum, Commander Keen, GW-BASIC to name but a few, were all written in C.
Hope this helps, Best regards, Tom.
Wow, am I going to be the one sticking up for C? Really? Look, I'm a pascal nut from way back, so I don't like C, it lets you do too many dangerous things, and too many obscure things, and you can't clear your throat without a pointer. But even I know, that all that means it is perfectly situated as a high level assembler. Nothing else comes close. Period. So your problem isn't that you want something better than C, it's that you don't want C.