I Simply Do Not Understand C Supremacist Assholes.

- -

I really don't. Not really talking about anyone in particular here, but I find the crowd who seems to think you're a pansy if you don't code in C, C++, or assembly language to be goddamned annoying. What is with these people? Seriously. I wanna know.

There's nothing particularly superiour about C. I don't really feel like I have any significant advantage when I have to double-check all my shit just to make sure that I don't have any memory leaks. I don't really consider having to stay away from recursion because it's too expensive an advantage, if my compiled Haskell code actually runs faster than C code that is, conceptually, identical, but without the optimizations Haskell's syntax makes possible. For example, I wrote a 20-or-so-line program in Haskell to get a solution to Project Euler problem 204. I know very very very little of my very, shall we say, selective readership can read Haskell, but trust me that the program does something absolutely crazy absurd: enumerating the set of Hamming integers, in order, by merging together an infinite list of infinite lists. It does all this, accurately, in about four seconds. In the process, it barely even sips the tiniest bit of RAM.

Here's why C sucks and will always suck no matter how much assholes like that try to spit-shine it into non-shittiness. It may require learning a bizarre language like Haskell to do it, but that program was, conceptually, very easy for me to create. I am by my nature more mathematically than engineeringly inclined, so imagining how I would weave together a list of lists of Hamming integers, each list being created from a different number multiplied by the list itself as it's being generated, was about as hard and took as much skill as writing a very complicated C program, but all the working out of bugs that I did I could do on a whiteboard, in pure math. When it compiled successfully and my program looked like the equations I'd laid out on my whiteboard, I knew that when I ran it I would get the right answer. Being able to conceptualize programming tasks as manipulations of pure mathematical constructs is extremely powerful.

In Ruby, my other love as far as languages go, the clarity and expressivity of my code are the attraction. In a pure object-oriented environment, data are things with life and purpose. I love it because once you get over the nuances of actually pure object-oriented programming, instead of the filthy kludge of C++, you realize that it is also an extremely powerful thing. The ability to manipulate the syntax and use things like metaclasses make it easy to create easily-understood domain-specific languages. Additionally, since everything is an object, functions are objects too, and it's possible to have things like functors. Thinking in Ruby is a challenge just as much as thinking in C is, but my point here is that I don't think that having intuitively understandable syntax combined with a runtime system that does all the grunt-work for you rather than expecting you to handle it yourself is a disadvantage. I think not having those things in C makes C only really useful for low-level systems programming where those kinds of precautions can become limitations. In ordinary applications programming, and in particular the programming of dymanic resource-driven web sites, the particulars of memory management aren't nearly as important as making the sure thing works.

And that right there is the weird part. The only argument seems to be, "In C you get to allocate your own memory and use pointers!" Who fucking cares? Pointers annoyed me in college even after I understood them, and I never appreciated having to pre-allocate the space needed for arrays and stuff like that. I'd much rather have arrays, tuples and lists be separate things and I'll tell the compiler what I want done with them and the compiler or runtime system or both will figure out which memory should be allocated when. My code is not worth more to me or to my clients if I spend half the time I worked on it chasing down memory leaks or writing code that handles memory. Now that I'm no longer in college and I'm all grown up and I've been a professional programmer for over five years, I don't need to dirty my hands with the memory management unless something goes wrong with it; it's okay if I want to abstract myself away from the bare metal a bit and spend more time on my algorithms and on optimization than on debugging. You, C programmers, think real hard, right now, about how many times a day your little finger hits that semicolon key. How many semicolons has your little finger had to type in the past month? Year? (For those of y'all who are not C programmers, the answer is likely to be the same as the number of lines of code they've typed.)

Really this guy just seems like he's bummed out that the younger crowd isn't into the same things he was—and still is—, but why should we? Was his generation into punchcards and COBOL or ALGOL-68 like their parents were? Probably not; they learned C instead. I agree that good programmers are the ones that understand what happens at the CPU level and C and C++ are great system languages and they're great for learning all the little lessons you've gotta learn on the way to becoming a great programmer. Sometime in the future, it will be important for you to know why a switch statement is preferable to a long series of if statements. That certainly does not mean that it's the best language ever and it definitely doesn't mean that it's appropriate or even remotely advisable for any given task. For example: I would challenge right here right now anyone out there who thinks they can write a program that accomplishes what my little Haskell program up there does, in the same amount of time/memory/hard drive space, that uses the same algorithm. My program depends on laziness but it could theoretically be done applicatively, and for both languages to be Turing complete it must be possible in both. I've seen how long a Y-combinator written in C is. I don't think anyone's gonna be able to do that unless their C code is the exact same C code that GHC generated from my code, and that would be so cheap. I know the program doesn't look useful—all it does is count an obscure kind of integer—but being able to do such a thing is a tremendous advantage for a programmer such as myself.

I dunno. I shouldn't get so bent out of shape about it but the article just rubbed me the wrong way, and the only way to deal with a blog that bums you out is to write one that bums other people out. The dude complains at length about how, I guess, he has some kind of a problem with Mac users who like to use pretty text editors with nice typography. I'm really not sure I understand what's so wrong about that... like I should strain my eyes to appease him or something. It's pretty weird. Anyways, space cadets, I'm calling this one: the dude just isn't a skilled enough programmer to learn Haskell or Ruby, so he tries to act like C is somehow not a terrible language. BOOM.