C Is The New Assembly

February 14th, 2007

John Gruber was recently featured for a second time on Dan Benjamin’s instantly-awesome Hivelogic Podcast.

I’m glad that Jesper heard and was intrigued by some of the same comments I was, because he reminded me that I wanted to comment on a subtle but worth-repeating impression that I took away.

The comments are those pertaining to the introduction of scripting language bridges as standard developer resources in Leopard OS X 10.5. Gruber points out that for developers, the presence of these scripting language bindings will both open up the platform to a wider audience of developers, and enable selective use of scripting languages where the highest performance is not necessary.

He suggests that a typical developer will write everything in Ruby or Python, and then do performance testing. Anything that needs a speed-up can be redone in Objective-C. You know, that “slow” dynamic variant of C :)

This analysis is foreboding, because it’s exactly what programmers have always done when they switched to a higher level language. 10 years ago a programmer would be more likely to “switch to assembly” for a much-needed performance boost. Has it come to this? Are we moving to a higher plane? If you’re like me, you’ve probably become quite comfortable in your Objective-C universe, and somewhat dismissive of the scripting languages as they begin to encroach on our holy ground. But we run the risk of being like those losers who still insist on programming everything in assembly, while the higher-level code would be just as fast and easier to maintain.

Is C is the new assembly?

84 Responses to “C Is The New Assembly”

  1. Jesper Says:

    I am reminded of the old chestnut that C is “a language that combines all the elegance and power of assembly language with all the readability and maintainability of assembly language”.

    The trend is out there and staring us in the face – C# and Java and scripting languages are taking over, step by step, but there’s a big difference when compared to Assembler.

    C has libraries.

  2. Chris Says:

    C was the new assembly in 1985, or whenever it was that Stroustrup first released CFront (the original C++ front-end, which compiled to C rather than assembly language).

    If you look at Objective-C 2.0, I think it’s pretty clear that the features were largely motivated by the greater expressiveness/simpler syntax possible in higher-level languages. You can draw direct lines, in particular, to Ruby’s accessor syntax and to Python’s iteration syntax. (These features can be seen in the public Apple branch of the GCC SVN repository — I’m not revealing NDA secrets.) So you’re gonna get higher-level syntax whether you like it or not. :)

  3. Ryan Ballantyne Says:

    I’m not like you, having never learned Objective-C or Cocoa but always wanting to.

    For me, this is great news. I’ve been meaning to learn Ruby for web development, and if I could also write Mac programs in Ruby, it would mean killing two birds with one stone.

    I hate working in low-level languages. In my book, the less low-level code I need to write, the happier I’ll be.

  4. Ben Says:

    C/C++ went the way of assembly years ago for app development. Unless you’re writing an operating system, OS components, or some other embedded system, you’d be foolish to start writing code for a new project today in C/C++ on any major platform (Win, Mac). Obj-C has some features (dynamic typing) that elevate it above C/C++ and bring it closer to C#/Java, but it’s still no Ruby in terms of sheer expressiveness. (And frankly, I’m sick of Java– its ship never really came in, and C# is now a better language.)

    The future of application development is in ever-higher levels of expressiveness, generally egged on by user interface paradigms and enabled by ever-faster machines.

  5. Jonathan Wight Says:

    I’m using pure C in anger for the first time in about 8 years. I’m designing a cross platform library that will be used from C++, Objective-C and possibly other languages. C’s real problems are quickly becoming evident (again): The standard libraries suck. It is 2007 – where is the support for UTF-8 and standard containers/algorithms? C99 added a lot to the language but very little to the stdard libraries. Can you imagine how sucky Objective-C would be if NSStrings were ASCII only and you didn’t have your NSDictionaries and NSArrays…

    Of course there are Open Source libraries out there, but then suddenly your simple library has a whole host of dependencies that you need to track and you’re struggling finding the right library with the right license that works on all the required libraries. I’m not asking for a framework as rich as Cocoa to be added to C – heavens forbid. But the addition of some standard containers/algorithms would be great. Our idea of what a language needs to be useful has changed a lot since K&R’s day.

    C++ was improved dramatically by the addition of the STL. I’d love to see some kind of similar improvements made to the C standard libraries. I doubt it will ever happen but without C will always be glorified assembly.

  6. Rick Says:

    The fundamental difference between C and assembly code is portability. You’ve got to rewrite your assembler speedups for every architecture. C needs to be recompiled for each, but the code ports.

  7. Chris Ryland Says:

    As a newly-restarting developer in Obj-C (been using C for several decades now ;-), I’ve been very tempted by Python w/PyObjC (which, rumor has, will fully integrate with Obj-C object management in Leopard, and not be just auto-bridged).

    But, in the end, Obj-C is “poor man’s Smalltalk”, and has its own logic and rhythm. Trying to use a lot of Cocoa from Python becomes painful, because you give up the arg/value interleaving in method invocation that makes Obj-C code read like prose.

  8. Ben Says:

    @Jonathan: I absolutely agree with your assessment of the situation, but I’m not sure why it’s interesting to improve straight C. That’s what C++ with STL and Obj-C/Cocoa etc are for…

    Although you seem to be one of them, the number of cases where you’re developing a library in C that doesn’t *already* have a large environment of support functionality built for it is vanishingly small. An example is a device driver for some platform X: it’s in straight C, but the kernel will expose a bunch of helper utilities for dealing with data structures you’re likely to want to use, because the kernel has the same problem all over the place, so it uses some version of libc plus other environment-specific utility routines that have accreted over time.

    No one will ever develop, say, UTF8 additions to the portable C library, because there’s really just no demand for that these days in a cross-platform-portable way.

  9. Jonathan Wight Says:

    Ben,

    You’re right, C wont ever be extended in ways discussed.

    And I might well end up using C++ (w/STL) inside the library – but hiding the C++ inside a C interface (for various reasons). But ideally I shouldn’t have to. C is a good enough language for my purposes but I am let down by a lack of usefulness in the standard library.

    My original comment was aimed at Jesper I think, who pointed out that one difference between C and asm is that at least C has stdlibs. Unfortunatly the C stdlibs are kind of threadbare. Which is why (as you point out) we have C++, ObjC, and Python, etc.

  10. Stephan Cleaves Says:

    Personally I don’t see this as the case. Higher level languages may be making some thing possible but its not invading the application space yet. And I wonder how likely people would be to sign up to use a Ruby-Cocoa bridge given what Apple has done with the Java-Cocoa support. What I do see on Mac OS X is a move away from Carbon on Apple’s part. I believe they are going to continue to push on developers to embrace Cocoa and that new APIs will generally be more Cocoa-like.

    However, I will say that if we see things shift toward higher level languages it will likely be because people are less concerned about a consistent user experience. I think Apple is at least partly to blame because of their own practice of creating appealing UI elements but not opening them to their developer base. If you’re going to have to write your own to provide an experience people are used to then why not go beyond that. I think this can (and has) been taken too far and I still have hope that Leopard will bring us a new refined aqua or an outright aqua replacement that will be appealing enough to developers so that we converge on look and feel again.

    I’m quite happy with Objective-C and if I were going to move to a higher level language I’d rather it be a language like Io, but I have no experience with Ruby or some of the other high level languages, unless you count Java and we know where that’s going on OS X :)

  11. Jesper Says:

    Jonathan: When I said libraries, I didn’t mean the standard libraries. I meant two things: a) that C has libraries *at all* (Assembler is a bunch of fixed instructions and that’s it), AND b) that because of a), there’s a ton of libraries everywhere. There’s a library for everything. If you have the necessary mental bent, you can keep writing almost anything in C because of libraries. Many of these libraries form foundations of the current implementation of most programming languages.

    Libraries and reusable code did not only make it easier to switch from Assembler to C once upon a time, but it’s going to make it harder to switch from C to whatever else now. C is just the lingua franca of programming, and moving away from it – which, trust me, I still thing we’re doing – is going to take an awful lot longer because of it.

  12. Martin Pilkington Says:

    I doubt that the majority of Cocoa apps will end up being written in Ruby or Python. I feel that it may end up being similar to Java. Officially supporting the bridges will attract programmers who use those languages, allowing them to write Cocoa apps without Obj-C. But they’ll gradually look more and more at Obj-C and switch. If anything, Obj-C 2.0 will be what gets more people to look at Cocoa. I’ve got a friend who’s mad about ruby, yet he’s slowly learning Obj-C because Obj-C 2.0 interests him. The addition of Garbage Collection has allowed those who are used to languages like Java, Ruby, Python etc to take Obj-C “seriously”

  13. Scott Stevenson Says:

    Cocoa with Ruby isn’t new, and PyObjC has been around even longer. It’s great that these are standard now, but I don’t think they’ll replace Objective-C as much as augment it. In fact, with garbage collection and properties, the gap has narrowed quite a bit.

    Personally, I think Objective-C is much easier to read and write. Ruby code may be shorter, but my overriding goal is clarity more than brevity. The Ruby culture is to write really, really short (often ambigious) variable and method names. This make sense given Ruby’s heritage, but I just don’t think it fits the Cocoa landscape.

    Cocoa is built for Objective-C, and Rails is built for Ruby. In both cases, it’s apparent that the relationship between the language and the framework makes the overall environment better. Although I love Ruby as a language, I don’t think you can just superimpose it on top of Cocoa and forget about Objective-C.

    I also think the compiler has some major advantages beyond just speed:
    http://theocacao.com/document.page/387

    Objective-C also has the major advantage of being able to reach into C, C++ and practically anything else very easily. Could (enjoyably) write Xcode, Delicious Library, or TextMate in Ruby? I don’t think so, at least not yet.

    I could be wrong, though. We’ll see what happens. How do the bridges work for the frameworks outside of AppKit, Foundation and Core Data. Core Image, for example?

  14. Mark Grimes Says:

    Asm ftw! How else are you gonna exploit and crack applications? J/K or not. From an implementation perspective, I think Moore’s law has proven you don’t need to write 200x more asm against 400x more clock time to generate what you can in C in a few minutes and less time with a scripting language. Systems get faster, hard drives and memory get more capacity — there’s no sense in not standing on the shoulders of giants to curb a few clock cycles when the interface you are playing with probably has asm optimizations under the hood written by people smarter then me.

    The script bridges also aren’t uniform. E.g. RubyCocoa and RubyObjC have different performance metrics when crossing the bridge. The former requires implementation constraints that dictate not crossing the bridge too often. But alas, I agree with Scott, the popularity of these languages along with the abandonment of the Java bridge will get people new to OSX on-board until they discover how much more versatile ObjC is.

    Then what? You package the app and the interpreter as a part of an application bundle? Ewww.

    You also rely on Apple (in terms of their in-house adoption) to maintain binary compability, of which has been a notable issue with the Python interpreter (backwards compability).

    Then tack on the version skews at the end of an OSX release cycle:

    Apple: ruby 1.8.2 (2004-12-25) [universal-darwin8.0]
    MacPorts: ruby 1.8.5 (2006-12-25 patchlevel 12) [i686-darwin8.8.2]
    Apple: Python 2.3.5
    MacPorts: Python 2.4.3

    So yes, while they may pledge to support their shipping version of the interpreter, your 3rd party modules may very well require a newer release as time marches on. So even if they figure out a way to not require your app to embed the interpreter, you’re STILL stuck embedding it if the above case applies, no?

    It’s certainly nice that the capability is there, but I think it’s a long way off from being a bandaid for opening the door to more then simply ObjC.

  15. Jon Hendry Says:

    Jonathan wrote: “Can you imagine how sucky Objective-C would be if NSStrings were ASCII only and you didn’t have your NSDictionaries and NSArrays…”

    Why yes, I remember those days…

    During NeXT’s first bloom of popularity, in the black hardware days, there were no NSStrings or NSNumbers, no NSDictionary or NSArray. (There was List and Hashtable, but they weren’t as nice). Nor were there autorelease pools.

    That’s when it was adopted by big investment banks, and most of the commercial NeXTStep software came out (if I recall correctly).

    It wasn’t that bad. And it was vastly better than anything else. And ASCII-only (plus RTF) wasn’t so bad, because it was before Unicode was released, or contemporary to it, before it was widely adopted.

    The NSStrings, NSDictionaries, autorelease pools, and Unicode didn’t get in there until around 1995.

  16. Jonathan Wight Says:

    Jon, you forgot to prefix your posting with “When I were a lad…” ;-)

  17. Joe Goh Says:

    Back when I was still prototyping my app (FunkeeStory), I did the first prototype in a mixture of PyObjC and plain C. After a month or so, I quickly realised that many of the features that I come to depend on in Python are also sorta available in Cocoa using Objective-C.

    For me, the primary reason why Python is more productive than many other languages that i’ve worked with is due to its awesome libraries and its language support for dynamic data structures.

    However, after a month of hacking away in Cocoa using Objective-C, I found that the Cocoa libraries are just as awesome and although it doesn’t have native language-level support for dynamic data structures like Python, I don’t find that its a major hinderance at all. I do miss slicing from time to time though, but its not a deal-breaker.

    Here are the reasons why I (re)wrote my app in Objective-C rather than Python.

    1) Its much easier to get help when you’re stuck on the edge case scenarios in Cocoa. Most mailing lists for Cocoa are catered for the Objective-C crowd, and its much easier to communicate, paste code, etc if you’re writing in Objective-C. Same for websites, official documentation, sample code, etc.

    2) Application launch times are faster. And its perceptable by end-users. And in my case, as half of my app is a Palm conduit, this launch time difference will be noticed by users everytime they sync – which can mean several times a day for some.

    3) I’m not gaining all that much productivity if I use Python, and setting up all the tools for proper PyObjC development is still a pain.

  18. Greg Says:

    C is not the new assembly. Making that comparison with ruby is ridiculous. The reason that programmers would “switch to assembly” was because back in those times compilers were inefficient. If there were still that inefficient today more people would be writing assembly code. As it stands practically nobody writes in assembly, this is not because C has more features than assembly, this is because optimized C code generated by compilers is as fast or faster than something you’d be able to hand write in assembly.

    Ruby and python are a long way from catching up with C in terms of speed, and until that happens C will still be around.

  19. Gary Robinson Says:

    Our application, Goombah (http://www.goombah.com) is written in Python. The mac UI uses Cocoa via PyObjC and the Windows UI is wxWidgets.

    We’ve been very happy with this approach.

  20. David Says:

    You’re just now figuring this out?

    C has been the “new assembly language” for at least 20 years. The problem is too many developers hadn’t noticed this.

    I think I noticed it first when I read an article about using a limited subset of C that mapped one-to-one on 68K assembler. The registers were name d0-d7, a0-a7, you could assign to registers, add to them, etc., e.g.:
    a0=some calculation;
    a0+=sizeof(int);
    *a0++=d0;

    It was eye-opening not because I realized that C could map to assembler, but because I realized that C was so close to the original PDP-11 instruction set, which has echoes in 68K.

    So C has ALWAYS been “portable assembly language”, where “portable” depends more on the libraries than on anything inherent in the C language itself.

    And I’m referring to original K&R C, not that sissy standardized thing.

    And yeah, that was when I were a lad.

  21. Daniel Jalkut Says:

    I think it’s really interesting (and frustrating) that people are taking my commentary so literally. I don’t mean “C is like assembly languge” … I mean it’s serving the practical purposes of assembly language.

    It’s a distraction that C happens to be fairly close to the machine level. The point is, people used to resort to programming in assembly language when optimizing critical code passages. Well hell, they still do. But in a dynamic scripting world, they might very well resort to programming in C to address comparable performance problems.

    All ya’ll who insist on arguing that C is “inherently like assembly language” have been perplexingly overlooking the fact that people have been writing assembly passages to replace C code for decades. Why would they do that if C was already everything that assembly is?

  22. Jens Alfke Says:

    I haven’t tried to develop a GUI app in a scripting language (Ruby or Python) yet, though I’ve used both for web apps, and have done tons of GUI development in Obj-C, Java and C++.

    One problem I suspect will bite people using scripting-language bridges is the lack of type checking. Yes, I’ve read all the propaganda about “duck typing” and how the lack of type checking makes you more Agile. For small quickie bits of code, I tend to agree. However, in a full app, type errors will bite you in the ass. A lot.

    Now, if it’s a web app, being obsessive about testing can save you. You use tests to drive development, you use all the cool Rails features to write test cases that drive your whole app to test out every single feature, and so on. Fine. So you make zillions of type errors, and the compiler won’t catch them, but your test cases will, and you run the test cases every five minutes.

    But in my experience, GUI apps are orders of magnitude harder to test than something that’s linear and stateless and driven by a single command pipe. GUI apps have huge amounts of state, and all kinds of weird timing issues, and simulating real-world interaction in a repeatable way is very hard to do. I have never known of a real-world GUI app that got most of its testing from some kind of automated process. Rather, the vast majority of bugs get shaken out by actual users doing actual stuff with the app, in unpredictable ways, and filing bug reports.

    Of course that slows the testing cycle down a lot, because you can’t make one change, press a button, and have a dozen users instantly do work with the new build of the app and file bug reports on it. So if you make as many mistakes as you do in Ruby or Python, I suspect you’ll be dead. Instead, you need a language and a compiler that catch [many of] those errors for you before the code gets to your testers.

    (One objection I’ve heard is that “even a GUI app is mostly back-end code, which you can write automated tests for, plus maybe 5% that deals with the user interface.” I suspect this refers to the clunky “Enterprise” app whose UI consists of a table-view and a couple of buttons and text-fields slapped into a window. It sure isn’t true for the kind of polished app that Mac users expect.)

    Sorry, I didn’t mean this to be a screed against using Ruby or Python for GUI apps. I really want to try it, because I like those apps a lot. I’m just worried about the reliability of the code.

  23. Daniel Jalkut Says:

    Jens: that’s a good point. And although I generally consider the compile-time checking of ObjC to be minimal, I guess in comparison to an uncompiled language, it would come out ahead.

    I wonder if the dynamic languages take off, we might see the rise of some kind of “fake compiler” … a lint-like processor that would parse your script code only to tell you whether or not things looked effed-up or not.

  24. Jens Alfke Says:

    Daniel: Agreed, Obj-C is lax in that its collection classes use the unchecked “id” type. But even so, the compiler catches a lot of stuff, like when you call a nonexistent method on a (typed) object.

    Strong type checking can have its downside too. My current project is, for complicated reasons, mostly in C++, and early on we went kind of a bender with the template kool-aid. So we have a lot of collections and other classes that are 100% type-safe, but were a nightmare to implement and can often be a pain in the butt to use. (Some of my smart friends keep saying how functional languages like OCaml and Haskell solve these problems, but I haven’t succeeded in being able to figure them out yet.)

  25. Bret Says:

    Uncompiled languages will never take off for commercial app development. Here’s why: you are shipping your source code. OSS, server stuff, yeah – but not desktop apps the way they are defined now.

    However, you could do like java does, and have a compiler (either to bytecode or direct to CPU-specific object code). IMHO, this is the better solution, anyway, as it will be faster (which translates to more CPU cycles available for other stuff, or better power efficiency).

    But, the best solution I think is to extend Obj-C to provide the expressiveness of those other languages (which as discussed further up in this thread, Obj-C 2.0 starts to do).

  26. Arpan Says:

    @Scott

    The most interesting part of Textmate, namely the bundles, are written in Ruby script, shell script etc. Because of this, it is so easy to extend it, since you only need to know a scripting language.

    Only the core is written in Cocoa. So Textmate is actually a very good example of what is possible when using scripting languages.

  27. LKM Says:

    Yeah, application programming is moving towards scripting languages. But I don’t think Ruby or Python or Lua. I think it’s going to be ECMAScript. Think about it: It’s a language that a huge number of “non-programmers” are familiar with. It runs on anything (in fact, I think apps like NetNewsWire already use it for some of their interface components). Pretty much all web-based applications use it (even those written in Flash – ActionScript is basically ECMAScript). It’s got Tamarin, which is kinda a JRE for ECMAScript. It’s a very modern language (yeah, lots of people don’t like its class-less object orientation, but I think this is about to change). And it’s rather easy to learn for C/Java/C#/Objective-C programmers (again, the object oriented part takes some getting used to, but that is changing).

    Ruby, Pyton, Lua and all those other languages are cool, but I think ECMAScript is going to be the next big thing for “real” programmers. It already is for web developers.

  28. Joe Goh Says:

    Bret:
    For Java’s case like you mentioned, there are EXCELLENT decompilers available for a long time now: http://members.fortunecity.com/neshkov/dj.html

    If you have an existing Java project, go download that and try it out somewhere, you’ll be AMAZED at how good the source code output is.

    Python which most people think as a scripting language can be compiled too, so its not exactly shipping source code either. But good decompilers will always be available for such languages.

    In any case, I disagree that using “scripting” languages for commercial app development will fail due to it being easily decompiled (or reverse engineered in other ways for that matter).

    Decompilers for apps written in C are available ( http://boomerang.sourceforge.net/cando.php?hidemenu ) and when these tools get more sophisticated, is anyone going to say that we can’t ship apps in C anymore because it can be decompiled?

  29. Joao Says:

    well Objective C is the new ruby on rails, it follows the MVC aproach, you have core data and you can pretty much do an application without coding just drawing diagrams.

    have you seen how easy it is to play with core animation, no ajax library is that easy…

  30. Martin Pilkington Says:

    Joao: I think it would be more accurate to say that Ruby on Rails is the new Cocoa. Rails hasn’t been around that long but Cocoa and Obj-C have been around since the 80s.

    Arpan: I think what Scott was trying to say is that, while parts of an application could be written in a scripting language, writing a full application would likely not be as easy as in Obj-C.

  31. Arpan Says:

    Martin:
    In reference to Textmate, which parts do you think would not be possible in Ruby / Python/Shell?

    Of course the main text editor is written in Cocoa, but most of the functionality that makes Textmate unique and so useful, are the bundles.

  32. Arpan Says:

    Just to note, that I’m not arguing that creating entire desktop applications would be easy using scripting languages.

    I was just responding to the comment about Textmate.

  33. Michael Ströck Says:

    “have you seen how easy it is to play with core animation, no ajax library is that easy…”

    Exactly, playing with it is easy. Doing real work with it not so much :-)

  34. Scott Stevenson Says:

    @Arpan: I think TextMate is a fantastic model for how the scripting bridges could be used. Allowing plug-ins to reach into Cocoa for useful things makes a lot of sense to me.

    In fact, I think some apps do this sort of thing today with the JavaScript/Objective-C bridge which is part of WebKit. John Fox specifically called this out during his presentation on MemoryMiner at CocoaHeads.

  35. Patrick Taylor Says:

    I think hybrid apps are going to happen more and more. For example, Lua has been used quite a bit in complement with C including in World of Warcraft and Photoshop ::koff:: Lightroom (about 40% of it was done in Lua according to Wikipedia).

  36. Chris Ryland Says:

    @Scott: I thought Textmate doesn’t actually bridge to anything, but simply calls a content-massaging script in any language through a subshell.

    Also see Steve Yegge’s recent rant about “the next great language.” I’m pretty sure he means (and he’s probably right) ECMAScript 4 (the Tamarind project).

  37. Martin Pilkington Says:

    Arpan: I’m not saying it wouldn’t be possible, I’m simply saying it would not likely be as easy. I don’t know enough Ruby or Python to be able to write anything significant yet, but from what I’ve seen it won’t offer a huge advantage over Obj-C 2.0 beyond familiarity to current users. As Scott says, writing plug ins in scripting languages is really the way this will be mostly used. It especially makes sense to use Ruby for something like TextMate which is the Ruby text editor of choice for most people.

  38. Gary Robinson Says:

    “One problem I suspect will bite people using scripting-language bridges is the lack of type checking. Yes, I’ve read all the propaganda about “duck typing” and how the lack of type checking makes you more Agile. For small quickie bits of code, I tend to agree. However, in a full app, type errors will bite you in the ass. A lot.”

    We have a sizeable app written in python. We unit test all the internals. I do not recall ONE time of there being a problem due to type. Based on about 10 years’ experience with Python, I know for an absolute fact that when you really write a lot of python code and do the unit testing that’s required to check the LOGIC, type problems are not an issue. It’s a myth perpetuated by people without the actual experience.

  39. Arpan Says:

    Regarding writing applications with Ruby / Python, I’m not sure that I would want to use them. I get frustrated when an app does not respond immediately. If an app used dynamic languages for substantial portions, it would definitely be slower and more resource intensive.

    Plus, from what I’ve seen, Cocoa as a framework has resulted in so many superb and creative apps, and I don’t necessarily think that it would be much faster to develop using Ruby / Python (for desktop apps).

    I think that a Ruby-Cocoa bridge would be useful for me personally for making small utilities, and for extending exisiting apps or easily accessing console apps and libraries, since as a non-developer, I don’t have the time to learn a proper programming language, and I have already learnt Ruby for working with web-applications.

    Basically, what I’m saying is I have absolutely no idea what I’m talking about, so I’m going to keep quite and let those who know more than me talk.

  40. Sam Penrose Says:

    What Gary said WRT testing. I’m a Python programmer who does a lot of GUI work on Linux interested in Objective C and Cocoa development. When I read snippets such as these from an alpha Objective-C geek:

    http://wilshipley.com/blog/2005/12/pimp-my-code-part-6-pimp-before.html

    Objective C looks a LOT more verbose than Python and Ruby. On any given stretch of screen less is being expressed. That’s not a criticism, just an observation. So when Scott says, for example, “Personally, I think Objective-C is much easier to read and write. Ruby code may be shorter, but my overriding goal is clarity more than brevity,” what I think is: “there’s a guy who has spent years reading Obj-C until his brain is Obj-C shaped.” (That’s not a criticism either.) I have no opinion on Ruby, but Python is famously both clear and concise, and all things being equal more concise *is* more clear.

    The real contest IMO is not Python (or Ruby or ECMAScript) vs Obj-C. It’s concise but enviroment-neutral dynamic languages vs. strongly typed languages plus first class IDEs plus (in the case of Obj-C and C#) powerful OS integration. First class IDEs do a lot to close both the readability and writeability gaps that verbose statically typed languages have with concise dynamically typed laguages. And as others have noted in this thread, the dynamic languages all have extra important “Leaky Abstraction” layers between what the programmer writes and the OS/device-level environment relative to platform-specific languages such as Obj-C and C# (yeah, Mono, whatever).

    If I were on the Xcode team I’d be watching Microsoft’s IronPython carefully to look for clues about how to get the best of both worlds.

  41. Chris Ryland Says:

    @Sam: My instinct is exactly the same as yours when it comes to Python vs. Obj-C. But note that it’s not so much Obj-C as the Cocoa libraries that are verbose, by their choice of almost didactic method names.

    To a Smalltalker, Cocoa would look very familiar. To a typical Python-head, it looks verbose.

    But look at an actual method from your example (forgive the periods, but I can’t seem to get indentation otherwise, even using code/pre formatting):


    - (void)createBackgroundImage
    {
    .. [bgImage release];
    .. NSRect bounds = [self bounds];
    .. bgImage = [[NSImage alloc] initWithSize:bounds.size];
    .. [bgImage lockFocus];
    .. [self doGradient:bounds];
    .. [bgImage unlockFocus];
    }

    In Python, that’d be something like:


    def createBackgroundImage()
    .. bgImage.release()
    .. bounds = self.bounds
    .. bgImage = NSIMage.alloc().initWithSize_(bounds.size);
    .. bgImage.lockFocus()
    .. self.doGradient_(bounds)
    .. bgImage.unlockFocus()

    where the dominant text is pretty much the same. The only place where Python wins is with simple dictionary/list access, but even the Obj-C [d setValue:v forKey:k] isn’t all that much verbose than d[k] = v, particularly if you start using descriptive names as is Cocoa’s wont, e.g., voices[currrentVoice] = newVoice isn’t all that much more terse than [voices setValue:newVoice forKey:currentVoice].

    So I think it’s more a matter of a verbose Cocoa naming culture vs. Python’s terser naming. The big question is Does this kind of verbosity help readability/writeability of code? It certainly slows down the latter (though code completion can sweeten writing considerably, and does), but I suspect (being no expert at either Cocoa nor Python) that the improved readability is a huge win.

  42. Kevin Walzer Says:

    Daniel,

    Your comment about “C being the new assembler”–whether meant literally or not–pretty much sums up my feelings about it. I program almost exclusively in scripting languages–Tcl, Python, and AppleScript–and have delved into C only to extend a particular scripting language’s functionality. (I wrote a small Tcl wrapper for the Carbon Help API–two Tcl commands, fifty lines of C, and the most painful programming experience of my life.) I admire the skills of those who program in C and other compiled langauges; they surpass my own. But in nearly every case, I simply don’t need to work at such a low level to do what I want to do, which is desktop GUI applications.

    I’m amused at those who say scripting languages can’t replace system-level languages for many tasks. In some domains this may be true, but in most cases, it’s not. My favorite example of this is the Alpha text editor. Alpha is a powerful, Emacs-like text editor that dates back to the old Mac OS; it’s been updated for OS X, and is now a pretty nice Carbon app whose core is written in C and whose scripting interface is Tcl. A single developer, Vince Darley, wanted to get Alpha running on other platforms, so he rewrote the entire program in pure Tcl/Tk (AlphaTk)–it’s about 70,000 lines of Tcl code. It implements every feature of the C-based original and also implements the same GUI, giving up only a little in polish and gaining portability to Windows and Linux.

    A better perspective on this can be found here:

    http://www.activestate.com/company/NewsRoom/whitepapers_ADL.plex

    –Kevin

  43. LKM Says:

    By the way, my personal issue with dynamically typed languages is that it’s often not possible to infer a variable’s type, which means that intellisense-style guides don’t work. I’m just lazy that way :-)

  44. Sam Penrose Says:

    @Chris: fair points, although I note you have chosen a stretch that only includes 2 inline type declarations, no loops, only 2 braces, and no data structure creations. We could find Java or C++ that looked similar. Also “simple dictionary/list access” is a large fraction of my Python code. I’d be surprised if real Obj-C doesn’t take up at least twice as much room on the screen as real Python. What we’d need to settle this is actual data from someone who has done significant real translation between Python and Obj-C, although as I indicated I don’t think the readability of code as such is the most important criterion: the read/write-ability of code in a particular development environment is what matters.

  45. Chris Ryland Says:

    @Sam: Yep, but note that Obj-C 2 will solve the loop verbosity problem, and properties will save a lot of method invocations. But, yes, type declarations and type casts, and data structure creations are a lot more verbose in Obj-C.

  46. Damien Sorresso Says:

    For Java’s case like you mentioned, there are EXCELLENT decompilers available for a long time now: http://members.fortunecity.com/neshkov/dj.html

    If you have an existing Java project, go download that and try it out somewhere, you’ll be AMAZED at how good the source code output is.

    I was just going to mention this before I saw your comment. Java doesn’t have a myriad a compilers with a ton of different optimization and scheduling options available. It has one compiler, and most code is compiled with the same options. So there are far, far fewer “weird” cases you have to consider when looking at the byte code. Unlike something like gcc, where a de-compiler might have to worry about what optimization level the code was compiled with, what kind of processor it’s scheduled for, etc …

    So if scripting languages go that route, I’d expect that de-compilers of similar quality will be released in short order.

  47. Pete Yandell Says:

    “I haven’t tried to develop a GUI app in a scripting language (Ruby or Python)…One problem I suspect will bite people using scripting-language bridges is the lack of type checking.”

    This argument gets trotted out a lot, and it’s invariably by people who haven’t built anything major in loosely-typed languages.

    I’ve built substantial apps in Perl, Ruby, C, C++, and Objective-C and here’s the conclusion I’ve reached about strict vs. loose typing: they’re just different ways of working.

    In loosely typed languagues, have I been bitten by type errors that only came out in obscure runtime circumstances? Sure, a few times, but very rarely.

    In strictly typed languages, have I been restricted in implementing something neatly by the nature of the type system? Sure, a few times, but very rarely.

    But neither problem has ever really been a major issue in development productiveness or reliability of code.

  48. Joao Says:

    Martin Pilkington : you are right, the idea is that objective c is becoming very flexible and easy just like this scripting languages.

  49. Martin Pilkington Says:

    @Sam: I think it all depends on what you are used to. Yes Obj-C seems verbose, but to me more compact languages seem much harder to understand. I’ve been given many pieces of Java sample code recently and many of the argument names are 2 or 3 letter abbreviations and I just think “why couldn’t you spend the extra 2 seconds to give the variable a proper name and make your code much more readable”. Verbosity comes at the cost of speed of typing, but I find that Cocoa does that much for me that it helps make up the time. Plus the extra verbosity means you rarely have to look up a method to find what it does.

    It all comes down to a matter of preference, I prefer Obj-C code, I find it infinitely more readable than any other language I’ve seen.

  50. Jim Thompson Says:

    You youngsters will eventually “discover” Common Lisp.

  51. Daniel Jalkut Says:

    Jim: Haha! The bad news is I haven’t really looked at lisp since I hobbled my way through a Scheme class in college. The good news is, “learn lisp” has been near the top of my programming self-improvement list for a few years now. And I think it’s on a lot of our lists :)

  52. Bret Says:

    >> you’ll be AMAZED at how good the source code output is.

    No, actually, I wouldn’t – theoretically, if the CPU can do it, somebody can figure out what’s going on. My point wasn’t that “it’s not possible to reverse engineer bytecode” – because it most certainly is. It’s also possible to do similar things with compiler-generated machine code.

    No – the point is, “Will the PHB veto this?” — and for uncompiled languages, I think the answer is “Yes!”. The perception of vulnerability is just too high (regardless of the facts).

    But, like I said above, throw in a compiler, and I think the perception issue can be surmounted, as it now looks to the PHB just like any other toolchain.

    >> In any case, I disagree that using “scripting” languages for commercial app development will fail due to it being easily decompiled (or reverse engineered in other ways for that matter).

    As far as the PHB is concerned, reverse engineering != trivial copy and paste.

  53. Bret Says:

    >> I prefer Obj-C code, I find it infinitely more readable than any other language I’ve seen.

    Me too. I like verbosity. And with code-completion, I’m still not doing all that much typing. Sticking to patterns helps also (and there’s also the issue of egronomics – if your keyboard is driving you nuts, anything that makes you use it is also going to drive you nuts. I have a pile of different one’s that I’ve tried, just to find a good one – I’ve probably spent maybe 500 bucks over the years on input devices.)

  54. Jake Says:

    Dear Deity,
    Please forgive these folk that jump on bandwagons and talk utter rubbish for they know not what they do.
    When You created PHP and Ruby they had a Purpose, and that Purpose was not to be trumpeted as THE NEXT BIG THING, but as an addition/improvement to the tools already available.

    Too many people call themselves “programmers” because they have dabbled – you know the ones, people who use Dreamweaver or something similar and then mess around with PHP for a bit and then think they are l337.

    There are a million reasons for not using Ruby, Python or scripting languages – they will not always be the best tools for the job. IMHO they arent often the best tools for the job, but then we all have our own opinions. If you’re writing something in Ruby and having to use C to “hack” it to make it faster then you’re going wrong from step 1.

  55. Michael Says:

    Worth noting, if just as an aside, that there is a world full of systems developers out there for whom C is high level, and Assembly is still a day-to-day language. Our company produces network switches and host adapters. In such an environment, you need a real-time language (read: no GC threads interrupting deterministic behavior), and often need the ability to drop to Assembly for controlling various types of CPUs and microcontrollers. Of course, there’s always the OS kernel as an ever popular example, too. C++ is a systems language, but sys devs aren’t particularly aware of OOP principles, and much driver and embedded code doesn’t map well to objects anyway (interrupts and kernel hooks and such). (keep an eye on D though… )

    Point being that in some circles, Assembly is still very much alive and C is still thought of as high level. Lets all take a moment and thank the embedded programmers ’round the world for allowing app devs to think of C as Assembly 2.0.

  56. Joe Goh Says:

    Michael: Hear hear! I totally agree with you, in the midst of all this talk of C as assembly, there’s a whole world of embedded programmers being forgotten.

    I remember helping to debug a bug in some custom firmware when I was still in my previous job with a firmware programmer and starting at the ASM listings and then thinking of how incredibly high-level C seems at that time. After that experience, I learned to never take the reliability of hardware or firmware for granted.

    But then again, there are those even that will see ASM as high level, like CPU designers.

    So i’m with Michael, let’s forget about squabbling over which language is on a higher level than other languages, its just going to go nowhere. Let’s just use whatever gets the job done for us and thank all the fellow engineers and developers out there, sweating over the things that we depend on for our work daily.

    BTW, never take your compiler for granted (i’m feeling the pain now, as i’m currently struggling with a bug in my app that’s possibly due to a compiler bug). What a coincidence to run into this sort of thing just as this is becoming a hot topic eh?

  57. Reinier Zwitserloot Says:

    C? Sure. Has been for some time now.

    Objective C with the Cocoa libraries and Core*… not really.

  58. Paul Says:

    Where C shines as the “new assembly” is as an intermediate language. Most exotic (particularly functional) languages that offer native code actually compile to C and then call gcc internally to produce a binary.

    This has lots of advantages: it’s much easier to generate C than any machine language and it gives instant, optiized support for just about every CPU there is (yes, ghc does a heck of a lot more of its own.) Such generated programs tend to be larger than programmer-written C, but often faster as well.

    One huge time saver in C is the ability to inline assembly code: my ideal system would allow inlining C/C++ directly into a high level language, to be copied more or less directly into the generated code, with special macros to provide direct access to data in the “real” language. This could allow direct calls to external libraries (avoid the need to create / learn a GUI bridge at all for those so inclined) and allow small bottlenecks to be addressed with snippets rather than the agony and misery and overhead of interfacing a separate external library of complete functions. In principle there’s no reason why any number of straight C files couldn’t be linked into such a program and called directly (the run time libraries usually are.) This, for me, is the sole interesting idea behind MS’s .NET and CLR, though implemented differently.

    I understand Ruby has this ability to some extent. It seems like a very highly desirable way to go.

  59. Mike Medvinsky Says:

    Although I share the authors view, I am not sure that the problem is exactly within a language representational form but is within a runtime. The question is not weather the language is C or Python or Ruby but how the semantics of the language yield themselves to the optimization as the code is compiled from AST to a machine executable state machine.
    Ruby, for example, because of its dynamic structure and flexibility will be very difficult to automatically compile into an optimal state machine that the native hardware can efficiently execute. Java is a bit more structured, but the general memory management solution and inability to deal with pointer arithmetic directly puts a cup on performance of is runtime as well.
    Generally I would say that performance of a language depends on 3 major things. One How well does its structural semantics coexists with optimization algorithms, two the level of isolation from hardware and three how is memory management dealt with.
    It should be perfectly possible to convert a Ruby program into C code and even generate general purpose assembly out of it, but will the dynamic nature of Ruby permit for real optimizations to take place? Think about how would you implement closure in assembly… and will it be fast?

    Mike

  60. Anders Says:

    Jim Thompson:

    Have to agree with that statement. C has long been my language of choice, but Lisp has captured that position now. And that’s despite having used Ruby and Python a fair deal.

  61. bogdan77 Says:

    Um… what about AppleScript? As a dabbler in AS, I have to say I like its syntax a lot, but I wonder why Apple isn’t making it more powerful, with all kind of operations on lists (remove duplicates, for example), and regular expressions included in AS. Why not, Apple?

    I’m working to extend TextEdit for my needs, to grep styled text with AppleScript, but I have to resort to do shell script constructions, which for even one page of text is slowing things down terribly.

    —–Sorry if this is off-topic for you real programmers. I just feel there is big potential in AppleScript if it would be made even more easy to work with :^)

  62. Paul Says:

    > Um… what about AppleScript? As a dabbler in AS, I have to say
    >I like its syntax a lot, but I wonder why Apple isn’t making it more
    >powerful

    AS is a better language than people give it credit for. A high % (I suspect) aren’t aware of all it can do besides automate programs. Automation is actually not its greatest strength: historically most developers don’t make the (enormous, even with PowerPlant) effort to make programs usefully scriptable. AS is heavily used with specific programs whose makers invested in the effort and documented how to do it (Quark scriptability has long helped Apple hold onto desktop publishing.)

    I once wrote a quicksort in AS to alphabetize icons in a Finder window while preserving layout: getting the notoriously buggy, unpredictable, and undocumented Finder to cooperate was by far the hardest part. VBS, while a royal pain in its own right, is at least reproducible most of the time.

    Extending AS is hideously difficult (at least as of a few years ago), largely due to the complex interface and near total absence of documentation. It’s no accident that most third party AS tools come from companies started by ex-Apple engineers.

    Apple’s own efforts (AS Studio) notwithstanding, I’m very leery of their commitment to any technology, including AS, if anything comes along that looks like a viable replacement (and there are some, for Cocoa programs) and can be called a “standard” in ad copy. I was lucky enough not to get burned too badly by QuickDraw3D or OpenDoc, mostly because I trusted my instincts and not the press releases.

  63. Jim Thompson Says:

    http://xkcd.com/c224.html

  64. init.six Says:

    C is to Assembly as Perl is to Python

  65. Sid Millspaugh Says:

    I use C/C++ on a daily basis, what do I do? Game Development, and while I use C# as a scripting language for many things (mostly gameplay code) lower level languages just can’t be beat for the core technology.

  66. Juan Sebastian Says:

    C, OS X, SLACKWARE and EMACS FTW….!!!!!:D:D:D:D:D:D

  67. Rudi Cilibrasi Says:

    For my scientific software (linked) CompLearn, I use a combination of C and Ruby just the way you hypothesize. As a scientist I have been using this pair of languages together to good effect for rapid prototyping and easy experiments. You can see (and download C and Ruby) the results on the website. It is also nice to be able to profile and replace the innermost loops with C as needed. I recommend it wholeheartedly as a strategy moving forward.

  68. Ned Baldessin Says:

    About spam filtering on the iPhone : this is when server-based filtering becomes necessary. I’ve switched my email hosting over to Google, and the spam filtering is great, I get less than one spam per week, and best of all, I don’t have to download all the spam messages.

    Maybe this will push the trend of hosted email infrastructure.

  69. wikepedist Says:

    “while the higher-level code would be just as fast and easier to maintain.”
    I find that false in both ways:
    1. Higher level code ( python,perl ) isn’t anywere near the speed C can bring.
    2. Higher level code would be much easier to maitain.

  70. John Says:

    Oh hmm, my first attemp was eaten :(

    So again, there’s not only C or ObjectiveC, there’s also object pascal 32/64bit and this compiler is also multiplatform, more info at http://www.freepascal.org

    Oh and small example program (Photoshop? hint!hint!): http://www.kanzelsberger.com

  71. Patrick Coston Says:

    What you need is a language that combines the power and high performance of C and C++ with the programmer productivity of modern languages like Ruby and Python.

    Oh wait … it already exists! It’s called D!

    http://www.digitalmars.com/d/

    Java, C#, Python, Ruby and D all derive from a C so its very easy to jump from C to these other C-like languages unlike trying to learn Lisp.

  72. William Woody Says:

    C, the new assembly?

    Sure. But C was always the new assembly. The C language was designed so that each statement translated into a few machine language instructions–often just one–and has been referred to for years as a ‘low level language’ because in many ways it’s as much a portable macro assembler language without the assembly as it was a ‘high level language.’

    A good C programmer familar with the ABI of the platform he’s writing code for should be able to intuitively know what the assembly output will be for his C code. And while C had the advantage of being a portable language, in this case, “portability” is limited by a few dozen rules that all have to do with the target architecture of the system he is writing code for. (For example, is ‘-3 % 4′ a positive or negative number? Well, that depends on the target architecture.)

    The only reason why we can now talk about C as the new assembly language is because computers have become powerful enough that we can talk about writing performance sensitive code in a high level language such as Ruby. And it’s also one of the reasons why I hate C++ templates: because it violates the rule that C (and C++ without templates or exceptions) should translate everything you write into a few (and intuitively easy to divine) machine language instructions.

  73. Belig Donner Says:

    “C has libraries.”

    Barely.

  74. guitarm Says:

    I’m an old fart trying to cope with the new technologies and I must say…I’m all for trying to abstract away alot of the “old” stuff.
    But…and this is (my) big butt…
    Who will want to understand the underlying things of it all…misunderstand me the right way here…I’m not saying everyone should understand kernel stuff in assembler. What I’m saying is the more experience you have…be it high, or low abstraction stuff, it’s always a benefit if you get “the whole picture”

  75. kokorozashi Says:

    You want the big picture? Try this on for size:All languages in common use today are done. Clock speeds have stopped increasing and the future is about parallelism. None of today’s languages is up to the task of expressing parallel notions, which, for the sake of clarity, are more difficult than threads, which are already difficult enough. It’s going to be an interesting ride. Eventually, the C of parallelism will appear. It isn’t going to be any of these dynamically typed scripting languages which scale terribly and are really just ways to aggregate extensions written in C. Most people won’t have expected it, but suddenly it will be here after Pixar reveals it divided the rendering time of their latest feature by ten.We now return you to your regularly scheduled complacency. :-)

  76. Kevin Kitching Says:

    Even when I learned programming in ’89, assembly was something you learned because you might still need it, but avoided if you could…mostly because even then tightly written C could do most things nearly as fast, and with 1/10th the coding time.

    That’s not to say that one of my class projects, a programmer’s text editor, didn’t contain alot of assembly … we were required to use only modules we developed ourselves. And some of the base functionality … pop-up menus and the like … had to be written in assembly. But the top level stuff, linked lists and file system stuff…no way. I’d still be banging away at the keyboard of that XT today trying to get it right.

    Assembly’s still a good club to have in your bag, if for nothing other than background.

  77. Paul Says:

    >And it’s also one of the reasons why I hate C++ templates:
    >because it violates the rule that C (and C++ without
    >templates or exceptions) should translate everything
    >you write into a few (and intuitively easy to divine)
    >machine language instructions.

    I’ve been saying for years that C++ has lost its way: the standards committee is besotted with trying to make C++ something it was never meant to be, by bolting on one kludge after another. The best example of that is boost.org, which does things in C++ that seem impossible, at the expense of long compiles, unreadable, unmaintainable, and especially undebuggable code that’s a few lines shorter but takes longer to write.

    (I do respect the the inventors of template metaprogramming, Design Patterns, and Processes, though: they’re the ones who saw the future early and created new territory to be highly paid gurus of, or tenured teachers of.)

    What comes out of this discussion that’s hopeful is that a lot of people seem to realize that, and are looking for something better.

  78. Robert Says:

    You still need Assembly when you work with micro controllers with 1kb of memory or so :)

  79. Jon Says:

    Looking for something better (than C, C++, …) could start here http://www.ffconsultancy.com/free/ocaml/

  80. Scott Andrew Says:

    We use C++ and C for commercial Windows UI development. On the mac we have Objective C and C++. We have looked at things like XAML/C# for Windows and the problem with those languages are two fold, at least in our programs. Our engine is all crossplatform C++ for video encoding and such. In a higher level language you have to write a wrapper which adds overhead and performance cost if you are moving over that boundry repeatedly.

    These langauges are also in effecient with memory. C# will keep eating memory until it can’t then garbage collect. There is something about having control over my foot print and being able to clear memory as needed that is nice. XAML has huge overhead on the windows side and is fairly slow. Quite a few things were hacked on XP from what i understand. We’ll take our C++ WTL based framework right now over C# 3.0 and XAML for high end apps.

    As far as is C++ I think yes it is to a point. It also has its place when doing basic applications if you are looking for crossplatform support. Our engine is all crossplatform C++. I can take any of our components in the engine and complie it for windows/mac/linux/bsd… C++ won’t go the way of the dinosaur, interpretive languages have their place, but they aren’t efficent enough for large scale commercial applications yet, I don’t think.

  81. L Says:

    To the guy who praised AppleScript’s syntax:

    lol k

  82. Coders2020 Says:

    Some amazing informative comments here. Post bookmarked and I am glad that I come here. And as already pointed out, C can be an alternative not a replaement. You will still need assembly when you work at microcontroller level

  83. Aakash Sharma Says:

    Can anybody tell me that in which microprocessor 8086 or 8088 a far procedure call will execute faster and why???

    Send me reply @ akkuf117@gmail.com

  84. agnosticj Says:

    The 8086 has a 16bit bus, much larger than the 8088′s 2-nibble bus. All things being equal (same code), the 8086 will do it faster.

Comments are closed.

Follow the Conversation

Stay up-to-date by subscribing to the Comments RSS Feed for this entry.