Development Phase Code Signing

June 29th, 2008

Code signing is a technology for associating a cryptographically secure signature with your application’s executable code. This signature makes it possible for the operating system or other services to make confident assumptions of authenticity based on the unique signature which you’ve supplied.

Thus, code signing is a technology which is rather useless in itself, but which can be utilized by other services to achieve specific goals. Starting with Mac OS X 10.5, Apple built some features into the operating system which take advantage of code signing, including some features which make accessing secure items from the keychain less of a nuisance.

In particular, consider the scenario in which an application asks permission from the user to access items in the keychain. Once the user gives permission to “always allow,” the application can thereafter obtain a secret value from the keychain without user intervention. This is very handy for internet passwords, etc. But once the application changes, e.g. a new version comes out, the user must be asked again to permit the access. This is to ensure that malicious code hasn’t been snuck into the application.

But with code signing, the users permission can be expanded to cover any release of the application that was signed by the same certificate. The idea is that if users approve one version of an application, then they’ll likely approve the next version, as long as they are guaranteed it came from the same developer.

I have so far not signed any of my publicly released applications. I may do so soon, because these conveniences would remove one more tedious task for users who are updating to the newest versions of my applications.

But the user who has it worst of all is me myself, as I’m constantly revising the application, leading the system to request new approvals every time my changed application accesses the keychain. To get around these constant requests, I finally decided to at take advantage of code signing during the development phase. What does that mean? It means I’m committing to signing my own code for my own internal purposes, but not yet committing to signing publicly shipped products.

From Zero To Code Signed In 300 Seconds

With Xcode 3.0, code signing is easy, but takes a little work to set up. Since I just went through the somewhat tedious task myself, I thought I’d share with you how it’s done easily and predictably:

  1. Establish a self-signing certificate. Apple offers simple instructions for how to do this with the Apple-provided Keychain Access utility. When you create the certificate, you’ll be asked to give it a name. Name it whatever you like, for instance “My Cert”. The certificate is stored in your keychain, where it can be easily referenced by name.
  2. Add a custom build phase to Xcode. Xcode will not automatically sign your code, but it’s easy enough to add a build phase that does. Select your target in Xcode and choose “Project -> New Build Phase -> New Run Script Build Phase” from the main menu. Paste in the following script:
    if [[ ${CONFIGURATION} == "Debug" ]]
    then
    	# -f means force to re-sign even if already signed
    	codesign -f -s "My Cert" "${BUILT_PRODUCTS_DIR}/${WRAPPER_NAME}/"
    fi
    

    Observe that the “My Cert” text can obviously be changed to whatever you ended up naming your certificate. Also notice that because of my hesitation to ship a code signed application yet, I am restricting the action to only the Debug build configuration. If you’re ready to go public, just remove the test and the code signing will happen for all build configurations.

  3. There’s no step 3! Enjoy.

Constantly approving those keychain approval dialogs is one of the things that I just fell into the habit of tediously doing. Thanks to Apple’s code signing efforts in 10.5, there’s no longer any reason for me to do that. Hopefully if you use the keychain in your own development products, you’ll find this productivity upgrade useful as well.

37 Responses to “Development Phase Code Signing”

  1. Bruce Adcock Says:

    Interestingly, you state the benefits of code signing, and don’t give any downsides. But you seem on the fence about making the released version be signed. Is there anything holding you back from doing code signing with MarsEdit, for example?

  2. bg Says:

    Presumably this is how the code signing will work for iphone apps? – with the certificate being supplied by apple, i assume.

  3. Jeremy Knope Says:

    It looks like it’ll be easier once Xcode 3.1 is final too, as the beta lets you code sign by specifying the name of the certificate to use. Was super easy to get started and really nice to not have the annoyance of always approving the application upon keychain access while developing :)

  4. Daniel Jalkut Says:

    Bruce – my reasons for hesitating to ship a version with code signing in place is mainly just that I don’t completely understand the implications of all the options you can specify with the codesign utility. I generally prefer to be safe rather than sorry :) Though often I end up being neither or both!

    One downside I can think of is that if you sign the application with a self-generated certificate, and later find some reason to instead use an authoritative certificate from some trusted source, you’ll have to change the certificate and that will cause a warning for users to reappear. Of course, this might just match the existing warning that users get whenever they run an update app.

    Long story short: I don’t think there are many downsides to code signing, but I just don’t want to do it until I have a better complete understanding of all the implications. The fact that I can get some value from code signing during development appears completely risk free and therefore I have no hesitation to do it.

  5. leeg Says:

    With my security hat on, the weakness of signed development builds is the gap between your development and deployment environments. If you ever accidentally released a development build and it wasn’t signed, people would kick up a stink (after all, your app hasn’t requested access to the keychain since you started signing, and now it does… fish-like odours abound) and you would notice and replace with a production build. Of course, if you ever accidentally released a _signed_ development build, no way to tell until people notice the _objc_class_RSFixForThatNastyWhiningCustomer symbols ;-).

    That points to either using different identities for development and deployment builds or signing only in the deployment process. It definitely points to checking as part of the deployment process that the app is signed with the expected identity:
    codesign -v -R=”identifier com.example.organisation” NewApplication.app

  6. Martin Pilkington Says:

    I started doing code signing on my new app earlier this week. It accesses the keychain every launch, so being able to get rid of the warnings has sped up my testing so much. I mean, it’s only a few seconds, but those few seconds start to really add up, especially when the dialogue takes a little while to show up.

  7. alexr Says:

    Real certs for this purpose cost _way_ too much. I always expected that someday Apple would provide a reasonable signing authority for their OS X Server customers, but code signing is an even more obvious place for them to provide this service.

  8. Histrionic Says:

    I’m really encouraged to see this. I hope more Mac developers start considering signing their applications.

    If developers don’t, the OS will perform some signing steps for users. This makes it trickier to deploy unsigned apps in larger organizations, so it’s preferable to have them signed by the original developer.

    One comment suggested using two different identities to sign development vs. deployed code. If those are intermediate identities that are both chained to the same root CA, you should be able to get the same benefits. Only use the master to create intermediates. If needed, you can revoke one of your intermediate code signing CAs without revoking the master.

  9. Matthew Schinckel Says:

    I guess the other reason not to sign deployment code is that you are signing it with a self-signed certificate – which a user then has to say is “Trusted”. This is more of an issue than just allowing an updated application to access the keychain (I think), as it might allow for other issues to appear – possibly.

    It seems conceiveable that without choosing “Always Trust”, which requires Administrator privs, then the keychain would still whinge each time it tried to use the untrusted certificate to access the keychain item.

    It all depends on how the trust stuff works, and I don’t really know that much (other than it seems impossible to get the WSMethodInvocation to send requests to an SSL server that is using a self-signed certificate, even if I set it to trust this…)

  10. Ben Says:

    I think Histrionic alluded to this in his comment, but there is another benefit to signing code – it helps system administrators in managed client environments. Workgroup Manager 10.5 now uses the application’s signature instead of it’s (insecure) bundle ID to identify it when determining whether a user is allowed to launch it or not.

    If you do not sign your apps, Workgroup Manager has the ability to create a signature, but of course it must be updated every time your application is updated. It can also lead to difficulty when not every machine on the network has the same version of your app.

    Make sysadmins happy and sign your apps, folks :)

  11. Julil Says:

    I hope there will never be a signed public build of MarsEdit. Not to be political, but i much rather click ” change all” every now and then.

  12. leeg Says:

    +1 to what Ben said. Also the built-in application firewall on 10.5 uses the signature to identify ‘upgraded’ vs. ‘replaced’ applications.

    Julil: why? :)

  13. lee harvey osmond Says:

    Omni Development have been releasing OmniWeb 5.x with a self-signed certificate for some time.

    CA-signed code-signing certificates can cost as little as $200 per annum. That’s more than a domain-validated SSL cert usually costs, and more than Apple wants for an iPhone development cert. A wisely-chosen cert can be used across an organisation for signing other code, notably J2ME J2SE J2EE and WIn32.

  14. Julik Says:

    Sorry, mistyped my name on a jailbroken iPhone (speaking of code signing).
    Signed code is evil. Too long to explain but I would like my software to stay unsigned for as long as possible.

    Firewall/Keychain confirmation is not a problem for me, I’m all in the mood to clickety-click for as long as necessary, but I find the whole idea of “signed code” endangering the ecosystem. An app on my computer is free to come from absolutely anywhere, and if it’s a trojan it’s my responsibility that it ever gotten there. I see no protection in some Authority saying that a particular app is “legit”, and I see a great hurdle in allowing some Authority to confirm that an application is “legit” by someone having paid for it.

  15. Martin Pilkington Says:

    @Julik: You can create a self-signed certificate using Keychain Assistant so there’s no cost involved. Of course this just means that the app came from the same source. Code signing isn’t mandatory and isn’t costly and all it really does is get rid of the “allow access to my keychain” messages. Some people think ‘code signing’ and see “I’m not allowed to modify this app in any way whatsoever” which, while in some cases it can be true, isn’t the case on the Mac.

  16. Paul D. Waite Says:

    > Signed code is evil. Too long to explain

    Don”™t worry, we”™ll just take your word for it.

  17. Julik Says:

    Well, don’t take my word for it. But think about this:after an OS introduces code signing, at some point it becomes mandatory to be signed to get access to some API or functionality.

    You want to write an input driver for Symbian (like a new handwriting recognition app)? You gotta be signed. No cert? no access to input managers. I had to buy a cellphone which was 3 years old because there was no way for me to install a Russian keyboard driver on it, simply because the developers needed to pay just short of a thousand bucks for a nothingness of thin air to “sign it.” Reason? The only way to get an input driver in there is to reflash the phone with a specific version of the firmware.

    Substitute “input driver for Symbian” with “driver for my obsolete Wacom that I got from Sourceforge – it needs direct USB”, or “A preference pane to start the open-source webserver that I got – it talks on a privileged port”… and you will see why even the sheer technical possibility of this existing makes me worried. Look at other platforms at hand: you want to distribute on iPhone? Nice, you get there only if you are signed by the Authority. Want to develop with AIR? Nice, you get there only with a cert from Adobe (and should they come to not like you in any way your cert might be revoked any moment).

    Sorry, but this fits my definition of “evil” very well.

  18. Ben Says:

    Sorry, but I don’t think your reasoning is sound. In 10.5, the infrastructure is already in place for code signing. If Apple wanted to flip the kill switch on unsigned apps, do you think a few third-party apps that refused to sign would dissuade them?

    Since signed code is here to stay on the Mac, there’s no reason not to take advantage of the benefits. If you are really dead set against any and all forms of code signing, you should downgrade to 10.4, where no one will ever be able to force you to sign anything.

  19. Jean-Daniel Says:

    @Julik Says: The issue you describe applies only if Apple choose that only code signed by Apple can be loaded (like for the iPhone).
    But on Mac OS X, it’s not the way they are likely to choose (according to Apple engineers), but more: if the signing authority is trusted, allow the driver to load.
    So, to install your sourceforge driver you will just have to install the developer root certificate in the trusted root list in Keychain.
    It will allows Administrator to choose for example to deny all drivers that was not signed by themself, and is a very powerfull features.

    This article is very interesting, but you should note that unlike App signing, Framework signing is broken in OS X 10.5.3. If you run the codesign utility on a framework, instead of signing the binary, it will replace the symlink in the root folder by your signed version (and let the original binary untouch).

    I’m not going to ship signed product until Apple fix this.

  20. Justin M. Says:

    I’m curious what will happen to all of the apps that currently allow a “run dockless” mode by modifying the Info.plist in the app bundle. I wrote about this a while ago and it’s one of the reasons I’m not using this common method to implement this feature.

  21. Justin M. Says:

    Sorry, meant to say, “wonder what will happen when/if code signing becomes more prevalent or is made a requirement”.

  22. leeg Says:

    @Justin M.: they’ll use a preferences domain to store their preferences instead?

  23. Justin M. Says:

    @leeg: I should clarify, or summarize my blog post: It doesn’t matter where you store the preference; the Info.plist has to be changed to make the application either use or not use a dock icon. And combine this with (to keep this relevant) code signing and the app is considered tampered with if/when you change the Info.plist. There are some other approaches to dockless mode (which I think I cover comprehensively in my post), one of which I am using in a shipping application, but with the result that AppleScript is hard to implement.

    Anyway, that conversation should probably continue there, so I’ll wrap up by saying thanks to Daniel for this post — it’s something I’ve been meaning to check out and I’ll definitely be implementing it for my Debug builds.

  24. Peter Maurer Says:

    Justin’s point is the only downside I can think of. And that very issue affects two of my apps. Let me put it this way: Everytime you code-sign an application, a little Scrubber dies. ;-)

  25. dave Says:

    One thing I personally think is a negative for code-signing is that it prevents the end-user from modifying the app. For example, one app I use a lot has a lot of open windows, and using keyboard shortcuts, I would accidentally quit the app (cmd-Q) instead of closing the window (cmd-W), and it would take a bunch of time to get back to the same ‘state’. If the app was signed, I couldn’t edit the nib file to remove cmd-Q from the Quit menu item…

  26. Nathan Duran Says:

    The biggest downside to Apple’s implementation of code signing for me has been the dependence on the utterly bug-ridden Keychain Access application itself. Why can’t I just give the codesign utility the path to an OpenSSL generated CA and whatever all else it needs rather the nebulous name of a loosely-defined construct known only as an “identity?” Better yet, why can’t XCode have a simple “Sign build results with…” button somewhere amongst its labyrinthine options panes?

    Combine these issues with the documentation, which runs the gamut of “incomplete” to “completely inaccurate,” and you’ve got a wonderful recipe for my complete lack of interest. I am personally of the opinion that “Perry” and the other security folks with little to no usability experience should simply never be put in charge of anything that any user would ever have to interact with directly.

  27. Darren Says:

    @dave,

    I don’t think user-editable nibs is a use case most developers are keen to support.

    Also, couldn’t you use the Keyboard Preferences to change the Quit command shortcut for that particular app?

  28. Damien Sorresso Says:

    @Julik: Your arguments might be valid if Mac OS X (a) would only allow you to trust certificates signed by a CA such as Thawte or VeriSign, (b) didn’t allow you to create your own signing authority or (c) didn’t allow you to trust whichever CA or self-signed root certificate you wanted.

    Since none of those things is true, you’re just being paranoid. Yes, Apple engineers have stated that Mac OS X will move to an all-signed environment eventually, but Mac OS X and iPhone are not the same thing. In the Brave New World, having a CA-signed certificate may confer certain advantages (such as the user not being prompted to explicitly trust your code), but I doubt the user will be prevented from running code that’s not signed by a trusted anchor.

    As to users editing their nibs, after making the edits, they can re-sign the app themselves, with their own root certificate, if they really want. For anyone mucking around with nib files, this shouldn’t be too much of a problem.

  29. Martin Pilkington Says:

    There is a very key point to be made here. When a code sign fails, all that happens is OS X reverts to the old way of doing things. An app doesn’t break if you modify something, it will still run. To quote Apple:

    “It is not a digital rights management (DRM) or copy protection technology. Although the system could determine that a copy of your program had not been properly signed by you, or that its copy protection had been hacked, thus making the signature invalid, there is nothing to prevent the user from running the program anyway.”

  30. Rich Says:

    Will deleting unused languages result in an app being considered tampered with?

  31. Smokey Ardisson Says:

    I’ve seen very conflicting reports about the case Martin Pilkington describes just above (when code suddenly fails to match the signature).

    On the one hand, there’s that quote from Apple (which really addresses only the application running, not the application’s access to code-signature-gated services in the OS), where all is well, or as it was pre-code-signing, in the case of a failure to match.

    On the other hand, there were the reports of applications (e.g. Safari) being completely unable to access data in the Keychain after they had done something to the application to make it fail to match the signature.

    I’ve not tried any such modifications myself to see what happens, but given the number of reports and utility applications which changed their behavior after 10.5 shipped, I tend to believe the scenario in the latter hand. Further, if the former scenario really is the case, then there’s not much user protection from maliciously tampered code in the Keychain and Firewall using signatures; until the day when all code is signed and granting Keychain access only happens once per app, many users will simply think a maliciously tampered app is just prompting for access “again,” “as usual.” There go those passwords.

    Ideally these types of modifications (plist tweaks, nib changes, delocalizations, etc.) are discouraged, but up to now they’ve generally been harmless and users still expect the modifications to 1) be functional and 2) not screw up apps’ access to the Keychain. There’s tension to be resolved here, and I think a certain segment of the Mac user community will be unhappy with the side that will emerge on top.

  32. Perry The Cynic Says:

    Daniel:
    Thanks for giving it a try. Code Signing really is pretty painless.
    Apple does not recommend that you publish with a self-signed identity. You can, but it removes a lot of flexibility from the system. At least, make yourself a CA and issue one signing identity from it. (Five minutes in Certificate Assistant.) That preserves all the flexibility of the X.509 certificate system and lets you expand your CA later without invalidating existing signatures. For testing, of course, a self-signed identity works just fine.

    Note that for keychain use, there is no advantage to getting a paid-for identity from a commercial CA. The keychain does not care whether your signing identity is system-trusted or not; it only cares whether it remains stable. A home-made CA works just as well for this. (That is also true for Parental Controls.)

    Don’t let all the options in codesign(1) scare you – most of them are for special cases and unusual situations.

    A frequest question is how you can separate test-signing from “real” (release) signing. Commonly, people use two different CAs (or a CA and a self-signed cert). That means, of course, that the system doesn’t believe that a test-signed program is the same as a release-signed one. Depending on your situation, that may or may not be a problem. If it is, there’s keychain ACL tricks to make it work (at the price of a bit of complication, of course).

    @Alexr:
    Right now, Apple provides developer code signing identities only for the iPhone (and Touch). Those happen to also work for signing Mac OS X code, though they’re not really meant for that purpose.
    If you expect Apple to run a free CA for its developers, don’t hold your breath. It costs quite a bit to run the infrastructure (and keep the lawyers happy). If you want free, make your own CA. It’s not that hard (using Certificate Assistant).

    @Matthew:
    This is a very common misconception. You do not have to “trust” any certificate for the system to recognize a signature made with it, unless the verifying agent explicitly requires that. Neither the keychain nor Parental Controls do this. Whether the CA (anchor) is trusted on the end-user system is completely irrelevant to these subsystems, and neither you nor your users can tell the difference.

    @Julik:
    You see no protection in a developer being able to say, “I really made this program, and it hasn’t been hacked since it left my computer”? This isn’t about Apple saying anything about code. It’s about giving developers a reliable way of asserting the identity of their code, and giving you a reliable way of asserting policy on your system about who you want to trust. The software just connects the two of you and referees your interactions.

    @Jean-Daniel:
    Frameworks are “versioned bundles”. You need to sign (and verify) the particular version (usually …/Foo.framework/Versions/A). Yes, that could be clearer, and more fool-proof.

    @Justin M.:
    Self-modifying programs were always broken by Apple’s standards. The golden rule has always been that if it writes into your bundle, it’s a bad idea. (What if you’re running off a read-only file server, or (shudder) a CD-ROM? Yes, schools still do that.) Code signing just doesn’t let programmers get away with it anymore.

    @Peter Maurer:
    Sorry about the little Scrubber. I can’t think of a way to keep it going without adding a feature to the Mac OS that would make it redundant…

    @dave:
    Code signing does not prevent the end-user from modifying code. It only prevents the end-user from changing code and then pretending that it hasn’t changed. If you want to take responsibility for your changes, make them and then re-sign the code with your own identity. (Yes, you can.) Of course, then you’re the one vouching for the code with your own identity and taking responsibility for tracking updates and all that jazz. But that’s as it should be.

    @Smokey Ardisson:
    When an application’s signature breaks (or it invalidates for a dynamic reason), it stops having an identity at all. That doesn’t (by default) kill it, but it keeps it from trading on that identity to get stuff, including keychain items. That means it needs to fall back on other means, such as user dialogs.
    Leopard as delivered has a switch thrown “on” that suppresses keychain dialogs for signed applications whose signature is broken. That’s a UI decision that was made on “user confusion” grounds, and it’s certainly debatable. If you disagree with it, remove the -i flag from /System/Library/LaunchDaemons/com.apple.securityd.plist (and reboot), and you’ll get a nice dialog telling you that the application has a broken signature, and do you want to allow access anyway (this time)? You can’t “Always Allow”, of course, since there’s no reliable identity to record for the future.

    Cheers
    — perry

  33. Felix Schwarz Says:

    One unfortunate side effect to the use of code signing under Leopard is that, if

    1) you write an application with an embedded server

    2) the user needs to add an exception rule for it in Leopard’s firewall in System Preferences

    3) the application bundle then gets modified by the user (by any of various third party apps that promise to “save disk space”, make an application “dockless” or otherwise modifying your application’s bundle)

    the user will end up with this (at least up to 10.5.3 – I haven’t tried with 10.5.4, yet):

    1) The firewall rule is still displayed in System Preferences

    2) But as the code signing of your app’s bundle is now broken, Leopard ignores that rule. All incoming connections will be blocked by the firewall.

    3) Removing and re-adding that rule will not change that. Your application’s copy is now permanently unable to receive incoming connections until the user either reinstalls your application (so that code signing is intact again) or deactivates the firewall altogether.

    I hope Apple will handle such cases more gracefully in the future, i.e. by indicating broken code signing in the preferences, so users have a chance to identify the issue rather.

    Cheers,
    Felix

  34. Jean-Daniel Says:

    @dave:
    Modern nibs are no longer editable, and code signing has nothing to do with it. Removing informations required to edit the nib significantly reduce the distributed file size. Try to compile a xib file and then to open it for example.

    @Perry The Cynic
    Thank you for this very interesting comment and for your answers. I understand the need to specify which version you want to sign in a Framework, but it prevent usage of one single script to sign all bundles :-(
    I Hope the Xcode 3.1 integration will solve this issues.

  35. dc Says:

    @dave, @jean-daniel:
    “I don”™t think user-editable nibs is a use case most developers are keen to support.”
    “Try to compile a xib file and then to open it for example.”

    This is the thing that pisses me off most about .xibs. I can understand the utility of space saving on the iPhone, but on the desktop making .nibs write-only is super annoying. Nib hacking is one of my favorite OS X pastimes. (Especially since there’s no good technical reason why IB shouldn’t be able to open compiled .xibs.)

  36. Daniel Jalkut Says:

    Regarding compiled xibs and their alleged “unopenability,” this is a default setting in Xcode for compiling nibs, but can be overridden. I am using some xibs in my projects, and have them set to generate normal, openable nibs.

  37. Ilgaz Says:

    I think if your application has server capabilities (so Leopard prompts), you should tell the user not to apply “clean languages” (can contact Monolingual team to add it to ‘black list’ too, or even edit source it is OSS) or change its icon, remove architectures etc.
    As a person always cleaned up languages, I resist doing it on Leopard. I liked the scheme and the fact that it is not a commercial trap (self signed is equal) like Windows one and unless the application modifies its own .app dir, it really works flawlessly.
    Another approach to prevent people from cleaning(!) languages is not to include them. 1 Password (Agile) current version says “We removed languages from download” but they promise a way to let users have their languages included in future in updates.
    Or Omni’s approach, their actual stable versions always have “English only” and “International” versions. Note that they had that scheme for years, long before Leopard.
    Another thing to note which I noticed as a end user is, Leopard HFS+ handles lots of directories, files in a way that older OS X could never do. The B-Tree fragmentation etc. are way lower. That is another reason not to bother with “language cleaning”. For cleaning languages, my reason was to keep the boot disk as simple as possible. Apple somehow does way better directory handling so I really don’t care.

Comments are Closed.

Follow the Conversation

Stay up-to-date by subscribing to the Comments RSS Feed for this entry.