Security infrastructure proposal

James Tucker jftucker at
Sat Feb 2 17:06:00 UTC 2013


I'm waiting for my hiking partner to arrive in the next few minutes, so I can't review this immediately, but I have a chance to offer you my gratitude for putting something together so swiftly and presenting it properly. I look forward to reading through it later.



On Feb 2, 2013, at 8:32 AM, Chris Heald <cheald at> wrote:

> Hey folks --
> I've been noodling on the state of the security infrastructure, and have
> congealed my thoughts on it into a proposal. I'd like to disclose up front
> that I am not a security expert per se, and don't seek to claim any kind of
> authority in that regard, but I am an experienced systems architect and
> hope that counts for something! :)
> I'm cross-posting this to the group as well as the
> rubygems-developers list as it would require changes to both the rubygems
> installable and distribution platforms like
> The primary goals are:
> 1. Principle of least responsibility - the infrastructure consists of
> multiple independent systems, with the intent that a breach of any single
> system is covered by the other systems in place.
> 2. Verification of continuity - gem consumers should be confident that they
> are consuming authorized gems every time they do an install or upgrade of a
> gem.
> 3. Provision for multiple signers, so that group-maintained projects work
> easily in the system.
> 4. Provision for certificate revocation, in order to revoke certificates at
> any point in the chain of trust
> This proposal does not attempt to verify identity beyond email address
> ownership, nor does it intend to provide a vetting platform for code that
> would restrict publication of gems. It would still be possible to publish
> nasty gems under this system, but it would protect against backdoored gems
> and illicit uploads, as well as providing a revocation mechanism for end
> users to use to discover and remove compromised gems in the event of a
> breach.
> Additionally, I should mention that I based the proposal on X509, as Evan
> previously indicated that any signing mechanism would need to work with
> only the Ruby stdlib, which means OpenSSL. There has been some talk of GPG
> and Web-of-trust systems for certification, and while I think that such a
> system could be used with this setup, x509 gets the ball rolling faster as
> it's already in place in Rubygems, and has no external dependencies like
> GPG.
> I'm sure there are things I haven't considered in this system, and would
> very much like feedback, criticism, and flames.
> ------------------------------------------------------------------------------------------
> # Root Key & Signing System
> A single X509 key is generated per distribution platform (,
> Gemfury, etc). This key is used to sign gem author requests.
> A gem author may generate a certificate and request that the platform sign
> it. Alice generates her x509 keypair with her email address encoded as the
> x509 name field, stashes the private key somewhere safe, and submits the
> pubkey to the signing system.
> The signing system consists of two parts:
> 1. [Machine A] A web UI (or email inbox) responsible for accepting public
> keys and sending emails
> 2. [Machine B] A signing machine with a shared data store (shared NFS
> mount, redis store, whatever - it must simply be a data store to act as a
> dead drop)
> The UI accepts pubkeys, ensures their validity, parses the certificate for
> the name field, and sends a verification email to the email specified in
> the name field. The email contains a link with a cryptographic signature
> (something like an HMAC of the pubkey). The email owner clicks this link
> (or replies to the email) which causes Machine A to validate the response
> and put the affiliated pubkey into the dead-drop inbox.
> Machine B is monitoring the inbox for pubkeys. Once a key is received, it
> is signed, and placed in the dead-drop outbox.
> Machine A monitors the outbox for signed keys. It again parses the key for
> the name field again, encrypts the signed key with itself, and emails it to
> the name field in the record.
> Alice retrieves the key from her email inbox, decrypts it with her private
> key, and then may use it to sign her gems.
> This system could have an exceptionally small attack surface, consisting of
> only a minimal mailserver (A) and a local-only daemon (B) which operate on
> shared storage (on either A or B, or on a third server, C).
> # Gem certificate chain history server
> A separate server ["Chain of trust history server"] maintains and validates
> cert chain history for all gems on, but which is queryable by
> Rubygems-bin, allowing Rubygems-bin to obtain the last known and verified
> certificate chain for a given gem when installing in the event that no
> local history is known.
> It must be separate from the platform in order to avoid
> allowing a compromise of to be pivoted into a compromise of
> the certificate history system, allowing an attacker to upload fraudulent
> certificates.
> This system would naturally serve as an automated IDS, as well, and could
> raise an alarm if it ever discovered that had accepted a gem
> without a valid certificate chain, indicating a breach of the system's
> certificate verification mechanisms.
> # Gem Signing
> The gem is signed with something like:
>    s.signing_key = File.expand_path("~/.gem/trust/.gem-private_cert.pem")
>    s.cert_chain  = ['rubygems-public_cert.pem', 'alice-public_cert.pem']
> Alice may then upload her gem to Upon receipt of the gem,
> ensures that the gem has been signed with a cert chain
> terminating in a certificate that it knows about and trusts. Additionally,
> it will ensure that the gem is signed with a certificate containing an
> email that matches the email on the account of the system.
> Rubygems(-bin) will maintain a local history of certificate chains for a
> gem. If a certificate is *removed* (without a signed authorization), then
> it will refuse to install the gem, suggest review, and require a user
> override to proceed. will additionally maintain this
> certificate chain, and refuse to accept a gem that does not include the
> owning account's email as a part of the chain of trust. This ensures:
>  * If an individual account is compromised (but not the
> legitimate owner's private key), then a malicious entity cannot upload a
> modified gem into the account.
>  * If an individual account is comprimised, and the attacker
> has been able to forge a key with the account's email, then the attacker
> can upload new gems into the account, but cannot publish new versions of
> existing gems, as they will fail to validate the chain of trust history.
>  * If Rubygems as a whole is compromised, then the attacker may be able to
> upload a malicious gem. However, Rubygems-bin will refuse to install any
> newer version of it.
> Rubygems will allow certificates to be *added* to the certificate chain, so
> long as they are signed by a non-root certificate in the chain. This
> permits for transfer of project ownership and multiple signing keys. For
> example:
> ### Project transfer
>  Alice starts a project, Foobar, signs it with her key. The chain now
> looks like:
>    [rubygems, alice]
>  Alice then later abandons the project, and Charlie takes over as
> maintainer with Alice's blessing. Alice would generate a key re-issue
> signature on the project, authorizing the removal of her key, and the
> addition of Charlie's. The chain now looks like:
>    [rubygems, charlie] (alice removed with authorization)
>  As Alice signed Charlie's key and authorized her key's removal, she is
> still part of the chain of trust allowing the chain history to permit the
> change, the system permits installation, with the implicit understanding
> that Alice has blessed Charlie's key. Future releases will not need Alice's
> blessing.
> ### Multi-user projects
> Alice starts a popular project, which she then wants to add publishing
> members to while retaining publication ability herself. Initially, the
> trust chain is:
>    [rubygems, alice]
> Upon wanting to add a new member, Alice generate a project master key, and
> authorizes key reissuance of the project using the new project master key
>    [rubygems, project-master] (alice removed with authorization)
> Then Alice uses the project-master key to sign Charlie's key (and perhaps
> her own personal key):
>    [rubygems, project-master, charlie]
> Alice may continue to publish to the project while allowing Charlie to
> publish to the project, without giving Charlie her personal key's trust.
> ### Malicious cert chain modification
>  If Dave, a malicious actor, managed to wrest control of the project, he
> would be able to sign the gem, but its trust chain would look like:
>    [rubygems, dave]
>  Thus, both and Rubygems-bin would reject the gem based on
> the gem's known certificate history, and Alice's unauthorized exclusion
> from the certificate chain.
> # Gem installation and verification
> Bob, a Ruby developer, wants to use Alice's gem. Bob would install the
> public cert as a trusted certificate:
>    gem cert --add rubygems-public_cert.pem
> Bob may then download and install Alice's gem, and Rubygems(-bin)'s
> HighSecurity policy will validate and accept the gem, and permit it to
> install.
> # Certificate revocation
> Before fetching a gem, Rubygems would need to fetch any certificate
> revocation lists. It would then check the trusted certificate list for
> revocations, and remove any that appear on the list. This is the primary
> mechanism in which a compromised CA key would be removed. Users would be
> required to manually install the new key in this event.
> This necessitates that the Rubygems public key must be published in a
> location that is not connected to the CA, as a compromise of the CA could
> allow an attacker to revoke the otherwise-legitimate root key and publish
> his own for consumption.
> Each time Rubygems runs a network operation, it should
>  1. Check if the revocation list has changed since the last time it
> validated certificates for known gems.
>  2. If the list has changed, validate the certificate chains for all
> installed gems. Prompt to remove any with invalid certificates.
>  3. If step 2 was run, write a hash of the revocation list and the list of
> gems that passed muster.
>  4. Remove any entries from the local chain of trust history that contain
> revoked certificates.
>  5. Check for a new revocation list
>  6. Run step 2 if the revocation list has changed.
> This allows for certificate verifications and revocations for multiple gem
> installs (RVM gemsets, bundler local installs) in a given system.
> # Attack Surfaces
> * Installation of a malicious certificate as a trusted root certificate on
> a local machine would result in signatures becoming unreliable. However,
> given that this would require some level of ownership of the machine, it
> would likely be a small problem in such an event.
> * Compromise of's distribution platform may result in the
> upload of malicious gems. Such gems would be distributed to gem installers,
> which would then reject the gems due to either a local failed chain of
> trust, or a failed chain of trust from the chain history server.
> * Compromise of the chain history server would not be exploitable to
> install malicious software, as the attacker must also have control of the
> distribution platform. MITM attacks would be viable, but if you can MITM
>, you can MITM chain history server queries.
> * Compromise of the chain history server AND would allow for
> attackers to upload compromised gems to and distribute them to
> pristine installs. Upgrades would still fail due to the local chain of
> trust history.
> * Compromise of the Rubygems' pubkey publication platform could result in
> an attacker publishing his own public key, which would affect people
> installing the certificate for the first time. However, legitimate gems
> from would fail to install as they were not signed with the
> attacker's keypair.
> * Compromise of the pubkey platform AND the platform would
> result in failure to install due to local or queried chain of trust
> histories.
> * Compromise of the CA's "Machine A" would result in people being able to
> obtain signed keys for emails without validation. It would not expose the
> private key for Machine B. This would permit uploading of new gems to a
> compromised user account, but new versions of existing gems would fail to
> upload, as the key provided would not be a part of the gem's existing chain
> of trust history.
> * Compromise of the CA's "Machine B" would result in disclosure of the
> private key, requiring that the root key be revoked and reissued. This
> would invalidate all current gem signatures. Illicit replacement of the
> private key on CA's "Machine B" would result in people being issued
> certificates that would fail to upload to, due to failure to
> validate the cert chain against the private key.
> * Compromise of the CA and would result in pristine installs
> being served malicious software. Upgrades would still fail due to local
> chain of trust history.
> * An author's stolen private key may be used to fraudulently sign requests.
> This may be defended against by following proper key protection measures
> and password-protecting the key.
> * Most raised MITM attacks can be avoided by performing and
> chain history queries via SSL.
> -- Chris
> _______________________________________________
> RubyGems-Developers mailing list
> RubyGems-Developers at

More information about the RubyGems-Developers mailing list