tags:

views:

140

answers:

1

I'm trying to make elliptic curve digital signature with Ruby EC class, but my Objective-C validation code (made with libcrypto ECDSA_verify()) fails to recognize it.

#!/usr/bin/env ruby

require "rubygems"
require "base32"
require "openssl"

msgHash = OpenSSL::Digest::SHA1.digest("Message text")
key = OpenSSL::PKey::EC.new("secp160r1")
key.private_key = OpenSSL::BN.new("My private key in hex form", 16)
signature = key.dsa_sign_asn1(msgHash)
signatureInB32 = Base32.encode(signature)
puts signatureInB32

Base32 signature, made with Ruby seems to be longer, then signature made with OpenSSL raw c-interface in obj-c code.

Signature with ruby: GAWQEFIAQ2ZUBV3VLIRRA6W63VRF4DHH5R6PNR5VAIKERGOODBEOXQJSCCB6KCL6PQAXADMFJHMA====

Expected signature-length: XJOOQ52CDRAQ4FKSQLKJFLJYQQV25AMOQTSWZMHX6ZQDTOFR7OWCAFIBUE6HEKZP

I used the same curve :/

A: 

"My private key in hex form" is not a hexadecimal string, you should use binary-to-bignum conversion:

key.private_key = OpenSSL::BN.new("My private key in hex form", 2)

On the Obj-C side, use BN_bin2bn() instead of BN_hex2bn().

Another option is to use a real hexadecimal string:

key.private_key = OpenSSL::BN.new("BABE15BABE", 16)
ns