Verifying signature created using OpenSSL with BearSSL

Solution 1:

The signature file contents shown as

$ hexdump signature
0000000 4530 2002 ac54 51af 8ac0 cee8 dc74 4120
...

is displayed as 16-bit values.

The signature in the C program is defined as an array of 8-bit values

uint8_t signature[] = {
  0x45, 0x30, 0x20, 0x02, 0xac, 0x54, 0x51, 0xaf, 0x8a, 0xc0, 0xce, 0xe8, 0xdc, 0x74, 0x41, 0x20,
...
};

Depending on the byte order this may or may not be correct. Does 4530 correspond to 45, 30 or 30, 45?

With little-endian byte-order, the hex dump

4530 2002 ...

would correspond to (*)

uint8_t signature[] = {
  0x30, 0x45, 0x02, 0x20, ...
};

I suggest to display the hex dump as 8-bit values, e.g. by using

od -t x1 signature

and, if necessary, fix the array initialization in the C code.

According to dave_thompson_085's comment, the correct byte order is 0x30, 0x45, so the proposed fix (*) is the solution.

And an ECDSA signature on a 256-bit group definitely starts with first tag=SEQUENCE+constructed (always 0x30) then body length usually 68 to 70 (0x44 to 0x46)