I'm busy porting a c++ app to c#.
The app contains code that hashes data into a 256bit integer.
uint256 res = data->GetHash();
The GetHash function looks like this:
SHA256((unsigned char*)&data, sizeof(data), (unsigned char*)&hash);
In c# I use the BigMath library but get a different value to the c++ app.
My c# code looks like this:
SHA256Managed sha256 = new SHA256Managed();
byte bytes = GetByteArray(data);
Int256 hash256 = BigMath.Utils.ExtendedBitConverter.ToInt256(sha256.ComputeHash(bytes), 0, true);
The c++ apps'
uint256 type uses 8 x 32bit ints, while I see the c# BigMath library uses 4 x 64bit ints.
This shouldn't cause a problem right? Surely the output should be the same from both?
I'm expecting that both sha256 and 256bit int standards will be the same across languages?