Here you can ask questions and find or give answers to organizational, academic and other questions about studying computer science.
Given the ReSy-Float number (exam WS17)
with base=4, 3 exponent bits and 2 mantissa bits (not counting hidden bit?), how can we convert it into a decimal value?
I tried the following approach, but it yields incorrect results:
edit: added original screenshot of exam
This is the Resy-float format where we do not have any hidden bit.
ß = 43-1 div 2 = 31
Exponent E = 2.42 + 1.41 + 3.40 - 31 = 8
Mantissa M = (2.41 + 3.40).41-2 = 11.4-1
The represented number x is +M.BE = (11.4-1).48 = +11.47= 180224
You use 23-1 -1 for the offset \beta. Since it's base 4 there should be 4....
Also note that \beta = (Be-1) div 2 = (43-1) div 2 = 31