[Mono-dev] Is the `sizeof` opcode doing the right thing?
earlz at lastyearswishes.com
Wed Nov 14 03:04:52 UTC 2012
Hi, I've been messing with building a rather low-level datastructure
to avoid the Large Object Heap.
One thing this required of me was to write a very small library
directly in IL. The function I implemented is this:
.method public static hidebysig
default int32 SizeOf<T> () cil managed
And I then test this using this:
public static void Main (string args)
Console.WriteLine(BareMetal.SizeOf<Bar>()); //this is of concern
int a, b;
I got expected results in Microsoft's implementation with reference
values being stored as either 4 or 8 depending on platform
However, with Mono I get very surprising results. when getting the
sizeof `Bar`, it reports 32 on a 64-bit platform, when I would think
it would be either 4 or 8.
In the ECMA spec, on page 423 for partition III, it says "For a
reference type, the size returned is the size of a reference value to
the corresponding type, not the size of the data stored in objects
referred to by a reference value"
I think that language is a tiny bit ambiguous. Basically, does this
mean mono is being correct in that it's storing a "reference value" as
the entire class(an optimization?), or is Mono doing the wrong thing
in not returning just the size of the pointer? I'm not sure about the
Mono internals or anything, so I suspect that it *might* be conforming
to the spec, but I'd really like if someone could verify for this for
More information about the Mono-devel-list