[Mono-devel-list] Prevalence of pointer-integral-pointer casting in mono?
pcolson at connexus.net.au
Wed Aug 4 01:36:12 EDT 2004
On Tue, 2004-08-03 at 22:04, Jonathan Pryor wrote:
> On Mon, 2004-08-02 at 22:23, Peter Colson wrote:
> > Where does that leave us with respect to the IntPtr and UIntPtr types in
> > C# and the requirement for these types to be able to hold a pointer and
> > allow those contents to be treated as a pointer?
> > If the platform concerned has 128-bit pointers and no native integral
> > type to hold a pointer, is the ability to support (U)IntPtrs prevented?
> IntPtr and UIntPtr are supposed to be large enough to hold a pointer
> value. That's their entire purpose. So if you're targeting a 128-bit
> platform, then IntPtr and UIntPtr should be resized appropriately.
> I'm not sure easy this would be to do from the runtime perspective, but
> such a change should be transparent to managed code.
First of all thanks for the response.
To clarify, the platform concerned is the AS/400. It's not a full-blown
128-bit platform as such, however pointers are represented as 128-bit
At this point I am very much in research mode re. the feasibility of
porting a .Net runtime system to it, and am coming to grips with the
idiosyncracies of the 400 platform. So these posts are sometimes a bit
of thinking aloud and not necessarily as concise as they could
This thread started with me questioning issues of
pointer-integral-pointer conversion, but subsequently the question of
IntPtr representation occurred hence that question not appearing in the
Re. pointer-int-pointer conversion, ILE C (the 400's native C compiler)
allows casting of 128-bit pointers to another pointer, but once a
pointer has been cast to an integral (int/long, 32 bits, or long long,
64 bits) information is lost that results in an invalid pointer when the
integral value is cast back to a pointer. End result: exception on the
first attempt to de-ref the pointer. There doesn't appear to be an
integral type large enough to hold a complete pointer value.
This problem can be got around using the PASE subsystem which presents a
Unix-like address space, but at this point I'm looking at the
possibilities of a native port.
Part of my research has led me across code that uses unused bits in
pointers to signify type information. In most cases this can be dealt
with by rewriting expressions that get and set this information. But I
was curious if this was a widespread phenomenon under Mono, because if
it's not, it makes it a more attractive proposition for porting.
At this level the problem is confined to the internal C code
constituting the .Net runtime.
> > Furthermore, am I right in saying that any .Net-style runtime operating
> > on a platform is going to have recourse to using unsafe calls (at least
> > internally) requiring the use of (U)IntPtr's, even if C# code written on
> > that platform makes no use of unsafe code?
> Could you clarify that?
This question arose due to feedback elsewhere indicating that the 128-bit
pointer problem manifests itself at the C# level in code that uses IntPtr's.
This should only occur in unsafe code, as I understand. But the question in
my mind is even if the author of a C# program doesn't use unsafe code, whether
the compiler would make use of unsafe code internally as a matter of course,
with this same IntPtr-related unsafe code winding up in the IL of any executable
produced by the compiler.
I think I'm also coming to the realisation that any C# application of substance
is going to be using unsafe code anyway to get things done...
> Let's put it this way: as currently implemented, Mono requires the use
> of IntPtrs. Just look at the System.IO source code (IIRC) -- IntPtrs
> are used as part of the internal call declarations into the Mono
> runtime. This is pretty much true for any part of the runtime that uses
> internal calls (which is a substantial fraction). If an IntPtr can't
> actually hold a pointer, you're screwed. Period.
This confirms my suspicion above, I think.
> Furthermore, the typical times pointer->integer conversions break down
> isn't when the integer isn't large enough to hold a pointer -- you can
> just use a larger integer and you're fine. The *real* problems show up
> when you have a segmented memory architecture, and there is no integer
> large enough to hold a pointer. In such a case, you may not have an
> integer large enough to hold a full pointer (example: the 80286 had a 20
> bit address space but only 16-bit integers).
This is the situation faced on the AS/400.
> .NET isn't targeted at such platforms. I doubt Java could handle them
> either. Most C code isn't targeted at such platforms, either. C
> assumes a flat memory address, and hacking C into the 80x86 required
> introducing several pointer modifiers (near, far, huge, etc.). It is
> not fun (unless you're a masochist), and most people avoid such
> platforms (given a choice).
Agreed. The AS/400 does support Java, though. But it is
probably done at a level not available to the typical ILE C
developer, methinks. As mentioned above PASE provides a flat address
space, but natively it becomes more complicated.
> As a fallback, you could use a structure to hold the full pointer value
> and add some pack/unpack logic whenever IntPtr is used. This would
> suck, performance-wise, but it could work.
Agree, with you here too. Were I to do this I'd be looking for a codebase to
start with that allows this to be done as easily as possible (but here I'm
really thinking of the pointer-int-pointer casting problem in C again).
> Furthermore, here's a better follow-up question: what do you do if you
> want to run .NET code on platforms that don't have 8-bit bytes? ;-)
> Byte, Int16, Int32, etc., are no longer appropriately sized, and this
> could cause compatibility problems. This limits portability.
> (Granted, this isn't a *major* problem today, but it was ~20 years ago.)
Yes. The situation isn't quite this dire on the 400, at least it thinks
in terms of 8-bit bytes...
> - Jon
More information about the Mono-devel-list