[Mono-devel-list] Prevalence of pointer-integral-pointer casting in mono?

Peter Colson pcolson at connexus.net.au
Mon Aug 9 01:24:52 EDT 2004


On 04/08/2004, at 9:47 PM, Jonathan Pryor wrote:

> I need to seriously start trimming down this message...but I'll fail
> dismally.
>
> On Wed, 2004-08-04 at 01:36, Peter Colson wrote:
> <snip/>
>>> IntPtr and UIntPtr are supposed to be large enough to hold a pointer
>>> value.  That's their entire purpose.  So if you're targeting a 
>>> 128-bit
>>> platform, then IntPtr and UIntPtr should be resized appropriately.
> <snip/>
>>
>> First of all thanks for the response.
>>
>> To clarify, the platform concerned is the AS/400. It's not a 
>> full-blown
>> 128-bit platform as such, however pointers are represented as 128-bit
>> entities.
>
> Can you clarify "128-bit entities"?  Is there a *single* 128-bit
> register that is used for pointer addressing, or is a combination of
> registers used (such as the 80286 segment:offset).

The underlying architecture is PPC. Not sure of all the details but it 
doesn't use
128-bit registers. The 128-pit pointers are an OS/compiler construct to 
provide
a single address space (memory and disk together).

> If it's a single register, this should be reasonably straightforward:
> when JITting code, place all IntPtrs into the 128-bit registers.  
> Things
> should Just Work (I hope).

JIT'ing in this environment is a whole separate issue. To this point, 
I've been
thinking primarily of interpretation. The 400 in native mode only 
presents an
MI (Machine Interface) layer that doesn't allow register access.

>
> If it's a combination of registers, things are probably more hairy.
> You'd need to intercept every load/store of IntPtrs in the JIT and
> marshal them into a structure so you don't lose any pointer 
> information.
>
> <snip/>
>
>> Re. pointer-int-pointer conversion, ILE C (the 400's native C 
>> compiler)
>> allows casting of 128-bit pointers to another pointer, but once a
>> pointer has been cast to an integral (int/long, 32 bits, or long long,
>> 64 bits) information is lost that results in an invalid pointer when 
>> the
>> integral value is cast back to a pointer. End result: exception on the
>> first attempt to de-ref the pointer. There doesn't appear to be an
>> integral type large enough to hold a complete pointer value.
>
> The real question is how your 128-bit pointers are represented.  If
> they're a single register, there *should* be an integral type that can
> hold a complete pointer value (it may just have a special keyword, such
> as __int128 or something).  If pointer->integer->pointer conversions 
> are
> a problem, you just need to make sure that the intermediate integer is
> of the appropriate type.

There is no 128-bit integral, however (as referred to in Paolo's note) 
there
is the ability to use 48-bit pointers (via 'teraspace'-related compiler
options). This hasn't been an option previously because I'm on an 
earlier
version of the OS. But it seems that we have to target the later 
version and
then we can store pointers in 'long long's.

<snip>

>> [T]he question in
>> my mind is even if the author of a C# program doesn't use unsafe 
>> code, whether
>> the compiler would make use of unsafe code internally as a matter of 
>> course,
>> with this same IntPtr-related unsafe code winding up in the IL of any 
>> executable
>> produced by the compiler.
>
<snip>

> There is one slight complication: IntPtr isn't "unsafe"; it can be used
> by normal verifiable code.  This isn't a major complication from the
> security standpoint, though, as IntPtr is normally used for P/Invoke,
> and P/Invoke is outside of the sandbox (so if you're using DllImport 
> the
> .NET security infrastructure won't let you run within a sandbox,
> disabling code downloading, etc.).
>

OK, this clarifies a few things re. IntPtr's and unsafe code. My 
concern is less
whether code is safe/unsafe, as whether IntPtr's are generally required 
- which
it seems they are simply to make things work.

<snip>

> To revisit an earlier point, if IntPtr is actually represented as a
> pointer on the hardware, you're fine.  You might need to see what ILE C
> generates when pointers are used, to make sure that IntPtr is
> represented the same way.
>
> <snip/>
>
> One final point: I wouldn't be surprised that Mono could run in your
> environment (though I wouldn't be surprised if you can't; I'm
> ambivalent), but you might have a problem with portability of user
> code.  Some existing programs/libraries (Gtk#) assume that pointers are
> either 32-bit or 64-bit, as (1) they need to perform pointer 
> arithmetic,
> and (2) IntPtr doesn't provide the required operators to do pointer
> arithmetic.  Consequently existing code will cast an IntPtr to an
> appropriately-sized integer, perform the pointer arithmetic, and cast
> back -- a managed pointer->integer->pointer conversion.
>
> Obviously such managed code will break with 128-bit pointers.

Yes, I've seen C code elsewhere that says if a pointer's size is not 32 
bits, it
MUST be 64 bits and Paolo has pointed out the requirement to port libgc
for garbage collection. Makes me wonder whether 48 bit pointers are 
still
going to give problems. But at least it should be more manageable. 
We'll see...

> The solution is to get operator+/operator- added to IntPtr so that
> pointer arithmetic can be performed without needing an intermediate
> integer type (which can lose pointer information).  I'm not sure how
> easy this would be to get into the standard, though.

Feels like I've got a few more hurdles to jump before worrying about
changing standards, tho'.

>
>  - Jon
>

Regards,
Peter Colson,
Carringbush Software


__________________________
Regards,
Peter Colson,
Carringbush Software,
petercolson at mac.com
__________________________




More information about the Mono-devel-list mailing list