views:

147

answers:

3

Hi,

I've got a fairly specific problem i've been struggling with for a couple of days.

I'm using a native C++, one of the methods takes a ptr to a struct containing fixed size char arrays.

e.g.

struct userData {
    char data1[10];
    char data2[10];
};

method:

short AddItem(long id, userData* data);

I'm trying to call to call this from Managed VC++ but I need to have an instance of userData I can keep hold of in my managed class.

Can anyone help with how to achieve this?

Thanks

A: 

Store a pointer to the data in the managed class, delete it in the destructor.

ref class MyManagedClass
{
    userData *myUserData;

public:
    ~MyManagedClass()
    {
        if (myUserData)
            delete myUserData;
        myUserData = NULL;
    }

    short AddItem(long id, userData* data)
    {
        if (myUserData)
            delete myUserData;
        myUserData = new userData(*data);
    }
}
nikie
+1  A: 

I use one of the following two containers when friendly interop with garbage collection is preferred:

template<typename T> ref class GcPlainPtr sealed {
    T*  ptr;
public:
    GcPlainPtr(T*ptr): ptr(ptr) { GC::AddMemoryPressure(sizeof(T)); }

    !GcPlainPtr() { 
        GC::RemoveMemoryPressure(sizeof(T)); 
        delete ptr; ptr = nullptr; 
    }

    ~GcPlainPtr() { this->!GcPlainPtr(); } //mostly just to avoid C4461

    T* get() {return ptr;}

    static T* operator->(GcPlainPtr<T>% gcPtr) { return gcPtr.ptr;}
    static operator T*(GcPlainPtr<T>% gcPtr) { return gcPtr.ptr; }
};

The previous container looks sufficient for your needs. You can use it as follows:

ref class MyManagedClass {
    GcPlainPtr<userData> myUserData;

    MyManagedClass(...bla...) 
        : myUserData(new userData(...)) 
        , ... 
    {...}

    AnotherMethod() {
        std::cout << myUserData->data1 << '\n';
        AddItem(1, myUserData.get());
    }
}

The advantage of the previous approach is that even if you forget to dispose of the objects, the memory pressure is reasonably updated so that garbage collection occurs at the appropriate frequency.

If you know the size of the data element you're allocating roughtly, but it's not merely the direct size (i.e. the native struct or class allocates memory internally), the following variant may be more appropriate:

template<typename T> ref class GcAutoPtr sealed {
    T*  ptr;
    size_t  size;
public:
    GcAutoPtr(T*ptr,size_t size) : ptr(ptr), size(size) { 
        GC::AddMemoryPressure(size);
    }

    !GcAutoPtr() {
        GC::RemoveMemoryPressure(size);
        size=0;
        delete ptr;
        ptr = nullptr;
    }

    ~GcAutoPtr() { this->!GcAutoPtr();} //mostly just to avoid C4461

    T* get() {return ptr;}

    static T* operator->(GcAutoPtr<T>% gcPtr) { return gcPtr.ptr;}
    static operator T*(GcAutoPtr<T>% gcPtr) { return gcPtr.ptr; }
};
Eamon Nerbonne
+1  A: 

You cannot store the instance in the managed class object and generate a pointer to it easily. That's very incompatible with the garbage collector. It will move managed objects around when it compacts the heap, that can happen at unpredictable times. Trying to pass a pointer to a userData member will generate a compiler error.

Some workarounds: allocate the userData instance on the heap with new, store the pointer in the managed object. That is however pretty inefficient, you'll need to implement a destructor and a finalizer to release it. Do this only if you expect to have a limited number of instances of the managed class.

Next solution is to generate the pointer at the time of the call, using pin_ptr<>. That pins the managed object in memory, preventing the garbage collector from moving it. Not terribly efficient of course.

Finally, you could declare an instance of userData as a local variable in the method that makes the call and copy the one in the managed object into it. No problem generating pointers to stack variables, they cannot move. You are now also free to declare the structure in your managed class whichever way you want. Assuming this struct isn't too large, that's what I would pick.

Hans Passant