views:

535

answers:

1

I'm learning some DirectX programming by re-implementing some DirectX code into different projects, I did find however that the debugger seems to output data differently between the two projects (the sample and my one).

On my project if I do this:

D3DSURFACE_DESC desc;
pTarget->GetLevelDesc(0,&desc);
int width = desc.Width;
int height = desc.Height;

And have the debugger output width and height I get the results in decimal, however in the directx sample the result is in hexidecimal (both are actually accurate, when I convert the hexidecimal to decimal it seems to be appropriate).

Another (slightly) minor issue is the way in which the debugger gives me information about vairous DirectX related pointers. For example in my project when I stop the execution and move my mouse over a pointer I get some chinese characters, null pointers and all that (the texture is still valid and works 100%), in the DirectX sample however I don't get that, in fact it doesn't give any information on the pointer besides it being a DirectX base type (something which my project doesn't do).

So I'm just wondering, why this is and are there any debug options I could use to make them work the same?

+6  A: 

On the Debug toolbar (right click the toolbar and make sure Debug is selected) you can toggle the Hex button. This controls whether the debugger shows hex or dec values. You probably just have it set differently in different places.

You can also right click on a variable value in the little inspect tooltip thingy that appears as you're inspecting it and toggle Hex on/off.

jeffamaphone
just ran into this -- was wondering why all my visual studio debug values were displaying in hex -- thanks!!
Jeff Atwood