views:

626

answers:

3

I've noticed that a OpenGL app I've been working on has significant performance difference when run on Linux vs WindowsXP.

Granted, there are a lot of textures and shadow buffers but I would estimate that the app runs about 10x slower on Windows XP.

Any ideas?

Any suggestions for porting the code to DirectX? Can that be done easily or would a re-write be needed?

Running of different hardware. I don't have the specs of the Linux box, but my xp box is Intel Duo Core 2 with Nvidia Quadro FX 1500. The linux box video card was some sort of Nvidia Geforece (It was a University computer).

Some initiation code:

FlyWindow::FlyWindow() :
GlowWindow("fly", 300, 100, // GlowWindow::autoPosition, GlowWindow::autoPosition,
       700, 500,
       Glow::rgbBuffer | Glow::doubleBuffer |
       Glow::depthBuffer | Glow::multisampleBuffer,
       Glow::keyboardEvents | Glow::mouseEvents | Glow::dragEvents |
       /*Glow::menuEvents | */ Glow::motionEvents | Glow::visibilityEvents |
       Glow::focusEvents /* set ::glutEntryFunc */ ),

W(700), H(500),
flock(10),
lastSeconds(myclock.getSecondsSinceStart())
{
    myfps = FPScounter();

    GLdraw<float>::initGL(W,H);

    // Add a bouncing checkerboard
    MovingCB = Point3d<double>(50, 2, 50);

    Glow::RegisterIdle(this);
    bDebug = false;
    m_bLookAtCentroid = true;
    m_bLookAtGoal = false;
}

Thanks

A: 

QuadroFX 1500 isn't exactly the newest card. Find out what the linux box has and compare hardware specs. On my projects (display of dense signal data) I've found than OpenGL performance is pretty consistent between linux/windows.

basszero
+3  A: 

Comparing the Quadro to a GeForce is a big mistake. They may both be "graphics" cards but that is where the similarity ends.

The Quadro is designed for high end rendering and not games. From the wikipedia article on the Quadro:

Their designers aimed to accelerate CAD (Computer-Aided Design) and DCC (digital content creation), and the cards are usually featured in workstations. (Compared to the NVIDIA GeForce product-line, which specifically targets computer-gaming).

Quadro is going to preform very differently then the GeForce, regardless of operating system.

James McMahon
GeForce and Quadro aren't as drastically different as they're made out to be. The architecture of the GPUs are very similar. What is different is the QA that goes into the Quadros so they can perform with a much higher MTBF.
codelogic
Ah, I had no idea they were so similar. However the drivers would be optimized for different applications and that can make a huge difference on performance.
James McMahon
Older Quadro cards always used GeForce chips that identified themselves with a different device ID. You could take, for example, a GeForce 2 and flash it with a Quadro VBIOS and it would be indistinguishable. I know more recent NVIDIA hardware diverges more on this front, but the hardware designs are virtually identical.
greyfade
+3  A: 

As DrJokepu mentioned in the comments, it's possible XP is employing software rendering, implying a driver installation issue. You can verify this by querying GL_VENDOR and GL_RENDERER:

printf( "%s\n", (const char*)glGetString( GL_VENDOR ) );
printf( "%s\n", (const char*)glGetString( GL_RENDERER ) );

The vendor should be NVidia and not Microsoft and the renderer should be at least OpenGL 2.0.

codelogic
I did that, not rendering it in software, yet still slow. I'm not really sure...Anyways, I'm going to try to post some code somewhere in the future. thanks!
cbrulak