views:

1715

answers:

3

I hope that this is still considered programming-related, although it seems to be a sole hardware question, yet on the other hand it is actually important when doing graphics programming.

Essentially, I am trying to get a bit into game development, which naturally also includes graphics (shaders etc.). I know that in my workstation, I want to have something very powerful and essentially I want to have what the customers have - a Geforce or Radeon. But as I am planning to get a Laptop, I wonder if Intels X3100 and X4500 are suited for game development.

On the paper, both support DirectX 10, OpenGL 2.1 and Shader Model 4.0. But as a player, i remember countless incompatibilities in the past. But I do not have experience as a programmer, so I do not know the exact reasons for them.

Long story cut short: As a Graphics Programmer, would you downright refuse due to having to provide workarounds, or would you say that they are good enough (that is: I do not need to provide different shaders or workarounds) and just not as fast as "proper" Graphics cards?

+2  A: 

As a game programmer, I'd say:

  • Performance is not that important. If you have a faster video card than potential customers, you won't know if your game goes well on their machine. A reasonably slow graphic cards is ok, and can be interesting to be sure a lot of people can play your game with a good framerate.
  • Compatibility is important. That said, you want compatibility with what is on the market, not necessarily with the graphics API. I don't know if it is the case of these Intel cards.

I think it's nice to have two systems, one high-end with a good NVidia/AMD card and a low-end with an Intel or cheap NVida/AMD card. This way by testing both you'll sort out most compatibility problems.

ckarmann
+1  A: 

Depends on what your target hardware is. If you're doing casual/small/medium size games, you just can't ignore Intel graphics cards. Similarly, you can't ignore low-end GeForce/Radeon cards, which often can be as slow as Intel ones. For some statistics, see Unity Web Player hardware stats.

So as a programmer, you most likely want to have that card somewhere, so you can test on it. Whether it is good enough to have on development machine - again, depends on what kind of games you're making (i.e. on the content complexity). I'd say it's good enough for casual/small/medium games, and definitely not enough of you're trying to make anything "next gen".

My experience with X3100 was that driver are quite okay on Direct3D 9. On Windows/OpenGL, drivers are pretty bad, but, well, OpenGL drivers are generally worse on Windows for any card. On OS X / OpenGL, major part of the driver is written by Apple, and nowadays it's quite okay. Half a year ago, (OS X 10.5.0-10.5.4), X3100 drivers were quite broken there.

That said, X3100 was rumored to be DX10 compatible, but as far as I know, DX10 drivers were never released for it. So for practical reasons it is a shader model 3.0 card.

NeARAZ
A: 

It's not going to be the most performant solution for a games developer but it's certainly the level of graphics performance at which a large part of the computing public will have access to. In all fairness to the X3100 though it's a more welcome integrated part that what was previously passed out in notebooks. It does actually support 3D, a reasonable level or shader support and performs so much better than it's predecessors. It's never going to support the best visuals available but it gets you into a ballpark which is not all unreasonable.

The biggest question for youself is probably what's your target market? If you're making casual games you really need to consider the X3100 as your bread and butter. On the other hand if you're wanting to make the next AAA FPS - you might be able to ignore the X3100.

It's supported on multiple operating systems quite well too. Windows, Mac, Linux and the BSDs. Lets just say it's where I wish the bottom was for years; you can work with it, it's competant - but it's not going to win any contests.

Montdidier