views:

413

answers:

6

I want to know what sort of financial applications can be implemented using a GPGPU. I'm aware of Option pricing/ Stock price estimation using Monte Carlo simulation on GPGPU using CUDA. Can someone enumerate the various possibilities of utilizing GPGPU for any application in Finance domain,

+1  A: 

F# is used a lot in finance, so you might check out these links

http://blogs.msdn.com/satnam_singh/archive/2009/12/15/gpgpu-and-x64-multicore-programming-with-accelerator-from-f.aspx

http://tomasp.net/blog/accelerator-intro.aspx

Brian
How can it be used a lot given that it's a 1-2 years old language?
Adal
@Adal: F# is partially based on OCaml which is used and well-known in the finance business. Plus if you look, the first release of F# for the community is about 5 years old, so lots of people have delved into it already.
Stringer Bell
A: 

Answering the complement of your question: anything that involves accounting can't be done on GPGPU (or binary floating point, for that matter)

Michiel Buddingh'
Yeah, so I want to know what are those typical financial apps that can be ported on a GPGPU platform
CUDA-dev
+1  A: 

High-end GPUs are starting to offer ECC memory (a serious consideration for financial and, eh, military applications) and high-precision types.

But it really is all about Monte Carlo at the moment.

You can go to workshops on it, and from their descriptions see that it'll focus on Monte Carlo.

Will
It is "Monte" not "Monty" - the famous casino.
Ross
+1  A: 

Basically, anything that requires a lot of parallel mathematics to run. As you originally stated, Monte Carlo simultation of options that cannot be priced with closed-form solutions are excellent candidates. Anything that involves large matrixes and operations upon them will be ideal; after all, 3D graphics use alot of matrix mathematics.

Given that many trader desktops sometimes have 'workstation' class GPUs in order to drive several monitors, possibly with video feeds, limited 3D graphics (volatility surfaces, etc) it would make sense to run some of the pricing analytics on the GPU, rather than pushing the responsibility onto a compute grid; in my experience the compute grids are frequently struggling under the weight of EVERYONE in the bank trying to use them, and some of the grid computing products leave alot to be desired.

Outside of this particular problem, there's not a great deal more that can be easily achieved with GPUs, because the instruction set and pipelines are more limited in their functional scope compared to a regular CISC CPU.

The problem with adoption has been one of standardisation; NVidia had CUDA, ATI had Stream. Most banks have enough vendor lock-in to deal with without hooking their derivative analytics (which many regard as extremely sensitive IP) into a gfx card vendor's acceleration technology. I suppose with the availability of OpenCL as an open standard this may change.

James Webster
+1  A: 

A good start would be probably to check NVIDIA's website:

Stringer Bell
+1  A: 

There are many financial applications that can be run on the GPU in various fields, including pricing and risk. There are some links from NVIDIA's Computational Finance page.

It's true that Monte Carlo is the most obvious starting point for many people. Monte Carlo is a very broad class of applications many of which are amenable to the GPU. Also many lattice based problems can be run on the GPU. Explicit finite difference methods run well and are simple to implement, many examples on NVIDIA's site as well as in the SDK, it's also used in Oil & Gas codes a lot so plenty of material. Implicit finite difference methods can also work well depending on the exact nature of the problem, Mike Giles has a 3D ADI solver on his site which also has other useful finance stuff.

GPUs are also good for linear algebra type problems, especially where you can leave the data on the GPU to do reasonable work. NVIDIA provide cuBLAS with the CUDA Toolkit and you can get cuLAPACK too.

Tom