views:

116

answers:

4

I'm looking for interesting algorithms for image magnification that can be implemented on a gpu for real-time scaling of video. Linear and bicubic interpolations algorithms are not good enough.

Suggestions?

Here are some papers I've found, unsure about their suitability for gpu implementation.

Adaptive Interpolation

Level Set

I've seen some demos on the cell processor used in TVs for scaling which had some impressive results, no link unfortunately.

+3  A: 

lanczos3 is a very nice interpolation algorithm (you can test it in the GIMP or virtualDub). It generally performs better than cubic interpolation and can be parallelized.
A GPU based version is implemented in Chromium:
http://code.google.com/p/chromium/issues/detail?id=47447
Check out chromium source code.

It may be still too slow for realtime video processing but maybe worth trying if you don't use too high resolution.

Ugo
+1  A: 

Still slightly 'work in progress' but gpuCV is a drop in replacement for the openCV image processing fucntions implemented in openCL on a GPU

Martin Beckett
A: 

You may want to have a look at Super Resolution Algorithms. Starting Point on CiteseerX

bjoernz
@ronag I would be interested to have your feedback on SR if you eventually use it for realtime video magnification. For me SR is only interesting if you have multiple, perfectly aligned, images of the same scene.
Ugo
@Ugo I have seen a demo of a SR implementation. They were able to align the images using correspondences (SIFT, etc) and affine transformations. The results were quite impressive. But I doubt that is was done in real time, they only showed the input images and the result.
bjoernz
@bjoernz I saw that also. Results are crazy ! like in CSI ;)
Ugo
+2  A: 

You may also want to try out CUVI Lib which offers a good set of GPU acceleration Image Processing algorithms. Find about it on: http://www.cuvilib.com

Disclosure: I am part of the team that developed CUVI.

Salman Ul Haq