I'm looking for interesting algorithms for image magnification that can be implemented on a gpu for real-time scaling of video. Linear and bicubic interpolations algorithms are not good enough.
Suggestions?
Here are some papers I've found, unsure about their suitability for gpu implementation.
I've seen some demos on the cell processor used in TVs for scaling which had some impressive results, no link unfortunately.