I am just planning to implement some features like "Look up in Dictionary" when the mouse hovers over some text/words on the screen on the Mac OS X platform.
How can I get the displaying text/words nearby the mouse on the screen, even if they're not in my own application.
What I can do is:
- Using WorkSpace to know what applications are running.
- Use accessibility API to know the top UIElement on the screen.
- Use accessibility API to know the selected string of a UIElement if the application use something like NSTextView (I get get it by check the "selected text" attribute of the UIElement).
What I can't do is:
- Some applications like Safari, using WebKit framework, the only things I can get from Accessibility API are "value attribute", which is the current content of the HTML, and some attributes named like "AXTextMarker". These attributes can't not be found in Google or any document.
- Some applications even do not support Accessibility API, what I can get is just a screen shot of it.
Is there any way to call system API to recognize the text in the image? In Snow Leopard, there is a handwriting recognizer feature for input in Chinese or Japanese, so should it be some OCR feature? I can't find any open API for it.