Here's the deal. Is there a way to have strings tokenized in a line based on multiple regexes?
One example:
I have to get all href tags, their corresponding text and some other text based on a different regex. So I have 3 expressions and would like to tokenize the line and extract tokens of text matching every expression.
I have actually done this using flex (not to be confused with Adobe), which is an implementation of the good old lex. lex provides an elegant way to do this by executing "actions" based on expressions. One can control the way lex reading a file too (block / line based read).
The problem is that flex actually produces C/ C++ code which actually does the tokenizing job. I have a make file which wraps all these things. I was wondering if perl /python can in some way do the same thing. Its just that I would like to do everything I like in a single programming language itself.
Tokenizing is just one of the things that I want to do as part of my application.
Apart from perl or python can any language (functional also) do this?
I did read about PLY and ANTLR here (http://stackoverflow.com/questions/34081/parsing-where-can-i-learn-about-it#34085).
But is there a way to do it naturally in python itself? pardon my ignorance, but are these tools used in any popular products / services?
Thank you.