I'm trying to do a parser with Happy (Haskell Tool) But I'm getting a message error: "unused ruled: 11 and unused terminals: 10" and I don't know what this means. In other hand I'm really not sure about the use of $i parameters in the statements of the rules, I think my error is because of that. If any can help me...
Unused rules and terminals are parts of your grammar for which there is no way to reach from the top level parse statements, if I recall correctly. To see how to use the $$ parameters, read the happy user guide.
The $$ symbol is a placeholder that represents the value of this token. Normally the value of a token is the token itself, but by using the $$ symbol you can specify some component of the token object to be the value.
It's not an error if you get these messages, it just means that part of your grammar is unused because it is not reachable from the start symbol. To see more information about how Happy understands your grammar, use the --info
flag to Happy:
happy --info MyParser.y
which generates a file MyParser.info
in addition to the usual MyParser.hs
.
Unused rules and terminals means you have described rules that can't be reached during parsing (pretty much like "if true then 1 else 2", the 2 branch will never be reached). Check the output of --info for more details.
For the $$ thing, it is a data extractor: let's say you have a lexer that produces token of the following type:
data TokenType = INT | SYM
data TokenLex = L TokenType String
where TokenType is here to distinguish usefull data and keywords.
In the action of your parser, you can extract the String part by using $$
%token INTEGER {L INT $$ }
%token OTHER {L _ $$}
foo : INTEGER bar INTEGER { read $1 + read $3 }
| ...
In this rule, $1 means "give me the content of the first INTEGER" and $3 "the content of the second INTEGER". $2 means "give me the content of bar (which may be another complex rule).
Thanks to $$, $1 and $3 are geniune Haskell String because we told Happy that "the content of an INTEGER is the "String" part of the TokenLex", not the whole Token.