I'm trying to calculate the number of success cases within a recursive function in C#, but I'm astonished by the fact that my variable is shared between all the function calls!
[update 2]
More than strange this time. doing so
i = i + validTreesFun(tree.Nodes, newWords.ToList()) ;
resets i to 0
doing this
i = validTreesFun(tree.Nodes, newWords.ToList()) + i ;
gives some results (I'm not sure if it's correct)
[updated : the full code]
public static int validTreesFun(List<Tree<char>> nodes, List<string> words)
{
int i = 0;
if (nodes == null && (words == null || words.Count == 0 || (words.Count == 1 && words.First() == "")))
return 1;
else
if (nodes == null)
return 0;
foreach (Tree<char> tree in nodes)
{
var validWords = words.Where(w => w.ToCharArray()[0] == tree.Root)
.Select(w => w);
if (validWords.Count() == 0)
return 0;
else
{
var newWords = validWords.Select(w => join( w.ToCharArray().Skip(1).ToArray()));
i += validTreesFun(tree.Nodes, newWords.ToList());
}
}
return i;
}
when debuging the variable i take the value 1 but it resets to 0 on the next iteration!! despite the use of
i = i + ....
What is the problem in that piece of code?
Thank you