A mission-critical production system has n stages that have to be perfomed sequentially; stage i is performed by machine M_i. Each machine M_i has a probability f_i of failing and a probability 1 - f_i of functioning reliably (and the failures are independent). Therefore, if we implement each stage with a single machine, the probability that the whole system works is (1 - f_1)*(1 - f_2)...(1 - f_n). To improve this probability we add redundancy, by having m_i copies of the machine M_i that performs stage i. The probability that all m_i copies fail simultaneously is only (f_i)^(m_i) , so the probability that stage i is completed correctly is 1 - (f_i)^(m_i) and the probability that the whole system works is Geom(1, n) = 1(1 - (f_i)^(m_i)). Each machine M_i has a cost c_i, and there is a total budget B to buy machines. (Assume that B and c_i are positive integers.) Given the probabilities f_1,...,f_n, the costs c_1,...,c_n, and the budget B, find the redundancies m_1,...,m_n that are within the available budget and that maximize the probability that the system works correctly.
I understand the probabilities given and how adding redundancy improves the probability that the system works. My confusion comes from:
What is the relationship between the budget B and the c_1,...,c_n? I think that the total cost of the system would be: Sum(1,n) of m_i*c_i. Is this correct? How does this relate B?
How do you maximize the probability that the system works correctly? My first thought was to look for local min/maxes, but those just lead me to no redudancies as that would cost 0 extra, but that doesn't seem like a reasonable answer.
Anyway, this is a homework question, and I'm not looking for the answer per se. I would just like some insight into the relationship between the variables because I don't see it, so the question makes no sense to me.