As others have pointed out, there is not a solution that works 100% of the times, and you have to use rejection sampling.
In general, I second Cletus's answer, but using his algorithm you will obtain one result from two dice with probability 5/6, meaning that the expected "number of results per die" is 5/12 ~= 0.417. Multiplying the latter by the entropy of one of your random results,
which is
-(0.1*log2(0.1) + 0.2*log2(0.2) + 0.3*log2(0.3) + 0.4*log2(0.4)) ~= 1.846
we obtain 0.770. In other words, we are using, on average, 0.770 bits of information from each die. We can do better than this.
For example, throwing 9 dice you have 6^9 = 10077696 possible outcomes. Following Cletus, form a number from 0 to 10077695, and keep it only if it falls between 0 and 9999999 (this happens with probability ~0.992). When this is the case, you have 7 random decimal digits with uniform distribution, and from each of these you can extract a random number as in your problem:
0,1,2,3 --> 1
4,5,6 --> 2
7,8 --> 3
9 --> 4
This way we have 7 random results every 9 dice with probability 0.992, or an average "number of results per die" of 0.992*7/9 ~= 0.772. Multiplying this by the entropy of a result, we have 1.846*0.772 ~= 1.425. Thus, in this way we are using on average 1.425 bits from every die.
We can probably do better throwing more dice or adopting another technique. Of course, an upper bound is the entropy of a die, which is log2(6) ~= 2.585 bits.