You definitely have to accept debugging (i.e. finding out why it doesn't work) as one of the primary challenges of development. If you only focus on "making it work", and consider finding/fixing/preventing bugs a frustrating waste of time, you will be frustrated most of the time.
If you look at developers books, forums, education, you will notice that 90% of the discussions are not concerned about "making it work". Often will approaches that works perfectly fine be derided as bad and ugly. When developers say "ugly", they really mean "hard to maintain" which basically means "easy to break and hard to debug". Experienced developers know that their primary focus is to write code that is hard to break and easy to debug, and that is the main challenge of the programming profession.
When you accept debugging (and its flip side - preventing bugs in the first place) as the primary challenge, you will start incorporating methods to make debugging easier and less painful. For example, when I have found a bug that was hard to locate, I ask myself how I could make it easier to locate this class of bugs in the future. If a program terminates with an obscure error message, I always try to improve the error handling to be more helpful before I fix the actual bug.
Some development approaches like unit tests makes it easier to locate and fix bugs because you test small parts of the program in isolation. If one unit test indicates a bug, you know the bug is found in the unit under test (or in the specific test :-)) - you don't have to investigate the whole program to locate the bug.
"Test driven development" may be an antidote to your frustrations. In TDD, you write a test first, and watch it fail, before you implement the feature under test. If you discover a bug, you have to write a test to expose it, before you are allowed to fix the bug. This approach may be helpful for you, because it turns the failure of the test into a small victory.