This is a really complex question. While NASA certainly delivers high-quality code for life-critical systems, or robotic systems that must operate at great distances (think of the recent software fix on Voyager 2, thirteen light-hours out from Earth), NASA's quality doesn't come cheap, or quickly. Line for line, it's probably the most expensive software in the industry.
Your basic reporting application in a business doesn't need that kind of quality. Nor is it cost-effective. There are many ways to improve quality, and they vary in cost from simple and cheap (coding standards) to resource-intensive, difficult, and immensely time-consuming (written, formal mathematical proof of correctness on every method).
Project management tools such as risk assessment, project post-mortems, and continuous improvement can help an organization arrive at a suitable suite of quality practices.
Without pointing a finger at a specific industry, I will say that the most destructive practice, in terms of quality, is time pressure. Nothing induces a programmer to write sloppy code as much as tight, artificial deadlines.
What are the requirements for better quality?
Communication is crucial. Every developer on the team should know what every other developer is working on, at least in broad scope.
Second, an understanding that quality begins the day the project is accepted. Requirements need to be understood and validated. Key factors include making sure that the requirements identify the problem to be solved, rather than using a proposed solution to avoid problem definition; making sure requirements are measurable and specific enough that both developer and customer will be able to recognize that a solution meets the requirements; making sure that requirements are communicated clearly to all consumers of requirements, including developers, testers, tech writers, support personnel, and managers. Quality is best measured against clearly stated requirements; if your requirements are poorly defined, then quality is at best hap-hazard.
Reviews are crucial. Not just code reviews, but requirements analysis, designs, and perhaps most importantly, test plans. The role of testing is to verify that requirements have been met. You can't test bad or non-existent requirements.
And that leads to understanding the role of testing. You can't test quality into a product. Testing can verify quality. Testing can find defects and verify that they've been fixed. But if quality practices haven't been followed up to the point that testing begins, testing can't fix that.
While I'm not a fan of the waterfall development model, I think that agile development can take things too far to the other extreme, and can easily be abused in ways that hurt quality. I think that scrum helps alleviate some of the problems of agile; scrum promotes communication within the team, and it recognizes that estimates are nothing more than educated guesses that can be refined as knowledge improves.