Absolutely. But mostly because we are taught such bad habits at university.
Nearly all these answers are provided by senior programmers. As a junior programmer I might be able to provide a different insight into this question..
At university, I was taught interfaces, patterns and software design and thought it was all rubbish. The interfaces were wasted, repeated code, the patterns nonsensical and the design felt like wasted time that could be spent programming.
This was, in a large part thanks to it being taught by computer science academics who did very little programming and often didn't understand the practicalities associated with coding.
They taught us what the material was, but not why to use it, or how. Even worse, I wasn't convinced it was even a good idea.
This is, in my understanding, very common of university graduates.
So when we get into an actual programming job and get handed that sharp knife, we wonder why you didn't just give us that bread knife, and will probably end up using the blunt side to do all the work.
For those of us who use the sharp side, it's a very steep learning curve. After 3-4 years programming badly, we have to unlearn all those bad habits as well as learn this crazy world of designing software, not hacking code.
Most university assignments are small projects, usually based on a specific algorithm or getting a specific data structure to work. Code is thrown out after submission and never reused.
In employment, a large portion of my work is changing very large existing code-bases. Code is only thrown out if it does not do the job. A poorly coded, robust solution is still a robust solution.
The very moment you first have to troubleshoot legacy code, is when you see the value in the sharp side of the knife. That moment, when you first see a class with 6k+ LOC is very much a "OH $#!^" moment.
Maintainable code is not taught at university...