I have been using garbage collected languages professionally for over 15 years (and programming for 30 years). My industrial projects have ranged from software collating data from 8,000 tranducers across an oil field to (soft) real-time visualization and low-latency algo trading software.
I found garbage collection to be useful in all cases. I had reservations about the latency of garbage collection in two major projects: the visualization software (in OCaml) and the algo trading software (in F#). However, my concerns proved to be unjustified as the garbage collected solutions actually exhibited better latency characteristics than non-garbage collected solutions in both cases. In particular, translating visualization software from C++ to OCaml actually improved the worst-case stalls by a factor of 5. The stalls in the C++ code were due to collections falling out of scope and reference counts causing avalanches of destructors calling destructors. We had spent considerable effort trying to solve this problem by writing custom allocators that would make destruction incremental but never succeeded. Additionally, we have found that purely functional data structures often exhibit excellent latency characteristics and they are basically intractable without a garbage collector.
The only notable exceptions to my "garbage collectors are good" moral are poor man's garbage collectors like reference counting and conservative garbage collectors like Boehm's GC. I would not touch them with a barge pole in a professional context.