Cycles aren't bad, but they are often avoided because they can make it tricky to ensure you haven't got memory leaks. Leaks occur especially when objects are 'reference counted'. In a language or system that uses reference counting, an object keeps track of the number of references pointing at it. Every time a reference is deleted, the count goes down, when the count gets to zero, there are no references and so the object can be deleted.
This usually takes care of itself and works ok without any careful thinking. If you've got a group of objects with no cycles and you drop your reference to the root object, then it will be deleted, this means references it has to objects it owns will be dropped, the objects being referenced will have their reference counts go to zero. They'll be deleted and the cascade will cause all objects to be deleted.
But... if you have a cycle, this cascade doesn't work. You may have a group of objects and you don't want them any more, so you drop the only reference you have to these objects, but because there is a cycle the objects reference each other. This means there reference counts never go to zero, and they don't get deleted. This is a memory leak.
Clearly, you can do some careful management and break the cycles before you drop your reference to a group of objects you don't want any more. But... as I just said, this takes careful management. It's very easy to get wrong. This is one of the main reasons that memory leaks occur.
To avoid the risk of leaks and the tricky job of breaking cycles correctly when you no longer need a group of objects, programmers usually try to avoid cycles. This becomes more important on big projects with many programmers where no one person understands the whole system. If there were cycles, the programmers would have to watch out and spend a long time studying each others code to avoid cycles.
Some languages with garbage collectors (eg C#) can delete a group of objects that are no longer needed even if the group contains cycles.