Two of us made the exact same mistake back in the bad old days when we were allowed to make changes to production on the fly. We both missed highlighting the where clause in an update statement and set our users logins to the same same login for all users. Oops.
The difference it turns out was the aftermath. I made the mistake first, noticed it when it happened, sent out an email to all the dev managers immediately telling what happened and that it would be fixed in five minutes or less. I grabbed the old data from the audit tables (I love audit tables) and fixed the table immediately. Because I acted quickly and let everyone know that if someone complained to tell them that the fix was being made, I had no problems from the error at all.
My co-worker wasn't so lucky. Several months later, she did the same thing to the same table. Difference was she didn't notice that all the records had been updated instead of one. She went on her merry way and the first anyone knew about her mistake was when multiple user complaints started coming in from every client. The problem was quickly escalated to the IT VP who then tracked down who made the change from the audit tables and had her fix it. She wasn't as familiar with the audit tables as I was and so it took her awhile to get it fixed. End result, all our sites were down (since only one person could log in) for several hours, clients were angry, the dev got a formal warning from HR and put on probation and would have been fired the next time she made an error and would have gotten no pay raise due to a bad evaluation.
What I learned is it isn't so much the mistake but how you handle the aftermath. Owning up and initiating the fix right away works a lot better than waiting until someone else notices in most organizations (and the ones where it doesn't, you really, really don't want to work for).