Given the following list that contains some duplicate and some unique dictionaries, what is the best method to remove unique dictionaries first, then reduce the duplicate dictionaries to single instances? I gotta say I only recently started getting into Python but its making this project so much easier. I'm just a bit stumped on this kind of problem.
So my list looks like this:
[{  'file': u'/file.txt',
    'line': u'line 666',
    'rule': u'A DUPLICATE RULE'}
{   'file': u'/file.txt',
    'line': u'line 666',
    'rule': u'A DUPLICATE RULE'}
{   'file': u'/uniquefile.txt',
    'line': u'line 999',
    'rule': u'A UNIQUE RULE'}]
What I'm going for is in the end, the list should look like:
[{  'file': u'/file.txt',
    'line': u'line 666',
    'rule': u'A DUPLICATE RULE'}]