I've got a transaction that saves or updates a set of objects in a database. This is the code:
@Transactional
public void updatePDBEntry(Set<PDBEntry> pdbEntrySet) {
for (PDBEntry pdbEntry : pdbEntrySet) {
PDBEntry existingEntry = findByAccessionCode(pdbEntry.getAccessionCode());
if (existingEntry != null) {
log.debug("Remove previous version of PDBEntry {}", existingEntry);
makeTransient(existingEntry);
}
makePersistent(pdbEntry);
}
}
And in the genericDAO:
public void makePersistent(I entity) {
getCurrentSession().saveOrUpdate(entity);
}
public void makeTransient(I entity) {
getCurrentSession().delete(entity);
}
Somehow this doesn't work, it says it can't insert the object because of a duplicate key, even though I can see in the logs that it gets to makeTransient(). I guess this has to do with the fact that all this is happening within a transaction, so the changes made by makeTransient() might not be seen by the makePersistent() method. I could solve this by copying all data from pdbEntry into existingEntry and then doing saveOrUpdate(existingEntry) but that is sort of a dirty hack. Is there another way to make sure the makeTransient is visible to makePersistent, while still keeping it all within a transaction?
EDIT: This is my PDBEntry domain model:
@Entity
@Data
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@EqualsAndHashCode(callSuper = false, of = { "accessionCode", "date" })
@SuppressWarnings("PMD.UnusedPrivateField")
public class PDBEntry extends DomainObject implements Serializable {
@NaturalId
@NotEmpty
@Length(max = 4)
private String accessionCode;
@NaturalId
@NotNull
@Temporal(TemporalType.DATE)
private Date date;
private String header;
private Boolean isValidDssp;
@Temporal(TemporalType.TIMESTAMP)
private Date lastUpdated = new Date(System.currentTimeMillis());
@OneToOne(mappedBy = "pdbEntry", cascade = CascadeType.ALL)
private ExpMethod expMethod;
@OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<Refinement> refinementSet = new HashSet<Refinement>();
@OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<HetGroup> hetGroupSet = new HashSet<HetGroup>();
@OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<Chain> chainSet = new HashSet<Chain>();
@OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<Chain> residueSet = new HashSet<Chain>();
public Date getLastUpdated() {
return new Date(lastUpdated.getTime());
}
public void setLastUpdated() throws InvocationTargetException {
throw new InvocationTargetException(new Throwable());
}
public void touch() {
lastUpdated = new Date(System.currentTimeMillis());
}
@Override
public String toString() {
return accessionCode;
}
public PDBEntry(String accessionCode, Date date) throws NullPointerException {
if (accessionCode != null && date != null) {
this.accessionCode = accessionCode;
this.date = date;
} else {
throw new NullPointerException();
}
}
}
@MappedSuperclass
public abstract class DomainObject implements Serializable {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
public Long getId() {
return id;
}
@Override
public abstract boolean equals(Object obj);
@Override
public abstract int hashCode();
@Override
public abstract String toString();
}
EDIT: New problem, I've created a method that first deletes all the existing objects from the database like so:
@Override
@Transactional
public void updatePDBEntries(Set<PDBEntry> pdbEntrySet) {
findAndRemoveExistingPDBEntries(pdbEntrySet);
savePDBEntries(pdbEntrySet);
}
@Override
@Transactional
public void findAndRemoveExistingPDBEntries(Set<PDBEntry> pdbEntrySet) {
for (PDBEntry pdbEntry : pdbEntrySet) {
PDBEntry existingPDBEntry = findByAccessionCode(pdbEntry.getAccessionCode());
if (existingPDBEntry != null) {
log.info("Delete: {}", pdbEntry);
makeTransient(existingPDBEntry);
}
}
}
@Override
@Transactional
public void savePDBEntries(Set<PDBEntry> pdbEntrySet) {
for (PDBEntry pdbEntry : pdbEntrySet) {
log.info("Save: {}", pdbEntry);
makePersistent(pdbEntry);
}
}
It seems to delete the first 73 entries it encounters, but then gives an error:
WARN 2010-10-25 14:28:49,406 main JDBCExceptionReporter:100 - SQL Error: 0, SQLState: 23503 ERROR 2010-10-25 14:28:49,406 main JDBCExceptionReporter:101 - Batch entry 0 /* delete nl.ru.cmbi.pdbeter.core.model.domain.PDBEntry */ delete from PDBEntry where id='74' was aborted. Call getNextException to see the cause. WARN 2010-10-25 14:28:49,406 main JDBCExceptionReporter:100 - SQL Error: 0, SQLState: 23503 ERROR 2010-10-25 14:28:49,406 main JDBCExceptionReporter:101 - ERROR: update or delete on table "pdbentry" violates foreign key constraint "fke03a2dc84d44e296" on table "hetgroup" Detail: Key (id)=(74) is still referenced from table "hetgroup". ERROR 2010-10-25 14:28:49,408 main AbstractFlushingEventListener:324 - Could not synchronize database state with session org.hibernate.exception.ConstraintViolationException: Could not execute JDBC batch update
Any ideas how this error might arise?
EDIT: I found the problem: the last error was caused by the hetgroup model, which is as follows:
@Entity
@Data
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@EqualsAndHashCode(callSuper = false, of = { "pdbEntry", "hetId" })
@SuppressWarnings("PMD.UnusedPrivateField")
// extends DomainObject which contains Id, NaturalId is not enough in this case, since duplicate chains still exist
// in fact this is an error of the PDBFinder and will be fixed in the future
public class HetGroup extends DomainObject implements Serializable {
//@NaturalId
@NotNull
@ManyToOne
private PDBEntry pdbEntry;
//@NaturalId
@NotEmpty
private String hetId;
private Integer nAtom;
@Length(max = 8192)
private String name;
public HetGroup(PDBEntry pdbEntry, String hetId) {
this.pdbEntry = pdbEntry;
pdbEntry.getHetGroupSet().add(this);
this.hetId = hetId;
}
}
Especially pay attention to the commented @NaturalId's, I commented these because there was still some error in my data that caused duplicate hetgroups, so I thought I'd just remove the UNIQUE constraint on them for now, but I forgot that I was using Lombok to create equals and hashcode methods for me, as can be seen from the line:
@EqualsAndHashCode(callSuper = false, of = { "pdbEntry", "hetId" })
This caused the error with the duplicate HetId's. I fixed the data and reinstated the @NaturalId's, and now everything works fine.
Thanks all!