views:

69

answers:

1

Hello there,

I'm fairly new to Hibernate and PostgreSQL, but so far it's going well, although I'm running into a problem now that I can't solve. I'm getting an error while filling the database on the very first operation (which is one transaction inserting or updating 1000 rows in the database). The error is:

SQL Error: 0, SQLState: 53300
FATAL: sorry, too many clients already
Exception in thread "main" org.hibernate.exception.GenericJDBCException: Cannot open connection

This is the important code:

@Repository
public class PDBFinderDAO extends GenericDAO<PDBEntry> implements IPDBFinderDAO {
    @Override
    @Transactional
    public void updatePDBEntry(Set<PDBEntry> pdbEntrySet) {
        for (PDBEntry pdbEntry : pdbEntrySet) {
            getCurrentSession().saveOrUpdate(pdbEntry);
        }
    }
}

getCurrentSession() is extended from GenericDAO and calls sessionFactory.getCurrentSession().

This is my Hibernate configuration:

<!DOCTYPE hibernate-configuration PUBLIC "-//Hibernate/Hibernate Configuration DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd"&gt;
<hibernate-configuration>
    <session-factory>
        <!-- Database connection settings -->
        <property name="hibernate.connection.driver_class">org.postgresql.Driver</property>
        <property name="hibernate.dialect">org.hibernate.dialect.PostgreSQLDialect</property>
        <property name="hibernate.connection.url">jdbc:postgresql://localhost/PDBeter</property>
        <property name="hibernate.connection.username">xxxx</property>
        <property name="hibernate.connection.password">xxxx</property>

        <!-- Create or update the database schema on startup -->
        <property name="hbm2ddl.auto">create</property>

        <!-- Use the C3P0 connection pool provider -->
        <property name="hibernate.c3p0.min_size">5</property>
        <property name="hibernate.c3p0.max_size">20</property>
        <property name="hibernate.c3p0.timeout">300</property>
        <property name="hibernate.c3p0.max_statements">50</property>
        <property name="hibernate.c3p0.idle_test_period">300</property>

        <!-- Disable the second-level cache  -->
        <property name="cache.provider_class">org.hibernate.cache.NoCacheProvider</property>

        <!-- Batch size -->
        <property name="hibernate.jdbc.batch_size">50</property>

        <!-- this makes sure the more efficient new id generators are being used,
        though these are not backwards compatible with some older databases -->
        <property name="hibernate.id.new_generator_mappings">true</property>

        <!-- Echo all executed SQL to stdout -->
        <!--
        <property name="hibernate.show_sql">true</property>
        -->
        <property name="format_sql">true</property>
        <property name="use_sql_comments">true</property>
    </session-factory>
</hibernate-configuration>

This is my Spring configuration:

<beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns="http://www.springframework.org/schema/beans" 
    xmlns:aop="http://www.springframework.org/schema/aop"
    xmlns:context="http://www.springframework.org/schema/context" 
    xmlns:tx="http://www.springframework.org/schema/tx"
    xmlns:task="http://www.springframework.org/schema/task"
    xsi:schemaLocation="
        http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-3.0.xsd
        http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd
        http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.0.xsd
        http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task-3.0.xsd"&gt;


    <context:component-scan base-package="nl.ru.cmbi.pdbeter" />

    <!-- Transaction Manager -->
    <bean id="transactionManager"
        class="org.springframework.orm.hibernate3.HibernateTransactionManager">
        <property name="sessionFactory" ref="sessionFactory" />
    </bean>

    <tx:annotation-driven />

    <!-- Session Factory -->
    <bean id="sessionFactory"
        class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
        <property name="configLocation" value="hibernate.cfg.xml" />
        <property name="packagesToScan" value="nl.ru.cmbi.pdbeter.core.model.domain" />
    </bean>

    <!-- Task Executor -->
    <task:annotation-driven />
</beans>

I'm not really sure what is going wrong, this should only open one connection every time, and close it afterwards, isn't that what @Transactional is supposed to do? Also, is there an easy way to check how many connections are open at a certain time so that I can check before and after the error how many connections were open?

EDIT: when I check the database nothing has been added, so it can't even make one connection, what is going wrong here?

EDIT: I'm sorry but I already solved it myself. It was a very stupid mistake, there was a very small query using criteria that was executed 1000 times as well, but that one was executed before the transaction, causing it to be executed in 1000 separate transactions/sessions/connections (I think, correct me if I'm wrong!)

EDIT: ok, turns out that didn't solve it at all, cause I needed that small query to see if something was already in the database, and if so, get that object from the database so I could update it's fields/columns/whatever you want to call it.

This is the method in the GenericDAO:

@Override
public PDBEntry findByAccessionCode(String accessionCode) {
    return (PDBEntry) createCriteria(Restrictions.eq("accessionCode", accessionCode)).uniqueResult();
}

There is a function that builds the mapped objects that isn't in the DAO, since it converts a raw datafile into the database object, so I wanted to keep that out of database operations and only put the saveOrUpdate() within the database module. The problem I have now is that the findByAccessionCode() is being called a 1000 times during the conversion of the raw datafile to the database objects, because I need to check whether a certain piece of data is already present in the database, and if so, get the object from the database instead of making a new one.

Now how would I execute that query a 1000 times inside one connection in this context? I tried making that conversion method that converts the 1000 files @transactional, but that didn't work.

Here's the conversion method:

private void updatePDBSet(Set<RawPDBEntry> RawPDBEntrySet) {
    Set<PDBEntry> pdbEntrySet = new LinkedHashSet<PDBEntry>();

    for (RawPDBEntry pdb : RawPDBEntrySet) {
        PDBEntry pdbEntry = pdbEntryDAO.findByAccessionCode(pdb.id);
        if (pdbEntry == null) {
            pdbEntry = new PDBEntry(pdb.id, pdb.header.date);
        }

        pdbEntry.setHeader(pdb.header.header);

        ExpMethod expMethod = new ExpMethod.Builder(pdbEntry, pdb.expMethod.expMethod.toString()).build();
        if (pdb.expMethod.resolution != null) {
            expMethod.setResolution(pdb.expMethod.resolution);
        }
        if (pdb.expMethod.rFactor != null) {
            expMethod.setRFactor(pdb.expMethod.rFactor.rFactor);

            if (pdb.expMethod.rFactor.freeR != null) {
                expMethod.setFreeR(pdb.expMethod.rFactor.freeR);
            }
        }

        if (pdb.hetGroups != null) {
            for (PFHetId hetId : pdb.hetGroups.hetIdList) {
                HetGroup hetGroup = new HetGroup(pdbEntry, hetId.hetId);

                if (hetId.nAtom != null) {
                    hetGroup.setNAtom(hetId.nAtom);
                }

                if (hetId.name != null) {
                    hetGroup.setName(hetId.name);
                }
            }
        }

        for (PFChain chain : pdb.chainList) {
            new Chain(pdbEntry, chain.chain);
        }

        pdbEntrySet.add(pdbEntry);
    }

    pdbFinderDAO.updatePDBEntry(pdbEntrySet);
}

(The pdbFinderDAO.updatePDBEntry(pdbEntrySet) was where I originally thought the problem originated)

EDIT: First of all sorry that I created this new post, I really thought I found the answer, but I'll just continue in this post for further edits.

Ok, now I put all the 1000 findAccessionCode criteria inside the DAO by sending a Set of the raw data files to the DAO so it can retrieve the id's there, then finding them in the database, getting the ones it can find and adding it to a HashMap where the database object is mapped with the reference to the raw data file as key (so I know what raw data belongs to what database entry). This function I made @Transactional like so:

@Override
@Transactional
public Map<RawPDBEntry, PDBEntry> getRawPDBEntryToPDBEntryMap(Set<RawPDBEntry> rawPDBEntrySet) {
    Map<RawPDBEntry, PDBEntry> RawPDBEntryToPDBEntryMap = new HashMap<RawPDBEntry, PDBEntry>();

    for (RawPDBEntry pdb : rawPDBEntrySet) {
        RawPDBEntryToPDBEntryMap.put(pdb, (PDBEntry) createCriteria(Restrictions.eq("accessionCode", pdb.id)).uniqueResult());
    }

    return RawPDBEntryToPDBEntryMap;
}

Still, no success... I get the exact same error, but it does tell me it's the criteria that causes it. Why can't I execute all these 1000 queries within the same connection?

EDIT: Yet another update: I tried adding all the queries 1 by 1, and this worked, slowly, but it worked. I did this on an empty database. Next I tried the same thing but now the database already contained the stuff from the first try, and I got the following error:

Exception in thread "main" org.hibernate.HibernateException: Illegal attempt to associate a collection with two open sessions

I'm guessing this has something to do with the fact that I'm getting the objects (that are already in the database and so have to be updated) within the DAO, then sending references back to the conversion method, then changing their fields, then sending them back to the DAO to make them persistent. Although after googling a bit I found people that had problems with collections in their POJO's with the annotation:

@OneToMany(mappedBy = "pdbEntry", cascade = CascadeType.ALL, fetch = FetchType.LAZY)

where the cascading caused the problem. Do I have to remove all of these cascades and hardcode the saveOrUpdate() operations for all the different mapped objects? Or has this nothing to do with that error?

And finally: I'm still no closer to figuring out how to do this for 1000 objects at a time.

A: 

Solved the problem, it had to do with a bad setup of Spring, which caused the @Transactional to not be recognized. I fixed that, and then the error went away.

FinalArt2005