tags:

views:

439

answers:

2

I am running a system written in c++ that is continuously inserting large amounts of data into my database and at the same time querying the database for updated results. My problem is that the postgres threads started in this process continuously use more and more memory. I need to know how to correct this problem. The following is a much simpler program that demonstrates this issue.

#include <iostream>
#include <sstream>

#include <tbb/tbb_thread.h>//intel parallel studio class for parel

#include "libpq-fe.h"
#include "libpq/libpq-fs.h"

class Inserter{
public:
    void operator()(){
     PGconn* conn = PQconnectdb("user=postgres password=1234");
     int i=0;
     while(1){
      std::stringstream insert;
      insert << "INSERT INTO tmp (value) VALUES (" << i%250 << ");";
      PGresult* res=PQexec(conn,insert.str().c_str());
      if (PQresultStatus(res) == PGRES_FATAL_ERROR){
       std::cout << "Error in inserting data:\nError code: " << PQresStatus(PQresultStatus(res)) << "Error Message: " << PQerrorMessage(conn);
       PQclear(res);
       PQfinish(conn);
       return;
      }
      PQclear(res);
      i++;
     }
    }

};
class Queryer{
public:
    void operator()(){
     PGconn* conn = PQconnectdb("user=postgres password=1234");
     int j=0;
     while (1){
      std::stringstream query;
      query << "SELECT * FROM tmp WHERE id>" << j%1000 << ";";
      PGresult* res=PQexec(conn,query.str().c_str());
      if (PQresultStatus(res) == PGRES_FATAL_ERROR){
       std::cout << "Error in searching data:\nError code: " << PQresStatus(PQresultStatus(res)) << "Error Message: " << PQerrorMessage(conn);
       PQclear(res);
       PQfinish(conn);
       return;
      }
      PQclear(res);
      Sleep(10);
      j++;
     }
    }

};

void main(){
    //connect to Database
    PGconn* conn = PQconnectdb("user=postgres password=1234");

    //create table
    std::cout << "Creating table...\n";
    PGresult* res=PQexec(conn,"CREATE TABLE tmp (id SERIAL8 PRIMARY KEY,value INT);");
    if (PQresultStatus(res) == PGRES_FATAL_ERROR){
     std::cout << "Error in Creating table:\nError code: " << PQresStatus(PQresultStatus(res)) << "Error Message: " << PQerrorMessage(conn);
     //PQclear(res);
     //PQfinish(conn);
     //return;
    }
    PQclear(res);
    PQfinish(conn);

    std::cout << "Starting table filling thread...\n";
    //fill table with some data
    Inserter ins;
    tbb::tbb_thread filling(ins);
    Sleep(1000);
    // searching table ... here is where the memory leak is
    std::cout << "Starting table searching thread...\n";
    Queryer que;
    tbb::tbb_thread searching(que);

    while(true)
    {
     tbb::tick_count::interval_t t(1.0);
     tbb::this_tbb_thread::sleep(t);
    }
}
+1  A: 

Perhaps you need to close your connections in some way?

erikkallen
He is closing them with PQfinish(conn); The problem is that this is not exception safe.
fnieto
The operator()() functions only close it in case of an error.
erikkallen
if we closed connections for everytime we do a select or insert, the memory leak problem in postgres.exe would go away. However, in real-time application, the latency incurred from re-opening a connection for every query is too great that we have to keep the connection opened.
ShaChris23
In that case you probably shouldn't open the connections in every call.Rule of thumb: everytime you do Something *ptr = SomeFunction(), you need to do something to free memory before ptr goes out of scope.
erikkallen
A: 

ShaChris, first of all, hi. Have you been able to solve your problem? I'm having a similar problem, postgres.exe processes keep eating more and more memory as long as the connections are not closed and reopened (which would indeed solve the 'memory leak' problem, but at the same time replace it with an unacceptable 'reconnect to the database for every request with all the processing, time, cpu, network, etc, etc, etc overhead that would mean, besides its being plain ugly' problem).

drakorg