views:

195

answers:

1

Hi, I'm stuck at a Segmentation fault after inserting about 8 million records in a TC Hash Database. After everything is inserted I close the DB but I caught a Segmentation Fault at this part of the code (tchdb.c):

static void tchdbsetflag(TCHDB *hdb, int flag, bool sign){
  assert(hdb);
  char *fp = (char *)hdb->map + HDBFLAGSOFF;
  if(sign){
    *fp |= (uint8_t)flag;  //SEGFAULT HERE!
  } else {
    *fp &= ~(uint8_t)flag;
  }
  hdb->flags = *fp;
}

More especifically at the commented line.

The DB was opened like this:

tchdbopen(hdb, db_file,  HDBOWRITER | HDBOCREAT))

The DB is tunned with:

tchdbtune(hdb, 25000000, -1, -1, HDBTLARGE);
tchdbsetcache(hdb, 100000);

The .tch file is about 2GB (2147483647 bytes). The interesting thing is that it is only happening when I insert around 8 million records. With 2 or 3 millions the DB closes all right. Inserting 8 million records takes around 3 hours because I read data from text files.

Any ideas?

Thanks

+3  A: 

Just solved the problem. I'm on a 32bits system and TC can only handle databases up to 2GB in such systems. The solution is building TC with the "--enable-off64" option. Something like this:

./configure --enable-off64
make
make install
Felipe Hummel