I'm writing something to handle concurrent read/write requests to a database file.
ReentrantReadWriteLock looks like a good match. If all threads access a shared RandomAccessFile object, do I need to worry about the file pointer with concurrent readers? Consider this example:
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.util.concurrent.locks.ReentrantReadWriteLock;
public class Database {
private static final int RECORD_SIZE = 50;
private static Database instance = null;
private ReentrantReadWriteLock lock;
private RandomAccessFile database;
private Database() {
lock = new ReentrantReadWriteLock();
try {
database = new RandomAccessFile("foo.db", "rwd");
} catch (FileNotFoundException e) {
e.printStackTrace();
}
};
public static synchronized Database getInstance() {
if(instance == null) {
instance = new Database();
}
return instance;
}
public byte[] getRecord(int n) {
byte[] data = new byte[RECORD_SIZE];
try {
// Begin critical section
lock.readLock().lock();
database.seek(RECORD_SIZE*n);
database.readFully(data);
lock.readLock().unlock();
// End critical section
} catch (IOException e) {
e.printStackTrace();
}
return data;
}
}
In the getRecord() method, is the following interleaving possible with multiple concurrent readers?
Thread 1 -> getRecord(0)
Thread 2 -> getRecord(1)
Thread 1 -> acquires shared lock
Thread 2 -> acquires shared lock
Thread 1 -> seeks to record 0
Thread 2 -> seeks to record 1
Thread 1 -> reads record at file pointer (1)
Thread 2 -> reads record at file pointer (1)
If there are indeed potential concurrency issues using ReentrantReadWriteLock and RandomAccessFile, what would an alternative be?