I'm building a small application and to reduce hosting costs and dependencies, I am planning to store all persistent data to xml files rather than a Sql Server database.
In this case, the site audience is limited to friends and family - no more than a few concurrent users are ever expected, so the site does not need to scale. Is it feasible to literally open and close an xml file from disk on all transactions? At most a page might display data from a couple xml files, and occasionally a user will perform an action requiring an update of one.
For example, roughly following the repository pattern for getting and saving "Things," some methods would like like:
public IEnumerable<Thing> GetThings() {
XElement xml = XElement.Load(_xmlRepositoryPath);
var q = from s in xml.Descendants("Thing")
select new Thing {
//set properties...
};
return q;
}
public void SaveThing(Thing t) {
XElement xml = XElement.Load(_xmlRepositoryPath);
//update xml...
xml.Save(_xmlRepositoryPath);
}
Any pitfalls or problems with this approach? I'd rather avoid additional complexity of adding an additional caching or in-memory data layer. Extra credit: at what point of user load or transaction levels do think this would need to be implemented differently?