views:

665

answers:

6

Hello...my question is similar to other friend posted here...we are trying to develop an application that supports possibly terabytes of information based on a land registry in Paraguay with images and normal data.

The problem is that we want to reduce the cost of operation to minimum as possible because it´s like a competition between companies, and for that reason we want to use a free database....I have read a lot of information about it but I am still confused. We have to realize that the people who is gonna use it are government people so the DB has to be easy to manage at the same time.

What would u people recommend me?

Thanu very much

+2  A: 

Postgresql, SQL Server 2008 and Any recent version of Oracle all have spatial indexing, table partitioning and BLOBs and are capable of acting as the back-end of a large geographic database. You might also want to check out two open-source GIS applications: GRASS and QGIS, which might support doing what you want with less modification work than writing a bespoke application. Both can use Postgresql and other database back-ends.

As for support, any commercial or open-source database is going to need the attentions of a competent DBA if you want to get it to work well on terabyte-size databases. I don't think you will get away with a model of pure end-user support - attempts to do this are unlikely to work.

ConcernedOfTunbridgeWells
A: 

Whilst not meeting your criteria of being free, I would strongly recommend you consider using SQL Server 2008, because of two Gfeatures in this version which could help:

  • FILESTREAM - allows you to store your binary images within the filesystem, rather than within the database itself. This will make your database much more manageable whilst still allowing you to query the data in the usual way.

  • GEOGRAPHIC DATA TYPES - support for geospatial (lat/long) datatypes is likely to be very valuable to your solution.

Good luck!

Ian Nelson
<a href="msdn.microsoft.com/en-gb/librarybb964711.aspx">MSDN Geo Types</a>Does help with storing the geo loc but you need 2008 which fails part of the request.
jim
+3  A: 

MySQL and even SQLite already have spatial indexes, so no problem there.

To store the datafiles you could use a BLOB field, but it's usually much better (and easier to optimise) to store as files. To keep the files related to the DB records you can either put the full path (or URL) in a varchar field, or store the image in a path calculated by the record's ID.

To easily scale into the multi-terabyte store, plan from the start on using several servers. If the data is read-mostly, an easy way is to store the images on different hosts, each with a static HTTP server, and the database records where is each image. then put a webapp frontend for the database, where the URLs for each image directly point to the appropriate storage server. That way you can keep adding storage without creating a bottleneck on the 'central' server.

Javier
+2  A: 

It sounds like the image files will be a considerable amount of your storage. Don't store them in a database just store the file location details in the database.

(If you want access via the internet try Amazon Storage. It isn't free but very cheap and they handle the scaleability for you. )

Simmo
+1  A: 

Another cautionary note on using B/C/LOBs, as I've been bitten on exponential DB growth by storing internally w/in the DB.

What about storing the GIS maps on a separate server and just store the LAT/LONG "shape" of the area w/in the DB. The GIS can be updated separately w/out the cost of storing the images in the main database.

Smaller to admin. Less cost to backup.

jim
A: 

Use ESRI's Image Server. You won't need a database to serve the images. Its very easy to use. It also works off of files and its fast and handles many image formats. Plus it does image processing on the fly and supports many clients. AutoCAD, Microstation, ArcMap, ArcIMS, ArcServer...etc.

Image Server

Donny V.