I am using PostgreSQL.
I realize there is Array data type for PostgreSQL.
http://www.postgresql.org/docs/8.1/interactive/arrays.html
Currently, I need to use database to store measurement result of a semiconductor factory.
They are producing semicondutor units. Every semicondutor units can have variable number of measurement parameters.
I plan to design the table in the following way.
SemicondutorComponent
=====================
ID |
Measurement
=================
ID | Name | Value | SemicondutorComponent_ID
Example of data :
SemicondutorComponent
=====================
1 |
2 |
Measurement
=================
1 | Width | 0.001 | 1
2 | Height | 0.021 | 1
3 | Thickness | 0.022 | 1
4 | Pad0_Length | 0.031 | 1
5 | Pad1_Width | 0.041 | 1
6 | Width | 0.001 | 2
7 | Height | 0.021 | 2
8 | Thickness | 0.022 | 2
9 | Pad0_Length | 0.031 | 2
10| Pad1_Width | 0.041 | 2
11| Pad2_Width | 0.041 | 2
12| Lead0_Width | 0.041 | 2
Assume a factory is producing 24 million units in 1 day
SemicondutorComponent table will have 24 million rows in 1 day
Assume one SemicondutorComponent unit is having 50 measurement parameters. (can be more or can be less, depending on SemicondutorComponent type)
Measurement table will have 24 * 50 million rows in 1 day
Is it efficient to design that way?
I wish to have super fast write speed, and reasonable fast read speed from the database.
Or shall I make use of PostgreSQL Array facility?
SemicondutorComponent
=====================
ID | Array_of_measurement_name | Array_of_measurement_value