views:

157

answers:

2

I just want to give you a little background

  1. Need to write a PL-SQL which will extract 6 million record joining different tables and create a file of that.
  2. Need more suggestions, specifically on how to fetch these many records. As fetching these million of records on a single go can be a highly resource intensive. So question is how to fetch these many records ?

Any pl-sql will be highly appreciated.

A: 

Do you need to extract the contents of a single table, or a JOIN result? Does the result have to be ordered? If so, you should first optimize your extraction query.

Once you have an optimal query you can either SELECT directly (e.g. via sqlplus), or write to an external (flat file) table, or use ODI.

vladr
A: 

6 million rows is not all that much. Just write the query and let the optimizer do its thing. If your statistics are up to date, which is also automatic, it should run just fine. Trying to tweak it will probably make it run slower, unless you have a better understanding of query optimization than the team of engineers at Oracle.

Using only PL/SQL is okay if all your rows are small enough. The UTL_FILE package will restrict you to 32767 characters per line with the PUT_LINE command. There is a way around this, but it requires copying data to a BLOB.

I'd suggest installing either Python and cx_Oracle or Perl and DBD::Oracle. They will happily churn through 6 million rows with no problem. The defaults are set to pull a good size batch of rows to keep network transmissions down.

If you let me know how you want to proceed, I can post some sample code, but googling the documentation for either cx_Oracle or DBD::Oracle will turn up the basics on how to fetch records.

ben