views:

753

answers:

4

I'm using rails with the oracleenhanced adaptor to create a new interface for a legacy application.

Database migrations work successfully, but take an incredibly long amount of time before rake finishes. The database changes happen pretty quickly (1 or 2 seconds), but the db/schema.db dump takes over an hour to complete. (See example migration below)

It's a relatively large schema (about 150 tables), but I'm sure it shouldn't be taking this long to dump out each table description.

Is there anyway to speed this up by just taking the last schema.db and applying the change specified in the migration to it? Or am I able to skip this schema dump altogether?

I understand this schema.db is used to create the test database from scratch each time, but this case, there's a large chunk of the database logic in table triggers which aren't included in the schema.rb anyway, so the rake tests are no good to us in any case. (That's a whole different issue that I need to sort out at some other point)

Thanks

Dave Smylie

dgs@dgs-laptop:~/rails/voyager$ time rake db:migrate
(in /home/dgs/rails/voyager)
== 20090227012452 AddModuleActionAndControllerNames: migrating ================
-- add_column(:modules, :action_name, :text)
   -> 0.9619s
   -> 0 rows
-- add_column(:modules, :controller_name, :text)
   -> 0.1680s
   -> 0 rows
== 20090227012452 AddModuleActionAndControllerNames: migrated (1.1304s) =======


real    87m12.961s
user    0m12.949s
sys 0m2.128s
+3  A: 

After all migrations are applied to database then rake db:migrate calls db:schema:dump task to generate schema.rb file from current database schema.

db:schema:dump call adapter's "tables" method to get the list of all tables, then for each table calls "indexes" method and "columns" method. You can find SQL SELECT statements that are used in these methods in activerecord-oracle_enhanced-adapter gem's oracle_enhanced_adapter.rb file. Basically it does selects from ALL% or USER% data dictionary tables to find all the information.

Initially I had issues with original Oracle adapter when I used it with databases with lot of different schemas (as performance might be affected by the total number of table in the database - not just in your schema) and therefore I did some optimizations in Oracle enhanced adapter. It would be good to find out which methods are slow in your case (I suspect that it could be either "indexes" or "columns" method which is executed for each table).

One way hoe to debug this issue would be if you would put some debug messages in oracle_enhanced_adapter.rb file so that you could identify which method calls are taking so long time.

Raimonds Simanovskis
Cool. I'll have a play around in oracle_enhanced_adapter.rb and see what I can find.I suspect it is tied into the schema's as there is a number of identical schema's with different names in the database.Cheers
Dave Smylie
You can also try new oracle_enhanced adapter version 1.2.0 - there are some improvements for schema dump performance.
Raimonds Simanovskis
+1  A: 

use a profiler to figure out what's taking so long

rogerdpack
+2  A: 

Problem mostly solved after some digging round in oracle_enhanced_adapter.rb.

The problem came down to way too many tables in the local schema (many EBA_%, EVT_%, EMP_%, SMP_% tables had been created in there coincidently at some point), archive tables being included in the dump and a select from the data dictionaries taking 14 seconds to execute.

To fix the speed, I did three things:

  1. Dropped all unneeded tables (about 250 out of 500)
  2. Excluded archive tables from the schema dump
  3. Cached the result of the long running query

This improved the time from the migration/schema dump for the remaining 350 tables from about 90 minutes to about 15 seconds. More than fast enough.

My code as follows (for inspiration not copying and pasting - this code is fairly specific to my database, but you should be able to get the idea). You need to create the temp table manually. It takes about 2 or 3 minutes for me to do - still too long to generate with each migration, and it's fairly static anyway =)

module ActiveRecord
  module ConnectionAdapters
    class OracleEnhancedAdapter
      def tables(name = nil)
        select_all("select lower(table_name) from all_tables where owner = sys_context('userenv','session_user')  and table_name not like 'A!_%' escape '!' ").inject([]) do | tabs, t |
          tabs << t.to_a.first.last
        end
      end


      # TODO think of some way to automatically create the rails_temp_index table 
      #
      #   Table created by: 
      #   create table rails_temp_index_table as 
      #   SELECT lower(i.index_name) as index_name, i.uniqueness, 
      #          lower(c.column_name) as column_name, i.table_name
      #    FROM all_indexes i, user_ind_columns c
      #    WHERE  c.index_name = i.index_name 
      #       AND i.owner = sys_context('userenv','session_user')
      #       AND NOT exists  (SELECT uc.index_name FROM user_constraints uc 
      #              WHERE uc.constraint_type = 'P' and uc.index_name = i.index_name);

        def indexes(table_name, name = nil) #:nodoc:

              result = select_all(<<-SQL, name)
                SELECT index_name, uniqueness, column_name
                  FROM rails_temp_index_table
                 WHERE table_name = '#{table_name.to_s.upcase}'
                  ORDER BY index_name
              SQL

              current_index = nil
              indexes = []

            result.each do |row|
                if current_index != row['index_name']
                  indexes << IndexDefinition.new(table_name, row['index_name'], row['uniqueness'] == "UNIQUE", [])
                  current_index = row['index_name']
                end

                indexes.last.columns << row['column_name']
              end

              indexes
            end
end
Dave Smylie
A: 

HOW CAN MIGRATE to sreal

HAMED