Hi,
In my rails application, I have a file in lib that, among other things, sets up a filter that runs on all controllers.
When running under development environment, everything runs fine. However, under production the filter goes missing. Funny thing is, by inspecting the filter_chain
, I noticed other filters remain, eg. those defined in plugins, or later in the specific controller class.
I've tested this with both rails edge and v2.3.0.
Testing update:
I've now tested with older rails and found the issue to be present back to v2.1.0, but not in v2.0.5, I've bisect them and found the 986aec5 rails commit to be guilty.
I've isolated the behavior to the following tiny test case:
# app/controllers/foo_controller.rb
class FooController < ApplicationController
def index
render :text => 'not filtered'
end
end
# lib/foobar.rb
ActionController::Base.class_eval do
before_filter :foobar
def foobar
render :text => 'hi from foobar filter'
end
end
# config/environment.rb (at end of file)
require 'foobar'
Here's the output I get when running under the development environment:
$ script/server &
$ curl localhost:3000/foo
> hi from foobar filter
And here's the output for the production environment:
$ script/server -e production &
$ curl localhost:3000/foo
> not filtered
As alluded to before, it works fine for any environment when I do the same thing via plugin. All I need is to put what's under lib/foobar.rb
in the plugin's init.rb
file.
So in a way I already have a workaround, but I'd like to understand what's going on and what's causing the filter to go missing when in production.
I conjecture it's something in the different ways Rails handles loading in the different environments, but I need to dig deeper.
update
Indeed, I've now narrowed it down to the following config line:
config.cache_classes = false
If, in production.rb
, config.cache_classes
is changed from true
to false
, the test application works properly.
I still wonder why class reloading is causing such thing.