Hypothetical situation: I'm writing a sound library that I want to run on multiple platforms. I'll try to make as much of the code as possible be platform independent, but certainly some of it will need to change for Windows versus OSX versus Linux.
So I write all these different implementations, but I don't want the end user to have to make their program depend on Linux or Windows or whatever. I also don't want to maintain 4 different interfaces to my API. (Note these are just some of the reasons you might create a factory -- there are certainly other situations).
So I define this nice generic SoundObject
base class that defines all the methods the client gets to use. Then I make my LinuxSoundObject
, WindowsSoundObject
, and 5 others derive from SoundObject
. But I'm going to hide all these concrete implementations from the user and only provide them with a SoundObject
. Instead, you have to call my SoundObjectFactory
to grab what appears to you to be a plain old SoundObject
, but really I've chosen the correct runtime type for you and instantiated it myself.
2 years later, a new OS comes about and displaces Windows. Instead of forcing you to rewrite your software, I just update my library to support the new platform and you never see a change to the interface.
This is all pretty contrived, but hopefully you get the idea.
Factories isolate consumers of an interface from what runtime type (i.e. implementation) is really being used.