I am am trying to load a SQL table from a flat file. The flat i am talking about is a comma separated file. This has all the data required to populate a table will each column separated by a comma ",". I need some way by which i can load this content into the table faster.
If you are using SQL Server, use BULK INSERT
If you are using Oracle, see my answer here
Regardless of what database management system you are using, you could use a scripting language (such as perl or php) to set up a connection to your database, parse the file, and then insert the data into your database. Of course, you would have to know a scripting language...
use mysqldump?
mysqldump -u username -p database_name < sql_file.sql
This sounds a little bit old-fashioned, but I use an editor which has the capability to record and replay macros for such works.
I use Textpad (www.textpad.com) for this (yes, I bought a license), you might also use UltraEdit (www.ultraedit.com) or something familiar. It's as simple as starting the maro recorder, edit the first line so that it is SQL compatible, go to the next line and stop the recorder. Then you let the editor repeat your macro to the end of the file.
The main advantage is: After you processed the file you can store it and get it into your version control. If done properly, it works for every database (or tool) that can execute files including SQL commands.
take a look at these speed comparisons and decide what suits you best: http://weblogs.sqlteam.com/mladenp/archive/2006/07/22/10742.aspx
For SQL Server 2005, another option would be Integration Services (SSIS); Using SSIS you would be able to do a lot more work on the data during the import process (for example, looking up values in other tables, filtering out rows, importing multiple tables, etc).