From last couple of months my packages consuming more time to complete
as a scheduled job. In all my ssis packages i am truncating the data in sql
server then Bulk inserting data in those tables
I assume that over a period of time, the amount of data you are needing to bulk insert will keep on growing with this design? That might explain why the job might be becoming progressively slower.
Even I have lot of non clustered indexes on those tables. Do I need to drop all
indexes on those tables before executing bulk insert SSIS packages and once the
package is succeeded I have to create Non clustered indexes so that the
performance and time consuming constraint of the packages can be solved
The indexes will definitely slow down the insert of the data. However, it could be a lot more than just your non-clustered indexes. Depending on the values of the columns used to make up the clustered index i.e primary key, it could be that the clustered index slowing down the inserts. This may depend on the ordering of the data that you are inserting and could be something that could be controlled.
As you have already highlighted, you could try out dropping and re-creating the indexes post insert. However, you have to factor in the time taken to re-create the indexes and see if that additional time results in enough of a performance improvement for you.
In general, i would suggest that you first log the amount of time taken in each step of your DTS to really pinpoint which step is the slowest and then decide how it can be improved. Currently, without data, you are really shooting in the dark