It all depends on the performance requirements and the general practices you use. Rune's answer can be perfectly fine. If you are inserting 100,000 rows look at a bulk inserter.
If you are used to writing stored procs and you are lucky enough to be running SQL 2008 you can make use of table valued params
This allows you to do stuff like this:
SqlCommand cmd = new SqlCommand("usp_ins_Portfolio", conn);
cmd.CommandType = CommandType.StoredProcedure;
//add the ds here as a tvp
SqlParameter sp = cmd.Parameters.AddWithValue("@Portfolio", ds.Tables[0]);
//notice structured
sp.SqlDbType = SqlDbType.Structured;
cmd.ExecuteNonQuery();
Then a single call to a stored proc can insert all the rows required into the Tag table.
For SQL 2005 and below I usually will use a single comma separated param for all the values, and split it in TSQL inside a stored proc. This tends to perform quite well and avoids mucking around with temp tables. It is also secure, but you have to ensure you use a text input param for the proc or have some sort of limit or batching mechanism in code (so you do not truncate long lists).
For ideas on how to split up lists in TSQL have a look at Erland's excellent article.
Sql 2000 version of the article is here.