I am working on a page with several dynamically generated tables where the original developers used approximately the following logic:
<table>
<tr>...</tr>
<asp:PlaceHolder ID="ph1" runat="server" />
</table>
In the code behind:
// Do lots of stuff
HtmlTableCell cell = new HtmlTableCell();
// set cell text, formatting, etc
ph1.Controls.Add(cell);
// repeat the above logic for every cell in the table
In one page, methods like this are used to add up to 10-15 total rows, with 10-15 columns each spread over 4-5 tables. The table structures are set, so this could also be constructed by hard-coding the table structure in html and putting literals and labels in each cell. The page itself has some performance issues. This is one of the factors that could be contributing to the bad performance.
The question is:
- Is dynamically creating this many table cells and building the table in the code behind versus hard-coding the table and using Literals/Labels for populating the data something that could be introducing big performance issues (ie: is it worth switching over)?
- Or is it more accurate to say that in a situation like this, it would not be so significant (or at least not significant enough to warrant the hours it would take to rewrite the dynamic table generation into hard coded table + Literals)?