views:

438

answers:

6

My friend uses Visual Studio to develop websites in ASP.NET. She only uses the Master Page facility, other than that it's 100% normal HTML and CSS.

Is there a way to export the website to HTML pages based upon their master pages?

If not, it's either loading each page manually and saving the HTML, or I write a little app that does it.

Alternatively, does anyone know of a tool to achieve something similar?

+1  A: 

I do not really know how to export an entire site to a local copy.

There are however tool - website downloaders. I know of one - TeleportPro, there should be others. Check them out if it sound as an option to you.

User
A: 

Visual Studio doesn't have this ability out of the box. However, it should be possible to write a tool that walks through a site map, captures the rendered html from the response object, and then write it to a file.

Soviut
That was exactly what I had in mind as a "last resort"!
joshcomley
+1  A: 

You could give Macromedia Dreamweaver a shot if you feel like experimenting. It caters for Client-side and Server-side page development.

+1 I used to use the dreamweaver templating system on projects where the client had no dynamic backend. It is essentially the masterpage system in ASP.NET but dreamweaver generates the pages for you.
Soviut
+1  A: 

When using MasterPages, the content of the MasterPage is merged with the content page on the server-side (either at pre-compile or the page's first request). So you need to have the content pages and MasterPage compile via aspnet_compile at some point. See the "Runtime Behavior" section of this MSDN article.

Your friend may want to use old fashioned server side includes (which is essentially what a MasterPage is doing for you anyway):

<!--#include virtual="/includes/header.html" -->
<!--#include virtual="/includes/nav.html" -->

<p> content </p>

<!--#include virtual="includes/footer.html" -->

If this is blocked by your web server/host of choice (some disable it for security reasons) then I would create a main index page and use an Ajax call to fill a content DIV. Of course, if Javascript is disabled, your visitors will not see any content.

Rob Allen
+1  A: 

I think your going to need to roll your own for this one. This function visits a url and gets the contents:

  Public Shared Function GetHTTPContent(ByVal url As String) As String
    Dim req As WebRequest = System.Net.HttpWebRequest.Create(url)
    Dim encode As System.Text.Encoding = System.Text.Encoding.GetEncoding("utf-8")
    Dim sr As New StreamReader(req.GetResponse().GetResponseStream(), encode)
    Dim HTTPContent As String = sr.ReadToEnd

    sr.Close()
    sr.Dispose()

    Return HTTPContent

End Function
Corey Downie
A: 

Lots of these exist, here is one:

HTTrack Website Copier

This is also called spidering because it's the same thing search engines do.

MatthewMartin