Can the stackoverflow community recommend tools, techniques, and processes for checking the structure and content of a moderately large, collaboratively edited, evolving website? This process is time consuming to do manually, and it would be desirable to automate this to the fullest extent possible via software.
I am interested in checks on the structure of the website - are there pages without links, or pages with dead links, missing resources? Can the structure be visualised?
I am interested in checks on the content of the webpage - are there images without alt text, or above certain sizes, or inappropriately scaled down from full size? What bad practices can be detected programmatically, and with what tools? Are there tools that can warn of incompatibilities with major browsers?
Can content be checked easily e.g. spelling, grammar? What about code examples?
Are there tools for tracking what pages have in actual fact received a manual review?