It seems that this is not possible. Under Status and Reports > Crawl Diagnostics there are
2 styles of report available: the directory drill-down 'Tree View'
and the 100 URLs at a time 'List View'. Some people have tried creating programs to page through the List View
but this seems to fail after a few thousand URLs.
My advice is to use your server logs instead.
Make sure that 404 and referrer URL logging are enabled on your web server,
since you will probably want to correct the page containing the broken link.
You could then use a log file analyser to generate a broken link report.
To create an effective, long-term way of monitoring your broken links, you may want to set up a cron job to do the following:
- Use
grep
to extract lines containing 404 entries from the server log file.
- Use
sed
to remove everything except requested URLs and referrer URLs from every line.
- Use
sort
and uniq
commands to remove duplicates from the list.
- Output the result to a new file each time so that you can monitor changes over time.