Anyone have any great ideas besides
storing a list of all TLDs?
No, because each TLD differs on what counts as a subdomain, second level domain, etc.
Keep in mind that there are top level domains, second level domains, and subdomains. Technically speaking, everything except the TLD is a subdomain.
In the domain.com.uk example, domain is a subdomain, com is a second level domain, and uk is the tld.
So the question remains more complex than at first blush, and it depends on how each TLD is managed. You'll need a database of all the TLDs that include their particular partitioning, and what counts as a second level domain and a subdomain. There aren't too many TLDs, though, so the list is reasonably manageable, but collecting all that information isn't trivial. There may already be such a list available.
Looks like http://publicsuffix.org/ is one such list - all the common suffixes (.com, .co.uk, etc) in a list suitable for searching. It still won't be easy to parse it, but at least you don't have to maintain the list.
A "public suffix" is one under which
Internet users can directly register
names. Some examples of public
suffixes are ".com", ".co.uk" and
"pvt.k12.wy.us". The Public Suffix
List is a list of all known public
suffixes.
The Public Suffix List is an
initiative of the Mozilla Foundation.
It is available for use in any
software, but was originally created
to meet the needs of browser
manufacturers. It allows browsers to,
for example:
- Avoid privacy-damaging "supercookies" being set for
high-level domain name suffixes
- Highlight the most important part of a domain name in the user
interface
- Accurately sort history entries by site
Looking through the list, you can see it's not a trivial problem. I think a list is the only correct way to accomplish this...