tags:

views:

320

answers:

6

I was reading O'Reilly's Learning XML Book and read the following

HTML was in some ways a step backward. To achieve the simplicity necessary to be truly useful, some principles of generic coding had to be sacrificed. ... To return to the ideals of generic coding, some people tried to adapt SGML for the web ... This proved too difficult.

This reminded me of a StackOverflow Podcast where they discussed the poorly formed HTML that works on browsers.

My question is, would the Internet still be as successful if the standards were as strict as developers would want them to be now?

+3  A: 

Most of the ambiguity and inconsistency on the web today isn't from things like unclosed tags - it's from CSS semantics being inconsistent from one browser to the next. Even if all web pages were miraculously well-formed XML, it wouldn't help much.

+4  A: 

The fact that html simply "marks up" text and is not a language with operators, loops, functions and other common programming language elements is what allows it to be loosely interpreted.

One could correlate this loose interpretation as making the markup language more accessible and easily used thus allowing more "uneducated" people access to the language.

My personal opinion is that this has little to do with the success of the Internet. Instead, it's the ability to communicate and share information that make the internet "successful."

+4  A: 

It hurt the Internet big time.

I recall listening to a podcast interview with someone who worked on the HTML 2.0 spec and IIRC there was a big debate at the time surrounding the strictness of parsers adhering to the standard.

The winners of the argument used the "a well implemented system should be liberal in what it accepts and strict in what it outputs" approach which was popular at the time.

AFAICT many people now regard this approach as overly simplistic - it sounds good in principle, but actually rarely works in practice.

IMO, even if HTML was super strict from the outset, it would still have been simple enough for most people to grasp. Uptake might have been marginally slower at the outset, but a huge amount of time/money (billions of dollars) would have been saved in the medium-long term.

Ben Aston
It also meant that the it became quite difficult for new web browsers to enter the market.
Arafangion
Yep good point, and hence stifled innovation.
Ben Aston
I can't disagree more. One of the reasons the Internet revolution occurred was that my mom was able to write an HTML webpage. Sure HTML could have been stricter, but something easier to write would have benefited from the network effect instead.
Jon Ericson
+10  A: 

Lack of standard enforcement didn't hurt the adoption of the web in the slightest. If anything, it helped it. The web was originally designed for scientists (who generally have little patience for programming) to post research results. So liberal parsers allowed them to not care about the markup - good enough was good enough.

If it hadn't been successful with scientists, it never would have migrated to the rest of academia, nor from there to the wider world, and it would still today be an academic exercise.

But now that it's out in the wider world, should we clamp down? I see no incentive for anyone to do so. Browser makers want market share, and they don't get it by being pissy about which pages they display properly. Content sites want to reach people, and they don't do that by only appearing correctly in Opera. The developer lobby, such as it is, is not enough.

Besides, one of the reasons front-end developers can charge a lot of money (vs. visual designers) is because they know the ins and outs of the various browsers. If there's only one right way, then it can be done automatically, and there's no longer a need for those folks - well, not at programmer salaries, anyway.

Sarah Mei
As much as I hate your answer, it is unfortunately true. :(
Arafangion
+1: I'm reminded of the parallel between internet and television. In the 30s, scientists pioneered the invention of television, and they had a dream that TV would act as a medium to spread Shakespeare, science, and philosophy to the masses. What they got was a Shot at Love with Tila Tequila.
Juliet
+1  A: 

There is a principle that describes how HTML and web browsers are able to work and interoperate with any success at all:

Be liberal in what you accept, and conservative in what you output.

There needs to be some latitude between what is "correct" and "acceptable" HTML. Because HTML was designed to be "human +rw", we shouldn't be surprised that there are so many flavours of tag soup. Flexibility is HTML's strength wherever humans need to be involved.

However, that flexibility adds processing overhead which can be hard to justify when you need to create something for machine consumption. This is the reason for XHTML and XML: it takes away some of that flexibility in exchange for predictable input.

Andrew Vit
A: 

If HTML had been more strict, something easier would have generated the needed network effect for the internet to become mainstream.

Jon Ericson