For as many wiki tools as I have used, each time I must learn yet another markup language. Why doesn't wiki markup get standardized like HTML, XSLT, SVG, and other web languages?
Because the beauty of standards is that there are so many you can choose from (Torvalds), so it would not make a difference anyway.
There's a more or less de facto standard, which is MediaWiki. Other systems take advantage of their own solution to the problem in order to simplify parsing, or provide additional features that would not work with another syntax.
The main problem of lack of standards in this sense is portability. If you want to move from, say, MediaWiki to MoinMoin or wikkawiki, not only you have to convert the database, but also its content. That's caustic, but I think a stable standard will eventually evolve with natural selection. As I said, MediaWiki is more or less the standard, as it's very popular. Other solutions will become obsolete eventually. I mean.. check on wikimatrix... there are so many that it triggers the paradox of choice.
I suspect it's because any particular piece of wiki content doesn't need to interoperate with multiple wiki systems, in the way that an HTML file needs to be processed by multiple browsers, so there hasn't been an impetus to form a standards committee etc. etc.
- No standard libraries--unlike, say, RegEx, which is baked into various platforms, which tends to stabilize the de facto standards. There are standard implementations, such as MediaWiki, but not everyone wants to use MediaWiki and there's little incentive to copy its rather robust markup language.
- Little need to exchange data between wiki platforms.
- Few common users interact with multiple wikis, so they learn the ones they are exposed to.
- Wiki markup is essentially a hack to overcome poor rich text browser control implementations, but rich text controls are getting better. (Templating via wiki markup is, of course, a whole other topic.)
- Wikis are often domain-specific, so the available formatting options, suitable characters for markup, etc. differs between implementations.
- There are existing competing "standards" such as BBcode, which further confuse the whole tag-vs-character-markup decisions.
- If an entity such as the W3C came up with a standard, it would take 3 years to develop, an extra associate's degree to use, and no one would rewrite their wikis to support it.
I'd say people who come up with new variants all the time think the existing ones blows... and they may be correct.
Some well-defined stuff could be used to draw from like Tex but it doesn't seem like anyone think it's important with markup - though it ought to be one of the best ways to separate content from presentation.
Personally I'm not crazy for Markdown for instance and have been pondering back and forth for probably over two years between using it anyway, lately because of Showdown, or going through the trouble of creating a client-side version of one of my favorite markup variants instead. SO has shown that it might be neat enough, at least after their contributions to wmd - making it easier to use and configure. It would be neat for a wiki markup to add some intellisense-style typing/editing for linking to existing pages and resources ^^
Wikis are new. It takes time to shake down a lot of ideas and learn (by trial and error the only way that really works when people are involved) what works.
There is - Its called Creole. Most wikis accept that, as an extension or as a patch.
Another reason for the differences is that Wiki markup was designed to be easier to use than HTML, but different Wikis prefer a different spot on the power vs ease curve. Just like there are so many Linux distributions because some people want more cutting-edge versions of packages while others want stability. There's no right answer.