tags:

views:

127

answers:

2

I'm using eXist for a project at work, and I've run into a problem that I can't seem to figure out a solution for.

I have an xquery script that is updating an existing document that is already in the database. Part of the data that needs to be updated contains HTML, specifically <p> and </p> tags. I cannot get eXist/XQuery to stop escaping the HTML. It needs to be preserved in it's original form. Here's a very simple version of what I'm doing:

<pre>
declare variable $raw-content := request:get-parameter('content', '')
declare variable $content := local:clean($raw-content)
</pre>

local:clean is the following function:

<pre>
declare function local:clean($text) {
     let $text := util:parse($text)
     return $text
};
</pre>

Later on in the code I update a specific XML element

<pre>
{update replace $n/sports-content/article/nitf/body/body.content with <body.content>{$content}</body.content>}
</pre>

Now, this works perfect if I only pass in data wrapped in one set of tags (ie <p>foo</p>). If I do <p>foo</p><p>bar</p>, I get a null value placed in $text.

I've been banging my head against the desk for a day and a half now trying to figure out why this doesn't work. Any help in solving this problem would be greatly appreciated.

A: 

It seems like a problem with util:parse specifically, not the rest of your code (and I don't see any alternative way to do what you want here). Have you tried testing util:parse specifically on the sample input that you give in the question (i.e. <p>foo</p><p>bar</p>) to verify that it does indeed not parse a multi-node fragment correctly?

Pavel Minaev
A: 

A guess:

foo

bar

is not well-formed (no root element) while

foo

is well-formed (one root element).

So try util:parse-html instead of util:parse.