I'm writing a chrome extension that works with a website that uses iso-8859-1. Just to give some context what my extension does is making posting in the site's forums quicker by adding a more convenient post form. The value of the textarea where the message is written is then sent through an ajax call (using jQuery).
If the message contains characters like á these characters appear as á in the posted message. Forcing the browser to display utf-8 instead of iso-8859-1 makes the á appear correctly.
It is my understanding that javascript uses utf-8 for it's strings, so it is my theory that if I transcode the string to iso-8859-1 before sending it, it should solve my problem. However there seems to be no direct way to do this transcoding in javascript, and I can't touch the server side code. Any advice?
I've tried setting the created form to use iso-8859-1 like this:
var form = document.createElement("form");
form.enctype = "application/x-www-form-urlencoded; charset=ISO-8859-1";
and also
var form = document.createElement("form");
form.encoding = "ISO-8859-1";
but that doesn't seem to work.
EDIT:
The problem actually lied in how jQuery was urlencoding the message (or something along the way), I fixed this by telling jQuery not to process the data and doing it myself as is shown in the following snippet
function cfaqs_post_message(msg) {
    var url = cfaqs_build_post_url();
    msg = escape(msg).replace(/\+/g, "%2B");
    $.ajax({
        type: "POST",
        url: url,
        processData: false,
        data: "message=" + msg + "&post=Preview Message",
        success: function(html) {
            // ...
        },
        dataType: "html",
        contentType: "application/x-www-form-urlencoded"
    });
}