In ASP.NET/C#:
<%@ Page Language="C#" %>
<%@ Import Namespace="System.Web.Script.Serialization" %>
<%
JavaScriptSerializer jss = new JavaScriptSerializer();
string[] fruits = new string[3] {"apple","banana","crunchberries"};
string output = jss.Serialize(fruits);
%>
<html>
fruits=<%=output%>
</html>
returns fruits=["apple","banana","crunchberries"]
In ASP.NET/VB.NET:
<%@ Page Language="VB" %>
<%@ Import Namespace="System.Web.Script.Serialization" %>
<%
dim jss as new JavaScriptSerializer()
dim fruits = new String(2) {"apple","banana","crunchberries"}
dim output as string = jss.Serialize(fruits)
%>
<html>
fruits=<%=output%>
</html>
returns fruits=["apple","banana","crunchberries"]
But in ASP.NET/JScript.NET:
<%@ Page Language="JScript" %>
<%@ Import Namespace="System.Web.Script.Serialization" %>
<%
var jss:JavaScriptSerializer = new JavaScriptSerializer;
var fruits = ["apple","banana","crunchberries"];
var output = jss.Serialize(fruits);
%>
<html>
fruits=<%=output%>
</html>
returns fruits=["0","1","2"]
This seems completely broken. It can be fixed by explicitly declaring the data type of fruits, which makes this into a "native array":
<%@ Page Language="JScript" %>
<%@ Import Namespace="System.Web.Script.Serialization" %>
<%
var jss:JavaScriptSerializer = new JavaScriptSerializer;
var fruits:String = ["apple","banana","crunchberries"];
var output = jss.Serialize(fruits);
%>
<html>
fruits=<%=output%>
</html>
returns fruits=["apple","banana","crunchberries"]
I don't see why the type is mandatory to get the correct serialization. (var fruits = ["apple","banana","crunchberries",5]
might be bad code but it's legal in both Javascript and Jscript.) JScript arrays are suppose to be slower than native arrays, but they're still suppose to work, right?