views:

99

answers:

2

I was just wondering if anyone has successfully got Spark to work in a .NET 4.0 console application for compiling templates to HTML. Unfortunately I am getting the following error:

Unhandled Exception: Spark.Compiler.CompilerException: Dynamic view compilation failed.
(0,0): error CS1703: An assembly with the same identity 'mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Try removing one of the duplicate references.

However, when I target .NET 3.5 everything works fine, however I specifically wish to target 4.0. Has anyone solved this problem, some old threads on the Spark mailing list suggest I may just have to edit a line in the source, and recompile, but I hope this is a last resort.

EDIT:

    static void Main(string[] args)
    {
        if (args.Length > 0)
        {
            var templatePath = Path.Combine(Environment.CurrentDirectory, args[0]);
            var templateName = Path.GetFileName(templatePath);
            var templateDirPath = Path.GetDirectoryName(templatePath);
            var viewFolder = new FileSystemViewFolder(templateDirPath);

            var sparkEngine = new SparkViewEngine
            {
                DefaultPageBaseType = typeof(SparkView).FullName,
                ViewFolder = viewFolder.Append(new SubViewFolder(viewFolder, "Shared")),
            };

            var descriptor = new SparkViewDescriptor().AddTemplate(templateName);
            var view = sparkEngine.CreateInstance(descriptor) as SparkView;

            view.Model = args[1];

            using (var writer = new StreamWriter(new FileStream(args[2], FileMode.Create), Encoding.UTF8))
            {
                view.RenderView(writer);
            }
        }
        else
        {
            Console.WriteLine(">>> error - missing arguments:\n\tSparkCompiler.exe [templatepath] [modelstring] [outputname]");
        }
    }
+1  A: 

I didn't consider it a last resort. I changed Line #60 of src\Spark\Compiler\BatchCompiler.cs to

var providerOptions = new Dictionary { { "CompilerVersion", "v4.0" } };

it was originally

var providerOptions = new Dictionary { { "CompilerVersion", "v3.5" } };

After a recompile and referencing the new Spark.dll everything worked like a charm. er, um, i was able to proceed to the next exception.

devSolo
And then when you get errors with a missing HTML method on your SparkView you can add an HTML() method to your SparkView.cs... public object HTML(object value){ return value.ToString(); }
devSolo
And remember, you are not *forced* to pass an XDocument to your view. You can change it to any object as long as you cast it appropriately in the view.
devSolo
Ever get the unit tests working for Spark after those changes? I cannot, though it seems it may be only an assembly loading issue, where NUnit is bringing in v2 core stuff and Spark then fails bringing in v4 core stuff when it tries to render.
qstarin
+1  A: 

A fix for this has now been added to the main Spark master branch. You can either download the source and compile the latest binaries, or you can also use NuPack/NuGet to add a reference to your solution in VS2010 as the binaries there will be kept up to date from now on.

Hope that helps...

RobertTheGrey