views:

153

answers:

1

I'm trying to use SparkView engine with ASP.NET MVC running on a Linux machine, but there seems to be known problems resolving view paths eg:

System.InvalidOperationException: The view 'Index' or its master could not be found. The following locations were searched:
~/Views/Home/Index.aspx
~/Views/Home/Index.ascx
~/Views/Shared/Index.aspx
~/Views/Shared/Index.ascx
Home\Index.spark
Shared\Index.spark

The problem seems to be that fact that it's looking for Home\Index.spark instead of ~/Views/Home/Index.spark.

I've googled around for a solution to this and found some old comments about it not working, but wondering if anyone has figured this out. I'm using MVC 1.0.

+1  A: 

It sounds like Spark has been written unportably, hardcoding '\' as the path separator instead of using System.IO.Path.DirectorySeparatorChar or System.IO.Path.Combine (p1,p2).

It appears to have been a known issue for some time: http://sparkviewengine.codeplex.com/WorkItem/View.aspx?WorkItemId=3516

There are 2 options

  • Fix Spark
  • Use MONO_IOMAP (though this will slow down the app)
mhutch
There are more problems. MONO_IOMAP doesn't get spark running. (there is already a fix for the paths in the code) Because it seems impossible to get debugging to work with mono it would be very time consuming to fix this issue. Besides the paths there seem issues with the chunks being generated during parsing.
olle
What do you mean? Debugging works OOTB with Mono 2.6 and MonoDevelop 2.2 on Linux, Windows and Mac.
mhutch
I agree with olle. Debugging Spark under MonoDevelop 2.2.1/Ubuntu 9.10 is a painful experience. Random hangs, stepping often goes backwards, test cases don't seem to be runnable under the debugger. I set out with good intentions to fix some of the spark issues under Mono but gave up after two days of trying to get debugging working. (and btw, a large number of the Spark unit tests fail).
cantabilesoftware
Ubuntu ships an old version of Mono that does not include the new debugger. If you want to have always up-to-date packages, I recommend openSUSE. Or, you could build Mono+MD from source... but if you do, *please* use a parallel environment (http://is.gd/9uIl5).
mhutch
It goes deeper than simply a path problem. For some reason on OSX, spark truncates some of the tokens it finds in .spark files. So, System.Collections.Generic becomes just Generic. I've been trying to come up with a fix but haven't quite gotten there yet.I'm using the latest MonoDevelop, running the tests with its nunit test runner.
blue_fenix