I am trying to define an algebric type:
data MyType t = MyType t
And make it an instance of Show:
instance Show (MyType t) where
show (MyType x) = "MyType: " ++ (show x)
GHC complains becasue it cannot deduce that type 't' in 'Show (MyType t)' is actually an instance of Show, which is needed for (show x).
I have no idea where and how do I declare 't' to be an instance of Show?