views:

35

answers:

1

Hai i want to use unicode in my code.my unicode value is 0100, am adding my unicode string \u with my value. when i use string myVal="\u0100" its working., but when i use like below its not working the value is looking like "\\u1000"; how to resolve this. i want to use like below one.because the unicode value may vary sometimes. string uStr=@"\u" + "0100";

+3  A: 

There are a couple problems here, one is that @"\u" is actually the literal string "\u" (can also be represented as "\u".

The other issue is that you cannot construct a string in the way you describe because "\u" is not a valid string by itself, the compiler is expecting a value to follow "\u" (like "\u0100") to determine what the encoded value is supposed to be.

You need to keep in mind that strings in .Net are immutable, which means that when you look at what is going on behind the scenes with your concatenated example (`@"\u"+"0100"), this is what is actually happening:

  1. Create the string "\u"
  2. Create the string "0100"
  3. Create the string "\u0100"

So you have three strings in memory. In order for that to happen all of the strings must be valid.

Edit: Ok, based on your comment I think I understand what your needing to do. The first option that comes to mind for handling those values is to actually parse them as integers, and then convert them to characters. Something like:

var unicodeValue = (char)int.Parse("0100",System.Globalization.NumberStyles.AllowHexSpecifier);

Which will give you the single unicode character value. From there you can add it to a string, or convert it to a string using ToString()

There may be a more straight-forward way, but I can't think of one off the top of my head.

ckramer
@ckramer thanks, i just want to assign the value \u0100 to my string variable,thats it. but my unicode value are stores as string like this 0100,0200, i just want to change it as unicode by adding \u,and need to assign it to my string variable like this \u0100. how can i do that.
deep
great thanks, its working, but can u explain this. without specifying \u, how it takes the value 0100 as an unicode???
deep
@deep see http://msdn.microsoft.com/en-us/library/x9h8tsay(v=VS.71).aspx
Robert Paulson
Thank you @Robert. @deep it's all numbers under the covers...it just so happens that specifying a unicode character is done by giving the hexadecimal value of the unicode character.
ckramer