| Issue 83: | String doesn't match haXe interface in UTF8 mode | |
| 1 person starred this issue and may be notified of changes. | Back to list |
What steps will reproduce the problem?
try the follwing code :
var str:String = "ABéé";
str = str.substr( 0, 3 ) + "A" + str.substr( 3 );
What is the expected output? What do you see instead?
The expected result should be "ABéAé". Compiling in flash target for example this result is correct. But in hxccp, we get "ABÃA©Ã©". 'é' is an extend character in utf8 represented by 2 bytes "é", so we should get (in chars) "ABéAé".
What version of the product are you using? On what operating system?
v2.6.0, Windows Xp/Vista/7, UTF8 compile mode
Please provide any additional information below.
After some investigation, I found that the 'length' member actually represent the "character count" and not the "bytes count" as expected, but number of methods (such as substr(), but also others such as charCodeAt(), operator+(), operator+=() at least) only works on bytes indexes basis instead of character basis at it should (or am I missing something?).
I tried to fix it, but at some point we would sometimes also need to get the "bytes" size and access of the string in application side, so what do you think would be the ideal way to manage it ? match the flash (for example) behaviour and adds specific methods for bytes access ? Or at the opposite adds specific methods for characters access ?
Thanks in advance
Seb
Aug 21, 2011
Project Member
#1
gameh...@gmail.com
Status:
Fixed
|