encoding a var versus UTF8
Posted: Mon Mar 24, 2014 3:02 pm
Hi to all, me again
I don't understand how LC encode data...
What I do is I load an HTML page, keep it in a var called <webData> and read the source code. No problem.
The problem occurs like that:
I isolate a special part of <webData> in a var called <leMot>.
<leMot> shown in a field in UTF8 gives this result: <sa.ʁa.bɑ̃d> (sarabande). OK.
But if I ask LC to show me this <leMot> in the message box during debugging, I get : <sa.Êa.bɑ̃d>.
May be you already guess what this means...
If I want to read the char length(leMot) - 1, I get <ƒ> when I expect <ɑ̃>, and the number of chars is also different !
Any idea how to solve this ?
Thanks in advance.

I don't understand how LC encode data...
What I do is I load an HTML page, keep it in a var called <webData> and read the source code. No problem.
The problem occurs like that:
I isolate a special part of <webData> in a var called <leMot>.
<leMot> shown in a field in UTF8 gives this result: <sa.ʁa.bɑ̃d> (sarabande). OK.
But if I ask LC to show me this <leMot> in the message box during debugging, I get : <sa.Êa.bɑ̃d>.
May be you already guess what this means...
If I want to read the char length(leMot) - 1, I get <ƒ> when I expect <ɑ̃>, and the number of chars is also different !
Any idea how to solve this ?
Thanks in advance.