Why does not the length of a chain show a different amount than the elements possessed?

2

Having this text string and applying the property length

var texto = "Hola soy una arroba \u0040 , un placer";
console.log(texto.length);

Returns: 33.

If we count ourselves we would give: 38.

  

Why do you give a different amount?

    
asked by Victor Alvarado 29.04.2017 в 15:51
source

1 answer

8

Give the correct amount, the string is equivalent to

"Hola soy una arroba @ , un placer"

When we write a literal string in a Javascript source code (and in C, in Java and many other languages) the \ character is not interpreted literally but as part of a " escape sequence ". In this case, \u indicates that what comes is a Unicode character number. And in this case \u0040 means "character that occupies the place 0040 (in hexa) in Unicode, which corresponds to @ .

So that string that in your source code occupies 38 characters, in "reality" (in execution -and in languages compiled in the compiled binary object) occupies 33.

A widely used escape sequence is the line break. When you write

var s = "Hola\nChau";
alert(s);
alert("largo: " + s.length);

Although in the source code we see (and have) a character \ and% n , in "reality" those characters are not part of the string, but represent the character "line break" (equivalent a \u000A ). Therefore, the length of that string is 9, not 10.

More details here .

    
answered by 29.04.2017 / 15:56
source