JS: Char to UTF-16 Encoding 📜
Here is a function that return the hexadecimal string of UTF-16 Encoding of a unicode character's codepoint.
/* xah_codepoint_to_utf16_hexStr(zcodepoint) return a string of hexadecimal that's the UTF-16 encoding of the char of codepoint zcodepoint (integer). The result string is not padded, but one space is inserted to separate the first 2 bytes and last 2 bytes. Digits are in CAPITAL case. URL http://xahlee.info/js/js_utf-16_encoding.html Version: 2026-01-29 */ const xah_codepoint_to_utf16_hexStr = (zcodepoint) => ((xstr) => (Array.from(Array(xstr.length).keys(), (x) => (xstr.charCodeAt(x).toString(16).toUpperCase())).join(" ")))(String.fromCodePoint(zcodepoint)); // s------------------------------ // test // 🦋 console.assert(xah_codepoint_to_utf16_hexStr(129419) === "D83E DD8B"); /* 🦋 BUTTERFLY ID 129419 UTF16 D83E DD8B */ // A, codepoint 65, UTF16 41 console.assert(xah_codepoint_to_utf16_hexStr(65) === "41");
JavaScript. String, Char, Encoding, Hexadecimal
- JS: String Index Code Unit
- JS: Convert Decimal, Hexadecimal
- JS: String.prototype.codePointAt (Char to Char ID) ❌
- JS: String.fromCodePoint (Char ID to Char)
- JS: Char to UTF-8 Encoding 📜
- JS: Char to UTF-16 Encoding 📜
- JS: String.prototype.charCodeAt (Char to Char ID) ❌
- JS: String.prototype.charAt (Extract Char at Index) ❌
- JS: String.prototype.at (Extract Char at Index)
- JS: String.fromCharCode (Char ID to Char) ❌