Previously, ELinks used to silently discard the Alt modifier from
Alt-ö keystrokes when UTF-8 I/O was enabled. Now, separate actions
can be bound to ö and Alt-ö.
However, if CONFIG_UTF_8 is defined, then actions cannot be bound to
non-ASCII characters, regardless of modifiers. This is because the
code that handles names of keystrokes assumes a character can only be
a single byte. This commit does not change that.
Form fields and BFU text-input widgets then convert from UCS-4 to UTF-8.
If not all UTF-8 bytes fit, they don't insert anything. Thus it is no
longer possible to get invalid UTF-8 by hitting the length limit.
It is unclear to me which charset is supposed to be used for strings
in internal buffers. I made BFU insert UTF-8 whenever CONFIG_UTF_8,
but form fields use the charset of the terminal; that may have to be
changed.
As a side effect, this change should solve bug 782, because
term_send_ucs no longer encodes in UTF-8 if CONFIG_UTF_8 is defined.
I think the UTF-8 and codepage encoding calls I added are safe, too.
A similar bug may still surface somewhere else, but 782 could be
closed for now.
This change also lays the foundation for binding actions to non-ASCII
keys, but the keystroke name parser doesn't yet support that.
The CONFIG_UTF_8 mode does not currently support non-ASCII characters
in hot keys, either.
For instance, if Ctrl-F1 were pressed and src/terminal/kbd.c supported it,
then toupper(KBD_F1) would be called, resulting in undefined behaviour.
src/terminal/kbd.c does not support such combinations yet, but it is
safest to fix the bug already.