Clarify escaping of ASCII control characters

We use "c > 0" but we actually mean "c != 0".  The former looks like the
other code path handles negative c.  Yet if c is negative, our code would
print a single escaped byte (\xXY) which is wrong because a negative value
has "sizeof wchar_t" bytes which is at least 2.

I think on platforms with 16-bit wchar_t it's possible that we actually
get a negative value but I haven't checked.
This commit is contained in:
Johannes Altmanninger 2022-07-25 23:54:00 +02:00
parent f1b4366222
commit 2e8ecfdb44

View File

@ -988,10 +988,10 @@ static void escape_string_script(const wchar_t *orig_in, size_t in_len, wcstring
}
default: {
if (*in < 32) {
if (*in >= 0 && *in < 32) {
need_escape = need_complex_escape = true;
if (*in < 27 && *in > 0) {
if (*in < 27 && *in != 0) {
out += L'\\';
out += L'c';
out += L'a' + *in - 1;