This adds support for the `--function` option of abbreviations, so that the
expansion of an abbreviation may be generated dynamically via a fish
function.
Prior to this change, abbreviations were stored as fish variables, often
universal. However we intend to add additional features to abbreviations
which would be very awkward to shoe-horn into variables.
Re-implement abbreviations using a builtin, managing them internally.
Existing abbreviations stored in universal variables are still imported,
for compatibility. However new abbreviations will need to be added to a
function. A follow-up commit will add it.
Now that abbr is a built-in, remove the abbr function; but leave the
abbr.fish file so that stale files from past installs do not override
the abbr builtin.
We have had multiple crashes for relative CDPATH entries. Commit 5e274066e
(Always return absolute path in path_get_cdpath, 2019-10-17) tried to fix
all of them but it failed to do justice to its title. Let's fix this to
actually return absolute paths, always. Take care to to normalize the path
because it is used for autosuggestions. The normalization is mostly relevant
for CDPATH=. (the default) but it doesn't hurt others.
Closes#9407
Inside a comment we offer plain file completions (or command completions if
the comment is in command position). However these completions are broken
because they don't consider any of the surrounding characters. For example
with a command line
echo # comment
^ cursor
we suggest file completions and insert them as
echo # comsomefile ment
Providing completions inside comments does not seem useful and it can be
misleading. Let's remove the completions; this should communicate better that
we are in a free-form comment that's not subject to fish syntax.
Closes#9320
This fixes#9321
IEEE Std 1003.1-2017 Issue 6 added optional error condition
[EINVAL] for if no conversion could be performed.
Switch back to wcstoimax/wcstoumax: do not work around the old FreeBSD
8 issue.
Add a test for printf '%d %d' 1 2 3
Like the pexpect-based pager compeltions test `complete-group-order.py`, but for
the `complete` builtin. Verifies the same sort/dedup rules that apply to the
pager are also applied to the output of `complete` and asserts the sort behavior
for multiple `complete -k` calls for the same command and with the same (or with
both passing) preconditions.
This addresses a long-standing TODO where `complete -C` output isn't
deduplicated.
With this patch, the same deduplication and sort procedure that is run on actual
pager completions is also executed for `complete -C` completions (with a `-C`
payload specified).
This makes it possible to use `complete -C` to test what completions will
actually be generated by the completions pager instead of it displaying
something completely divorced from reality, improving the productivity of fish
completions developers.
Note that completions that wouldn't be shown in the pager are also omitted from
the results, e.g. `test/buildroot/` and `test/fish_expand_test/` are omitted
from the check matches in `checks/complete_directories.fish` because even if
they were generated, the pager wouldn't have shown them. This again makes
reasoning about and debugging completions much easier and more sane.
Fixes ommitted newline char shown after complete -n'(foo)'
Also axes the 'contains syntax errors' line before the error.
Update tests
before
> complete -n'(foo)'
complete: Condition '(foo)' contained a syntax error
complete: Command substitutions not allowed⏎
after
> complete -n'(foo)'
complete: -n '(foo)': command substitutions not allowed here
Up to now, in normal locales \x was essentially the same as \X, except
that it errored if given a value > 0x7f.
That's kind of annoying and useless.
A subtle change is that `\xHH` now represents the character (if any)
encoded by the byte value "HH", so even for values <= 0x7f if that's
not the same as the ASCII value we would diverge.
I do not believe anyone has ever run fish on a system where that
distinction matters. It isn't a thing for UTF-8, it isn't a thing for
ASCII, it isn't a thing for UTF-16, it isn't a thing for any extended
ASCII scheme - ISO8859-X, it isn't a thing for SHIFT-JIS.
I am reasonably certain we are making that same assumption in other
places.
Fixes#1352
We forgot to decode (i.e. turn into nice wchar_t codepoints)
"byte_literal" escape sequences. This meant that e.g.
```fish
string match ö \Xc3\Xb6
math 5 \X2b 5
```
didn't work, but `math 5 \x2b 5` did, and would print the wonderful
error:
```
math: Error: Missing operator
'5 + 5'
^
```
So, instead, we decode eagerly.
Prior to 1811a2d, the return value for negative return codes was UB and I'd
witnessed both expected cases like -256 mapping to a $status of 0 and unexpected
cases like a return value of -1 mapping to a $status of 0. As such, this doesn't
test just one fixed return value but the entire range from negative multiples of
256 all the way down (rather, up!) to -1.
...for improved cross-platform support.
Following up on the work in c90ac7b. There was one more test that had mktemp in
the littlecheck "shebang" and this also removes a now-unnecessary `env` prefix.
All usages of `mktemp` must go through the (fish-only) `mktemp` test function
that abstracts over the differences across multiple platforms/flavors.
Tests can be easily run individually via `ninja -C build test_xxx` and there
isn't a good reason to randomly manually override $HOME and $XDG_CONFIG_HOME for
a test here and a test there.
If it's absolutely necessary, littlecheck.py should be extended to support a
`%temp` variable initialized to a temporary directory and that can be used
instead of calling out to the platform-provided `mktemp` via a subshell.
For unknown reasons, the i686 launchpad builders fail on this date,
but apparently not the others.
Let's just remove it, we've tested dates older than the epoch, this is
slightly redundant.
As discussed in #9221, a bug in the autocomplete that was fixed in 66391922
caused completions to be incorrectly suppressed. The dropped test/check was
inadvertently relying on the buggy behavior and expected a git invocation to
generate no completions but there are, in fact, completions now that the bug has
been resolved.
cc @faho: I'm not sure if you want to replace this with a different check that
actually doesn't yield any completions or if you're happy with it just being
dropped.
This fails on old Ubuntu with:
> touch: invalid date format ‘190112112040.39’
Because we don't actually need the seconds here, we just use minute
resolution. It's fine.
Also use `path mtime`, because that's a portable way to get the mtime.
I forgot `stat` is non-portable. There's no great way to portably get a
machine-readable representation of stat(2) for a file. I don't want to ship our
own lstat(2) wrapper executable just for this test and don't want to fork out to
python or perl for this either - I just wanted to get the tests to pass under
WSL :'(
Anyway, just give up and make it skip just for WSL. If another OS fails this
test in the future, the comments and existing workaround will make it easy to
figure out what the problem is and what needs to be done. We'll cross that
bridge when we get there.
It turns out that not all systems print an unsigned integer as the output of
`stat -c %Y xxx` and the leading `-` can be misinterpreted as a parameter to
`string match`.
There's no guarantee (nor requirement) that the filesystem support pre-epoch
modification dates. If it doesn't, the `test` tests were failing to get the
expected results.
Skip the test if it seems the fs doesn't support pre-epoch timestamps
(determined by pre-epoch mt of `oldest` evaluating to 0 or the unix epoch).
* Replace ";" with "\n" in alias-generated functions
This can let us add a "#" in our aliases to make
them ignore additional arguments.
* Update changelog about aliases that ignore arguments
* Update test for alias.fish
This is now compliant with the aliases that can
ignore arguments.
This errored out *later* because the result was infinite or NaN, but
it didn't actually stop evaluation.
I'm not sure if there is a way to get floating point math to turn an
infinity back into something that doesn't depend on a literal
infinity, but division by zero conceptually isn't a thing we can
support.
There's entire branches of maths dedicated to figuring out what
dividing by "basically zero" means and we don't have to get into it.
This is essentially the inverse of `string pad`.
Where that adds characters to get up to the specified width,
this adds an ellipsis to a string if it goes over a specific maximum width.
The char can be given, but defaults to our ellipsis string.
("…" if the locale can handle it and "..." otherwise)
If the ellipsis string is empty, it just truncates.
For arguments given via argv, it goes line-by-line,
because otherwise length makes no sense.
If "--no-newline" is given, it adds an ellipsis instead and removes all subsequent lines.
Like pad and `length --visible`, it goes by visible width,
skipping recognized escape sequences, as those have no influence on width.
The default target width is the shortest of the given widths that is non-zero.
If the ellipsis is already wider than the target width,
we truncate instead. This is safer overall, so we don't e.g. move into a new line.
This is especially important given our default ellipsis might be width 3.
When selecting items in the pager, only the latest of those items is kept
in the edit history, as so-called transient edit. Each new transient edit
evicts any old transient edit (via undo).
If the pager is closed by a command that performs another transient edit
(like history-token-search-backward) we thus inadvertently undo (= remove)
the token inserted by the pager. Fix this by closing a transient edit
session when closing the pager. Token search will start its own session.
Fixes#9160
This checked specifically for "| and" and "a=b" and then just gave the
error without a caret at all.
E.g. for a /tmp/broken.fish that contains
```fish
echo foo
echo foo | and cat
```
This would print:
```
/tmp/broken.fish (line 3): The 'and' command can not be used in a pipeline
warning: Error while reading file /tmp/broken.fish
```
without any indication other than the line number as to the location
of the error.
Now we do
```
/tmp/broken.fish (line 3): The 'and' command can not be used in a pipeline
echo foo | and cat
^~^
warning: Error while reading file /tmp/broken.fish
```
Another nice one:
```
fish --no-config -c 'echo notprinted; echo foo; a=b'
```
failed to give the error message!
(Note: Is it really a "warning" if we failed to read the one file we
wer told to?)
We should check if we should either centralize these error messages
completely, or always pass them and remove this "code" system, because
it's only used in some cases.
This skipped printing a "^" line if the start or length of the error
was longer than the source.
That seems like the correc thing at first glance, however it means
that the caret line isn't skipped *if the file goes on*.
So, for example
```fish
echo "$abc["
```
by itself, in a file or via `fish -c`, would not print an error, but
```fish
echo "$abc["
true
```
would. That's not a great way to print errors.
So instead we just.. imagine the start was at most at the end.
The underlying issue why `echo "$abc["` causes this is that `wcstol`
didn't move the end pointer for the index value (because there is no
number there). I'd fix this, but apparently some of
our recursive variable calls absolutely rely on this position value.
This stops us from loading the completions for e.g. `./foo` if there
is no `foo` in path.
This is because the completion scripts will call an unqualified `foo`,
and then error out.
This of course means if the script would work because it never calls
the command, we still don't load it.
Pathed completions via `complete --path` should be unaffected because
they aren't autoloaded anyway.
Workaround for #3117Fixes#9133
This was misguidedly "fixed" in
9e08609f85, which made printf error out
with any "-"-prefixed words as the first argument.
Note: This means currently `printf --help` doesn't print the help.
This also matches `echo`, and we currently don't have anything to make
a literal `--help` execute a builtin help except for keywords. Oh well.
Fixes#9132
This used to be kept, so e.g. testing it with
fish_read_limit=5 echo (string repeat -n 10 a)
would cause the prompt and such to error as well.
Also there was no good way to get back to the default value
afterwards.
* string repeat: Don't allocate repeated string all at once
This used to allocate one string and fill it with the necessary
repetitions, which could be a very very large string.
Now, it instead uses one buffer and fills it to a chunk size,
and then writes that.
This fixes:
1. We no longer crash with too large max/count values. Before they
caused a bad_alloc because we tried to fill all RAM.
2. We no longer fill all RAM if given a big-but-not-too-big value. You
could've caused fish to eat *most* of your RAM here.
3. It can start writing almost immediately, instead of waiting
potentially minutes to start.
Performance is about the same to slightly faster overall.
The previous fix was reverted because it broke another scenario. Add tests
for both scenarios.
The first test exposes another problem: autosuggestions are sometimes not
recomputed after selecting the first completion with Tab Tab. Fix that too.
Since the fix for #3892, this escaping style escapes
\n to \\n
as well as
\\ to \\\\
\' to \\'
I believe these two are the only printable characters that are escaped with
ESCAPE_NO_PRINTABLES.
The rationale is probably to keep the encoding unambiguous and reversible.
However that doesn't justify escaping the single quote. Probably this was
an accident, so let's revert that part.
This has the nice effect that single quotes will no longer be escaped
when rendered in the completion pager (which is consistent with other
special characters). Try it:
complete : -a "aaa\'\; aaaa\'\;" -f
Also this makes the error output of builtin bind consistent:
$ bind -e --preset \;
$ bind -e --preset \'
$ bind \;
bind: No binding found for sequence “;”
$ bind \'
bind: No binding found for sequence “'”
the last line is clearly better than the old version:
bind: No binding found for sequence “\'”
In general, the fact that ESCAPE_NO_PRINTABLES escapes the (printable)
backslash is weird but I guess it's fine because it looks more consistent to
users, even though the result is an undocumented subset of the fish language.
command_line_has_transient_edit tracks the actual command line, not the
pager search field. We accidentally reset it after modifying the search field
which causes unexpected behavior - the commandline added by the completion
pager remains even after I press Escape.
This was an inadvertent change from
cc632d6ae9.
Because we used wgetcwd directly before, we always got the "physical"
resolved $PWD.
There's an argument to be made to use the logical $PWD here as well
but I prefer not to make changes lik that in a random commit without
good reason.
This can be used to print the modification time, like `stat` with some
options.
The reason is that `stat` has caused us a number of portability
headaches:
1. It's not available everywhere by default
2. The versions are quite different
For instance, with GNU stat it's `stat -c '%Y'`, with macOS it's `stat
-f %m`.
So now checking a cache file can be done just with builtins.
This adds a line to `set --show`s output like
```
$PATH: originally inherited as |/home/alfa/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl:/var/lib/flatpak/exports/bin|
```
to help with debugging.
Note that this means keeping an additional copy of the original
environment around. At most this would be one ARG_MAX's worth, which
is about 2M.
This is sort of slow because it's called hundreds of times.
We used to have a cache, introduced in ad9b4290e, but it was removed
in fee5a9125a because it had
false-positives.
So what we do, because the issue is that this is called hundreds of
times per-commandline, we cache it keyed on the commandline.
This speeds up `complete -C'git sta'` by a factor of 2.3x.
It's still useful without, for instance to implement a command that
takes no options, or to check min-args or max-args.
(technically no optspecs, no min/max args and --ignore-unknown does
nothing, but that's a very specific error that we don't need to forbid)
Fixes#9006
Commit ad9b4290e optimized git completions by adding a completion that would
run on every completion request, which allows to precompute data used by
other completion entries. Unfortunately, the completion entry is not run
when the commandline contains a flag like `git -C`. If we didn't
already load git.fish, we'd error. Additionally, we got false positive
completions for `git diff -c`.
So this hack was a very bad idea. We should optimize in another way.
This is simply an error in test setup. There's a limit to how far we
can isolate them from the system.
(it's possible new cmake versions close fds automatically since I
can't reproduce the original issue via `ninja test` or `make test`)
Fixes#9017
This lacks the tmux-256color terminfo entry, leading to spurious
warnings like
warning: Could not set up terminal. <= no check matches
warning: TERM environment variable set to \'tmux-256color\'. <= no check matches
warning: Check that this terminal type is supported on this system. <= no check matches
warning: Using fallback terminal type \'ansi\'. <= no check matches
Git's pathspec system is kind of annoying:
> A pathspec that begins with a colon : has special meaning. In the short form, the leading colon : is followed by zero or more "magic signature" letters (which optionally is terminated by another colon :), and the remainder is the pattern to match against the path. The "magic signature" consists of ASCII symbols that are neither alphanumeric, glob, regex special characters nor colon. The optional colon that terminates the "magic signature" can be omitted if the pattern begins with a character that does not belong to "magic signature" symbol set and is not a colon.
So if we complete `:/foo`, that "works" because "f" is alphanumeric
and so the "/" is the only magic character here.
If, however the filename starts with a magic character, that's used as
a magic signature.
So we do what the docs say and terminate the magic signature after the
"/" (which means "from the repo root").
Fixes#9004
This makes it so
1. The informative status can work without showing untracked
files (previously it was disabled if bash.showUntrackedFiles was
false)
2. If untrackedfiles isn't explicitly enabled, we use -uno, so git
doesn't have to scan all the files.
In a large repository (like the FreeBSD ports repo), this can improve
performance by a factor of 5 or up.
In b0084c3fc4, we refactored out event handlers get removed. But this
also caused us to remove "one-shot" handlers even if they have not yet
been fired. Fix this.
When switching this to use `git status`, I neglected to use the
correct definition of what a "dirty" and a "staged" change is.
So this now showed already staged files still as "dirty".
Fixes#8986
This makes it so `complete -c foo -n test1 -n test2` registers *both*
conditions, and when it comes time to check the candidate, tries both,
in that order. If any fails it stops, if all succeed the completion is offered.
The reason for this is that it helps with caching - we have a
condition cache, but conditions like
```fish
test (count (commandline -opc)) -ge 2; and contains -- (commandline -opc)[2] length
test (count (commandline -opc)) -ge 2; and contains -- (commandline -opc)[2] sub
```
defeats it pretty easily, because the cache only looks at the entire
script as a string - it can't tell that the first `test` is the same
in both.
So this means we separate it into
```fish
complete -f -c string -n "test (count (commandline -opc)) -ge 2; and contains -- (commandline -opc)[2] length" -s V -l visible -d "Use the visible width, excluding escape sequences"
+complete -f -c string -n "test (count (commandline -opc)) -ge 2" -n "contains -- (commandline -opc)[2] length" -s V -l visible -d "Use the visible width, excluding escape sequences"
```
which allows the `test` to be cached.
In tests, this improves performance for the string completions by 30%
by reducing all the redundant `test` calls.
The `git` completions can also greatly benefit from this.
This adds a path builtin to deal with paths.
It offers the following subcommands:
filter to go through a list of paths and only print the ones that pass some filter - exist, are a directory, have read permission, ...
is as a shortcut for filter -q to only return true if one of the paths passed the filter
basename, dirname and extension to print certain parts of the path
change-extension to change the extension to a different one (as a string operation)
normalize and resolve to canonicalize the paths in various flavors
sort to sort paths, also only using the basename or dirname as a key
The definition of "extension" here was carefully considered and should line up with how extensions are actually used - ~/.bashrc doesn't have an extension, but ~/.conf.d does (".d").
These subcommands all compose well - they can read from arguments or stdin (like string), they can use null-delimited input or output (input is autodetected - if a NULL happens in the first PATH_MAX bytes it switches automatically).
It is both a failglob exception (so like set if a glob passed to it fails it just doesn't get any arguments for it instead of triggering an error), and passes output to command substitution buffers explicitly split (like string split0) so newlines are easy to handle.
This would still remove non-existent paths, which isn't a strict
inversion and contradicts the docs.
Currently, to only allow paths that exist but don't pass a type check,
you'd have to filter twice:
path filter -Z foo bar | path filter -vfz
If a shortcut for this becomes necessary we can add it later.
This is now added to the two commands that definitely deal with
relative paths.
It doesn't work for e.g. `path basename`, because after removing the
dirname prepending a "./" doesn't refer to the same file, and the
basename is also expected to not contain any slashes.
Because we now count the extension including the ".", we print an
empty entry.
This makes e.g.
```fish
set -l base (path change-extension '' $somefile)
set -l ext (path extension $somefile)
echo $base$ext
```
reconstruct the filename, and makes it easier to deal with files with
no extension.
This means "../" components are cancelled out even after non-existent
paths or files.
(the alternative is to error out, but being able to say `path resolve
/path/to/file/../../` over `path resolve (path dirname
/path/to/file)/../../` seems worth it?)
Yeah, the macOS tests fail because it's started in /private/var... with a
$PWD of /var.... So resolve canonicalizes the path, which makes it no
longer match $PWD.
Simply use pwd -P
This just goes back until it finds an existent path, resolves that,
and adds the normalized rest on top.
So if you try
/bin/foo/bar////../baz
and /bin exists as a symlink to /usr/bin, it would resolve that, and
normalize the rest, giving
/usr/bin/foo/baz
(note: We might want to add this to realpath as well?)
This includes the "." in what `path extension` prints.
This allows distinguishing between an empty extension (just `.`) and a
non-existent extension (no `.` at all).
This adds a "path" builtin that can handle paths.
Implemented so far:
- "path filter PATHS", filters paths according to existence and optionally type and permissions
- "path base" and "path dir", run basename and dirname, respectively
- "path extension PATHS", prints the extension, if any
- "path strip-extension", prints the path without the extension
- "path normalize PATHS", normalizes paths - removing "/./" components
- and such.
- "path real", does realpath - i.e. normalizing *and* link resolution.
Some of these - base, dir, {strip-,}extension and normalize operate on the paths only as strings, so they handle nonexistent paths. filter and real ignore any nonexistent paths.
All output is split explicitly, so paths with newlines in them are
handled correctly. Alternatively, all subcommands have a "--null-input"/"-z" and "--null-output"/"-Z" option to handle null-terminated input and create null-terminated output. So
find . -print0 | path base -z
prints the basename of all files in the current directory,
recursively.
With "-Z" it also prints it null-separated.
(if stdout is going to a command substitution, we probably want to
skip this)
All subcommands also have a "-q"/"--quiet" flag that tells them to skip output. They return true "when something happened". For match/filter that's when a file passed, for "base"/"dir"/"extension"/"strip-extension" that's when something about the path *changed*.
Filtering
---------
`filter` supports all the file*types* `test` has - "dir", "file", "link", "block"..., as well as the permissions - "read", "write", "exec" and things like "suid".
It is missing the tty check and the check for the file being non-empty. The former is best done via `isatty`, the latter I don't think I've ever seen used.
There currently is no way to only get "real" files, i.e. ignore links pointing to files.
Examples
--------
> path real /bin///sh
/usr/bin/bash
> path extension foo.mp4
mp4
> path extension ~/.config
(nothing, because ".config" isn't an extension.)
This teaches `--on-signal SIGINT` (and by extension `trap cmd SIGINT`)
to work properly in scripts, not just interactively. Note any such
function will suppress the default behavior of exiting. Do this for
SIGTERM as well.
Like `set` and `read` before it, `eval` can be used to set variables,
and so it can't be shadowed by a function without loss of
functionality.
So this forbids it.
Incidentally, this means we will no longer try to autoload an
`eval.fish` file that's left over from an old version, which would
have helped with #8963.
Previously, running `fish_add_path /foo /foo` would result in /foo
being added to $PATH twice.
Now we check that it hasn't already been given, so we skip the
second (and any further) occurence.
This *might* be a bit faster running under TSAN, otherwise it takes >
400 seconds on Github Actions.
If this doesn't work we need to disable it for TSAN.
To recap, this means `&` in the middle of a word no longer
backgrounds.
So:
```fish
echo foo&bar # prints foo&bar
echo foo& bar # backgrounds an echo that prints "foo" and runs "bar"
```
This can no longer be changed. If "no-stderr-nocaret" is in
$fish_features it will simply be ignored.
The "^" redirection that was deprecated in fish 3.0 is now gone for good.
Note: For testing reasons, it can still be set _internally_ by running
"feature_flags_t::set". We simply shouldn't do that.
When you do
```fish
set foo-bar baz
```
"foo-baz" isn't usable as a variable *name*. When you just say the
"variable" is invalid that could also be interpreted to be a special
type of variable or something.
String tokens are subdivided by command substitutions. Some syntax errors
can occur in the gap between two command substitutions. Make the caret point
to the start of that gap, instead of the token start.
When expanding command substitutions, we use a naïve way of detecting whether
the cmdsub has the optional leading dollar. We check if the last character was
a dollar, which breaks if it's an escaped dollar. We wrongly expand
\$(echo "") to the empty string. Fix this by checking if the dollar was escaped.
The parse_util_* functions have a bunch of output parameters. We should
return a parameter bag instead (I think I tried once and failed).
Given
set var a
echo "$var$(echo b)"
the double-quoted string is expanded right-to-left, so we construct an
intermediate "$varb". Since the variable "varb" is undefined, this wrongly
expands to the empty string (should be "ab"). Fix this by isolating the
expanded command substitution internally. We do the same when handling
unquoted command substitutions.
Fixes#8849
The read test is now failing on GitHub actions even though it passes on
my Mac. It may be due to differences in dd between these two
environments. Stop using dd and just use head.
The read.fish check has a test where it limits the amount of data passed to
`read` to 8192 bytes, and verifies that fish reads exactly that amount.
This check occasionally fails on the OBS builds; it's very hard to repro a
failure locally, but I finally did it.
The amount of data written is limited via `yes` and `dd`:
yes $line | dd bs=1024 count=(math "$fish_read_limit / 1024")
The bug is that `dd` outputs a fixed number of "blocks" where a block
corresponds to a single read. As `yes` and `dd` are running concurrently,
it may happen that `dd` performs a short read; this then counts as a single
block. So `dd` may output less than the desired amount of data.
This can be verified by removing the 2>/dev/null redirection; on a
successful run dd reports `8+0 records out`, on a failed run it reports
`7+1 records out` because one of the records was short.
Fix this by using `fullblock` so that dd will no longer count a short read
as a single block. `head` would probably be a simpler tool to use but we'll
do this for now.
Happily it's not a fish bug. No need to relnote it.
This was already apparently supposed to work, but didn't because we
just overrode errno again.
This now means that, if a correctly named candidate exists, we don't
start the command-not-found handler.
See #8804
This used to call exec_subshell, which has two issues:
1. It creates a command substitution block which shows up in a stack
trace
2. It does much more work than necessary
This removes a useless "in command substitution" from an error message
in an autoloaded file, and it speeds up autoloading a bit (not
measurable in actual benchmarks, but microbenchmarks are 2x).
* Implement fish_wcstod_underscores
* Add fish_wcstod_underscores unit tests
* Switch to using fish_wcstod_underscores in tinyexpr
* Add tests for math builtin underscore separator functionality
* Add documentation for underscore separators for math builtin
* Add a changelog entry for underscore numeric separators
We can't always read in chunks because we often can't bear to
overread:
```fish
echo foo\nbar | begin
read -l foo
read -l bar
end
```
needs to have the first read read `foo` and the second read `bar`. So
here we can only read one byte at a time.
However, when we are directly redirected:
```fish
echo foo | read foo
```
we can, because the data is only for us anyway. The stream will be
closed after, so anything not read just goes away. Nobody else is
there to read.
This dramatically speeds up `read` of long lines through a pipe. How
much depends on the length of the line.
With lines of 5000 characters it's about 15x, with lines of 50
characters about 2x, lines of 5 characters about 1.07x.
See #8542.
This is the simple fix - if we have no valid digit, we have nothing to
return. So instead of returning a NULL, we return an error.
This is already the case for invalid octal escapes (like `\777`).
Fixes#8545
This reverts commits:
2d9e51b43ed1d9f147ec346ce8081b
The box drawing because it's entangled with the rest and we don't
currently use this anywhere I know of. Nor was it gated on terminfo,
so it could have broken things, for subjectively little gain.
Fixes#8727.
A history search ends when you move the cursor, but the commandline inserted by
history search is still marked as transient. This means that the next history
search will clear the transient commandline. This means we are dropping an undo
point, for example:
echo 11
echo 1
echo autosuggestion
echo^P # commandline is "echo 1"
^A # stop history search
^P # commandline is "echo 11"
^Z # Bug: commandline goes back to "echo", but it should be "echo 1"
In the worst case, we are switching from line-search to token-search (see
the attached test case). Clearing the transient edit means the line is gone
and only the token is left on the command line.
Today, a command like "var=val status " has custom completions
because we skip over the var=val variable override when detecting
the command token.
However if the custom completions read the commandline state (via
"commandline -opc") they do see they variable override, which breaks
them, most likely. Try "a=b git ".
For completions of wrapped commands, we already set a transient
commandline. Do the same for commands with leading variable overrides;
then git completions for "a=b git " will think the commandline is
"git ".
Previously, when we got an unknown option with --ignore-unknown, we
would increment woptind but still try to read the same contents.
This means in e.g.
```
argparse -i h -- -ooo -h
```
The `-h` would also be skipped as an option, because after the first
`-o` getopt reads the other two `-o` and skips that many options.
This could be handled more extensively in wgetopt, but the simpler fix
is to just skip to the next argv entry once we have an unknown option
- there's nothing more we can do with it anyway!
Additionally, document this and clearly explain that we currently
don't transform the option.
Fixes#8637
fish_git_prompt may run certain git commands which may invoke certain
external programs as specified `.git/config`. Prevent this by suppressing
certain git config options.
This affects the caret position. In an expression like
123 456
we previously reported:
123 456
^ missing operator
Now we do:
123 456
^ missing operator
We do it on the first space, which should be acceptable.
(no need for a changelog entry, we have already ignored #8511)
Only show the shebang warning for .fish commands.
Use the phrase "interpreter directive" as the formal name for the
shebang.
Switch from windows to Windows for the operating system.
"not not return 34" exits with 34, not 1. This behavior is pretty
surprising but benign. I think it's very unlikely that anyone relies
on the opposite behavior, because using two "not" decorators in one
job is weird, and code that compares not's raw exit code is rare.
The behavior doesn't match our docs, but it's not worth changing the
docs because that would confuse newcomers. Add a test to cement the
behavior and a comment to explain this is intentional.
I considered adding the comment at
parse_execution_context_t::populate_not_process where this behavior
is implemented but the field defintion seems even better, because I
expect programmers to read that first.
Closes#8377
Commit e40eba358 (Treat text following quoted command substitution
as quoted) made parse_util_locate_cmdsubst_range() aware of quoted
command substitutions, by skipping surrounding text via quote_end().
However, it was not quite right. We fail to properly parse
two consecutive command substitutions in the same string,
because we don't maintain the quoting context across calls to
parse_util_locate_cmdsubst_range(). Let's track that bit in a
parameter. This allows us to get rid of the quote_end() hack.
Also apply this to the other place where we call
parse_util_locate_cmdsubst_range() in a loop (highlighting).
Fixes#8500
This fixes a regression about where we report errors:
echo error(here
old: ^
fixed: ^
Commit 0c22f67bd (Remove the old parser bits, 2020-07-02) removed
uses of "error_offset_within_token" so we always report errors at
token start. Add it back, hopefully restoring the 3.1.2 behavior.
Note that for cases like
echo "$("
we report "unbalanced quotes" because we treat the $( as double
quote. Giving a better error seems hard because of the ambguity -
we don't know if quote is meant to be inside or outside the command
substitution.
If you make a script called `foo` somewhere in $PATH, and did not give
it a shebang, this would end up calling
sh foo
instead of
sh /usr/bin/foo
which might not match up.
Especially if the path is e.g. `--version` or `-` that would end up
being misinterpreted *by sh*.
So instead we simply pass the actual_cmd to sh, because we need it
anyway to get it to fail to execute before.
For some reason, the window dimension parameters are ignored by tmux.
Not even an extra "resize-pane -x 80 -y 10" helps. So let's just drop
that assumption from our tests.
When the completion pager fills up all lines of the screen, we subtract
from the pager size the number of lines occupied by the prompt +
command line buffer (typically 1), so the command line is always
visible. However, we only subtract the number of lines *before* the
cursor, so on some multiline commandlines we draw a pager that is
too large for our screen, clobbering the commandline rendering.
Fix this by counting all lines.
Fixes#8509
Possibly fixes#8405
A command like "printf nonewline | sed s/x/y/" does not print a
concluding newline, whereas "printf nnl | string replace x y" does.
This is an edge case -- usually the user input does have a newline at
the end -- but it seems still better for this command to just forward
the user's data.
Teach most string subcommands to check if stdin is missing the trailing
newline, and stop adding one in that case.
This does not apply when input is read from commandline arguments.
* Most subcommands stop adding the final newline, because they don't
really care about newlines, so besides their normal processing,
they just want to preserve user input. They are:
* string collect
* string escape/unescape
* string join¹
* string lower/upper
* string pad
* string replace
* string repeat
* string sub
* string trim
* string match keeps adding the newline, following "grep". Additionally,
for string match --regex, it's important to output capture groups
separated by newlines, resulting in multiple output lines for an
input line. So it is not obvious where to leave out the newline.
* string split/split0 keep adding the newline for the same reason --
they are meant to output multiple elements for a single input line.
¹) string join0 is not changed because it already printed a trailing
zero byte instead of the trailing newline. This is consistent
with other tools like "find -print0".
Closes#3847
A «complete -C '~/fish-shell/build/fish '» fails to load custom
completions because we do not expand the ~, so
complete_param_for_command() thinks that this command is invalid.
Expand command tokens before loading custom completions.
Fixes#8442
Currently,
set -q --unpath PATH
simply ignores the "--unpath" bit (and same for "--path").
This changes it, so just like exportedness you can check pathness.
This finds the first broken component, to help people figure out where
they misspelt something.
E.g.
```
echo foo >/usr/lob/systemd/system/machines.target.wants/var-lib-machines.mount
```
will now show:
```
warning: Path '/usr/lob' does not exist
```
which would help with seeing that it should be "/usr/lib".
On a commandline like "ls arg" (cursor at end) we do not expand
abbrevations on enter. OTOH, on "ls " we do expand. This can be
frustrating because it means that the two obvious ways to suppress
abbrevation expansion (C-Space or post-expansion C-Z) cannot be used to
suppress expansion of a command without arguments. (One workaround is
"ls #".)
Only expand-on-execute if the cursor is at the command name (no space
in between).
This is a strict improvement for realistic scenarios, because if there
is a space, the user has already expressed the intent to not expand
the abbreviation. (I hope no one is using recursive abbreviations.)
Closes#8423
This was supposed to act like `type -q` or `command -q`, in that it
returns 0 if at least 1 exists.
But because it used the wrong variable it didn't.
Fixes#8431.
This fixes printing octal and hex values that are negative or larger
than UINT_MAX.
Negative values get a leading -, like:
> math --base hex -10
-0xa
Fixes#8417.
Commit ec3d3a481 (Support "$(cmd)" command substitution without line
splitting, 2021-07-02) started treating an input string like
"a$()b" as if it were "a"$()"b". Yet, we do not actually insert the
virtual quotes. Instead we just adapted the definition of when quotes
are closed - hence the changes to quote_end().
parse_util_locate_cmdsubst_range() is aware
of the changes to quote_end() but some of its
callers like parse_util_detect_errors_in_argument() and
highlighter_t::color_as_argument() are not. They split strings at
command substitution boundaries without handling the special quoting
rules. (Only the expansion logic did it right.)
Fix this by handling the special quoting rules inside
parse_util_locate_cmdsubst_range(). This is a bit hacky since it
makes it harder for callers to process some substrings in between
command substitutions, but that's okay because current callers only
care about what's inside the command substitutions.
Fixes#8394
Since #4376, for-loops would set the loop variable outside, so it
stays valid.
They did this by doing the equivalent of
```fish
set -l foo $foo
for foo in 1 2 3
```
And that first imaginary `set -l` would also fire a set-event.
Since there's no use for it and the variable isn't actually set, we
remove it.
Fixes#8384.
widechar_width no longer classifies U+1F41F as widened-in-9, so the
width no longer changes.
Since we're interested in testing the change here, we need a different
emoji.
Just use 🥁, which was introduced in 9 as wide, and therefore widened
in 9.
Like the $status commit, this would add the offset to already existing
errors, so
```fish
(foo)
(bar)
something
```
would see the "(foo)" error, store the correct error location, then
see the "(bar)" error, and *add the offset of (bar)* to the "(foo)"
error location.
Solve this by making a new error list and appending it to the existing
ones.
There's a few other ways to solve this, including:
- Stopping after the first error (we only display the first anyway, I
think?)
- Making it so the source location has an "absolute" flag that shows
the offset has already been added (but do we ever need to add two offsets?)
I went with the simpler fix.
This would break the location of any prior errors without doing
anything of value.
E.g.
```fish
echo foo | exec grep # this exec is not allowed!
$status
somethingelse # The error might be found here!
```
Would apply the offset of `$status` to the offset of `exec`, locating
the error for `exec` somewhere after $status!
Prior to this change, tmux based tests would call 'isolated-tmux' which would
initialize tmux on first call, an admitted "evil hack." Switch to requiring
an explicit call to 'isolated-tmux-start' which then defines 'isolated-tmux'
and other functions. Add some loop-until-prompt logic into
'isolated-tmux-start'. This improves reliability of the tmux tests on systems
under load; at least it makes the tests pass in the background on my Mac.
Remove the '$sleep' variable, to be replaced with 'tmux-sleep'.
This makes it so we treat backspaces as width -1, but never go below a
0 total width when talking about *lines*, like in screen or string
length --visible.
Fixes#8277.
When cd is passed a broken symlink, this changes the error message from
"no such directory" to "broken symbolic link". This scenario probably
won't happen very often since completion won't suggest broken symlinks
but it can't hurt to give a good error.
Fish used to do this until 7ac5932. This logic used to be in
path_get_cdpath, however, that is only used for highlighting, so we
don't need error messages there. Changing cd is enough.
Reword from "rotten" to "broken" since that's what file(1) uses.
Clean-up leftovers from old "rotten" code (nomen est omen).
See #8264
This currently changes builtin realpath with the "-s" option:
builtin realpath -s ///tmp
previously would print "///tmp", now it prints "/tmp".
The only thing "allow_leading_double_slashes" does is allow *two*
slashes.
This is important for `path match`, to be introduced in #8265.
The tmux-prompt test would sometimes fail because the first call was:
isolated-tmux capture-pane -p
this would run a capture-pane which would race with starting fish
itself; occasionally the pane would be empty since fish has not yet
drawn a prompt. Add a loop to give fish time to draw the prompt.
This used the *logical* $PWD, but realpath would operate on the
physical $PWD if given ".", even with -s. This makes this test fail if the $PWD is
logically different from physical.
This was long overdue since the setup logic is much more complex than
the actual tests.
tmux-prompt.fish had extra logic to protect against XDG_CONFIG_HOME
with leading double double-dot. I believe this is no longer necessary
with the new test driver.
We still use our own temp dir because we want to be able to run this
independently of the test driver, This can be useful for debugging
tests. For example we can insert a "$tmux attach" command in a test,
and then run
build/fish -C 'source tests/test_functions/isolated-tmux.fish' tests/checks/tmux-bind.fish
This allows to inspect the state of the test and debug interactively.
Attaching to the terminal doesn't work when running inside littlecheck
because littlecheck consumes our output and doesn't give us a terminal.
(Maybe there's an easy way to fix that?)
On request of a team member, this patches `basic.fish` to no longer
depend on being invoked by the test driver and started up in a $PWD that
points to a clean temporary directory.
This was requested by a team member who would like for some tests to
remain invokable (in thier own $HOME) directly via littlecheck without
relying on the test driver to prep the environment.
A comment explaining the rationale is also added so this doesn't get
passed down as folklore "you need to include this for tests to run" even
though no one understands why.
Tests are now executed in a test-specific temporary directory, so test
output on failure should be reproducible/reusable as-is without needing
to have TMPDIR defined (as it only exists by default under macOS).
Instead of trying to assert that there are no zombies when the test
starts (which often fails) and to prevent conflating existing or
irrelevant zombies with the ones we are interested in checking for,
have `ps` also emit the parent process id and filter its output to
include only children of the current fish instance.
Aside from the fact that the shared state could cause problems, tests
were randomly assuming it would be created where that wasn't the case.
In particular, `redirect.fish` and `basic.fish` were failing on only
macOS because `../test/temp` didn't exist yet - it would be created by
other tests later.
This disables job control inside command substitutions. Prior to this
change, a cmdsub might get its own process group. This caused it to fail
to cancel loops properly. For example:
while true ; echo (sleep 5) ; end
could not be control-C cancelled, because the signal would go to sleep,
and so the loop would continue on. The simplest way to fix this is to
match other shells and not use job control in cmdsubs.
Related is #1362
* commandline: Add --is-valid option to query whether it's syntactically complete
This means querying when the commandline is in a state that it could
be executed. Because our `execute` bind function also inserts a
newline if it isn't.
One case that's not handled right now: `execute` also expands
abbreviations, those can technically make the commandline invalid
again.
Unfortunately we have no real way to *check* without doing the
replacement.
Also since abbreviations are only available in command position when
you _execute_ them the commandline will most likely be valid.
This is enough to make transient prompts work:
```fish
function reset-transient --on-event fish_postexec
set -g TRANSIENT 0
end
function maybe_execute
if commandline --is-valid
set -g TRANSIENT 1
commandline -f repaint
else
set -g TRANSIENT 0
end
commandline -f execute
end
bind \r maybe_execute
```
and then in `fish_prompt` react to $TRANSIENT being set to 1.
Because we are, ultimately, interested in how many cells a string
occupies, we *have* to handle carriage return (`\r`) and line
feed (`\n`).
A carriage return sets the current tally to 0, and only the longest
tally is kept. The idea here is that the last position is the same as
the last position of the longest string. So:
abcdef\r123
ends up looking like
123def
which is the same width as abcdef, 6.
A line feed meanwhile means we flush the current tally and start a new
one. Every line is printed separately, even if it's given as one.
That's because, well, counting the width over multiple lines
doesn't *help*.
As a sidenote: This is necessarily imperfect, because, while we may
know the width of the terminal ($COLUMNS), we don't know the current
cursor position. So we can only give the width, and the user can then
figure something out on their own.
But for the common case of figuring out how wide the prompt is, this
should do.
* Add `set --function`
This makes the function's scope available, even inside of blocks. Outside of blocks it's the toplevel local scope.
This removes the need to declare variables locally before use, and will probably end up being the main way variables get set.
E.g.:
```fish
set -l thing
if condition
set thing one
else
set thing two
end
```
could be written as
```fish
if condition
set -f thing one
else
set -f thing two
end
```
Note: Many scripts shipped with fish use workarounds like `and`/`or`
instead of `if`, so it isn't easy to find good examples.
Also, if there isn't an else-branch in that above, just with
```fish
if condition
set -f thing one
end
```
that means something different from setting it before! Now, if
`condition` isn't true, it would use a global (or universal) variable of
te same name!
Some more interesting parts:
Because it *is* a local scope, setting a variable `-f` and
`-l` in the toplevel of a function ends up the same:
```fish
function foo2
set -l foo bar
set -f foo baz # modifies the *same* variable!
end
```
but setting it locally inside a block creates a new local variable
that shadows the function-scoped variable:
```fish
function foo3
set -f foo bar
begin
set -l foo banana
# $foo is banana
end
# $foo is bar again
end
```
This is how local variables already work. "Local" is actually "block-scoped".
Also `set --show` will only show the closest local scope, so it won't
show a shadowed function-level variable. Again, this is how local
variables already work, and could be done as a separate change.
As a fun tidbit, functions with --no-scope-shadowing can now use this to set variables in the calling function. That's probably okay given that it's already an escape hatch (but to be clear: if it turns out to problematic I reserve the right to remove it).
Fixes#565
Fixes some regressions from 35ca42413 ("Simplify some parse_util functions").
The tmux tests are not beautiful but I find them easy to write.
Probably a pexpect test would also be enough here?
for PWD in foo; true; end
prints:
>..src/parse_execution.cpp:461: end_execution_reason_t parse_execution_context_t::run_for_statement(const ast::for_header_t&, const ast::job_list_t&): Assertion `retval == ENV_OK' failed.
because this used the wrong way to see if something is read-only.
This allows us to test that `test` takes numbers with decimal point even in comma-using locales,
to stop those pesky americans from breaking everything again.
(and yes, we use french to keep myself honest)
Through a mechanism I don't entirely understand, $PWD is sometimes
writable (so that `cd` can change it) and sometimes not.
In this case we ended up with it writable, which is wrong.
See #8179.
This didn't do all the syntax checks, so something like
fish -c 'echo foo; and $status'
complained of a missing command `0` (i.e. $status), and
fish -c 'echo foo | exec grep'
hit an assert!
So we do what read_ni does, parse each command into an ast, run
parse_util_detect_errors on it if it worked and then eval the ast.
It is possible to do this neater by modifying parser::eval, but I
can't find where.
This is slightly unclean. Even tho it would otherwise be syntactically
valid, using $status as a command is very very very likely to be an
error, like
if not $status
We have reports of this surprisingly regularly, including #2773.
Because $status can only ever be a value from 0 to 255, it is also
very unlikely to be an actual command, and that command is very
unlikely to do what you want.
So we simply point the user towards the "conditions" help section,
that should explain things.
This is opt-in through a new feature flag "ampersand-nobg-in-token".
When this flag and "qmark-noglob" are enabled, this command no longer
needs quoting:
curl https://example.com/thing?foo=bar&duran=duran
Compared to the previous approach e1570a4 ("Let '&' only separate as
the first char of a word"), this has some advantages:
1. "&&" and "&>" are no longer affected. They are still special, even
if used between tokens without spaces, like "echo bar&>foo".
Maybe this is not really *better*, but it avoids risking to annoy
users by breaking the old variant.
2. "&" is still special if at the end of a token, like in "sleep 1&".
Word movement is not affected by the semantics change, so Alt-F and
friends still stop at every "&".
Currently, if a "return" is given outside of a function, we'd just
throw an error.
That always struck me as a bit weird, given that scripts can also
return a value.
So simply let "return" outside also exit the script, kinda like "exit"
does.
However, unlike "exit" it doesn't quit an interactive shell - it seems
weird to have "return" do that as well. It sets $status, so it can be
used to quickly set that, in case you want to test something.
In the variable handler, we just go through the entire thing and keep
every element once.
If there's a duplicate, we set it again, which calls the handler
again.
This takes a bit of time, to be paid on each startup. On my system,
with 100 already deduplicated elements, that's about 4ms (compared to
~17ms for adding them to $PATH).
It's also semantically more complicated - now this variable
specifically is deduplicated? Do we just want "unique" variables that
can't have duplicates?
However: This entirely removes the pathological case of appending to
$fish_user_paths in config.fish (which should be an FAQ entry!), and the implementation is quite simple.
This adds a hack to the parser. Given a command
echo "x$()y z"
we virtually insert double quotes before and after the command
substitution, so the command internally looks like
echo "x"$()"y z"
This hack allows to reuse the existing logic for handling (recursive)
command substitutions.
This makes the quoting syntax more complex; external highlighters
should consider adding this if possible.
The upside (more Bash compatibility) seems worth it.
Closes#159
In some setups (eg. macports) $tmpdir can expand to more than
100 symbols and tests fail with 'socket file name too long'
errors.
Using relative path to socket file fixes the issue.
* string: Allow `collect --no-empty` to avoid empty ellision
Currently we still have that issue where
test -n (thing | string collect)
can return true if `thing` doesn't print anything, because the
collected argument will still be removed.
So, what we do is allow `--no-empty` to be used, in which case we
print one empty argument.
This means
test -n (thing | string collect -n)
can now be safely used.
"no-empty" isn't the best name for this flag, but string's design
really incentivizes reusing names, and it's not *terrible*.
* Switch to `--allow-empty`
`--no-empty` does the exact opposite for `string split` and split0.
Since `-a`/`--allow-empty` already exists, use it.
The tmux-prompt test was failing when run more than once, because
XDG_DATA_HOME has a leading double-dot, causing the uvars file to
leak across sessions. Descend more deeply into our tmpdir to isolate
our XDG_DATA_HOME.