Commit Graph

19 Commits

Author SHA1 Message Date
Fabian Boehm
dad8c527e0 read: Error on read-only variables
Fixes #9346
2023-01-13 17:56:28 +01:00
ridiculousfish
448dd18685 Use head instead of dd in the read test
The read test is now failing on GitHub actions even though it passes on
my Mac. It may be due to differences in dd between these two
environments. Stop using dd and just use head.
2022-04-02 13:44:58 -07:00
ridiculousfish
108fe574a0 Finally track down that cursed read test failure
The read.fish check has a test where it limits the amount of data passed to
`read` to 8192 bytes, and verifies that fish reads exactly that amount.
This check occasionally fails on the OBS builds; it's very hard to repro a
failure locally, but I finally did it.

The amount of data written is limited via `yes` and `dd`:

    yes $line | dd bs=1024 count=(math "$fish_read_limit / 1024")

The bug is that `dd` outputs a fixed number of "blocks" where a block
corresponds to a single read. As `yes` and `dd` are running concurrently,
it may happen that `dd` performs a short read; this then counts as a single
block. So `dd` may output less than the desired amount of data.

This can be verified by removing the 2>/dev/null redirection; on a
successful run dd reports `8+0 records out`, on a failed run it reports
`7+1 records out` because one of the records was short.

Fix this by using `fullblock` so that dd will no longer count a short read
as a single block. `head` would probably be a simpler tool to use but we'll
do this for now.

Happily it's not a fish bug. No need to relnote it.
2022-04-02 11:33:07 -07:00
Fabian Homborg
cd62771d12 read: Don't use chunking read with --line
Fixes a regression from #8552.
2022-03-14 08:04:35 +01:00
Fabian Homborg
9ada7d9aad read: Also read in chunks when directly redirected
We can't always read in chunks because we often can't bear to
overread:

```fish
echo foo\nbar | begin
    read -l foo
    read -l bar
end
```

needs to have the first read read `foo` and the second read `bar`. So
here we can only read one byte at a time.

However, when we are directly redirected:

```fish
echo foo | read foo
```

we can, because the data is only for us anyway. The stream will be
closed after, so anything not read just goes away. Nobody else is
there to read.

This dramatically speeds up `read` of long lines through a pipe. How
much depends on the length of the line.

With lines of 5000 characters it's about 15x, with lines of 50
characters about 2x, lines of 5 characters about 1.07x.

See #8542.
2022-03-13 11:22:48 +01:00
ridiculousfish
954d0fb042 Output more information in read --nchars test
To try to track down a test failure, improve the error message.
2021-11-27 11:02:03 -08:00
Aaron Gyes
fefb913857 Update tests for changed error output 2021-11-03 22:54:55 -07:00
Fabian Homborg
45714eb29d Add function scope to read as well
Fixes #8295.
2021-09-23 17:12:37 +02:00
Fabian Homborg
aa7316b6c6 checks/read: Print maximum if we fail to read it
Debugging here is a bit difficult.
2020-06-13 19:53:21 +02:00
Fabian Homborg
6990c44443 Shorten set --show output
Changes it from

```
$fish_color_user: not set in local scope
$fish_color_user: set in global scope, unexported, with 1 elements
$fish_color_user[1]: length=3 value=|080|
$fish_color_user: set in universal scope, unexported, with 1 elements
$fish_color_user[1]: length=7 value=|brgreen|

```

(with the trailing empty line - not just a newline)

to

```
$fish_color_user: set in global scope, unexported, with 1 elements
$fish_color_user[1]: |080|
$fish_color_user: set in universal scope, unexported, with 1 elements
$fish_color_user[1]: |brgreen|
```
2020-04-26 08:49:01 +02:00
Fabian Homborg
9367d4ff71 Reindent functions to remove useless quotes
This does not include checks/function.fish because that currently
includes a "; end" in a message that indent would remove, breaking the test.
2020-03-09 19:46:43 +01:00
Johannes Altmanninger
706c1a838e Fix tests for 91fcb8c42c 2020-02-29 10:48:19 +01:00
Johannes Altmanninger
91fcb8c42c Revert "read: discard IFS delimiters before the last token"
See #6650.

This reverts commit 1410f938aa.
2020-02-29 09:53:53 +01:00
Fabian Homborg
6cccfa7cf4 tests/read: Make an error more useful
It would be nice to know what the length *is* if it's not the max.
2020-02-12 22:02:32 +01:00
Fabian Homborg
e9b4f5f0ab Replace references to ".../test/root/bin/fish" in the checks 2020-02-08 11:06:36 +01:00
Fabian Homborg
e3ccc310e2 Port read test to littlecheck
This was a tad annoying because of all the messing with variables, and
because I insisted on getting it all into the existing read.fish.
2020-02-08 10:38:43 +01:00
Fabian Homborg
69b464bc37 Run fish_indent on all our fish scripts
It's now good enough to do so.

We don't allow grid-alignment:

```fish
complete -c foo -s b -l barnanana -a '(something)'
complete -c foo -s z              -a '(something)'
```

becomes

```fish
complete -c foo -s b -l barnanana -a '(something)'
complete -c foo -s z -a '(something)'
```

It's just more trouble than it is worth.

The one part I'd change:

We align and/or'd parts of an if-condition with the in-block code:

```fish
if true
   and false
    dosomething
end
```

becomes

```fish
if true
    and false
    dosomething
end
```

but it's not used terribly much and if we ever fix it we can just
reindent.
2020-01-13 20:34:22 +01:00
Johannes Altmanninger
1410f938aa read: discard IFS delimiters before the last token
Do this only when splitting on IFS characters which usually contains
whitespace characters --- read --delimiter is unchanged; it still
consumes no more than one delimiter per variable. This seems better,
because it allows arbitrary delimiters in the last field.

Fixes #6406
2019-12-19 23:44:58 +01:00
Fabian Homborg
86133b0a2b Add read --tokenize
This splits a string into variables according to the shell's
tokenization rules, considering quoting, escaping etc.

This runs an automatic `unescape` on the string so it's presented like
it would be passed to the command. E.g.

    printf '%s\n' a\ b

returns the tokens

printf
%s\n
a b

It might be useful to add another mode "--tokenize-raw" that doesn't
do that, but this seems to be the more useful of the two.

Fixes #3823.
2019-12-01 18:14:26 +01:00