Commit Graph

9 Commits

Author SHA1 Message Date
Nick Craig-Wood
542677d807 s3: fix --s3-versions on individual objects
Before this fix attempting to access an s3 versioned object by name in
a subdirectory of root would not find the object.

This fixes the problem and introduced an integraton test.

See: https://forum.rclone.org/t/s3-versions-cant-retrieve-old-version/36900
2023-03-21 12:44:45 +00:00
Josh Soref
ce3b65e6dc all: fix spelling across the project
* abcdefghijklmnopqrstuvwxyz
* accounting
* additional
* allowed
* almost
* already
* appropriately
* arise
* bandwidth
* behave
* bidirectional
* brackets
* cached
* characters
* cloud
* committing
* concatenating
* configured
* constructs
* current
* cutoff
* deferred
* different
* directory
* disposition
* dropbox
* either way
* error
* excess
* experiments
* explicitly
* externally
* files
* github
* gzipped
* hierarchies
* huffman
* hyphen
* implicitly
* independent
* insensitive
* integrity
* libraries
* literally
* metadata
* mimics
* missing
* modification
* multipart
* multiple
* nightmare
* nonexistent
* number
* obscure
* ourselves
* overridden
* potatoes
* preexisting
* priority
* received
* remote
* replacement
* represents
* reproducibility
* response
* satisfies
* sensitive
* separately
* separator
* specifying
* string
* successful
* synchronization
* syncing
* šenfeld
* take
* temporarily
* testcontents
* that
* the
* themselves
* throttling
* timeout
* transaction
* transferred
* unnecessary
* using
* webbrowser
* which
* with
* workspace

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>
2022-08-30 11:16:26 +02:00
Nick Craig-Wood
ebe86c6cec s3: add --s3-decompress flag to download gzip-encoded files
Before this change, if an object compressed with "Content-Encoding:
gzip" was downloaded, a length and hash mismatch would occur since the
go runtime automatically decompressed the object on download.

If --s3-decompress is set, this change erases the length and hash on
compressed objects so they can be downloaded successfully, at the cost
of not being able to check the length or the hash of the downloaded
object.

If --s3-decompress is not set the compressed files will be downloaded
as-is providing compressed objects with intact size and hash
information.

See #2658
2022-08-05 16:45:23 +01:00
Nick Craig-Wood
4344a3e2ea s3: implement --s3-version-at flag - Fixes #1776 2022-08-05 16:45:23 +01:00
Nick Craig-Wood
81d242473a s3: implement Purge to purge versions and backend cleanup-hidden 2022-08-05 16:45:23 +01:00
Nick Craig-Wood
0ae171416f s3: implement --s3-versions flag - See #1776 2022-08-05 16:45:23 +01:00
Nick Craig-Wood
440d0cd179 s3: fix --s3-no-head panic: reflect: Elem of invalid type s3.PutObjectInput
In

22abd785eb s3: implement reading and writing of metadata #111

The reading information of objects was refactored to use the
s3.HeadObjectOutput structure.

Unfortunately the code branch with `--s3-no-head` was not tested
otherwise this panic would have been discovered.

This shows that this is path is not integration tested, so this adds a
new integration test.

Fixes #6322
2022-07-18 23:38:50 +01:00
Nick Craig-Wood
06182a3443 s3: actually compress the payload for content-type gzip test 2022-07-04 09:42:49 +01:00
Nick Craig-Wood
22abd785eb s3: implement reading and writing of metadata #111 2022-06-29 14:29:36 +01:00