FEATURE: Initial implementation of direct S3 uploads with uppy and stubs (#13787)

This adds a few different things to allow for direct S3 uploads using uppy. **These changes are still not the default.** There are hidden `enable_experimental_image_uploader` and `enable_direct_s3_uploads`  settings that must be turned on for any of this code to be used, and even if they are turned on only the User Card Background for the user profile actually uses uppy-image-uploader.

A new `ExternalUploadStub` model and database table is introduced in this pull request. This is used to keep track of uploads that are uploaded to a temporary location in S3 with the direct to S3 code, and they are eventually deleted a) when the direct upload is completed and b) after a certain time period of not being used. 

### Starting a direct S3 upload

When an S3 direct upload is initiated with uppy, we first request a presigned PUT URL from the new `generate-presigned-put` endpoint in `UploadsController`. This generates an S3 key in the `temp` folder inside the correct bucket path, along with any metadata from the clientside (e.g. the SHA1 checksum described below). This will also create an `ExternalUploadStub` and store the details of the temp object key and the file being uploaded.

Once the clientside has this URL, uppy will upload the file direct to S3 using the presigned URL. Once the upload is complete we go to the next stage.

### Completing a direct S3 upload

Once the upload to S3 is done we call the new `complete-external-upload` route with the unique identifier of the `ExternalUploadStub` created earlier. Only the user who made the stub can complete the external upload. One of two paths is followed via the `ExternalUploadManager`.

1. If the object in S3 is too large (currently 100mb defined by `ExternalUploadManager::DOWNLOAD_LIMIT`) we do not download and generate the SHA1 for that file. Instead we create the `Upload` record via `UploadCreator` and simply copy it to its final destination on S3 then delete the initial temp file. Several modifications to `UploadCreator` have been made to accommodate this.

2. If the object in S3 is small enough, we download it. When the temporary S3 file is downloaded, we compare the SHA1 checksum generated by the browser with the actual SHA1 checksum of the file generated by ruby. The browser SHA1 checksum is stored on the object in S3 with metadata, and is generated via the `UppyChecksum` plugin. Keep in mind that some browsers will not generate this due to compatibility or other issues.

    We then follow the normal `UploadCreator` path with one exception. To cut down on having to re-upload the file again, if there are no changes (such as resizing etc) to the file in `UploadCreator` we follow the same copy + delete temp path that we do for files that are too large.

3. Finally we return the serialized upload record back to the client

There are several errors that could happen that are handled by `UploadsController` as well.

Also in this PR is some refactoring of `displayErrorForUpload` to handle both uppy and jquery file uploader errors.
This commit is contained in:
Martin Brennan 2021-07-28 08:42:25 +10:00 committed by GitHub
parent 4a37612fd5
commit b500949ef6
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
27 changed files with 1167 additions and 80 deletions

View File

@ -845,7 +845,7 @@ export default Component.extend({
this._xhr = null; this._xhr = null;
if (!userCancelled) { if (!userCancelled) {
displayErrorForUpload(data, this.siteSettings); displayErrorForUpload(data, this.siteSettings, data.files[0].name);
} }
}); });
}); });

View File

@ -75,7 +75,7 @@ export default Component.extend({
$upload.on("fileuploadfail", (e, data) => { $upload.on("fileuploadfail", (e, data) => {
if (data.errorThrown !== "abort") { if (data.errorThrown !== "abort") {
displayErrorForUpload(data, this.siteSettings); displayErrorForUpload(data, this.siteSettings, data.files[0].name);
} }
this.reset(); this.reset();
}); });

View File

@ -1,4 +1,5 @@
import Component from "@ember/component"; import Component from "@ember/component";
import { or } from "@ember/object/computed";
import UppyUploadMixin from "discourse/mixins/uppy-upload"; import UppyUploadMixin from "discourse/mixins/uppy-upload";
import { ajax } from "discourse/lib/ajax"; import { ajax } from "discourse/lib/ajax";
import discourseComputed from "discourse-common/utils/decorators"; import discourseComputed from "discourse-common/utils/decorators";
@ -25,6 +26,8 @@ export default Component.extend(UppyUploadMixin, {
} }
}, },
uploadingOrProcessing: or("uploading", "processing"),
@discourseComputed("imageUrl", "placeholderUrl") @discourseComputed("imageUrl", "placeholderUrl")
showingPlaceholder(imageUrl, placeholderUrl) { showingPlaceholder(imageUrl, placeholderUrl) {
return !imageUrl && placeholderUrl; return !imageUrl && placeholderUrl;

View File

@ -1,4 +1,5 @@
import I18n from "I18n"; import I18n from "I18n";
import deprecated from "discourse-common/lib/deprecated";
import bootbox from "bootbox"; import bootbox from "bootbox";
import { isAppleDevice } from "discourse/lib/utilities"; import { isAppleDevice } from "discourse/lib/utilities";
@ -279,12 +280,29 @@ export function getUploadMarkdown(upload) {
} }
} }
export function displayErrorForUpload(data, siteSettings) { export function displayErrorForUpload(data, siteSettings, fileName) {
if (!fileName) {
deprecated(
"Calling displayErrorForUpload without a fileName is deprecated and will be removed in a future version."
);
fileName = data.files[0].name;
}
if (data.jqXHR) { if (data.jqXHR) {
const didError = displayErrorByResponseStatus( const didError = displayErrorByResponseStatus(
data.jqXHR.status, data.jqXHR.status,
data.jqXHR.responseJSON, data.jqXHR.responseJSON,
data.files[0].name, fileName,
siteSettings
);
if (didError) {
return;
}
} else if (data.body && data.status) {
const didError = displayErrorByResponseStatus(
data.status,
data.body,
fileName,
siteSettings siteSettings
); );
if (didError) { if (didError) {
@ -294,25 +312,7 @@ export function displayErrorForUpload(data, siteSettings) {
bootbox.alert(data.errors.join("\n")); bootbox.alert(data.errors.join("\n"));
return; return;
} }
// otherwise, display a generic error message
bootbox.alert(I18n.t("post.errors.upload"));
}
export function displayErrorForUppyUpload(response, fileName, siteSettings) {
if (response.body.errors && response.body.errors.length > 0) {
bootbox.alert(response.body.errors.join("\n"));
return;
} else {
const didError = displayErrorByResponseStatus(
response.status,
response.body,
fileName,
siteSettings
);
if (didError) {
return;
}
}
// otherwise, display a generic error message // otherwise, display a generic error message
bootbox.alert(I18n.t("post.errors.upload")); bootbox.alert(I18n.t("post.errors.upload"));
} }

View File

@ -0,0 +1,94 @@
import { Plugin } from "@uppy/core";
import { warn } from "@ember/debug";
import { Promise } from "rsvp";
export default class UppyChecksum extends Plugin {
constructor(uppy, opts) {
super(uppy, opts);
this.id = opts.id || "uppy-checksum";
this.capabilities = opts.capabilities;
this.type = "preprocessor";
}
canUseSubtleCrypto() {
if (!window.isSecureContext) {
this.warnPrefixed(
"Cannot generate cryptographic digests in an insecure context (not HTTPS)."
);
return false;
}
if (this.capabilities.isIE11) {
this.warnPrefixed(
"The required cipher suite is unavailable in Internet Explorer 11."
);
return false;
}
if (
!(window.crypto && window.crypto.subtle && window.crypto.subtle.digest)
) {
this.warnPrefixed(
"The required cipher suite is unavailable in this browser."
);
return false;
}
return true;
}
generateChecksum(fileIds) {
if (!this.canUseSubtleCrypto()) {
return Promise.resolve();
}
let promises = fileIds.map((fileId) => {
let file = this.uppy.getFile(fileId);
this.uppy.emit("preprocess-progress", file, {
mode: "indeterminate",
message: "generating checksum",
});
return file.data.arrayBuffer().then((arrayBuffer) => {
return window.crypto.subtle
.digest("SHA-1", arrayBuffer)
.then((hash) => {
const hashArray = Array.from(new Uint8Array(hash));
const hashHex = hashArray
.map((b) => b.toString(16).padStart(2, "0"))
.join("");
this.uppy.setFileMeta(fileId, { sha1_checksum: hashHex });
})
.catch((err) => {
if (
err.message.toString().includes("Algorithm: Unrecognized name")
) {
this.warnPrefixed(
"SHA-1 algorithm is unsupported in this browser."
);
}
});
});
});
const emitPreprocessCompleteForAll = () => {
fileIds.forEach((fileId) => {
const file = this.uppy.getFile(fileId);
this.uppy.emit("preprocess-complete", file);
});
};
return Promise.all(promises).then(emitPreprocessCompleteForAll);
}
warnPrefixed(message) {
warn(`[uppy-checksum-plugin] ${message}`);
}
install() {
this.uppy.addPreProcessor(this.generateChecksum.bind(this));
}
uninstall() {
this.uppy.removePreProcessor(this.generateChecksum.bind(this));
}
}

View File

@ -108,7 +108,7 @@ export default Mixin.create({
}); });
$upload.on("fileuploadfail", (e, data) => { $upload.on("fileuploadfail", (e, data) => {
displayErrorForUpload(data, this.siteSettings); displayErrorForUpload(data, this.siteSettings, data.files[0].name);
reset(); reset();
}); });
}), }),

View File

@ -1,6 +1,7 @@
import Mixin from "@ember/object/mixin"; import Mixin from "@ember/object/mixin";
import { ajax } from "discourse/lib/ajax";
import { import {
displayErrorForUppyUpload, displayErrorForUpload,
validateUploadedFile, validateUploadedFile,
} from "discourse/lib/uploads"; } from "discourse/lib/uploads";
import { deepMerge } from "discourse-common/lib/object"; import { deepMerge } from "discourse-common/lib/object";
@ -9,6 +10,8 @@ import I18n from "I18n";
import Uppy from "@uppy/core"; import Uppy from "@uppy/core";
import DropTarget from "@uppy/drop-target"; import DropTarget from "@uppy/drop-target";
import XHRUpload from "@uppy/xhr-upload"; import XHRUpload from "@uppy/xhr-upload";
import AwsS3 from "@uppy/aws-s3";
import UppyChecksum from "discourse/lib/uppy-checksum-plugin";
import { on } from "discourse-common/utils/decorators"; import { on } from "discourse-common/utils/decorators";
import { warn } from "@ember/debug"; import { warn } from "@ember/debug";
@ -42,13 +45,17 @@ export default Mixin.create({
@on("willDestroyElement") @on("willDestroyElement")
_destroy() { _destroy() {
this.messageBus && this.messageBus.unsubscribe("/uploads/" + this.type); if (this.messageBus) {
this.messageBus.unsubscribe(`/uploads/${this.type}`);
}
this.uppyInstance && this.uppyInstance.close(); this.uppyInstance && this.uppyInstance.close();
}, },
@on("didInsertElement") @on("didInsertElement")
_initialize() { _initialize() {
this.set("fileInputEl", this.element.querySelector(".hidden-upload-field")); this.setProperties({
fileInputEl: this.element.querySelector(".hidden-upload-field"),
});
this.set("allowMultipleFiles", this.fileInputEl.multiple); this.set("allowMultipleFiles", this.fileInputEl.multiple);
this._bindFileInputChangeListener(); this._bindFileInputChangeListener();
@ -78,6 +85,7 @@ export default Mixin.create({
bypassNewUserRestriction: true, bypassNewUserRestriction: true,
user: this.currentUser, user: this.currentUser,
siteSettings: this.siteSettings, siteSettings: this.siteSettings,
validateSize: true,
}, },
this.validateUploadedFilesOptions() this.validateUploadedFilesOptions()
); );
@ -114,24 +122,43 @@ export default Mixin.create({
); );
this.uppyInstance.use(DropTarget, { target: this.element }); this.uppyInstance.use(DropTarget, { target: this.element });
this.uppyInstance.use(UppyChecksum, { capabilities: this.capabilities });
this.uppyInstance.on("progress", (progress) => { this.uppyInstance.on("progress", (progress) => {
this.set("uploadProgress", progress); this.set("uploadProgress", progress);
}); });
this.uppyInstance.on("upload-success", (_file, response) => { this.uppyInstance.on("upload-success", (file, response) => {
if (this.usingS3Uploads) {
this.setProperties({ uploading: false, processing: true });
this._completeExternalUpload(file)
.then((completeResponse) => {
this.uploadDone(completeResponse);
this._reset();
})
.catch((errResponse) => {
displayErrorForUpload(errResponse, this.siteSettings, file.name);
this._reset();
});
} else {
this.uploadDone(response.body); this.uploadDone(response.body);
this._reset(); this._reset();
}
}); });
this.uppyInstance.on("upload-error", (file, error, response) => { this.uppyInstance.on("upload-error", (file, error, response) => {
displayErrorForUppyUpload(response, file.name, this.siteSettings); displayErrorForUpload(response, this.siteSettings, file.name);
this._reset(); this._reset();
}); });
// later we will use the uppy direct s3 uploader based on enable_s3_uploads, if (
// for now we always just use XHR uploads this.siteSettings.enable_s3_uploads &&
this.siteSettings.enable_direct_s3_uploads // hidden setting like enable_experimental_image_uploader
) {
this._useS3Uploads();
} else {
this._useXHRUploads(); this._useXHRUploads();
}
}, },
_useXHRUploads() { _useXHRUploads() {
@ -143,6 +170,45 @@ export default Mixin.create({
}); });
}, },
_useS3Uploads() {
this.set("usingS3Uploads", true);
this.uppyInstance.use(AwsS3, {
getUploadParameters: (file) => {
const data = { file_name: file.name, type: this.type };
// the sha1 checksum is set by the UppyChecksum plugin, except
// for in cases where the browser does not support the required
// crypto mechanisms or an error occurs. it is an additional layer
// of security, and not required.
if (file.meta.sha1_checksum) {
data.metadata = { "sha1-checksum": file.meta.sha1_checksum };
}
return ajax(getUrl("/uploads/generate-presigned-put"), {
type: "POST",
data,
})
.then((response) => {
this.uppyInstance.setFileMeta(file.id, {
uniqueUploadIdentifier: response.unique_identifier,
});
return {
method: "put",
url: response.url,
headers: {
"Content-Type": file.type,
},
};
})
.catch((errResponse) => {
displayErrorForUpload(errResponse, this.siteSettings, file.name);
this._reset();
});
},
});
},
_xhrUploadUrl() { _xhrUploadUrl() {
return ( return (
getUrl(this.getWithDefault("uploadUrl", "/uploads")) + getUrl(this.getWithDefault("uploadUrl", "/uploads")) +
@ -172,8 +238,21 @@ export default Mixin.create({
}); });
}, },
_completeExternalUpload(file) {
return ajax(getUrl("/uploads/complete-external-upload"), {
type: "POST",
data: {
unique_identifier: file.meta.uniqueUploadIdentifier,
},
});
},
_reset() { _reset() {
this.uppyInstance && this.uppyInstance.reset(); this.uppyInstance && this.uppyInstance.reset();
this.setProperties({ uploading: false, uploadProgress: 0 }); this.setProperties({
uploading: false,
processing: false,
uploadProgress: 0,
});
}, },
}); });

View File

@ -3,9 +3,9 @@
<div class="placeholder-overlay" style={{placeholderStyle}}></div> <div class="placeholder-overlay" style={{placeholderStyle}}></div>
{{/if}} {{/if}}
<div class="image-upload-controls"> <div class="image-upload-controls">
<label class="btn btn-default pad-left no-text {{if uploading "disabled"}}"> <label class="btn btn-default pad-left no-text {{if uploadingOrProcessing "disabled"}}">
{{d-icon "far-image"}} {{d-icon "far-image"}}
<input class="hidden-upload-field" disabled={{uploading}} type="file" accept="image/*"> <input class="hidden-upload-field" disabled={{uploadingOrProcessing}} type="file" accept="image/*">
</label> </label>
{{#if imageUrl}} {{#if imageUrl}}
@ -27,6 +27,7 @@
{{/if}} {{/if}}
<span class="btn {{unless uploading "hidden"}}">{{i18n "upload_selector.uploading"}} {{uploadProgress}}%</span> <span class="btn {{unless uploading "hidden"}}">{{i18n "upload_selector.uploading"}} {{uploadProgress}}%</span>
<span class="btn {{unless processing "hidden"}}">{{i18n "upload_selector.processing"}}</span>
</div> </div>
{{#if imageUrl}} {{#if imageUrl}}

View File

@ -3,6 +3,7 @@ import {
allowsAttachments, allowsAttachments,
allowsImages, allowsImages,
authorizedExtensions, authorizedExtensions,
displayErrorForUpload,
getUploadMarkdown, getUploadMarkdown,
isImage, isImage,
validateUploadedFiles, validateUploadedFiles,
@ -312,4 +313,57 @@ discourseModule("Unit | Utility | uploads", function () {
"![image|100x200](/uploads/123/abcdef.ext)" "![image|100x200](/uploads/123/abcdef.ext)"
); );
}); });
test("displayErrorForUpload - jquery file upload - jqXHR present", function (assert) {
sinon.stub(bootbox, "alert");
displayErrorForUpload(
{
jqXHR: { status: 422, responseJSON: { message: "upload failed" } },
},
{ max_attachment_size_kb: 1024, max_image_size_kb: 1024 },
"test.png"
);
assert.ok(bootbox.alert.calledWith("upload failed"), "the alert is called");
});
test("displayErrorForUpload - jquery file upload - jqXHR missing, errors present", function (assert) {
sinon.stub(bootbox, "alert");
displayErrorForUpload(
{
errors: ["upload failed"],
},
{ max_attachment_size_kb: 1024, max_image_size_kb: 1024 },
"test.png"
);
assert.ok(bootbox.alert.calledWith("upload failed"), "the alert is called");
});
test("displayErrorForUpload - jquery file upload - no errors", function (assert) {
sinon.stub(bootbox, "alert");
displayErrorForUpload(
{},
{
max_attachment_size_kb: 1024,
max_image_size_kb: 1024,
},
"test.png"
);
assert.ok(
bootbox.alert.calledWith(I18n.t("post.errors.upload")),
"the alert is called"
);
});
test("displayErrorForUpload - uppy - with response status and body", function (assert) {
sinon.stub(bootbox, "alert");
displayErrorForUpload(
{
status: 422,
body: { message: "upload failed" },
},
"test.png",
{ max_attachment_size_kb: 1024, max_image_size_kb: 1024 }
);
assert.ok(bootbox.alert.calledWith("upload failed"), "the alert is called");
});
}); });

View File

@ -9,8 +9,14 @@ class UploadsController < ApplicationController
protect_from_forgery except: :show protect_from_forgery except: :show
before_action :is_asset_path, :apply_cdn_headers, only: [:show, :show_short, :show_secure] before_action :is_asset_path, :apply_cdn_headers, only: [:show, :show_short, :show_secure]
before_action :external_store_check, only: [:show_secure, :generate_presigned_put, :complete_external_upload]
SECURE_REDIRECT_GRACE_SECONDS = 5 SECURE_REDIRECT_GRACE_SECONDS = 5
PRESIGNED_PUT_RATE_LIMIT_PER_MINUTE = 5
def external_store_check
return render_404 if !Discourse.store.external?
end
def create def create
# capture current user for block later on # capture current user for block later on
@ -125,7 +131,6 @@ class UploadsController < ApplicationController
def show_secure def show_secure
# do not serve uploads requested via XHR to prevent XSS # do not serve uploads requested via XHR to prevent XSS
return xhr_not_allowed if request.xhr? return xhr_not_allowed if request.xhr?
return render_404 if !Discourse.store.external?
path_with_ext = "#{params[:path]}.#{params[:extension]}" path_with_ext = "#{params[:path]}.#{params[:extension]}"
@ -187,6 +192,84 @@ class UploadsController < ApplicationController
} }
end end
def generate_presigned_put
return render_404 if !SiteSetting.enable_direct_s3_uploads
RateLimiter.new(
current_user, "generate-presigned-put-upload-stub", PRESIGNED_PUT_RATE_LIMIT_PER_MINUTE, 1.minute
).performed!
file_name = params.require(:file_name)
type = params.require(:type)
# don't want people posting arbitrary S3 metadata so we just take the
# one we need. all of these will be converted to x-amz-meta- metadata
# fields in S3 so it's best to use dashes in the names for consistency
#
# this metadata is baked into the presigned url and is not altered when
# sending the PUT from the clientside to the presigned url
metadata = if params[:metadata].present?
meta = {}
if params[:metadata]["sha1-checksum"].present?
meta["sha1-checksum"] = params[:metadata]["sha1-checksum"]
end
meta
end
url = Discourse.store.signed_url_for_temporary_upload(
file_name, metadata: metadata
)
key = Discourse.store.path_from_url(url)
upload_stub = ExternalUploadStub.create!(
key: key,
created_by: current_user,
original_filename: file_name,
upload_type: type
)
render json: { url: url, key: key, unique_identifier: upload_stub.unique_identifier }
end
def complete_external_upload
return render_404 if !SiteSetting.enable_direct_s3_uploads
unique_identifier = params.require(:unique_identifier)
external_upload_stub = ExternalUploadStub.find_by(
unique_identifier: unique_identifier, created_by: current_user
)
return render_404 if external_upload_stub.blank?
raise Discourse::InvalidAccess if external_upload_stub.created_by_id != current_user.id
external_upload_manager = ExternalUploadManager.new(external_upload_stub)
hijack do
begin
upload = external_upload_manager.promote_to_upload!
if upload.errors.empty?
external_upload_manager.destroy!
render json: UploadsController.serialize_upload(upload), status: 200
else
render_json_error(upload.errors.to_hash.values.flatten, status: 422)
end
rescue ExternalUploadManager::ChecksumMismatchError => err
debug_upload_error(err, "upload.checksum_mismatch_failure")
render_json_error(I18n.t("upload.failed"), status: 422)
rescue ExternalUploadManager::CannotPromoteError => err
debug_upload_error(err, "upload.cannot_promote_failure")
render_json_error(I18n.t("upload.failed"), status: 422)
rescue ExternalUploadManager::DownloadFailedError, Aws::S3::Errors::NotFound => err
debug_upload_error(err, "upload.download_failure")
render_json_error(I18n.t("upload.failed"), status: 422)
rescue => err
Discourse.warn_exception(
err, message: "Complete external upload failed unexpectedly for user #{current_user.id}"
)
render_json_error(I18n.t("upload.failed"), status: 422)
end
end
end
protected protected
def force_download? def force_download?
@ -274,4 +357,8 @@ class UploadsController < ApplicationController
send_file(file_path, opts) send_file(file_path, opts)
end end
def debug_upload_error(translation_key, err)
return if !SiteSetting.enable_upload_debug_mode
Discourse.warn_exception(err, message: I18n.t(translation_key))
end
end end

View File

@ -45,6 +45,8 @@ module Jobs
end end
end end
ExternalUploadStub.cleanup!
self.last_cleanup = Time.zone.now.to_i self.last_cleanup = Time.zone.now.to_i
end end

View File

@ -0,0 +1,68 @@
# frozen_string_literal: true
require "digest/sha1"
class ExternalUploadStub < ActiveRecord::Base
CREATED_EXPIRY_HOURS = 1
UPLOADED_EXPIRY_HOURS = 24
belongs_to :created_by, class_name: 'User'
scope :expired_created, -> {
where(
"status = ? AND created_at <= ?",
ExternalUploadStub.statuses[:created],
CREATED_EXPIRY_HOURS.hours.ago
)
}
scope :expired_uploaded, -> {
where(
"status = ? AND created_at <= ?",
ExternalUploadStub.statuses[:uploaded],
UPLOADED_EXPIRY_HOURS.hours.ago
)
}
before_create do
self.unique_identifier = SecureRandom.uuid
self.status = ExternalUploadStub.statuses[:created] if self.status.blank?
end
def self.statuses
@statuses ||= Enum.new(
created: 1,
uploaded: 2,
failed: 3
)
end
# TODO (martin): Lifecycle rule would be best to clean stuff up in the external
# systems, I don't think we really want to be calling out to the external systems
# here right?
def self.cleanup!
expired_created.delete_all
expired_uploaded.delete_all
end
end
# == Schema Information
#
# Table name: external_upload_stubs
#
# id :bigint not null, primary key
# key :string not null
# original_filename :string not null
# status :integer default(1), not null
# unique_identifier :uuid not null
# created_by_id :integer not null
# upload_type :string not null
# created_at :datetime not null
# updated_at :datetime not null
#
# Indexes
#
# index_external_upload_stubs_on_created_by_id (created_by_id)
# index_external_upload_stubs_on_key (key) UNIQUE
# index_external_upload_stubs_on_status (status)
#

View File

@ -0,0 +1,84 @@
# frozen_string_literal: true
class ExternalUploadManager
DOWNLOAD_LIMIT = 100.megabytes
class ChecksumMismatchError < StandardError; end
class DownloadFailedError < StandardError; end
class CannotPromoteError < StandardError; end
attr_reader :external_upload_stub
def initialize(external_upload_stub)
@external_upload_stub = external_upload_stub
end
def can_promote?
external_upload_stub.status == ExternalUploadStub.statuses[:created]
end
def promote_to_upload!
raise CannotPromoteError if !can_promote?
external_upload_stub.update!(status: ExternalUploadStub.statuses[:uploaded])
external_stub_object = Discourse.store.object_from_path(external_upload_stub.key)
external_etag = external_stub_object.etag
external_size = external_stub_object.size
external_sha1 = external_stub_object.metadata["sha1-checksum"]
# This could be legitimately nil, if it's too big to download on the
# server, or it could have failed. To this end we set a should_download
# variable as well to check.
tempfile = nil
should_download = external_size < DOWNLOAD_LIMIT
if should_download
tempfile = download(external_upload_stub.key, external_upload_stub.upload_type)
raise DownloadFailedError if tempfile.blank?
actual_sha1 = Upload.generate_digest(tempfile)
if external_sha1 && external_sha1 != actual_sha1
raise ChecksumMismatchError
end
end
# TODO (martin): See if these additional opts will be needed
#
# for_private_message: for_private_message,
# for_site_setting: for_site_setting,
# pasted: pasted,
#
# also check if retain_hours is needed
opts = {
type: external_upload_stub.upload_type,
existing_external_upload_key: external_upload_stub.key,
external_upload_too_big: external_size > DOWNLOAD_LIMIT,
filesize: external_size
}
UploadCreator.new(tempfile, external_upload_stub.original_filename, opts).create_for(
external_upload_stub.created_by_id
)
rescue
external_upload_stub.update!(status: ExternalUploadStub.statuses[:failed])
raise
ensure
tempfile&.close!
end
def destroy!
external_upload_stub.destroy!
end
private
def download(key, type)
url = Discourse.store.signed_url_for_path(external_upload_stub.key)
FileHelper.download(
url,
max_file_size: DOWNLOAD_LIMIT,
tmp_file_name: "discourse-upload-#{type}",
follow_redirect: true
)
end
end

View File

@ -4019,8 +4019,12 @@ en:
store_failure: "Failed to store upload #%{upload_id} for user #%{user_id}." store_failure: "Failed to store upload #%{upload_id} for user #%{user_id}."
file_missing: "Sorry, you must provide a file to upload." file_missing: "Sorry, you must provide a file to upload."
empty: "Sorry, but the file you provided is empty." empty: "Sorry, but the file you provided is empty."
failed: "Sorry, but your upload failed. Please try again."
png_to_jpg_conversion_failure_message: "An error happened when converting from PNG to JPG." png_to_jpg_conversion_failure_message: "An error happened when converting from PNG to JPG."
optimize_failure_message: "An error occurred while optimizing the uploaded image." optimize_failure_message: "An error occurred while optimizing the uploaded image."
download_failure: "Downloading the file from the external provider failed."
checksum_mismatch_failure: "The checksum of the file you uploaded does not match. The file contents may have changed on upload. Please try again."
cannot_promote_failure: "The upload cannot be completed, it may have already completed or previously failed."
attachments: attachments:
too_large: "Sorry, the file you are trying to upload is too big (maximum size is %{max_size_kb}KB)." too_large: "Sorry, the file you are trying to upload is too big (maximum size is %{max_size_kb}KB)."
images: images:

View File

@ -537,6 +537,9 @@ Discourse::Application.routes.draw do
post "uploads" => "uploads#create" post "uploads" => "uploads#create"
post "uploads/lookup-urls" => "uploads#lookup_urls" post "uploads/lookup-urls" => "uploads#lookup_urls"
post "uploads/generate-presigned-put" => "uploads#generate_presigned_put"
post "uploads/complete-external-upload" => "uploads#complete_external_upload"
# used to download original images # used to download original images
get "uploads/:site/:sha(.:extension)" => "uploads#show", constraints: { site: /\w+/, sha: /\h{40}/, extension: /[a-z0-9\._]+/i } get "uploads/:site/:sha(.:extension)" => "uploads#show", constraints: { site: /\w+/, sha: /\h{40}/, extension: /[a-z0-9\._]+/i }
get "uploads/short-url/:base62(.:extension)" => "uploads#show_short", constraints: { site: /\w+/, base62: /[a-zA-Z0-9]+/, extension: /[a-zA-Z0-9\._-]+/i }, as: :upload_short get "uploads/short-url/:base62(.:extension)" => "uploads#show_short", constraints: { site: /\w+/, base62: /[a-zA-Z0-9]+/, extension: /[a-zA-Z0-9\._-]+/i }, as: :upload_short

View File

@ -271,6 +271,13 @@ basic:
client: true client: true
default: false default: false
hidden: true hidden: true
enable_direct_s3_uploads:
client: true
default: false
hidden: true
enable_upload_debug_mode:
default: false
hidden: true
default_theme_id: default_theme_id:
default: -1 default: -1
hidden: true hidden: true

View File

@ -0,0 +1,18 @@
# frozen_string_literal: true
class CreateExternalUploadStubs < ActiveRecord::Migration[6.1]
def change
create_table :external_upload_stubs do |t|
t.string :key, null: false
t.string :original_filename, null: false
t.integer :status, default: 1, null: false, index: true
t.uuid :unique_identifier, null: false
t.integer :created_by_id, null: false, index: true
t.string :upload_type, null: false
t.timestamps
end
add_index :external_upload_stubs, [:key], unique: true
end
end

View File

@ -5,6 +5,7 @@ module FileStore
class BaseStore class BaseStore
UPLOAD_PATH_REGEX = %r|/(original/\d+X/.*)| UPLOAD_PATH_REGEX = %r|/(original/\d+X/.*)|
OPTIMIZED_IMAGE_PATH_REGEX = %r|/(optimized/\d+X/.*)| OPTIMIZED_IMAGE_PATH_REGEX = %r|/(optimized/\d+X/.*)|
TEMPORARY_UPLOAD_PREFIX ||= "temp/"
def store_upload(file, upload, content_type = nil) def store_upload(file, upload, content_type = nil)
upload.url = nil upload.url = nil
@ -40,6 +41,15 @@ module FileStore
File.join(path, "test_#{ENV['TEST_ENV_NUMBER'].presence || '0'}") File.join(path, "test_#{ENV['TEST_ENV_NUMBER'].presence || '0'}")
end end
def temporary_upload_path(file_name)
File.join(
upload_path,
TEMPORARY_UPLOAD_PREFIX,
SecureRandom.hex,
file_name
)
end
def has_been_uploaded?(url) def has_been_uploaded?(url)
not_implemented not_implemented
end end

View File

@ -35,6 +35,22 @@ module FileStore
url url
end end
def move_existing_stored_upload(existing_external_upload_key, upload, content_type = nil)
upload.url = nil
path = get_path_for_upload(upload)
url, upload.etag = store_file(
nil,
path,
filename: upload.original_filename,
content_type: content_type,
cache_locally: false,
private_acl: upload.secure?,
move_existing: true,
existing_external_upload_key: existing_external_upload_key
)
url
end
def store_optimized_image(file, optimized_image, content_type = nil, secure: false) def store_optimized_image(file, optimized_image, content_type = nil, secure: false)
optimized_image.url = nil optimized_image.url = nil
path = get_path_for_optimized_image(optimized_image) path = get_path_for_optimized_image(optimized_image)
@ -42,10 +58,18 @@ module FileStore
url url
end end
# File is an actual Tempfile on disk
#
# An existing_external_upload_key is given for cases where move_existing is specified.
# This is an object already uploaded directly to S3 that we are now moving
# to its final resting place with the correct sha and key.
#
# options # options
# - filename # - filename
# - content_type # - content_type
# - cache_locally # - cache_locally
# - move_existing
# - existing_external_upload_key
def store_file(file, path, opts = {}) def store_file(file, path, opts = {})
path = path.dup path = path.dup
@ -72,7 +96,16 @@ module FileStore
path.prepend(File.join(upload_path, "/")) if Rails.configuration.multisite path.prepend(File.join(upload_path, "/")) if Rails.configuration.multisite
# if this fails, it will throw an exception # if this fails, it will throw an exception
if opts[:move_existing] && opts[:existing_external_upload_key]
path, etag = s3_helper.copy(
opts[:existing_external_upload_key],
path,
options: options
)
s3_helper.delete_object(opts[:existing_external_upload_key])
else
path, etag = s3_helper.upload(file, path, options) path, etag = s3_helper.upload(file, path, options)
end
# return the upload url and etag # return the upload url and etag
[File.join(absolute_base_url, path), etag] [File.join(absolute_base_url, path), etag]
@ -162,10 +195,14 @@ module FileStore
def url_for(upload, force_download: false) def url_for(upload, force_download: false)
upload.secure? || force_download ? upload.secure? || force_download ?
presigned_url(get_upload_key(upload), force_download: force_download, filename: upload.original_filename) : presigned_get_url(get_upload_key(upload), force_download: force_download, filename: upload.original_filename) :
upload.url upload.url
end end
def path_from_url(url)
URI.parse(url).path.delete_prefix("/")
end
def cdn_url(url) def cdn_url(url)
return url if SiteSetting.Upload.s3_cdn_url.blank? return url if SiteSetting.Upload.s3_cdn_url.blank?
schema = url[/^(https?:)?\/\//, 1] schema = url[/^(https?:)?\/\//, 1]
@ -175,7 +212,21 @@ module FileStore
def signed_url_for_path(path, expires_in: S3Helper::DOWNLOAD_URL_EXPIRES_AFTER_SECONDS, force_download: false) def signed_url_for_path(path, expires_in: S3Helper::DOWNLOAD_URL_EXPIRES_AFTER_SECONDS, force_download: false)
key = path.sub(absolute_base_url + "/", "") key = path.sub(absolute_base_url + "/", "")
presigned_url(key, expires_in: expires_in, force_download: force_download) presigned_get_url(key, expires_in: expires_in, force_download: force_download)
end
def signed_url_for_temporary_upload(file_name, expires_in: S3Helper::UPLOAD_URL_EXPIRES_AFTER_SECONDS, metadata: {})
key = temporary_upload_path(file_name)
presigned_put_url(key, expires_in: expires_in, metadata: metadata)
end
def temporary_upload_path(file_name)
path = super(file_name)
s3_bucket_folder_path.nil? ? path : File.join(s3_bucket_folder_path, path)
end
def object_from_path(path)
s3_helper.object(path)
end end
def cache_avatar(avatar, user_id) def cache_avatar(avatar, user_id)
@ -248,7 +299,19 @@ module FileStore
private private
def presigned_url( def presigned_put_url(key, expires_in: S3Helper::UPLOAD_URL_EXPIRES_AFTER_SECONDS, metadata: {})
signer = Aws::S3::Presigner.new(client: s3_helper.s3_client)
signer.presigned_url(
:put_object,
bucket: s3_bucket_name,
key: key,
acl: "private",
expires_in: expires_in,
metadata: metadata
)
end
def presigned_get_url(
url, url,
force_download: false, force_download: false,
filename: false, filename: false,
@ -262,7 +325,7 @@ module FileStore
) )
end end
obj = s3_helper.object(url) obj = object_from_path(url)
obj.presigned_url(:get, opts) obj.presigned_url(:get, opts)
end end
@ -276,7 +339,7 @@ module FileStore
def update_ACL(key, secure) def update_ACL(key, secure)
begin begin
s3_helper.object(key).acl.put(acl: secure ? "private" : "public-read") object_from_path(key).acl.put(acl: secure ? "private" : "public-read")
rescue Aws::S3::Errors::NoSuchKey rescue Aws::S3::Errors::NoSuchKey
Rails.logger.warn("Could not update ACL on upload with key: '#{key}'. Upload is missing.") Rails.logger.warn("Could not update ACL on upload with key: '#{key}'. Upload is missing.")
end end

View File

@ -15,7 +15,13 @@ class S3Helper
# * cache time for secure-media URLs # * cache time for secure-media URLs
# * expiry time for S3 presigned URLs, which include backup downloads and # * expiry time for S3 presigned URLs, which include backup downloads and
# any upload that has a private ACL (e.g. secure uploads) # any upload that has a private ACL (e.g. secure uploads)
DOWNLOAD_URL_EXPIRES_AFTER_SECONDS ||= 300 DOWNLOAD_URL_EXPIRES_AFTER_SECONDS ||= 5.minutes.to_i
##
# Controls the following:
#
# * presigned put_object URLs for direct S3 uploads
UPLOAD_URL_EXPIRES_AFTER_SECONDS ||= 10.minutes.to_i
def initialize(s3_bucket_name, tombstone_prefix = '', options = {}) def initialize(s3_bucket_name, tombstone_prefix = '', options = {})
@s3_client = options.delete(:client) @s3_client = options.delete(:client)
@ -80,6 +86,7 @@ class S3Helper
end end
def copy(source, destination, options: {}) def copy(source, destination, options: {})
destination = get_path_for_s3_upload(destination)
if !Rails.configuration.multisite if !Rails.configuration.multisite
options[:copy_source] = File.join(@s3_bucket_name, source) options[:copy_source] = File.join(@s3_bucket_name, source)
else else
@ -94,9 +101,9 @@ class S3Helper
options[:copy_source] = File.join(@s3_bucket_name, multisite_upload_path, source) options[:copy_source] = File.join(@s3_bucket_name, multisite_upload_path, source)
end end
end end
s3_bucket
.object(destination) response = s3_bucket.object(destination).copy_from(options)
.copy_from(options) [destination, response.copy_object_result.etag.gsub('"', '')]
end end
# make sure we have a cors config for assets # make sure we have a cors config for assets

View File

@ -30,6 +30,7 @@ class UploadCreator
@filename = (filename || "").gsub(/[^[:print:]]/, "") @filename = (filename || "").gsub(/[^[:print:]]/, "")
@upload = Upload.new(original_filename: @filename, filesize: 0) @upload = Upload.new(original_filename: @filename, filesize: 0)
@opts = opts @opts = opts
@filesize = @opts[:filesize] if @opts[:external_upload_too_big]
@opts[:validate] = opts[:skip_validations].present? ? !ActiveRecord::Type::Boolean.new.cast(opts[:skip_validations]) : true @opts[:validate] = opts[:skip_validations].present? ? !ActiveRecord::Type::Boolean.new.cast(opts[:skip_validations]) : true
end end
@ -44,14 +45,22 @@ class UploadCreator
is_image ||= @image_info && FileHelper.is_supported_image?("test.#{@image_info.type}") is_image ||= @image_info && FileHelper.is_supported_image?("test.#{@image_info.type}")
is_image = false if @opts[:for_theme] is_image = false if @opts[:for_theme]
# if this is present then it means we are creating an upload record from
# an external_upload_stub and the file is > ExternalUploadManager::DOWNLOAD_LIMIT,
# so we have not downloaded it to a tempfile. no modifications can be made to the
# file in this case because it does not exist; we simply move it to its new location
# in S3
external_upload_too_big = @opts[:external_upload_too_big]
sha1_before_changes = Upload.generate_digest(@file) if @file
DistributedMutex.synchronize("upload_#{user_id}_#{@filename}") do DistributedMutex.synchronize("upload_#{user_id}_#{@filename}") do
# We need to convert HEIFs early because FastImage does not consider them as images # We need to convert HEIFs early because FastImage does not consider them as images
if convert_heif_to_jpeg? if convert_heif_to_jpeg? && !external_upload_too_big
convert_heif! convert_heif!
is_image = FileHelper.is_supported_image?("test.#{@image_info.type}") is_image = FileHelper.is_supported_image?("test.#{@image_info.type}")
end end
if is_image if is_image && !external_upload_too_big
extract_image_info! extract_image_info!
return @upload if @upload.errors.present? return @upload if @upload.errors.present?
@ -72,14 +81,18 @@ class UploadCreator
# compute the sha of the file and generate a unique hash # compute the sha of the file and generate a unique hash
# which is only used for secure uploads # which is only used for secure uploads
if !external_upload_too_big
sha1 = Upload.generate_digest(@file) sha1 = Upload.generate_digest(@file)
unique_hash = SecureRandom.hex(20) if SiteSetting.secure_media end
if SiteSetting.secure_media || external_upload_too_big
unique_hash = generate_fake_sha1_hash
end
# we do not check for duplicate uploads if secure media is # we do not check for duplicate uploads if secure media is
# enabled because we use a unique access hash to differentiate # enabled because we use a unique access hash to differentiate
# between uploads instead of the sha1, and to get around various # between uploads instead of the sha1, and to get around various
# access/permission issues for uploads # access/permission issues for uploads
if !SiteSetting.secure_media if !SiteSetting.secure_media && !external_upload_too_big
# do we already have that upload? # do we already have that upload?
@upload = Upload.find_by(sha1: sha1) @upload = Upload.find_by(sha1: sha1)
@ -99,7 +112,7 @@ class UploadCreator
fixed_original_filename = nil fixed_original_filename = nil
if is_image if is_image && !external_upload_too_big
current_extension = File.extname(@filename).downcase.sub("jpeg", "jpg") current_extension = File.extname(@filename).downcase.sub("jpeg", "jpg")
expected_extension = ".#{image_type}".downcase.sub("jpeg", "jpg") expected_extension = ".#{image_type}".downcase.sub("jpeg", "jpg")
@ -117,13 +130,13 @@ class UploadCreator
@upload.user_id = user_id @upload.user_id = user_id
@upload.original_filename = fixed_original_filename || @filename @upload.original_filename = fixed_original_filename || @filename
@upload.filesize = filesize @upload.filesize = filesize
@upload.sha1 = SiteSetting.secure_media? ? unique_hash : sha1 @upload.sha1 = (SiteSetting.secure_media? || external_upload_too_big) ? unique_hash : sha1
@upload.original_sha1 = SiteSetting.secure_media? ? sha1 : nil @upload.original_sha1 = SiteSetting.secure_media? ? sha1 : nil
@upload.url = "" @upload.url = ""
@upload.origin = @opts[:origin][0...1000] if @opts[:origin] @upload.origin = @opts[:origin][0...1000] if @opts[:origin]
@upload.extension = image_type || File.extname(@filename)[1..10] @upload.extension = image_type || File.extname(@filename)[1..10]
if is_image if is_image && !external_upload_too_big
if @image_info.type.to_s == 'svg' if @image_info.type.to_s == 'svg'
w, h = [0, 0] w, h = [0, 0]
@ -157,12 +170,35 @@ class UploadCreator
# Callbacks using this event to validate the upload or the file must add errors to the # Callbacks using this event to validate the upload or the file must add errors to the
# upload errors object. # upload errors object.
#
# Can't do any validation of the file if external_upload_too_big because we don't have
# the actual file.
if !external_upload_too_big
DiscourseEvent.trigger(:before_upload_creation, @file, is_image, @upload, @opts[:validate]) DiscourseEvent.trigger(:before_upload_creation, @file, is_image, @upload, @opts[:validate])
end
return @upload unless @upload.errors.empty? && @upload.save(validate: @opts[:validate]) return @upload unless @upload.errors.empty? && @upload.save(validate: @opts[:validate])
should_move = false
upload_changed = \
if external_upload_too_big
false
else
Upload.generate_digest(@file) != sha1_before_changes
end
if @opts[:existing_external_upload_key] && Discourse.store.external?
should_move = external_upload_too_big || !upload_changed
end
if should_move
# move the file in the store instead of reuploading
url = Discourse.store.move_existing_stored_upload(@opts[:existing_external_upload_key], @upload)
else
# store the file and update its url # store the file and update its url
File.open(@file.path) do |f| File.open(@file.path) do |f|
url = Discourse.store.store_upload(f, @upload) url = Discourse.store.store_upload(f, @upload)
end
end
if url.present? if url.present?
@upload.url = url @upload.url = url
@ -170,7 +206,6 @@ class UploadCreator
else else
@upload.errors.add(:url, I18n.t("upload.store_failure", upload_id: @upload.id, user_id: user_id)) @upload.errors.add(:url, I18n.t("upload.store_failure", upload_id: @upload.id, user_id: user_id))
end end
end
if @upload.errors.empty? && is_image && @opts[:type] == "avatar" && @upload.extension != "svg" if @upload.errors.empty? && is_image && @opts[:type] == "avatar" && @upload.extension != "svg"
Jobs.enqueue(:create_avatar_thumbnails, upload_id: @upload.id) Jobs.enqueue(:create_avatar_thumbnails, upload_id: @upload.id)
@ -430,7 +465,7 @@ class UploadCreator
end end
def filesize def filesize
File.size?(@file.path).to_i @filesize || File.size?(@file.path).to_i
end end
def max_image_size def max_image_size
@ -484,4 +519,7 @@ class UploadCreator
end end
end end
def generate_fake_sha1_hash
SecureRandom.hex(20)
end
end end

View File

@ -154,9 +154,7 @@ describe FileStore::S3Store do
s3_bucket.expects(:object).with(destination).returns(s3_object) s3_bucket.expects(:object).with(destination).returns(s3_object)
s3_object.expects(:copy_from).with( expect_copy_from(s3_object, "s3-upload-bucket/#{source}")
copy_source: "s3-upload-bucket/#{source}"
)
store.copy_file(upload.url, source, destination) store.copy_file(upload.url, source, destination)
end end
@ -171,7 +169,7 @@ describe FileStore::S3Store do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("tombstone/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("tombstone/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/original/1X/#{upload.sha1}.png") expect_copy_from(s3_object, "s3-upload-bucket/original/1X/#{upload.sha1}.png")
s3_bucket.expects(:object).with("original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -188,7 +186,7 @@ describe FileStore::S3Store do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("tombstone/#{path}").returns(s3_object) s3_bucket.expects(:object).with("tombstone/#{path}").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/#{path}") expect_copy_from(s3_object, "s3-upload-bucket/#{path}")
s3_bucket.expects(:object).with(path).returns(s3_object) s3_bucket.expects(:object).with(path).returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -206,7 +204,7 @@ describe FileStore::S3Store do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("discourse-uploads/tombstone/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("discourse-uploads/tombstone/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/discourse-uploads/original/1X/#{upload.sha1}.png") expect_copy_from(s3_object, "s3-upload-bucket/discourse-uploads/original/1X/#{upload.sha1}.png")
s3_bucket.expects(:object).with("discourse-uploads/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("discourse-uploads/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -231,7 +229,7 @@ describe FileStore::S3Store do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("tombstone/#{image_path}").returns(s3_object) s3_bucket.expects(:object).with("tombstone/#{image_path}").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/#{image_path}") expect_copy_from(s3_object, "s3-upload-bucket/#{image_path}")
s3_bucket.expects(:object).with("#{image_path}").returns(s3_object) s3_bucket.expects(:object).with("#{image_path}").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -257,9 +255,7 @@ describe FileStore::S3Store do
.with("discourse-uploads/tombstone/#{image_path}") .with("discourse-uploads/tombstone/#{image_path}")
.returns(s3_object) .returns(s3_object)
s3_object.expects(:copy_from).with( expect_copy_from(s3_object, "s3-upload-bucket/discourse-uploads/#{image_path}")
copy_source: "s3-upload-bucket/discourse-uploads/#{image_path}"
)
s3_bucket.expects(:object).with( s3_bucket.expects(:object).with(
"discourse-uploads/#{image_path}" "discourse-uploads/#{image_path}"
@ -425,4 +421,12 @@ describe FileStore::S3Store do
expect(store.signed_url_for_path("special/optimized/file.png")).not_to eq(upload.url) expect(store.signed_url_for_path("special/optimized/file.png")).not_to eq(upload.url)
end end
end end
def expect_copy_from(s3_object, source)
s3_object.expects(:copy_from).with(
copy_source: source
).returns(
stub(copy_object_result: stub(etag: '"etagtest"'))
)
end
end end

View File

@ -0,0 +1,19 @@
# frozen_string_literal: true
Fabricator(:external_upload_stub) do
created_by { Fabricate(:user) }
original_filename "test.txt"
key { Discourse.store.temporary_upload_path("test.txt") }
upload_type "card_background"
status 1
end
Fabricator(:image_external_upload_stub, from: :external_upload_stub) do
original_filename "logo.png"
key { Discourse.store.temporary_upload_path("logo.png") }
end
Fabricator(:attachment_external_upload_stub, from: :external_upload_stub) do
original_filename "file.pdf"
key { Discourse.store.temporary_upload_path("file.pdf") }
end

View File

@ -305,4 +305,13 @@ describe Jobs::CleanUpUploads do
expect(Upload.exists?(id: expired_upload.id)).to eq(false) expect(Upload.exists?(id: expired_upload.id)).to eq(false)
expect(Upload.exists?(id: badge_image.id)).to eq(true) expect(Upload.exists?(id: badge_image.id)).to eq(true)
end end
it "deletes external upload stubs that have expired" do
external_stub1 = Fabricate(:external_upload_stub, status: ExternalUploadStub.statuses[:created], created_at: 10.minutes.ago)
external_stub2 = Fabricate(:external_upload_stub, status: ExternalUploadStub.statuses[:created], created_at: (ExternalUploadStub::CREATED_EXPIRY_HOURS.hours + 10.minutes).ago)
external_stub3 = Fabricate(:external_upload_stub, status: ExternalUploadStub.statuses[:uploaded], created_at: 10.minutes.ago)
external_stub4 = Fabricate(:external_upload_stub, status: ExternalUploadStub.statuses[:uploaded], created_at: (ExternalUploadStub::UPLOADED_EXPIRY_HOURS.hours + 10.minutes).ago)
Jobs::CleanUpUploads.new.execute(nil)
expect(ExternalUploadStub.pluck(:id)).to contain_exactly(external_stub1.id, external_stub3.id)
end
end end

View File

@ -113,7 +113,7 @@ RSpec.describe 'Multisite s3 uploads', type: :multisite do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("uploads/tombstone/default/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("uploads/tombstone/default/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/#{upload_path}/original/1X/#{upload.sha1}.png") expect_copy_from(s3_object, "s3-upload-bucket/#{upload_path}/original/1X/#{upload.sha1}.png")
s3_bucket.expects(:object).with("#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -129,7 +129,7 @@ RSpec.describe 'Multisite s3 uploads', type: :multisite do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("uploads/tombstone/second/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("uploads/tombstone/second/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/#{upload_path}/original/1X/#{upload.sha1}.png") expect_copy_from(s3_object, "s3-upload-bucket/#{upload_path}/original/1X/#{upload.sha1}.png")
s3_bucket.expects(:object).with("#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -150,7 +150,7 @@ RSpec.describe 'Multisite s3 uploads', type: :multisite do
s3_object = stub s3_object = stub
s3_bucket.expects(:object).with("discourse-uploads/uploads/tombstone/default/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("discourse-uploads/uploads/tombstone/default/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:copy_from).with(copy_source: "s3-upload-bucket/discourse-uploads/#{upload_path}/original/1X/#{upload.sha1}.png") expect_copy_from(s3_object, "s3-upload-bucket/discourse-uploads/#{upload_path}/original/1X/#{upload.sha1}.png")
s3_bucket.expects(:object).with("discourse-uploads/#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object) s3_bucket.expects(:object).with("discourse-uploads/#{upload_path}/original/1X/#{upload.sha1}.png").returns(s3_object)
s3_object.expects(:delete) s3_object.expects(:delete)
@ -298,4 +298,60 @@ RSpec.describe 'Multisite s3 uploads', type: :multisite do
expect(store.has_been_uploaded?(link)).to eq(false) expect(store.has_been_uploaded?(link)).to eq(false)
end end
end end
describe "#signed_url_for_temporary_upload" do
before do
setup_s3
end
let(:store) { FileStore::S3Store.new }
context "for a bucket with no folder path" do
before { SiteSetting.s3_upload_bucket = "s3-upload-bucket" }
it "returns a presigned url with the correct params and the key for the temporary file" do
url = store.signed_url_for_temporary_upload("test.png")
key = store.path_from_url(url)
expect(url).to match(/Amz-Expires/)
expect(key).to match(/uploads\/default\/test_[0-9]\/temp\/[a-zA-z0-9]{0,32}\/test.png/)
end
it "presigned url contans the metadata when provided" do
url = store.signed_url_for_temporary_upload("test.png", metadata: { "test-meta": "testing" })
expect(url).to include("&x-amz-meta-test-meta=testing")
end
end
context "for a bucket with a folder path" do
before { SiteSetting.s3_upload_bucket = "s3-upload-bucket/site" }
it "returns a presigned url with the correct params and the key for the temporary file" do
url = store.signed_url_for_temporary_upload("test.png")
key = store.path_from_url(url)
expect(url).to match(/Amz-Expires/)
expect(key).to match(/site\/uploads\/default\/test_[0-9]\/temp\/[a-zA-z0-9]{0,32}\/test.png/)
end
end
context "for a multisite site" do
before { SiteSetting.s3_upload_bucket = "s3-upload-bucket/standard99" }
it "returns a presigned url with the correct params and the key for the temporary file" do
test_multisite_connection('second') do
url = store.signed_url_for_temporary_upload("test.png")
key = store.path_from_url(url)
expect(url).to match(/Amz-Expires/)
expect(key).to match(/standard99\/uploads\/second\/test_[0-9]\/temp\/[a-zA-z0-9]{0,32}\/test.png/)
end
end
end
end
def expect_copy_from(s3_object, source)
s3_object.expects(:copy_from).with(
copy_source: source
).returns(
stub(copy_object_result: stub(etag: '"etagtest"'))
)
end
end end

View File

@ -704,4 +704,159 @@ describe UploadsController do
end end
end end
end end
describe "#generate_presigned_put" do
before do
sign_in(user)
end
context "when the store is external" do
before do
SiteSetting.enable_direct_s3_uploads = true
setup_s3
end
it "errors if the correct params are not provided" do
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png" }
expect(response.status).to eq(400)
post "/uploads/generate-presigned-put.json", params: { type: "card_background" }
expect(response.status).to eq(400)
end
it "generates a presigned URL and creates an external upload stub" do
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png", type: "card_background" }
expect(response.status).to eq(200)
result = response.parsed_body
external_upload_stub = ExternalUploadStub.where(
unique_identifier: result["unique_identifier"],
original_filename: "test.png",
created_by: user,
upload_type: "card_background"
)
expect(external_upload_stub.exists?).to eq(true)
expect(result["key"]).to include(FileStore::S3Store::TEMPORARY_UPLOAD_PREFIX)
expect(result["url"]).to include(FileStore::S3Store::TEMPORARY_UPLOAD_PREFIX)
expect(result["url"]).to include("Amz-Expires")
end
it "includes accepted metadata in the presigned url when provided" do
post "/uploads/generate-presigned-put.json", {
params: {
file_name: "test.png",
type: "card_background",
metadata: {
"sha1-checksum" => "testing",
"blah" => "wontbeincluded"
}
}
}
expect(response.status).to eq(200)
result = response.parsed_body
expect(result['url']).to include("&x-amz-meta-sha1-checksum=testing")
expect(result['url']).not_to include("&x-amz-meta-blah=wontbeincluded")
end
it "rate limits" do
RateLimiter.enable
stub_const(UploadsController, "PRESIGNED_PUT_RATE_LIMIT_PER_MINUTE", 1) do
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png", type: "card_background" }
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png", type: "card_background" }
end
expect(response.status).to eq(429)
end
end
context "when the store is not external" do
it "returns 404" do
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png", type: "card_background" }
expect(response.status).to eq(404)
end
end
end
describe "#complete_external_upload" do
before do
sign_in(user)
end
context "when the store is external" do
fab!(:external_upload_stub) { Fabricate(:external_upload_stub, created_by: user) }
let(:upload) { Fabricate(:upload) }
before do
SiteSetting.enable_direct_s3_uploads = true
SiteSetting.enable_upload_debug_mode = true
setup_s3
end
it "returns 404 when the upload stub does not exist" do
post "/uploads/complete-external-upload.json", params: { unique_identifier: "unknown" }
expect(response.status).to eq(404)
end
it "returns 404 when the upload stub does not belong to the user" do
external_upload_stub.update!(created_by: Fabricate(:user))
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(404)
end
it "handles ChecksumMismatchError" do
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).raises(ExternalUploadManager::ChecksumMismatchError)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"].first).to eq(I18n.t("upload.failed"))
end
it "handles CannotPromoteError" do
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).raises(ExternalUploadManager::CannotPromoteError)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"].first).to eq(I18n.t("upload.failed"))
end
it "handles DownloadFailedError and Aws::S3::Errors::NotFound" do
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).raises(ExternalUploadManager::DownloadFailedError)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"].first).to eq(I18n.t("upload.failed"))
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).raises(Aws::S3::Errors::NotFound.new("error", "not found"))
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"].first).to eq(I18n.t("upload.failed"))
end
it "handles a generic upload failure" do
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).raises(StandardError)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"].first).to eq(I18n.t("upload.failed"))
end
it "handles validation errors on the upload" do
upload.errors.add(:base, "test error")
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).returns(upload)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(response.status).to eq(422)
expect(response.parsed_body["errors"]).to eq(["test error"])
end
it "deletes the stub and returns the serialized upload when complete" do
ExternalUploadManager.any_instance.stubs(:promote_to_upload!).returns(upload)
post "/uploads/complete-external-upload.json", params: { unique_identifier: external_upload_stub.unique_identifier }
expect(ExternalUploadStub.exists?(id: external_upload_stub.id)).to eq(false)
expect(response.status).to eq(200)
expect(response.parsed_body).to eq(UploadsController.serialize_upload(upload))
end
end
context "when the store is not external" do
it "returns 404" do
post "/uploads/generate-presigned-put.json", params: { file_name: "test.png", type: "card_background" }
expect(response.status).to eq(404)
end
end
end
end end

View File

@ -0,0 +1,222 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe ExternalUploadManager do
fab!(:user) { Fabricate(:user) }
let(:type) { "card_background" }
let!(:logo_file) { file_from_fixtures("logo.png") }
let!(:pdf_file) { file_from_fixtures("large.pdf", "pdf") }
let(:object_size) { 1.megabyte }
let(:etag) { "e696d20564859cbdf77b0f51cbae999a" }
let(:client_sha1) { Upload.generate_digest(object_file) }
let(:sha1) { Upload.generate_digest(object_file) }
let(:object_file) { logo_file }
let(:metadata_headers) { {} }
let!(:external_upload_stub) { Fabricate(:image_external_upload_stub, created_by: user) }
let(:upload_base_url) { "https://#{SiteSetting.s3_upload_bucket}.s3.#{SiteSetting.s3_region}.amazonaws.com" }
subject do
ExternalUploadManager.new(external_upload_stub)
end
before do
SiteSetting.authorized_extensions += "|pdf"
SiteSetting.max_attachment_size_kb = 210.megabytes / 1000
setup_s3
stub_head_object
stub_download_object_filehelper
stub_copy_object
stub_delete_object
end
describe "#can_promote?" do
it "returns false if the external stub status is not created" do
external_upload_stub.update!(status: ExternalUploadStub.statuses[:uploaded])
expect(subject.can_promote?).to eq(false)
end
end
describe "#promote_to_upload!" do
context "when stubbed upload is < DOWNLOAD_LIMIT (small enough to download + generate sha)" do
let!(:external_upload_stub) { Fabricate(:image_external_upload_stub, created_by: user) }
let(:object_size) { 1.megabyte }
let(:object_file) { logo_file }
context "when the download of the s3 file fails" do
before do
FileHelper.stubs(:download).returns(nil)
end
it "raises an error" do
expect { subject.promote_to_upload! }.to raise_error(ExternalUploadManager::DownloadFailedError)
end
end
context "when the upload is not in the created status" do
before do
external_upload_stub.update!(status: ExternalUploadStub.statuses[:uploaded])
end
it "raises an error" do
expect { subject.promote_to_upload! }.to raise_error(ExternalUploadManager::CannotPromoteError)
end
end
context "when the upload does not get changed in UploadCreator (resized etc.)" do
it "copies the stubbed upload on S3 to its new destination and deletes it" do
upload = subject.promote_to_upload!
expect(WebMock).to have_requested(
:put,
"#{upload_base_url}/original/1X/#{upload.sha1}.png",
).with(headers: { 'X-Amz-Copy-Source' => "#{SiteSetting.s3_upload_bucket}/#{external_upload_stub.key}" })
expect(WebMock).to have_requested(
:delete,
"#{upload_base_url}/#{external_upload_stub.key}"
)
end
it "errors if the image upload is too big" do
SiteSetting.max_image_size_kb = 1
upload = subject.promote_to_upload!
expect(upload.errors.full_messages).to include(
"Filesize " + I18n.t("upload.images.too_large", max_size_kb: SiteSetting.max_image_size_kb)
)
end
it "errors if the extension is not supported" do
SiteSetting.authorized_extensions = ""
upload = subject.promote_to_upload!
expect(upload.errors.full_messages).to include(
"Original filename " + I18n.t("upload.unauthorized", authorized_extensions: "")
)
end
end
context "when the upload does get changed by the UploadCreator" do
let(:file) { file_from_fixtures("should_be_jpeg.heic", "images") }
it "creates a new upload in s3 (not copy) and deletes the original stubbed upload" do
upload = subject.promote_to_upload!
expect(WebMock).to have_requested(
:put,
"#{upload_base_url}/original/1X/#{upload.sha1}.png"
)
expect(WebMock).to have_requested(
:delete, "#{upload_base_url}/#{external_upload_stub.key}"
)
end
end
context "when the sha has been set on the s3 object metadata by the clientside JS" do
let(:metadata_headers) { { "x-amz-meta-sha1-checksum" => client_sha1 } }
context "when the downloaded file sha1 does not match the client sha1" do
let(:client_sha1) { "blahblah" }
it "raises an error and marks upload as failed" do
expect { subject.promote_to_upload! }.to raise_error(ExternalUploadManager::ChecksumMismatchError)
expect(external_upload_stub.reload.status).to eq(ExternalUploadStub.statuses[:failed])
end
end
end
end
context "when stubbed upload is > DOWNLOAD_LIMIT (too big to download, generate a fake sha)" do
let(:object_size) { 200.megabytes }
let(:object_file) { pdf_file }
let!(:external_upload_stub) { Fabricate(:attachment_external_upload_stub, created_by: user) }
before do
UploadCreator.any_instance.stubs(:generate_fake_sha1_hash).returns("testbc60eb18e8f974cbfae8bb0f069c3a311024")
end
it "does not try and download the file" do
FileHelper.expects(:download).never
subject.promote_to_upload!
end
it "generates a fake sha for the upload record" do
upload = subject.promote_to_upload!
expect(upload.sha1).not_to eq(sha1)
expect(upload.original_sha1).to eq(nil)
expect(upload.filesize).to eq(object_size)
end
it "marks the stub as uploaded" do
subject.promote_to_upload!
expect(external_upload_stub.reload.status).to eq(ExternalUploadStub.statuses[:uploaded])
end
it "copies the stubbed upload on S3 to its new destination and deletes it" do
upload = subject.promote_to_upload!
expect(WebMock).to have_requested(
:put,
"#{upload_base_url}/original/1X/#{upload.sha1}.pdf"
).with(headers: { 'X-Amz-Copy-Source' => "#{SiteSetting.s3_upload_bucket}/#{external_upload_stub.key}" })
expect(WebMock).to have_requested(
:delete, "#{upload_base_url}/#{external_upload_stub.key}"
)
end
end
end
def stub_head_object
stub_request(
:head,
"#{upload_base_url}/#{external_upload_stub.key}"
).to_return(
status: 200,
headers: {
ETag: etag,
"Content-Length" => object_size,
"Content-Type" => "image/png",
}.merge(metadata_headers)
)
end
def stub_download_object_filehelper
signed_url = Discourse.store.signed_url_for_path(external_upload_stub.key)
uri = URI.parse(signed_url)
signed_url = uri.to_s.gsub(uri.query, "")
stub_request(:get, signed_url).with(query: hash_including({})).to_return(
status: 200,
body: object_file.read
)
end
def stub_copy_object
copy_object_result = <<~BODY
<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n
<CopyObjectResult
xmlns=\"http://s3.amazonaws.com/doc/2006-03-01/\">
<LastModified>2021-07-19T04:10:41.000Z</LastModified>
<ETag>&quot;#{etag}&quot;</ETag>
</CopyObjectResult>
BODY
stub_request(
:put,
"#{upload_base_url}/original/1X/testbc60eb18e8f974cbfae8bb0f069c3a311024.pdf"
).to_return(
status: 200,
headers: { "ETag" => etag },
body: copy_object_result
)
stub_request(
:put,
"#{upload_base_url}/original/1X/bc975735dfc6409c1c2aa5ebf2239949bcbdbd65.png"
).to_return(
status: 200,
headers: { "ETag" => etag },
body: copy_object_result
)
end
def stub_delete_object
stub_request(
:delete, "#{upload_base_url}/#{external_upload_stub.key}"
).to_return(
status: 200
)
end
end