mirror of
https://github.com/discourse/discourse.git
synced 2024-11-22 04:34:32 +08:00
930f51e175
* Chinese segmenetation will continue to rely on cppjieba
* Japanese segmentation will use our port of TinySegmenter
* Korean currently does not rely on segmentation which was dropped in c677877e4f
* SiteSetting.search_tokenize_chinese_japanese_korean has been split
into SiteSetting.search_tokenize_chinese and
SiteSetting.search_tokenize_japanese respectively
15 lines
275 B
Ruby
15 lines
275 B
Ruby
# frozen_string_literal: true
|
|
|
|
class SearchTokenizeJapaneseValidator
|
|
def initialize(opts = {})
|
|
end
|
|
|
|
def valid_value?(value)
|
|
!SiteSetting.search_tokenize_chinese
|
|
end
|
|
|
|
def error_message
|
|
I18n.t("site_settings.errors.search_tokenize_chinese_enabled")
|
|
end
|
|
end
|