Commit a6c2be7c authored by GitLab Bot's avatar GitLab Bot

Add latest changes from gitlab-org/gitlab@master

parent 74a2d57b
## Problem Statement
<!-- What is the problem we hope to validate and solve? -->
<!-- What is the problem we hope to validate? Reference how to write a real customer problem statement at https://productcoalition.com/how-to-write-a-good-customer-problem-statement-a815f80189ba for guidance. -->
## Reach
......
Please view this file on the master branch, on stable branches it's out of date.
## 12.8.0
### Removed (1 change)
- Remove confidence labels from security report. !24033
### Fixed (33 changes, 1 of them is from the community)
- Fix UI on Project Audit Events when the feature not available. !16032 (Takuya Noguchi)
- Group SSO handles locked users gracefully instead of showing 500 error. !20329
- Fix incorrect security status counts. !22650
- Fix include subgroups in security status. !22653
- Make sure type is set properly in Elasticsearch query when doing global search. !22821
- Include users from all sub-projects and shared groups when counting billing seats currently in use. !22967
- Fix vulnerability finding list endpoint query timeout on instance security dashboard. !23232
- Add app validation for any-approver rule uniqueness. !23241
- Fix 500 error in global search for blob, wiki_blob and commit search. !23326
- Fix group hook triggering from subgroup project. !23333
- Change conditions when user uses license seat. !23522
- Accept group path as ID when fetching notes from API. !23535
- Fixes a bug that prevented auto-remediation on the pipeline security dashboard. !23677
- Fix nav link in security submenu. !23775
- Order epic related issues by relative_position. !23776
- Correctly display the number of approvals for a merge request. !23827
- Fix orphan issues that were promoted to epics. !23916
- Fix rendering of design management references. !24001
- Fix 500 error when browsing the roadmap page for a group the user is not authorized to view. !24002
- Use project slug instead of name for Error Tracking Settings Display. !24176
- Display error message in MR License Report if it fails to load. !24201
- Fix display logic of Securty Report MR widget. !24204
- Set SSL certificates path env when calling ES indexer. !24213
- Allow submit to event to trigger a new search. !24262
- Fix npm package uploads when bundleDependencies is set to false. !24355
- Resolve 500 error after Web IDE terminal use. !24443
- Added commas to current active user count when appropriate. !24549
- Hide duplicate company/individual question on trial selection. !24567
- Update invalid SPDX identifiers in software licenses table. !24829
- Cleanup deprecated package dependency links. !24868
- Fix to display a link to the logs in both embed and dashboard. !25288
- Disable self-approval at the Instance level - Fix approvals filtering. !25385
- Allow user to close sidebar while editing boardlist and save wip limit.
### Changed (13 changes)
- Display generic error in codeclimate MR widget when base_path is null. !21666
- Adjust skip trial copy in trial sign up flow for SaaS users who are logged in. !22923
- Use export icon instead of download for the export button in the Dependencies List. !23094
- Apply darker color to column headers and scan names in secure features configuration. !23104
- Redacts quick actions used by support bot. !23353
- Remove Code Review Analytics feature flag. !23418
- Delete description change history - Frontend. !23568
- Support moving the design repository of a project when the project is transferred to a new namespace. !23573
- Display proper error messages on vulnerabilities fetch failure. !23812
- Add date range validation for Cycle Analytics at the backend side. !24254
- Exclude GitLab generated bot users from using a license seat. !24275
- Changes the standalone vulnerabilty endpoint. !24777
- Move Productivity Analytics page to the group level. !25329
### Performance (1 change)
- Geo - Fix query to retrieve Job Artifacts when selective sync is disabled. !25388
### Added (42 changes, 1 of them is from the community)
- Create DesignAtVersion model, exposing it with GraphQL. !15260
- Add Group-level compliance dashboard MVC. !20844
- Adds sorting to package api. !20963
- Allow to soft delete issuables description history. !21439
- Display warning flash if design upload is skipped. !21615
- Ask if a user is trying GitLab for his company or for his individual usage. !22280
- Support design tab link references for issues. !22330
- Allow using custom user name for service desk emails. !22478
- Raise exception if any namespaces runner minutes were not reset. !22636
- Adds vulnerability management state dropdown. !22823
- Add additional license information to admin dashboard. !22866
- Add sort by date to audit logs and events. !22887
- Add Group WebHooks API. !22994 (Rajendra Kadam)
- Add API route to confirm a vulnerability. !23099
- Creates the standalone vulnerability list page. !23438
- Show license badge for Gitlab.com member overview. !23521
- Create audit log when username changes. !23576
- Resolve Disable self-approval at the Instance level. !23731
- Add time picker to logs page. !23837
- Introduce Credentials Inventory for Groups that enforce Group Managed Accounts. !23944
- Add API for protected environments. !23964
- Prompt users to check their account settings. !23994
- Allow to pick a subgroup to hold the Insights config. !24053
- Add health_status column to issues and epics tables. !24202
- Add a link in dashboard to allow users to go to the logs page. !24240
- Record audit event when user is deleted. !24257
- Time Series chart filtered time range (datazoom) becomes reflected in the View logs link. !24343
- Scope merge request approval rules to protected branches using API search. !24344
- Add application limit for ES indexed field length. !24345
- Add affected projects feature to instance security dashboard. !24644
- Add trial field to namespaces API. !24666
- Make elasticsearch bulk parameters configurable. !24688
- Add feature filter for users. !24765
- Design view: moveable `new comment` pin. !24769
- Record audit event when user is added. !24855
- Add group identification headers to epic emails. !24878
- Record audit event when user is blocked. !24930
- Moveable design note pins. !24934
- Add NuGet Repository. !25157
- Add single-level Epics to EE Premium. !25184
- Show View logs link in embed metrics. !25217
- Add usage ping counter for events. !199874
### Other (1 change)
- Prepare DB structure for GMA forking changes. !22002
## 12.7.5
### Fixed (1 change)
......
......@@ -5,7 +5,25 @@ module Mutations
class Update < Base
graphql_name 'UpdateIssue'
# Add arguments here instead of creating separate mutations
argument :title,
GraphQL::STRING_TYPE,
required: false,
description: copy_field_description(Types::IssueType, :title)
argument :description,
GraphQL::STRING_TYPE,
required: false,
description: copy_field_description(Types::IssueType, :description)
argument :due_date,
Types::TimeType,
required: true,
description: copy_field_description(Types::IssueType, :due_date)
argument :confidential,
GraphQL::BOOLEAN_TYPE,
required: true,
description: copy_field_description(Types::IssueType, :confidential)
def resolve(project_path:, iid:, **args)
issue = authorized_find!(project_path: project_path, iid: iid)
......
......@@ -11,6 +11,7 @@ class LfsObject < ApplicationRecord
scope :with_files_stored_locally, -> { where(file_store: LfsObjectUploader::Store::LOCAL) }
scope :with_files_stored_remotely, -> { where(file_store: LfsObjectUploader::Store::REMOTE) }
scope :for_oids, -> (oids) { where(oid: oids) }
validates :oid, presence: true, uniqueness: true
......
......@@ -25,7 +25,6 @@ module Projects
private
# rubocop: disable CodeReuse/ActiveRecord
def link_existing_lfs_objects(oids)
linked_existing_objects = []
iterations = 0
......@@ -33,7 +32,7 @@ module Projects
oids.each_slice(BATCH_SIZE) do |oids_batch|
# Load all existing LFS Objects immediately so we don't issue an extra
# query for the `.any?`
existent_lfs_objects = LfsObject.where(oid: oids_batch).load
existent_lfs_objects = LfsObject.for_oids(oids_batch).load
next unless existent_lfs_objects.any?
rows = existent_lfs_objects
......@@ -49,7 +48,6 @@ module Projects
linked_existing_objects
end
# rubocop: enable CodeReuse/ActiveRecord
def log_lfs_link_results(lfs_objects_linked_count, iterations)
Gitlab::Import::Logger.info(
......
......@@ -27,7 +27,7 @@ module Projects
@status.run!
raise InvalidStateError, 'missing pages artifacts' unless build.artifacts?
raise InvalidStateError, 'pages are outdated' unless latest?
raise InvalidStateError, 'build SHA is outdated for this ref' unless latest?
# Create temporary directory in which we will extract the artifacts
make_secure_tmp_dir(tmp_path) do |archive_path|
......@@ -36,7 +36,7 @@ module Projects
# Check if we did extract public directory
archive_public_path = File.join(archive_path, PUBLIC_DIR)
raise InvalidStateError, 'pages miss the public folder' unless Dir.exist?(archive_public_path)
raise InvalidStateError, 'pages are outdated' unless latest?
raise InvalidStateError, 'build SHA is outdated for this ref' unless latest?
deploy_page!(archive_public_path)
success
......
---
title: Add id and image_v432x230 columns to design_management_designs_versions
merge_request: 22860
author:
type: changed
---
title: 'Elasticsearch: when index is absent warn users and disable index button'
merge_request: 25254
author:
type: fixed
---
title: Add migration to create self monitoring project environment
merge_request: 25289
author:
type: added
---
title: Use clearer error message for pages deploy job when the SHA is outdated
merge_request: 25659
author:
type: other
---
title: Drop forked_project_links table
merge_request: 20771
author: Lee Tickett
type: other
---
title: Add missing arguments to UpdateIssue mutation
merge_request: 25268
author:
type: added
# frozen_string_literal: true
class AddImageToDesignManagementDesignsVersions < ActiveRecord::Migration[6.0]
DOWNTIME = false
def change
add_column :design_management_designs_versions, :image_v432x230, :string, limit: 255
end
end
# frozen_string_literal: true
class AddIdToDesignManagementDesignsVersions < ActiveRecord::Migration[6.0]
DOWNTIME = false
def change
add_column :design_management_designs_versions, :id, :primary_key
end
end
# frozen_string_literal: true
class DropForkedProjectLinksFk < ActiveRecord::Migration[6.0]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
def up
with_lock_retries do
remove_foreign_key_if_exists :forked_project_links, column: :forked_to_project_id
end
end
def down
unless foreign_key_exists?(:forked_project_links, :projects, column: :forked_to_project_id)
with_lock_retries do
# rubocop: disable Migration/AddConcurrentForeignKey
add_foreign_key :forked_project_links, :projects, column: :forked_to_project_id, on_delete: :cascade, validate: false
end
end
fk_name = concurrent_foreign_key_name(:forked_project_links, :forked_to_project_id, prefix: 'fk_rails_')
validate_foreign_key(:forked_project_links, :forked_to_project_id, name: fk_name)
end
end
# frozen_string_literal: true
class DropForkedProjectLinksTable < ActiveRecord::Migration[6.0]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
def change
drop_table "forked_project_links", id: :serial do |t|
t.integer "forked_to_project_id", null: false
t.integer "forked_from_project_id", null: false
t.datetime "created_at"
t.datetime "updated_at"
t.index ["forked_to_project_id"], name: "index_forked_project_links_on_forked_to_project_id", unique: true
end
end
end
# frozen_string_literal: true
class CreateEnvironmentForSelfMonitoringProject < ActiveRecord::Migration[6.0]
DOWNTIME = false
def up
execute <<~SQL
INSERT INTO environments (project_id, name, slug, created_at, updated_at)
SELECT instance_administration_project_id, 'production', 'production', CURRENT_TIMESTAMP, CURRENT_TIMESTAMP
FROM application_settings
WHERE instance_administration_project_id IS NOT NULL
AND NOT EXISTS (
SELECT 1
FROM environments
INNER JOIN application_settings
ON application_settings.instance_administration_project_id = environments.project_id
)
SQL
end
def down
# no-op
# This migration cannot be reversed because it cannot be ensured that the environment for the Self Monitoring Project
# did not already exist before the migration ran - in that case, the migration does nothing, and it would be unexpected
# behavior for that environment to be deleted by reversing this migration.
end
end
......@@ -1417,10 +1417,11 @@ ActiveRecord::Schema.define(version: 2020_02_20_180944) do
t.index ["project_id"], name: "index_design_management_designs_on_project_id"
end
create_table "design_management_designs_versions", id: false, force: :cascade do |t|
create_table "design_management_designs_versions", force: :cascade do |t|
t.bigint "design_id", null: false
t.bigint "version_id", null: false
t.integer "event", limit: 2, default: 0, null: false
t.string "image_v432x230", limit: 255
t.index ["design_id", "version_id"], name: "design_management_designs_versions_uniqueness", unique: true
t.index ["design_id"], name: "index_design_management_designs_versions_on_design_id"
t.index ["event"], name: "index_design_management_designs_versions_on_event"
......@@ -1654,14 +1655,6 @@ ActiveRecord::Schema.define(version: 2020_02_20_180944) do
t.index ["root_project_id"], name: "index_fork_networks_on_root_project_id", unique: true
end
create_table "forked_project_links", id: :serial, force: :cascade do |t|
t.integer "forked_to_project_id", null: false
t.integer "forked_from_project_id", null: false
t.datetime "created_at"
t.datetime "updated_at"
t.index ["forked_to_project_id"], name: "index_forked_project_links_on_forked_to_project_id", unique: true
end
create_table "geo_cache_invalidation_events", force: :cascade do |t|
t.string "key", null: false
end
......@@ -4747,7 +4740,6 @@ ActiveRecord::Schema.define(version: 2020_02_20_180944) do
add_foreign_key "fork_network_members", "projects", column: "forked_from_project_id", name: "fk_b01280dae4", on_delete: :nullify
add_foreign_key "fork_network_members", "projects", on_delete: :cascade
add_foreign_key "fork_networks", "projects", column: "root_project_id", name: "fk_e7b436b2b5", on_delete: :nullify
add_foreign_key "forked_project_links", "projects", column: "forked_to_project_id", name: "fk_434510edb0", on_delete: :cascade
add_foreign_key "geo_container_repository_updated_events", "container_repositories", name: "fk_212c89c706", on_delete: :cascade
add_foreign_key "geo_event_log", "geo_cache_invalidation_events", column: "cache_invalidation_event_id", name: "fk_42c3b54bed", on_delete: :cascade
add_foreign_key "geo_event_log", "geo_container_repository_updated_events", column: "container_repository_updated_event_id", name: "fk_6ada82d42a", on_delete: :cascade
......
......@@ -73,6 +73,9 @@ gitlab-rake "gitlab:uploads:migrate[FileUploader, Project]"
gitlab-rake "gitlab:uploads:migrate[PersonalFileUploader, Snippet]"
gitlab-rake "gitlab:uploads:migrate[NamespaceFileUploader, Snippet]"
gitlab-rake "gitlab:uploads:migrate[FileUploader, MergeRequest]"
# Design Management design thumbnails (EE)
gitlab-rake "gitlab:uploads:migrate[DesignManagement::DesignV432x230Uploader, DesignManagement::Action, :image_v432x230]"
```
**Source Installation**
......@@ -102,6 +105,8 @@ sudo -u git -H bundle exec rake "gitlab:uploads:migrate[PersonalFileUploader, Sn
sudo -u git -H bundle exec rake "gitlab:uploads:migrate[NamespaceFileUploader, Snippet]"
sudo -u git -H bundle exec rake "gitlab:uploads:migrate[FileUploader, MergeRequest]"
# Design Management design thumbnails (EE)
sudo -u git -H bundle exec rake "gitlab:uploads:migrate[DesignManagement::DesignV432x230Uploader, DesignManagement::Action]"
```
## Migrate legacy uploads out of deprecated paths
......
......@@ -7841,6 +7841,21 @@ input UpdateIssueInput {
"""
clientMutationId: String
"""
Indicates the issue is confidential
"""
confidential: Boolean!
"""
Description of the issue
"""
description: String
"""
Due date of the issue
"""
dueDate: Time!
"""
The desired health status
"""
......@@ -7855,6 +7870,11 @@ input UpdateIssueInput {
The project the issue to mutate is in
"""
projectPath: ID!
"""
Title of the issue
"""
title: String
}
"""
......
......@@ -20696,6 +20696,54 @@
},
"defaultValue": null
},
{
"name": "title",
"description": "Title of the issue",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"defaultValue": null
},
{
"name": "description",
"description": "Description of the issue",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"defaultValue": null
},
{
"name": "dueDate",
"description": "Due date of the issue",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "Time",
"ofType": null
}
},
"defaultValue": null
},
{
"name": "confidential",
"description": "Indicates the issue is confidential",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "Boolean",
"ofType": null
}
},
"defaultValue": null
},
{
"name": "healthStatus",
"description": "The desired health status",
......
......@@ -21,6 +21,7 @@ There are many places where file uploading is used, according to contexts:
- CI Artifacts (archive, metadata, trace)
- LFS Objects
- Merge request diffs
- Design Management design thumbnails (EE)
## Disk storage
......@@ -37,6 +38,7 @@ they are still not 100% standardized. You can see them below:
| Project avatars | yes | uploads/-/system/project/avatar/:id/:filename | `AvatarUploader` | Project |
| Issues/MR/Notes Markdown attachments | yes | uploads/:project_path_with_namespace/:random_hex/:filename | `FileUploader` | Project |
| Issues/MR/Notes Legacy Markdown attachments | no | uploads/-/system/note/attachment/:id/:filename | `AttachmentUploader` | Note |
| Design Management design thumbnails (EE) | yes | uploads/-/system/design_management/action/image_v432x230/:id/:filename | `DesignManagement::DesignV432x230Uploader` | DesignManagement::Action |
| CI Artifacts (CE) | yes | `shared/artifacts/:disk_hash[0..1]/:disk_hash[2..3]/:disk_hash/:year_:month_:date/:job_id/:job_artifact_id` (:disk_hash is SHA256 digest of project_id) | `JobArtifactUploader` | Ci::JobArtifact |
| LFS Objects (CE) | yes | shared/lfs-objects/:hex/:hex/:object_hash | `LfsObjectUploader` | LfsObject |
| External merge request diffs | yes | shared/external-diffs/merge_request_diffs/mr-:parent_id/diff-:id | `ExternalDiffUploader` | MergeRequestDiff |
......
......@@ -9,7 +9,6 @@ module Gitlab
@time_left = time_left
end
# rubocop: disable CodeReuse/ActiveRecord
def objects_missing?
return false unless @newrev && @project.lfs_enabled?
......@@ -19,12 +18,11 @@ module Gitlab
return false unless new_lfs_pointers.present?
existing_count = @project.all_lfs_objects
.where(oid: new_lfs_pointers.map(&:lfs_oid))
.for_oids(new_lfs_pointers.map(&:lfs_oid))
.count
existing_count != new_lfs_pointers.count
end
# rubocop: enable CodeReuse/ActiveRecord
end
end
end
......@@ -235,11 +235,17 @@ module Gitlab
# PostgreSQL constraint names have a limit of 63 bytes. The logic used
# here is based on Rails' foreign_key_name() method, which unfortunately
# is private so we can't rely on it directly.
def concurrent_foreign_key_name(table, column)
#
# prefix:
# - The default prefix is `fk_` for backward compatibility with the existing
# concurrent foreign key helpers.
# - For standard rails foreign keys the prefix is `fk_rails_`
#
def concurrent_foreign_key_name(table, column, prefix: 'fk_')
identifier = "#{table}_#{column}_fk"
hashed_identifier = Digest::SHA256.hexdigest(identifier).first(10)
"fk_#{hashed_identifier}"
"#{prefix}#{hashed_identifier}"
end
# Long-running migrations may take more than the timeout allowed by
......
# frozen_string_literal: true
# The method `filename` must be defined in classes that use this module.
# The method `filename` must be defined in classes that mix in this module.
#
# This module is intended to be used as a helper and not a security gate
# to validate that a file is safe, as it identifies files only by the
......@@ -35,6 +35,13 @@ module Gitlab
DANGEROUS_VIDEO_EXT = [].freeze # None, yet
DANGEROUS_AUDIO_EXT = [].freeze # None, yet
def self.extension_match?(filename, extensions)
return false unless filename.present?
extension = File.extname(filename).delete('.')
extensions.include?(extension.downcase)
end
def image?
extension_match?(SAFE_IMAGE_EXT)
end
......@@ -74,10 +81,7 @@ module Gitlab
private
def extension_match?(extensions)
return false unless filename
extension = File.extname(filename).delete('.')
extensions.include?(extension.downcase)
::Gitlab::FileTypeDetection.extension_match?(filename, extensions)
end
end
end
......@@ -43,12 +43,11 @@ module Gitlab
relation_name.to_s.constantize
end
def initialize(relation_sym:, relation_hash:, members_mapper:, object_builder:, merge_requests_mapping: nil, user:, importable:, excluded_keys: [])
def initialize(relation_sym:, relation_hash:, members_mapper:, object_builder:, user:, importable:, excluded_keys: [])
@relation_name = self.class.overrides[relation_sym]&.to_sym || relation_sym
@relation_hash = relation_hash.except('noteable_id')
@members_mapper = members_mapper
@object_builder = object_builder
@merge_requests_mapping = merge_requests_mapping
@user = user
@importable = importable
@imported_object_retries = 0
......
......@@ -377,3 +377,6 @@ ee:
- protected_environments:
- :deploy_access_levels
- :service_desk_setting
excluded_attributes:
actions:
- image_v432x230
......@@ -111,28 +111,6 @@ module Gitlab
@relation_hash['group_id'] = @importable.namespace_id
end
# This code is a workaround for broken project exports that don't
# export merge requests with CI pipelines (i.e. exports that were
# generated from
# https://gitlab.com/gitlab-org/gitlab/merge_requests/17844).
# This method can be removed in GitLab 12.6.
def update_merge_request_references
# If a merge request was properly created, we don't need to fix
# up this export.
return if @relation_hash['merge_request']
merge_request_id = @relation_hash['merge_request_id']
return unless merge_request_id
new_merge_request_id = @merge_requests_mapping[merge_request_id]
return unless new_merge_request_id
@relation_hash['merge_request_id'] = new_merge_request_id
parsed_relation_hash['merge_request_id'] = new_merge_request_id
end
def setup_build
@relation_hash.delete('trace') # old export files have trace
@relation_hash.delete('token')
......@@ -147,8 +125,6 @@ module Gitlab
end
def setup_pipeline
update_merge_request_references
@relation_hash.fetch('stages', []).each do |stage|
stage.statuses.each do |status|
status.pipeline = imported_object
......
......@@ -76,8 +76,6 @@ module Gitlab
import_failure_service.with_retry(action: 'relation_object.save!', relation_key: relation_key, relation_index: relation_index) do
relation_object.save!
end
save_id_mapping(relation_key, data_hash, relation_object)
rescue => e
import_failure_service.log_import_failure(
source: 'process_relation_item!',
......@@ -90,17 +88,6 @@ module Gitlab
@import_failure_service ||= ImportFailureService.new(@importable)
end
# Older, serialized CI pipeline exports may only have a
# merge_request_id and not the full hash of the merge request. To
# import these pipelines, we need to preserve the mapping between
# the old and new the merge request ID.
def save_id_mapping(relation_key, data_hash, relation_object)
return unless importable_class == Project
return unless relation_key == 'merge_requests'
merge_requests_mapping[data_hash['id']] = relation_object.id
end
def relations
@relations ||=
@reader
......@@ -219,13 +206,8 @@ module Gitlab
importable_class.to_s.downcase.to_sym
end
# A Hash of the imported merge request ID -> imported ID.
def merge_requests_mapping
@merge_requests_mapping ||= {}
end
def relation_factory_params(relation_key, data_hash)
base_params = {
{
relation_sym: relation_key.to_sym,
relation_hash: data_hash,
importable: @importable,
......@@ -234,9 +216,6 @@ module Gitlab
user: @user,
excluded_keys: excluded_keys_for_relation(relation_key)
}
base_params[:merge_requests_mapping] = merge_requests_mapping if importable_class == Project
base_params
end
end
end
......
......@@ -21,6 +21,10 @@ module Gitlab
prepare_variables(args, logger)
end
def self.categories
CATEGORIES
end
def migrate_to_remote_storage
@to_store = ObjectStorage::Store::REMOTE
......@@ -70,3 +74,5 @@ module Gitlab
end
end
end
Gitlab::Uploads::MigrationHelper.prepend_if_ee('EE::Gitlab::Uploads::MigrationHelper')
......@@ -3,7 +3,7 @@ namespace :gitlab do
namespace :migrate do
desc "GitLab | Uploads | Migrate all uploaded files to object storage"
task all: :environment do
Gitlab::Uploads::MigrationHelper::CATEGORIES.each do |args|
Gitlab::Uploads::MigrationHelper.categories.each do |args|
Rake::Task["gitlab:uploads:migrate"].invoke(*args)
Rake::Task["gitlab:uploads:migrate"].reenable
end
......@@ -20,7 +20,7 @@ namespace :gitlab do
namespace :migrate_to_local do
desc "GitLab | Uploads | Migrate all uploaded files to local storage"
task all: :environment do
Gitlab::Uploads::MigrationHelper::CATEGORIES.each do |args|
Gitlab::Uploads::MigrationHelper.categories.each do |args|
Rake::Task["gitlab:uploads:migrate_to_local"].invoke(*args)
Rake::Task["gitlab:uploads:migrate_to_local"].reenable
end
......
......@@ -14075,6 +14075,9 @@ msgstr ""
msgid "Please create a username with only alphanumeric characters."
msgstr ""
msgid "Please create an index before enabling indexing"
msgstr ""
msgid "Please enable and migrate to hashed storage to avoid security issues and ensure data integrity. %{migrate_link}"
msgstr ""
......@@ -21876,9 +21879,15 @@ msgstr ""
msgid "Who will be able to see this group?"
msgstr ""
msgid "Who will be using GitLab?"
msgstr ""
msgid "Who will be using this GitLab subscription?"
msgstr ""
msgid "Who will be using this GitLab trial?"
msgstr ""
msgid "Wiki"
msgstr ""
......
# frozen_string_literal: true
require 'spec_helper'
describe 'User edits Release', :js do
let_it_be(:project) { create(:project, :repository) }
let_it_be(:release) { create(:release, project: project, name: 'The first release' ) }
let_it_be(:user) { create(:user) }
before do
project.add_developer(user)
gitlab_sign_in(user)
visit edit_project_release_path(project, release)
end
def fill_out_form_and_click(button_to_click)
fill_in 'Release title', with: 'Updated Release title'
fill_in 'Release notes', with: 'Updated Release notes'
click_button button_to_click
wait_for_requests
end
it 'renders the breadcrumbs' do
within('.breadcrumbs') do
expect(page).to have_content("#{project.creator.name} #{project.name} Edit Release")
expect(page).to have_link(project.creator.name, href: user_path(project.creator))
expect(page).to have_link(project.name, href: project_path(project))
expect(page).to have_link('Edit Release', href: edit_project_release_path(project, release))
end
end
it 'renders the edit Release form' do
expect(page).to have_content('Releases are based on Git tags. We recommend naming tags that fit within semantic versioning, for example v1.0, v2.0-pre.')
expect(find_field('Tag name', { disabled: true }).value).to eq(release.tag)
expect(find_field('Release title').value).to eq(release.name)
expect(find_field('Release notes').value).to eq(release.description)
expect(page).to have_button('Save changes')
expect(page).to have_button('Cancel')
end
it 'redirects to the main Releases page without updating the Release when "Cancel" is clicked' do
original_name = release.name
original_description = release.description
fill_out_form_and_click 'Cancel'
expect(current_path).to eq(project_releases_path(project))
release.reload
expect(release.name).to eq(original_name)
expect(release.description).to eq(original_description)
end
it 'updates the Release and redirects to the main Releases page when "Save changes" is clicked' do
fill_out_form_and_click 'Save changes'
expect(current_path).to eq(project_releases_path(project))
release.reload
expect(release.name).to eq('Updated Release title')
expect(release.description).to eq('Updated Release notes')
end
end
# frozen_string_literal: true
require 'spec_helper'
describe Mutations::Issues::Update do
let(:issue) { create(:issue) }
let(:user) { create(:user) }
let(:expected_attributes) do
{
title: 'new title',
description: 'new description',
confidential: true,
due_date: Date.tomorrow
}
end
let(:mutation) { described_class.new(object: nil, context: { current_user: user }) }
let(:mutated_issue) { subject[:issue] }
describe '#resolve' do
let(:mutation_params) do
{
project_path: issue.project.full_path,
iid: issue.iid
}.merge(expected_attributes)
end
subject { mutation.resolve(mutation_params) }
it 'raises an error if the resource is not accessible to the user' do
expect { subject }.to raise_error(Gitlab::Graphql::Errors::ResourceNotAvailable)
end
context 'when the user can update the issue' do
before do
issue.project.add_developer(user)
end
it 'updates issue with correct values' do
subject
expect(issue.reload).to have_attributes(expected_attributes)
end
context 'when iid does not exist' do
it 'raises resource not available error' do
mutation_params[:iid] = 99999
expect { subject }.to raise_error(Gitlab::Graphql::Errors::ResourceNotAvailable)
end
end
end
end
end
......@@ -2,6 +2,35 @@
require 'spec_helper'
describe Gitlab::FileTypeDetection do
describe '.extension_match?' do
let(:extensions) { %w[foo bar] }
it 'returns false when filename is blank' do
expect(described_class.extension_match?(nil, extensions)).to eq(false)
expect(described_class.extension_match?('', extensions)).to eq(false)
end
it 'returns true when filename matches extensions' do
expect(described_class.extension_match?('file.foo', extensions)).to eq(true)
expect(described_class.extension_match?('file.bar', extensions)).to eq(true)
end
it 'returns false when filename does not match extensions' do
expect(described_class.extension_match?('file.baz', extensions)).to eq(false)
end
it 'can match case insensitive filenames' do
expect(described_class.extension_match?('file.FOO', extensions)).to eq(true)
end
it 'can match filenames with periods' do
expect(described_class.extension_match?('my.file.foo', extensions)).to eq(true)
end
it 'can match filenames with directories' do
expect(described_class.extension_match?('my/file.foo', extensions)).to eq(true)
end
end
context 'when class is an uploader' do
let(:uploader) do
example_uploader = Class.new(CarrierWave::Uploader::Base) do
......
......@@ -567,6 +567,8 @@ designs: *design
actions:
- design
- version
- uploads
- file_uploads
versions: &version
- author
- issue
......
......@@ -7,7 +7,6 @@ describe Gitlab::ImportExport::BaseRelationFactory do
let(:project) { create(:project) }
let(:members_mapper) { double('members_mapper').as_null_object }
let(:relation_sym) { :project_snippets }
let(:merge_requests_mapping) { {} }
let(:relation_hash) { {} }
let(:excluded_keys) { [] }
......@@ -16,7 +15,6 @@ describe Gitlab::ImportExport::BaseRelationFactory do
relation_hash: relation_hash,
object_builder: Gitlab::ImportExport::GroupProjectObjectBuilder,
members_mapper: members_mapper,
merge_requests_mapping: merge_requests_mapping,
user: user,
importable: project,
excluded_keys: excluded_keys)
......
......@@ -6,7 +6,6 @@ describe Gitlab::ImportExport::ProjectRelationFactory do
let(:group) { create(:group) }
let(:project) { create(:project, :repository, group: group) }
let(:members_mapper) { double('members_mapper').as_null_object }
let(:merge_requests_mapping) { {} }
let(:user) { create(:admin) }
let(:excluded_keys) { [] }
let(:created_object) do
......@@ -14,7 +13,6 @@ describe Gitlab::ImportExport::ProjectRelationFactory do
relation_hash: relation_hash,
object_builder: Gitlab::ImportExport::GroupProjectObjectBuilder,
members_mapper: members_mapper,
merge_requests_mapping: merge_requests_mapping,
user: user,
importable: project,
excluded_keys: excluded_keys)
......
......@@ -769,7 +769,9 @@ DesignManagement::Design:
- project_id
- filename
DesignManagement::Action:
- id
- event
- image_v432x230
DesignManagement::Version:
- id
- created_at
......
# frozen_string_literal: true
require 'spec_helper'
require Rails.root.join('db', 'post_migrate', '20200214214934_create_environment_for_self_monitoring_project')
describe CreateEnvironmentForSelfMonitoringProject, :migration do
let(:application_settings_table) { table(:application_settings) }
let(:environments) { table(:environments) }
let(:instance_administrators_group) do
table(:namespaces).create!(
id: 1,
name: 'GitLab Instance Administrators',
path: 'gitlab-instance-administrators-random',
type: 'Group'
)
end
let(:self_monitoring_project) do
table(:projects).create!(
id: 2,
name: 'Self Monitoring',
path: 'self_monitoring',
namespace_id: instance_administrators_group.id
)
end
context 'when the self monitoring project ID is not set' do
it 'does not make changes' do
expect(environments.find_by(project_id: self_monitoring_project.id)).to be_nil
migrate!
expect(environments.find_by(project_id: self_monitoring_project.id)).to be_nil
end
end
context 'when the self monitoring project ID is set' do
before do
application_settings_table.create!(instance_administration_project_id: self_monitoring_project.id)
end
context 'when the environment already exists' do
let!(:environment) do
environments.create!(project_id: self_monitoring_project.id, name: 'production', slug: 'production')
end
it 'does not make changes' do
expect(environments.find_by(project_id: self_monitoring_project.id)).to eq(environment)
migrate!
expect(environments.find_by(project_id: self_monitoring_project.id)).to eq(environment)
end
end
context 'when the environment does not exist' do
it 'creates the environment' do
expect(environments.find_by(project_id: self_monitoring_project.id)).to be_nil
migrate!
expect(environments.find_by(project_id: self_monitoring_project.id)).to be
end
end
end
end
......@@ -13,6 +13,15 @@ describe LfsObject do
expect(described_class.not_linked_to_project(project)).to contain_exactly(other_lfs_object)
end
end
describe '.for_oids' do
it 'returns the correct LfsObjects' do
lfs_object_1, lfs_object_2 = create_list(:lfs_object, 2)
expect(described_class.for_oids(lfs_object_1.oid)).to contain_exactly(lfs_object_1)
expect(described_class.for_oids([lfs_object_1.oid, lfs_object_2.oid])).to contain_exactly(lfs_object_1, lfs_object_2)
end
end
end
it 'has a distinct has_many :projects relation through lfs_objects_projects' do
......
......@@ -60,8 +60,8 @@ describe Projects::LfsPointers::LfsLinkService do
stub_const("#{described_class}::BATCH_SIZE", 1)
oids = %w(one two)
expect(LfsObject).to receive(:where).with(oid: %w(one)).once.and_call_original
expect(LfsObject).to receive(:where).with(oid: %w(two)).once.and_call_original
expect(LfsObject).to receive(:for_oids).with(%w(one)).once.and_call_original
expect(LfsObject).to receive(:for_oids).with(%w(two)).once.and_call_original
subject.execute(oids)
end
......
......@@ -82,6 +82,9 @@ describe Projects::UpdatePagesService do
expect(execute).not_to eq(:success)
expect(project.pages_metadatum).not_to be_deployed
expect(deploy_status).to be_failed
expect(deploy_status.description).to eq('build SHA is outdated for this ref')
end
context 'when using empty file' do
......
# frozen_string_literal: true
# Expects the calling spec to define:
# - uploader_class
# - model_class
# - mounted_as
RSpec.shared_examples 'enqueue upload migration jobs in batch' do |batch:|
def run(task)
args = [uploader_class.to_s, model_class.to_s, mounted_as].compact
run_rake_task(task, *args)
end
it 'migrates local storage to remote object storage' do
expect(ObjectStorage::MigrateUploadsWorker)
.to receive(:perform_async).exactly(batch).times
.and_return("A fake job.")
run('gitlab:uploads:migrate')
end
it 'migrates remote object storage to local storage' do
expect(Upload).to receive(:where).exactly(batch + 1).times { Upload.all }
expect(ObjectStorage::MigrateUploadsWorker)
.to receive(:perform_async)
.with(anything, model_class.name, mounted_as, ObjectStorage::Store::LOCAL)
.exactly(batch).times
.and_return("A fake job.")
run('gitlab:uploads:migrate_to_local')
end
end
# frozen_string_literal: true
# Expects the calling spec to define:
# - model_class
# - mounted_as
# - to_store
RSpec.shared_examples 'uploads migration worker' do
def perform(uploads, store = nil)
described_class.new.perform(uploads.ids, model_class.to_s, mounted_as, store || to_store)
rescue ObjectStorage::MigrateUploadsWorker::Report::MigrationFailures
# swallow
end
describe '.enqueue!' do
def enqueue!
described_class.enqueue!(uploads, model_class, mounted_as, to_store)
end
it 'is guarded by .sanity_check!' do
expect(described_class).to receive(:perform_async)
expect(described_class).to receive(:sanity_check!)
enqueue!
end
context 'sanity_check! fails' do
include_context 'sanity_check! fails'
it 'does not enqueue a job' do
expect(described_class).not_to receive(:perform_async)
expect { enqueue! }.to raise_error(described_class::SanityCheckError)
end
end
end
describe '.sanity_check!' do
shared_examples 'raises a SanityCheckError' do |expected_message|
let(:mount_point) { nil }
it do
expect { described_class.sanity_check!(uploads, model_class, mount_point) }
.to raise_error(described_class::SanityCheckError).with_message(expected_message)
end
end
context 'uploader types mismatch' do
let!(:outlier) { create(:upload, uploader: 'GitlabUploader') }
include_examples 'raises a SanityCheckError', /Multiple uploaders found/
end
context 'mount point not found' do
include_examples 'raises a SanityCheckError', /Mount point [a-z:]+ not found in/ do
let(:mount_point) { :potato }
end
end
end
describe '#perform' do
shared_examples 'outputs correctly' do |success: 0, failures: 0|
total = success + failures
if success > 0
it 'outputs the reports' do
expect(Rails.logger).to receive(:info).with(%r{Migrated #{success}/#{total} files})
perform(uploads)
end
end
if failures > 0
it 'outputs upload failures' do
expect(Rails.logger).to receive(:warn).with(/Error .* I am a teapot/)
perform(uploads)
end
end
end
it_behaves_like 'outputs correctly', success: 10
it 'migrates files to remote storage' do
perform(uploads)
expect(Upload.where(store: ObjectStorage::Store::LOCAL).count).to eq(0)
end
context 'reversed' do
let(:to_store) { ObjectStorage::Store::LOCAL }
before do
perform(uploads, ObjectStorage::Store::REMOTE)
end
it 'migrates files to local storage' do
expect(Upload.where(store: ObjectStorage::Store::REMOTE).count).to eq(10)
perform(uploads)
expect(Upload.where(store: ObjectStorage::Store::LOCAL).count).to eq(10)
end
end
context 'migration is unsuccessful' do
before do
allow_any_instance_of(ObjectStorage::Concern)
.to receive(:migrate!).and_raise(CarrierWave::UploadError, 'I am a teapot.')
end
it_behaves_like 'outputs correctly', failures: 10
end
end
end
RSpec.shared_context 'sanity_check! fails' do
before do
expect(described_class).to receive(:sanity_check!).and_raise(described_class::SanityCheckError)
end
end
......@@ -16,32 +16,6 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
allow(ObjectStorage::MigrateUploadsWorker).to receive(:perform_async)
end
def run(task)
args = [uploader_class.to_s, model_class.to_s, mounted_as].compact
run_rake_task(task, *args)
end
shared_examples 'enqueue jobs in batch' do |batch:|
it 'migrates local storage to remote object storage' do
expect(ObjectStorage::MigrateUploadsWorker)
.to receive(:perform_async).exactly(batch).times
.and_return("A fake job.")
run('gitlab:uploads:migrate')
end
it 'migrates remote object storage to local storage' do
expect(Upload).to receive(:where).exactly(batch + 1).times { Upload.all }
expect(ObjectStorage::MigrateUploadsWorker)
.to receive(:perform_async)
.with(anything, model_class.name, mounted_as, ObjectStorage::Store::LOCAL)
.exactly(batch).times
.and_return("A fake job.")
run('gitlab:uploads:migrate_to_local')
end
end
context "for AvatarUploader" do
let(:uploader_class) { AvatarUploader }
let(:mounted_as) { :avatar }
......@@ -50,7 +24,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
let(:model_class) { Project }
let!(:projects) { create_list(:project, 10, :with_avatar) }
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
context "for Group" do
......@@ -60,7 +34,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
create_list(:group, 10, :with_avatar)
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
context "for User" do
......@@ -70,7 +44,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
create_list(:user, 10, :with_avatar)
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
end
......@@ -85,7 +59,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
create_list(:note, 10, :with_attachment)
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
context "for Appearance" do
......@@ -97,7 +71,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
end
%i(logo header_logo).each do |mount|
it_behaves_like 'enqueue jobs in batch', batch: 1 do
it_behaves_like 'enqueue upload migration jobs in batch', batch: 1 do
let(:mounted_as) { mount }
end
end
......@@ -115,7 +89,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
end
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
context "for PersonalFileUploader" do
......@@ -129,7 +103,7 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
end
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
context "for NamespaceFileUploader" do
......@@ -143,6 +117,6 @@ describe 'gitlab:uploads:migrate and migrate_to_local rake tasks' do
end
end
it_behaves_like 'enqueue jobs in batch', batch: 4
it_behaves_like 'enqueue upload migration jobs in batch', batch: 4
end
end
......@@ -3,12 +3,6 @@
require 'spec_helper'
describe ObjectStorage::MigrateUploadsWorker do
shared_context 'sanity_check! fails' do
before do
expect(described_class).to receive(:sanity_check!).and_raise(described_class::SanityCheckError)
end
end
let(:model_class) { Project }
let(:uploads) { Upload.all }
let(:to_store) { ObjectStorage::Store::REMOTE }
......@@ -19,109 +13,6 @@ describe ObjectStorage::MigrateUploadsWorker do
# swallow
end
shared_examples "uploads migration worker" do
describe '.enqueue!' do
def enqueue!
described_class.enqueue!(uploads, Project, mounted_as, to_store)
end
it 'is guarded by .sanity_check!' do
expect(described_class).to receive(:perform_async)
expect(described_class).to receive(:sanity_check!)
enqueue!
end
context 'sanity_check! fails' do
include_context 'sanity_check! fails'
it 'does not enqueue a job' do
expect(described_class).not_to receive(:perform_async)
expect { enqueue! }.to raise_error(described_class::SanityCheckError)
end
end
end
describe '.sanity_check!' do
shared_examples 'raises a SanityCheckError' do |expected_message|
let(:mount_point) { nil }
it do
expect { described_class.sanity_check!(uploads, model_class, mount_point) }
.to raise_error(described_class::SanityCheckError).with_message(expected_message)
end
end
context 'uploader types mismatch' do
let!(:outlier) { create(:upload, uploader: 'GitlabUploader') }
include_examples 'raises a SanityCheckError', /Multiple uploaders found/
end
context 'mount point not found' do
include_examples 'raises a SanityCheckError', /Mount point [a-z:]+ not found in/ do
let(:mount_point) { :potato }
end
end
end
describe '#perform' do
shared_examples 'outputs correctly' do |success: 0, failures: 0|
total = success + failures
if success > 0
it 'outputs the reports' do
expect(Rails.logger).to receive(:info).with(%r{Migrated #{success}/#{total} files})
perform(uploads)
end
end
if failures > 0
it 'outputs upload failures' do
expect(Rails.logger).to receive(:warn).with(/Error .* I am a teapot/)
perform(uploads)
end
end
end
it_behaves_like 'outputs correctly', success: 10
it 'migrates files to remote storage' do
perform(uploads)
expect(Upload.where(store: ObjectStorage::Store::LOCAL).count).to eq(0)
end
context 'reversed' do
let(:to_store) { ObjectStorage::Store::LOCAL }
before do
perform(uploads, ObjectStorage::Store::REMOTE)
end
it 'migrates files to local storage' do
expect(Upload.where(store: ObjectStorage::Store::REMOTE).count).to eq(10)
perform(uploads)
expect(Upload.where(store: ObjectStorage::Store::LOCAL).count).to eq(10)
end
end
context 'migration is unsuccessful' do
before do
allow_any_instance_of(ObjectStorage::Concern)
.to receive(:migrate!).and_raise(CarrierWave::UploadError, "I am a teapot.")
end
it_behaves_like 'outputs correctly', failures: 10
end
end
end
context "for AvatarUploader" do
let!(:projects) { create_list(:project, 10, :with_avatar) }
let(:mounted_as) { :avatar }
......
......@@ -11,6 +11,7 @@ describe 'admin/application_settings/integrations.html.haml' do
before do
assign(:application_setting, app_settings)
allow(Gitlab::Sourcegraph).to receive(:feature_available?).and_return(sourcegraph_flag)
allow(License).to receive(:feature_available?).with(:elastic_search).and_return(false) if defined?(License)
end
context 'when sourcegraph feature is enabled' do
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment