Commit fd28ff5c authored by Nur Rony's avatar Nur Rony

Merge branch 'master' into 23557-remove-extra-line-for-empty-issue-description

* master: (22 commits)
  Fix status code expectation
  Stop clearing the database cache on rake cache:clear
  Fix error in generating labels
  Fix bug where e-mails were not being sent out via Sidekiq
  Fix documents and comments on Build API `scope`. #23146 #19131
  Re-organize queues to use for Sidekiq
  Fix wrong endpoint in api/users documentation, fix same typo in spec describe blocks
  Update CHANGELOG
  Fix object data to be sent to fetch analytics data
  Fixed compare ellipsis messing with layout
  Change "Group#web_url" to return "/groups/twitter" rather than "/twitter".
  fix font weight of project feature settings
  Add hover to trash icon in notes
  Ensure custom provider tab labels don't break layout.
  Fixed issue when images are loading it would push off the tabs
  Fixed issues with sticky mr tabs & sidebar
  Refactor and add new functionality to CI yaml reference
  Ignore external issues when bulk assigning issues to author of merge request.
  Changed gitlab-shell version to avoid warning when precompiling the assets.
  Grammar fixes in docs
  ...
parents 0ca0697a a98ad03b
...@@ -4,14 +4,20 @@ Please view this file on the master branch, on stable branches it's out of date. ...@@ -4,14 +4,20 @@ Please view this file on the master branch, on stable branches it's out of date.
- Adds user project membership expired event to clarify why user was removed (Callum Dryden) - Adds user project membership expired event to clarify why user was removed (Callum Dryden)
- Trim leading and trailing whitespace on project_path (Linus Thiel) - Trim leading and trailing whitespace on project_path (Linus Thiel)
- Fix HipChat notifications rendering (airatshigapov, eisnerd) - Fix HipChat notifications rendering (airatshigapov, eisnerd)
- Add hover to trash icon in notes !7008 (blackst0ne)
- Simpler arguments passed to named_route on toggle_award_url helper method - Simpler arguments passed to named_route on toggle_award_url helper method
- Fix: Backup restore doesn't clear cache - Fix: Backup restore doesn't clear cache
- Use MergeRequestsClosingIssues cache data on Issue#closed_by_merge_requests method - Use MergeRequestsClosingIssues cache data on Issue#closed_by_merge_requests method
- Fix documents and comments on Build API `scope`
## 8.13.1 (unreleased)
- Fix error in generating labels
## 8.13.0 (2016-10-22) ## 8.13.0 (2016-10-22)
- Removes extra line for empty issue description. (!7045) - Removes extra line for empty issue description. (!7045)
- Fix save button on project pipeline settings page. (!6955) - Fix save button on project pipeline settings page. (!6955)
- All Sidekiq workers now use their own queue
- Avoid race condition when asynchronously removing expired artifacts. (!6881) - Avoid race condition when asynchronously removing expired artifacts. (!6881)
- Improve Merge When Build Succeeds triggers and execute on pipeline success. (!6675) - Improve Merge When Build Succeeds triggers and execute on pipeline success. (!6675)
- Respond with 404 Not Found for non-existent tags (Linus Thiel) - Respond with 404 Not Found for non-existent tags (Linus Thiel)
...@@ -30,6 +36,7 @@ Please view this file on the master branch, on stable branches it's out of date. ...@@ -30,6 +36,7 @@ Please view this file on the master branch, on stable branches it's out of date.
- Update duration at the end of pipeline - Update duration at the end of pipeline
- ExpireBuildArtifactsWorker query builds table without ordering enqueuing one job per build to cleanup - ExpireBuildArtifactsWorker query builds table without ordering enqueuing one job per build to cleanup
- Add group level labels. (!6425) - Add group level labels. (!6425)
- Fix Cycle analytics not showing correct data when filtering by date. !6906
- Add an example for testing a phoenix application with Gitlab CI in the docs (Manthan Mallikarjun) - Add an example for testing a phoenix application with Gitlab CI in the docs (Manthan Mallikarjun)
- Cancelled pipelines could be retried. !6927 - Cancelled pipelines could be retried. !6927
- Updating verbiage on git basics to be more intuitive - Updating verbiage on git basics to be more intuitive
......
...@@ -36,7 +36,11 @@ ...@@ -36,7 +36,11 @@
method: 'GET', method: 'GET',
dataType: 'json', dataType: 'json',
contentType: 'application/json', contentType: 'application/json',
data: { start_date: options.startDate } data: {
cycle_analytics: {
start_date: options.startDate
}
}
}).done((data) => { }).done((data) => {
this.decorateData(data); this.decorateData(data);
this.initDropdown(); this.initDropdown();
......
...@@ -388,28 +388,25 @@ ...@@ -388,28 +388,25 @@
// So we dont affix the tabs on these // So we dont affix the tabs on these
if (Breakpoints.get().getBreakpointSize() === 'xs' || !$tabs.length) return; if (Breakpoints.get().getBreakpointSize() === 'xs' || !$tabs.length) return;
var tabsWidth = $tabs.outerWidth(), var $diffTabs = $('#diff-notes-app'),
$diffTabs = $('#diff-notes-app'), $fixedNav = $('.navbar-fixed-top'),
offsetTop = $tabs.offset().top - ($('.navbar-fixed-top').height() + $('.layout-nav').height()); $layoutNav = $('.layout-nav');
$tabs.off('affix.bs.affix affix-top.bs.affix') $tabs.off('affix.bs.affix affix-top.bs.affix')
.affix({ .affix({
offset: { offset: {
top: offsetTop top: function () {
var tabsTop = $diffTabs.offset().top - $tabs.height();
tabsTop = tabsTop - ($fixedNav.height() + $layoutNav.height());
return tabsTop;
}
} }
}).on('affix.bs.affix', function () { }).on('affix.bs.affix', function () {
$tabs.css({
left: $tabs.offset().left,
width: tabsWidth
});
$diffTabs.css({ $diffTabs.css({
marginTop: $tabs.height() marginTop: $tabs.height()
}); });
}).on('affix-top.bs.affix', function () { }).on('affix-top.bs.affix', function () {
$tabs.css({
left: '',
width: ''
});
$diffTabs.css({ $diffTabs.css({
marginTop: '' marginTop: ''
}); });
......
...@@ -185,6 +185,10 @@ header.header-sidebar-pinned { ...@@ -185,6 +185,10 @@ header.header-sidebar-pinned {
@media (min-width: $screen-sm-min) { @media (min-width: $screen-sm-min) {
padding-right: $sidebar_collapsed_width; padding-right: $sidebar_collapsed_width;
.merge-request-tabs-holder.affix {
right: $sidebar_collapsed_width;
}
} }
.sidebar-collapsed-icon { .sidebar-collapsed-icon {
...@@ -207,6 +211,10 @@ header.header-sidebar-pinned { ...@@ -207,6 +211,10 @@ header.header-sidebar-pinned {
@media (min-width: $screen-md-min) { @media (min-width: $screen-md-min) {
padding-right: $gutter_width; padding-right: $gutter_width;
.merge-request-tabs-holder.affix {
right: $gutter_width;
}
} }
&.with-overlay { &.with-overlay {
......
...@@ -143,6 +143,7 @@ ...@@ -143,6 +143,7 @@
&:not(.active) { &:not(.active) {
background-color: $gray-light; background-color: $gray-light;
border-left: 1px solid $border-color;
} }
a { a {
...@@ -170,6 +171,31 @@ ...@@ -170,6 +171,31 @@
} }
} }
// Ldap configurations may need more tabs & the tab labels are user generated (arbitrarily long).
// These styles prevent this from breaking the layout, and only applied when providers are configured.
.new-session-tabs.custom-provider-tabs {
flex-wrap: wrap;
li {
min-width: 85px;
flex-basis: auto;
// This styles tab elements that have wrapped to a second line. We cannot easily predict when this will happen.
// We are making somewhat of an assumption about the configuration here: that users do not have more than
// 3 LDAP servers configured (in addition to standard login) and they are not using especially long names for any
// of them. If either condition is false, this will work as expected. If both are true, there may be a missing border
// above one of the bottom row elements. If you know a better way, please implement it!
&:nth-child(n+5) {
border-top: 1px solid $border-color;
}
}
a {
font-size: 16px;
}
}
.form-control { .form-control {
&:active, &:focus { &:active, &:focus {
...@@ -203,6 +229,7 @@ ...@@ -203,6 +229,7 @@
.login-page { .login-page {
.col-sm-5.pull-right { .col-sm-5.pull-right {
float: none !important; float: none !important;
margin-bottom: 45px;
} }
} }
} }
...@@ -244,7 +271,11 @@ ...@@ -244,7 +271,11 @@
} }
.navless-container { .navless-container {
padding: 65px; // height of footer + bottom padding of email confirmation link padding: 65px 15px; // height of footer + bottom padding of email confirmation link
@media (max-width: $screen-xs-max) {
padding: 0 15px 65px;
}
} }
} }
...@@ -263,3 +294,4 @@ ...@@ -263,3 +294,4 @@
bottom: 0; bottom: 0;
} }
} }
...@@ -438,11 +438,18 @@ ...@@ -438,11 +438,18 @@
} }
} }
.merge-request-tabs { .merge-request-tabs-holder {
background-color: #fff; background-color: #fff;
&.affix { &.affix {
top: 100px; top: 100px;
left: 0;
z-index: 9; z-index: 9;
transition: right .15s;
}
&:not(.affix) .container-fluid {
padding-left: 0;
padding-right: 0;
} }
} }
...@@ -13,9 +13,18 @@ ...@@ -13,9 +13,18 @@
.new_project, .new_project,
.edit-project { .edit-project {
fieldset { fieldset {
&.features .control-label {
font-weight: normal; &.features {
.label-light {
margin-bottom: 0;
}
.help-block {
margin-top: 0;
}
} }
.form-group { .form-group {
...@@ -40,6 +49,7 @@ ...@@ -40,6 +49,7 @@
} }
.input-group > div { .input-group > div {
&:last-child { &:last-child {
padding-right: 0; padding-right: 0;
} }
...@@ -47,6 +57,7 @@ ...@@ -47,6 +57,7 @@
@media (max-width: $screen-xs-max) { @media (max-width: $screen-xs-max) {
.input-group > div { .input-group > div {
margin-bottom: 14px; margin-bottom: 14px;
&:last-child { &:last-child {
...@@ -60,6 +71,7 @@ ...@@ -60,6 +71,7 @@
} }
.input-group-addon { .input-group-addon {
&.static-namespace { &.static-namespace {
height: 35px; height: 35px;
border-radius: 3px; border-radius: 3px;
......
...@@ -68,7 +68,7 @@ class Group < Namespace ...@@ -68,7 +68,7 @@ class Group < Namespace
end end
def web_url def web_url
Gitlab::Routing.url_helpers.group_url(self) Gitlab::Routing.url_helpers.group_canonical_url(self)
end end
def human_name def human_name
......
...@@ -4,7 +4,7 @@ module MergeRequests ...@@ -4,7 +4,7 @@ module MergeRequests
@assignable_issues ||= begin @assignable_issues ||= begin
if current_user == merge_request.author if current_user == merge_request.author
closes_issues.select do |issue| closes_issues.select do |issue|
!issue.assignee_id? && can?(current_user, :admin_issue, issue) !issue.is_a?(ExternalIssue) && !issue.assignee_id? && can?(current_user, :admin_issue, issue)
end end
else else
[] []
......
...@@ -75,4 +75,4 @@ ...@@ -75,4 +75,4 @@
- @runners.each do |runner| - @runners.each do |runner|
= render "admin/runners/runner", runner: runner = render "admin/runners/runner", runner: runner
= paginate @runners = paginate @runners, theme: "gitlab"
...@@ -67,7 +67,7 @@ ...@@ -67,7 +67,7 @@
= form_for [:admin, project.namespace.becomes(Namespace), project, project.runner_projects.new] do |f| = form_for [:admin, project.namespace.becomes(Namespace), project, project.runner_projects.new] do |f|
= f.hidden_field :runner_id, value: @runner.id = f.hidden_field :runner_id, value: @runner.id
= f.submit 'Enable', class: 'btn btn-xs' = f.submit 'Enable', class: 'btn btn-xs'
= paginate @projects = paginate @projects, theme: "gitlab"
.col-md-6 .col-md-6
%h4 Recent builds served by this Runner %h4 Recent builds served by this Runner
......
...@@ -10,7 +10,7 @@ ...@@ -10,7 +10,7 @@
= form_for(resource, as: resource_name, url: session_path(resource_name), method: :post, html: { class: 'edit_user show-gl-field-errors' }) do |f| = form_for(resource, as: resource_name, url: session_path(resource_name), method: :post, html: { class: 'edit_user show-gl-field-errors' }) do |f|
- resource_params = params[resource_name].presence || params - resource_params = params[resource_name].presence || params
= f.hidden_field :remember_me, value: resource_params.fetch(:remember_me, 0) = f.hidden_field :remember_me, value: resource_params.fetch(:remember_me, 0)
.form-group %div
= f.label 'Two-Factor Authentication code', name: :otp_attempt = f.label 'Two-Factor Authentication code', name: :otp_attempt
= f.text_field :otp_attempt, class: 'form-control', required: true, autofocus: true, autocomplete: 'off', title: 'This field is required.' = f.text_field :otp_attempt, class: 'form-control', required: true, autofocus: true, autocomplete: 'off', title: 'This field is required.'
%p.help-block.hint Enter the code from the two-factor app on your mobile device. If you've lost your device, you may enter one of your recovery codes. %p.help-block.hint Enter the code from the two-factor app on your mobile device. If you've lost your device, you may enter one of your recovery codes.
......
%ul.new-session-tabs.nav-links.nav-tabs %ul.new-session-tabs.nav-links.nav-tabs{ class: ('custom-provider-tabs' if form_based_providers.any?) }
- if crowd_enabled? - if crowd_enabled?
%li.active %li.active
= link_to "Crowd", "#crowd", 'data-toggle' => 'tab' = link_to "Crowd", "#crowd", 'data-toggle' => 'tab'
......
...@@ -10,7 +10,7 @@ ...@@ -10,7 +10,7 @@
= button_tag type: 'button', class: "form-control compare-dropdown-toggle js-compare-dropdown", required: true, data: { refs_url: refs_namespace_project_path(@project.namespace, @project), toggle: "dropdown", target: ".js-compare-from-dropdown", selected: params[:from], field_name: :from } do = button_tag type: 'button', class: "form-control compare-dropdown-toggle js-compare-dropdown", required: true, data: { refs_url: refs_namespace_project_path(@project.namespace, @project), toggle: "dropdown", target: ".js-compare-from-dropdown", selected: params[:from], field_name: :from } do
.dropdown-toggle-text= params[:from] || 'Select branch/tag' .dropdown-toggle-text= params[:from] || 'Select branch/tag'
= render "ref_dropdown" = render "ref_dropdown"
.compare-ellipsis ... .compare-ellipsis.inline ...
.form-group.dropdown.compare-form-group.to.js-compare-to-dropdown .form-group.dropdown.compare-form-group.to.js-compare-to-dropdown
.input-group.inline-input-group .input-group.inline-input-group
%span.input-group-addon to %span.input-group-addon to
......
...@@ -47,7 +47,9 @@ ...@@ -47,7 +47,9 @@
= link_to "command line", "#modal_merge_info", class: "how_to_merge_link vlink", title: "How To Merge", "data-toggle" => "modal" = link_to "command line", "#modal_merge_info", class: "how_to_merge_link vlink", title: "How To Merge", "data-toggle" => "modal"
- if @commits_count.nonzero? - if @commits_count.nonzero?
%ul.merge-request-tabs.nav-links.no-top.no-bottom{ class: ("js-tabs-affix" unless ENV['RAILS_ENV'] == 'test') } .merge-request-tabs-holder{ class: ("js-tabs-affix" unless ENV['RAILS_ENV'] == 'test') }
%div{ class: container_class }
%ul.merge-request-tabs.nav-links.no-top.no-bottom
%li.notes-tab %li.notes-tab
= link_to namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#notes', action: 'notes', toggle: 'tab' } do = link_to namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#notes', action: 'notes', toggle: 'tab' } do
Discussion Discussion
......
...@@ -57,7 +57,7 @@ ...@@ -57,7 +57,7 @@
= link_to '#', title: 'Edit comment', class: 'note-action-button js-note-edit' do = link_to '#', title: 'Edit comment', class: 'note-action-button js-note-edit' do
= icon('pencil', class: 'link-highlight') = icon('pencil', class: 'link-highlight')
= link_to namespace_project_note_path(note.project.namespace, note.project, note), title: 'Remove comment', method: :delete, data: { confirm: 'Are you sure you want to remove this comment?' }, remote: true, class: 'note-action-button hidden-xs js-note-delete danger' do = link_to namespace_project_note_path(note.project.namespace, note.project, note), title: 'Remove comment', method: :delete, data: { confirm: 'Are you sure you want to remove this comment?' }, remote: true, class: 'note-action-button hidden-xs js-note-delete danger' do
= icon('trash-o') = icon('trash-o', class: 'danger-highlight')
.note-body{class: note_editable ? 'js-task-list-container' : ''} .note-body{class: note_editable ? 'js-task-list-container' : ''}
.note-text.md .note-text.md
= preserve do = preserve do
......
...@@ -26,4 +26,4 @@ ...@@ -26,4 +26,4 @@
%h4.underlined-title Available specific runners %h4.underlined-title Available specific runners
%ul.bordered-list.available-specific-runners %ul.bordered-list.available-specific-runners
= render partial: 'runner', collection: @assignable_runners, as: :runner = render partial: 'runner', collection: @assignable_runners, as: :runner
= paginate @assignable_runners = paginate @assignable_runners, theme: "gitlab"
class AdminEmailWorker class AdminEmailWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
sidekiq_options retry: false # this job auto-repeats via sidekiq-cron
def perform def perform
repository_check_failed_count = Project.where(last_repository_check_failed: true).count repository_check_failed_count = Project.where(last_repository_check_failed: true).count
......
class BuildCoverageWorker class BuildCoverageWorker
include Sidekiq::Worker include Sidekiq::Worker
sidekiq_options queue: :default include BuildQueue
def perform(build_id) def perform(build_id)
Ci::Build.find_by(id: build_id) Ci::Build.find_by(id: build_id)
......
class BuildEmailWorker class BuildEmailWorker
include Sidekiq::Worker include Sidekiq::Worker
include BuildQueue
def perform(build_id, recipients, push_data) def perform(build_id, recipients, push_data)
recipients.each do |recipient| recipients.each do |recipient|
......
class BuildFinishedWorker class BuildFinishedWorker
include Sidekiq::Worker include Sidekiq::Worker
include BuildQueue
def perform(build_id) def perform(build_id)
Ci::Build.find_by(id: build_id).try do |build| Ci::Build.find_by(id: build_id).try do |build|
......
class BuildHooksWorker class BuildHooksWorker
include Sidekiq::Worker include Sidekiq::Worker
sidekiq_options queue: :default include BuildQueue
def perform(build_id) def perform(build_id)
Ci::Build.find_by(id: build_id) Ci::Build.find_by(id: build_id)
......
class BuildSuccessWorker class BuildSuccessWorker
include Sidekiq::Worker include Sidekiq::Worker
sidekiq_options queue: :default include BuildQueue
def perform(build_id) def perform(build_id)
Ci::Build.find_by(id: build_id).try do |build| Ci::Build.find_by(id: build_id).try do |build|
......
# This worker clears all cache fields in the database, working in batches. # This worker clears all cache fields in the database, working in batches.
class ClearDatabaseCacheWorker class ClearDatabaseCacheWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
BATCH_SIZE = 1000 BATCH_SIZE = 1000
......
# Concern for setting Sidekiq settings for the various CI build workers.
module BuildQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :build
end
end
# Concern that sets various Sidekiq settings for workers executed using a
# cronjob.
module CronjobQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :cronjob, retry: false
end
end
# Concern that sets the queue of a Sidekiq worker based on the worker's class
# name/namespace.
module DedicatedSidekiqQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: name.sub(/Worker\z/, '').underscore.tr('/', '_')
end
end
# Concern for setting Sidekiq settings for the various CI pipeline workers.
module PipelineQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :pipeline
end
end
# Concern for setting Sidekiq settings for the various repository check workers.
module RepositoryCheckQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :repository_check, retry: false
end
end
class DeleteUserWorker class DeleteUserWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(current_user_id, delete_user_id, options = {}) def perform(current_user_id, delete_user_id, options = {})
delete_user = User.find(delete_user_id) delete_user = User.find(delete_user_id)
......
class EmailReceiverWorker class EmailReceiverWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :incoming_email
def perform(raw) def perform(raw)
return unless Gitlab::IncomingEmail.enabled? return unless Gitlab::IncomingEmail.enabled?
......
class EmailsOnPushWorker class EmailsOnPushWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :mailers
attr_reader :email, :skip_premailer attr_reader :email, :skip_premailer
def perform(project_id, recipients, push_data, options = {}) def perform(project_id, recipients, push_data, options = {})
......
class ExpireBuildArtifactsWorker class ExpireBuildArtifactsWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
def perform def perform
Rails.logger.info 'Scheduling removal of build artifacts' Rails.logger.info 'Scheduling removal of build artifacts'
......
class ExpireBuildInstanceArtifactsWorker class ExpireBuildInstanceArtifactsWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(build_id) def perform(build_id)
build = Ci::Build build = Ci::Build
......
class GitGarbageCollectWorker class GitGarbageCollectWorker
include Sidekiq::Worker include Sidekiq::Worker
include Gitlab::ShellAdapter include Gitlab::ShellAdapter
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell, retry: false sidekiq_options retry: false
def perform(project_id) def perform(project_id)
project = Project.find(project_id) project = Project.find(project_id)
......
class GitlabShellWorker class GitlabShellWorker
include Sidekiq::Worker include Sidekiq::Worker
include Gitlab::ShellAdapter include Gitlab::ShellAdapter
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell
def perform(action, *arg) def perform(action, *arg)
gitlab_shell.send(action, *arg) gitlab_shell.send(action, *arg)
......
class GroupDestroyWorker class GroupDestroyWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :default
def perform(group_id, user_id) def perform(group_id, user_id)
begin begin
......
class ImportExportProjectCleanupWorker class ImportExportProjectCleanupWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
sidekiq_options queue: :default
def perform def perform
ImportExportCleanUpService.new.execute ImportExportCleanUpService.new.execute
......
...@@ -3,6 +3,7 @@ require 'socket' ...@@ -3,6 +3,7 @@ require 'socket'
class IrkerWorker class IrkerWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(project_id, chans, colors, push_data, settings) def perform(project_id, chans, colors, push_data, settings)
project = Project.find(project_id) project = Project.find(project_id)
......
class MergeWorker class MergeWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :default
def perform(merge_request_id, current_user_id, params) def perform(merge_request_id, current_user_id, params)
params = params.with_indifferent_access params = params.with_indifferent_access
......
class NewNoteWorker class NewNoteWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :default
def perform(note_id, note_params) def perform(note_id, note_params)
note = Note.find(note_id) note = Note.find(note_id)
......
class PipelineHooksWorker class PipelineHooksWorker
include Sidekiq::Worker include Sidekiq::Worker
sidekiq_options queue: :default include PipelineQueue
def perform(pipeline_id) def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id) Ci::Pipeline.find_by(id: pipeline_id)
......
class PipelineMetricsWorker class PipelineMetricsWorker
include Sidekiq::Worker include Sidekiq::Worker
include PipelineQueue
sidekiq_options queue: :default
def perform(pipeline_id) def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline| Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline|
......
class PipelineProcessWorker class PipelineProcessWorker
include Sidekiq::Worker include Sidekiq::Worker
include PipelineQueue
sidekiq_options queue: :default
def perform(pipeline_id) def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id) Ci::Pipeline.find_by(id: pipeline_id)
......
class PipelineSuccessWorker class PipelineSuccessWorker
include Sidekiq::Worker include Sidekiq::Worker
sidekiq_options queue: :default include PipelineQueue
def perform(pipeline_id) def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline| Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline|
......
class PipelineUpdateWorker class PipelineUpdateWorker
include Sidekiq::Worker include Sidekiq::Worker
include PipelineQueue
sidekiq_options queue: :default
def perform(pipeline_id) def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id) Ci::Pipeline.find_by(id: pipeline_id)
......
class PostReceive class PostReceive
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :post_receive
def perform(repo_path, identifier, changes) def perform(repo_path, identifier, changes)
if path = Gitlab.config.repositories.storages.find { |p| repo_path.start_with?(p[1].to_s) } if path = Gitlab.config.repositories.storages.find { |p| repo_path.start_with?(p[1].to_s) }
......
...@@ -5,8 +5,7 @@ ...@@ -5,8 +5,7 @@
# storage engine as much. # storage engine as much.
class ProjectCacheWorker class ProjectCacheWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :default
LEASE_TIMEOUT = 15.minutes.to_i LEASE_TIMEOUT = 15.minutes.to_i
......
class ProjectDestroyWorker class ProjectDestroyWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :default
def perform(project_id, user_id, params) def perform(project_id, user_id, params)
begin begin
......
class ProjectExportWorker class ProjectExportWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell, retry: 3 sidekiq_options retry: 3
def perform(current_user_id, project_id) def perform(current_user_id, project_id)
current_user = User.find(current_user_id) current_user = User.find(current_user_id)
......
class ProjectServiceWorker class ProjectServiceWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :project_web_hook
def perform(hook_id, data) def perform(hook_id, data)
data = data.with_indifferent_access data = data.with_indifferent_access
......
class ProjectWebHookWorker class ProjectWebHookWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :project_web_hook
def perform(hook_id, data, hook_name) def perform(hook_id, data, hook_name)
data = data.with_indifferent_access data = data.with_indifferent_access
......
class PruneOldEventsWorker class PruneOldEventsWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
def perform def perform
# Contribution calendar shows maximum 12 months of events. # Contribution calendar shows maximum 12 months of events.
......
class RemoveExpiredGroupLinksWorker class RemoveExpiredGroupLinksWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
def perform def perform
ProjectGroupLink.expired.destroy_all ProjectGroupLink.expired.destroy_all
......
class RemoveExpiredMembersWorker class RemoveExpiredMembersWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
def perform def perform
Member.expired.find_each do |member| Member.expired.find_each do |member|
......
class RepositoryArchiveCacheWorker class RepositoryArchiveCacheWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
sidekiq_options queue: :default
def perform def perform
RepositoryArchiveCleanUpService.new.execute RepositoryArchiveCleanUpService.new.execute
......
module RepositoryCheck module RepositoryCheck
class BatchWorker class BatchWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
RUN_TIME = 3600 RUN_TIME = 3600
sidekiq_options retry: false
def perform def perform
start = Time.now start = Time.now
......
module RepositoryCheck module RepositoryCheck
class ClearWorker class ClearWorker
include Sidekiq::Worker include Sidekiq::Worker
include RepositoryCheckQueue
sidekiq_options retry: false
def perform def perform
# Do small batched updates because these updates will be slow and locking # Do small batched updates because these updates will be slow and locking
......
module RepositoryCheck module RepositoryCheck
class SingleRepositoryWorker class SingleRepositoryWorker
include Sidekiq::Worker include Sidekiq::Worker
include RepositoryCheckQueue
sidekiq_options retry: false
def perform(project_id) def perform(project_id)
project = Project.find(project_id) project = Project.find(project_id)
......
class RepositoryForkWorker class RepositoryForkWorker
include Sidekiq::Worker include Sidekiq::Worker
include Gitlab::ShellAdapter include Gitlab::ShellAdapter
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell
def perform(project_id, forked_from_repository_storage_path, source_path, target_path) def perform(project_id, forked_from_repository_storage_path, source_path, target_path)
Gitlab::Metrics.add_event(:fork_repository, Gitlab::Metrics.add_event(:fork_repository,
......
class RepositoryImportWorker class RepositoryImportWorker
include Sidekiq::Worker include Sidekiq::Worker
include Gitlab::ShellAdapter include Gitlab::ShellAdapter
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell
attr_accessor :project, :current_user attr_accessor :project, :current_user
......
class RequestsProfilesWorker class RequestsProfilesWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
sidekiq_options queue: :default
def perform def perform
Gitlab::RequestProfiler.remove_all_profiles Gitlab::RequestProfiler.remove_all_profiles
......
class StuckCiBuildsWorker class StuckCiBuildsWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
BUILD_STUCK_TIMEOUT = 1.day BUILD_STUCK_TIMEOUT = 1.day
......
class SystemHookWorker class SystemHookWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :system_hook
def perform(hook_id, data, hook_name) def perform(hook_id, data, hook_name)
SystemHook.find(hook_id).execute(data, hook_name) SystemHook.find(hook_id).execute(data, hook_name)
......
class TrendingProjectsWorker class TrendingProjectsWorker
include Sidekiq::Worker include Sidekiq::Worker
include CronjobQueue
sidekiq_options queue: :trending_projects
def perform def perform
Rails.logger.info('Refreshing trending projects') Rails.logger.info('Refreshing trending projects')
......
class UpdateMergeRequestsWorker class UpdateMergeRequestsWorker
include Sidekiq::Worker include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(project_id, user_id, oldrev, newrev, ref) def perform(project_id, user_id, oldrev, newrev, ref)
project = Project.find_by(id: project_id) project = Project.find_by(id: project_id)
......
...@@ -4,6 +4,7 @@ cd $(dirname $0)/.. ...@@ -4,6 +4,7 @@ cd $(dirname $0)/..
app_root=$(pwd) app_root=$(pwd)
sidekiq_pidfile="$app_root/tmp/pids/sidekiq.pid" sidekiq_pidfile="$app_root/tmp/pids/sidekiq.pid"
sidekiq_logfile="$app_root/log/sidekiq.log" sidekiq_logfile="$app_root/log/sidekiq.log"
sidekiq_config="$app_root/config/sidekiq_queues.yml"
gitlab_user=$(ls -l config.ru | awk '{print $3}') gitlab_user=$(ls -l config.ru | awk '{print $3}')
warn() warn()
...@@ -37,7 +38,7 @@ start_no_deamonize() ...@@ -37,7 +38,7 @@ start_no_deamonize()
start_sidekiq() start_sidekiq()
{ {
exec bundle exec sidekiq -q post_receive -q mailers -q archive_repo -q system_hook -q project_web_hook -q gitlab_shell -q incoming_email -q runner -q common -q default -e $RAILS_ENV -P $sidekiq_pidfile "$@" exec bundle exec sidekiq -C "${sidekiq_config}" -e $RAILS_ENV -P $sidekiq_pidfile "$@"
} }
load_ok() load_ok()
......
...@@ -24,7 +24,8 @@ module Gitlab ...@@ -24,7 +24,8 @@ module Gitlab
#{config.root}/app/models/ci #{config.root}/app/models/ci
#{config.root}/app/models/hooks #{config.root}/app/models/hooks
#{config.root}/app/models/members #{config.root}/app/models/members
#{config.root}/app/models/project_services)) #{config.root}/app/models/project_services
#{config.root}/app/workers/concerns))
config.generators.templates.push("#{config.root}/generator_templates") config.generators.templates.push("#{config.root}/generator_templates")
......
...@@ -12,7 +12,8 @@ constraints(GroupUrlConstrainer.new) do ...@@ -12,7 +12,8 @@ constraints(GroupUrlConstrainer.new) do
end end
end end
resources :groups, constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do scope constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do
resources :groups, except: [:show] do
member do member do
get :issues get :issues
get :merge_requests get :merge_requests
...@@ -31,4 +32,6 @@ resources :groups, constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do ...@@ -31,4 +32,6 @@ resources :groups, constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do
resources :labels, except: [:show], constraints: { id: /\d+/ } resources :labels, except: [:show], constraints: { id: /\d+/ }
end end
end
get 'groups/:id' => 'groups#show', as: :group_canonical
end end
# This configuration file should be exclusively used to set queue settings for
# Sidekiq. Any other setting should be specified using the Sidekiq CLI or the
# Sidekiq Ruby API (see config/initializers/sidekiq.rb).
---
# All the queues to process and their weights. Every queue _must_ have a weight
# defined.
#
# The available weights are as follows
#
# 1: low priority
# 2: medium priority
# 3: high priority
# 5: _super_ high priority, this should only be used for _very_ important queues
#
# As per http://stackoverflow.com/a/21241357/290102 the formula for calculating
# the likelihood of a job being popped off a queue (given all queues have work
# to perform) is:
#
# chance = (queue weight / total weight of all queues) * 100
:queues:
- [post_receive, 5]
- [merge, 5]
- [update_merge_requests, 3]
- [new_note, 2]
- [build, 2]
- [pipeline, 2]
- [gitlab_shell, 2]
- [email_receiver, 2]
- [emails_on_push, 2]
- [mailers, 2]
- [repository_fork, 1]
- [repository_import, 1]
- [project_service, 1]
- [clear_database_cache, 1]
- [delete_user, 1]
- [expire_build_instance_artifacts, 1]
- [group_destroy, 1]
- [irker, 1]
- [project_cache, 1]
- [project_destroy, 1]
- [project_export, 1]
- [project_web_hook, 1]
- [repository_check, 1]
- [system_hook, 1]
- [git_garbage_collect, 1]
- [cronjob, 1]
- [default, 1]
require 'json'
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class MigrateSidekiqQueuesFromDefault < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = true
DOWNTIME_REASON = <<-EOF
Moving Sidekiq jobs from queues requires Sidekiq to be stopped. Not stopping
Sidekiq will result in the loss of jobs that are scheduled after this
migration completes.
EOF
disable_ddl_transaction!
# Jobs for which the queue names have been changed (e.g. multiple workers
# using the same non-default queue).
#
# The keys are the old queue names, the values the jobs to move and their new
# queue names.
RENAMED_QUEUES = {
gitlab_shell: {
'GitGarbageCollectorWorker' => :git_garbage_collector,
'ProjectExportWorker' => :project_export,
'RepositoryForkWorker' => :repository_fork,
'RepositoryImportWorker' => :repository_import
},
project_web_hook: {
'ProjectServiceWorker' => :project_service
},
incoming_email: {
'EmailReceiverWorker' => :email_receiver
},
mailers: {
'EmailsOnPushWorker' => :emails_on_push
},
default: {
'AdminEmailWorker' => :cronjob,
'BuildCoverageWorker' => :build,
'BuildEmailWorker' => :build,
'BuildFinishedWorker' => :build,
'BuildHooksWorker' => :build,
'BuildSuccessWorker' => :build,
'ClearDatabaseCacheWorker' => :clear_database_cache,
'DeleteUserWorker' => :delete_user,
'ExpireBuildArtifactsWorker' => :cronjob,
'ExpireBuildInstanceArtifactsWorker' => :expire_build_instance_artifacts,
'GroupDestroyWorker' => :group_destroy,
'ImportExportProjectCleanupWorker' => :cronjob,
'IrkerWorker' => :irker,
'MergeWorker' => :merge,
'NewNoteWorker' => :new_note,
'PipelineHooksWorker' => :pipeline,
'PipelineMetricsWorker' => :pipeline,
'PipelineProcessWorker' => :pipeline,
'PipelineSuccessWorker' => :pipeline,
'PipelineUpdateWorker' => :pipeline,
'ProjectCacheWorker' => :project_cache,
'ProjectDestroyWorker' => :project_destroy,
'PruneOldEventsWorker' => :cronjob,
'RemoveExpiredGroupLinksWorker' => :cronjob,
'RemoveExpiredMembersWorker' => :cronjob,
'RepositoryArchiveCacheWorker' => :cronjob,
'RepositoryCheck::BatchWorker' => :cronjob,
'RepositoryCheck::ClearWorker' => :repository_check,
'RepositoryCheck::SingleRepositoryWorker' => :repository_check,
'RequestsProfilesWorker' => :cronjob,
'StuckCiBuildsWorker' => :cronjob,
'UpdateMergeRequestsWorker' => :update_merge_requests
}
}
def up
Sidekiq.redis do |redis|
RENAMED_QUEUES.each do |queue, jobs|
migrate_from_queue(redis, queue, jobs)
end
end
end
def down
Sidekiq.redis do |redis|
RENAMED_QUEUES.each do |dest_queue, jobs|
jobs.each do |worker, from_queue|
migrate_from_queue(redis, from_queue, worker => dest_queue)
end
end
end
end
def migrate_from_queue(redis, queue, job_mapping)
while job = redis.lpop("queue:#{queue}")
payload = JSON.load(job)
new_queue = job_mapping[payload['class']]
# If we have no target queue to migrate to we're probably dealing with
# some ancient job for which the worker no longer exists. In that case
# there's no sane option we can take, other than just dropping the job.
next unless new_queue
payload['queue'] = new_queue
redis.lpush("queue:#{new_queue}", JSON.dump(payload))
end
end
end
...@@ -11,10 +11,10 @@ GET /projects/:id/builds ...@@ -11,10 +11,10 @@ GET /projects/:id/builds
| Attribute | Type | Required | Description | | Attribute | Type | Required | Description |
|-----------|---------|----------|---------------------| |-----------|---------|----------|---------------------|
| `id` | integer | yes | The ID of a project | | `id` | integer | yes | The ID of a project |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `pending`, `running`, `failed`, `success`, `canceled`; showing all builds if none provided | | `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `created`, `pending`, `running`, `failed`, `success`, `canceled`, `skipped`; showing all builds if none provided |
``` ```
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" "https://gitlab.example.com/api/v3/projects/1/builds" curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" 'https://gitlab.example.com/api/v3/projects/1/builds?scope%5B0%5D=pending&scope%5B1%5D=running'
``` ```
Example of response Example of response
...@@ -132,10 +132,10 @@ GET /projects/:id/repository/commits/:sha/builds ...@@ -132,10 +132,10 @@ GET /projects/:id/repository/commits/:sha/builds
|-----------|---------|----------|---------------------| |-----------|---------|----------|---------------------|
| `id` | integer | yes | The ID of a project | | `id` | integer | yes | The ID of a project |
| `sha` | string | yes | The SHA id of a commit | | `sha` | string | yes | The SHA id of a commit |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `pending`, `running`, `failed`, `success`, `canceled`; showing all builds if none provided | | `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `created`, `pending`, `running`, `failed`, `success`, `canceled`, `skipped`; showing all builds if none provided |
``` ```
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" "https://gitlab.example.com/api/v3/projects/1/repository/commits/0ff3ae198f8601a285adcf5c0fff204ee6fba5fd/builds" curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" 'https://gitlab.example.com/api/v3/projects/1/repository/commits/0ff3ae198f8601a285adcf5c0fff204ee6fba5fd/builds?scope%5B0%5D=pending&scope%5B1%5D=running'
``` ```
Example of response Example of response
......
...@@ -643,7 +643,7 @@ Parameters: ...@@ -643,7 +643,7 @@ Parameters:
| `id` | integer | yes | The ID of the user | | `id` | integer | yes | The ID of the user |
```bash ```bash
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" https://gitlab.example.com/api/v3/user/:id/events curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" https://gitlab.example.com/api/v3/users/:id/events
``` ```
Example response: Example response:
......
...@@ -188,7 +188,7 @@ In order to do that, follow the steps: ...@@ -188,7 +188,7 @@ In order to do that, follow the steps:
image = "docker:latest" image = "docker:latest"
privileged = false privileged = false
disable_cache = false disable_cache = false
volumes = ["/var/run/docker.sock", "/cache"] volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache"]
[runners.cache] [runners.cache]
Insecure = false Insecure = false
``` ```
......
...@@ -37,7 +37,7 @@ The registered runner will use the `ruby:2.1` docker image and will run two ...@@ -37,7 +37,7 @@ The registered runner will use the `ruby:2.1` docker image and will run two
services, `postgres:latest` and `mysql:latest`, both of which will be services, `postgres:latest` and `mysql:latest`, both of which will be
accessible during the build process. accessible during the build process.
## What is image ## What is an image
The `image` keyword is the name of the docker image that is present in the The `image` keyword is the name of the docker image that is present in the
local Docker Engine (list all images with `docker images`) or any image that local Docker Engine (list all images with `docker images`) or any image that
...@@ -47,7 +47,7 @@ Hub please read the [Docker Fundamentals][] documentation. ...@@ -47,7 +47,7 @@ Hub please read the [Docker Fundamentals][] documentation.
In short, with `image` we refer to the docker image, which will be used to In short, with `image` we refer to the docker image, which will be used to
create a container on which your build will run. create a container on which your build will run.
## What is service ## What is a service
The `services` keyword defines just another docker image that is run during The `services` keyword defines just another docker image that is run during
your build and is linked to the docker image that the `image` keyword defines. your build and is linked to the docker image that the `image` keyword defines.
...@@ -61,7 +61,7 @@ time the project is built. ...@@ -61,7 +61,7 @@ time the project is built.
You can see some widely used services examples in the relevant documentation of You can see some widely used services examples in the relevant documentation of
[CI services examples](../services/README.md). [CI services examples](../services/README.md).
### How is service linked to the build ### How services are linked to the build
To better understand how the container linking works, read To better understand how the container linking works, read
[Linking containers together][linking-containers]. [Linking containers together][linking-containers].
......
...@@ -146,13 +146,17 @@ variables: ...@@ -146,13 +146,17 @@ variables:
``` ```
These variables can be later used in all executed commands and scripts. These variables can be later used in all executed commands and scripts.
The YAML-defined variables are also set to all created service containers, The YAML-defined variables are also set to all created service containers,
thus allowing to fine tune them. thus allowing to fine tune them. Variables can be also defined on a
[job level](#job-variables).
Variables can be also defined on [job level](#job-variables). Except for the user defined variables, there are also the ones set up by the
Runner itself. One example would be `CI_BUILD_REF_NAME` which has the value of
the branch or tag name for which project is built. Apart from the variables
you can set in `.gitlab-ci.yml`, there are also the so called secret variables
which can be set in GitLab's UI.
[Learn more about variables.](../variables/README.md) [Learn more about variables.][variables]
### cache ### cache
...@@ -541,20 +545,29 @@ An example usage of manual actions is deployment to production. ...@@ -541,20 +545,29 @@ An example usage of manual actions is deployment to production.
> Introduced in GitLab 8.9. > Introduced in GitLab 8.9.
`environment` is used to define that a job deploys to a specific [environment]. > You can read more about environments and find more examples in the
This allows easy tracking of all deployments to your environments straight from [documentation about environments][environment].
GitLab.
`environment` is used to define that a job deploys to a specific environment.
If `environment` is specified and no environment under that name exists, a new If `environment` is specified and no environment under that name exists, a new
one will be created automatically. one will be created automatically.
The `environment` name must contain only letters, digits, '-', '_', '/', '$', '{', '}' and spaces. Common The `environment` name can contain:
names are `qa`, `staging`, and `production`, but you can use whatever name works
with your workflow.
--- - letters
- digits
- spaces
- `-`
- `_`
- `/`
- `$`
- `{`
- `}`
**Example configurations** Common names are `qa`, `staging`, and `production`, but you can use whatever
name works with your workflow.
In its simplest form, the `environment` keyword can be defined like:
``` ```
deploy to production: deploy to production:
...@@ -563,39 +576,134 @@ deploy to production: ...@@ -563,39 +576,134 @@ deploy to production:
environment: production environment: production
``` ```
The `deploy to production` job will be marked as doing deployment to In the above example, the `deploy to production` job will be marked as doing a
`production` environment. deployment to the `production` environment.
#### environment:name
> Introduced in GitLab 8.11.
>**Note:**
Before GitLab 8.11, the name of an environment could be defined as a string like
`environment: production`. The recommended way now is to define it under the
`name` keyword.
Instead of defining the name of the environment right after the `environment`
keyword, it is also possible to define it as a separate value. For that, use
the `name` keyword under `environment`:
```
deploy to production:
stage: deploy
script: git push production HEAD:master
environment:
name: production
```
#### environment:url
> Introduced in GitLab 8.11.
>**Note:**
Before GitLab 8.11, the URL could be added only in GitLab's UI. The
recommended way now is to define it in `.gitlab-ci.yml`.
This is an optional value that when set, it exposes buttons in various places
in GitLab which when clicked take you to the defined URL.
In the example below, if the job finishes successfully, it will create buttons
in the merge requests and in the environments/deployments pages which will point
to `https://prod.example.com`.
```
deploy to production:
stage: deploy
script: git push production HEAD:master
environment:
name: production
url: https://prod.example.com
```
#### environment:on_stop
> [Introduced][ce-6669] in GitLab 8.13.
Closing (stoping) environments can be achieved with the `on_stop` keyword defined under
`environment`. It declares a different job that runs in order to close
the environment.
Read the `environment:action` section for an example.
#### environment:action
> [Introduced][ce-6669] in GitLab 8.13.
The `action` keyword is to be used in conjunction with `on_stop` and is defined
in the job that is called to close the environment.
Take for instance:
```yaml
review_app:
stage: deploy
script: make deploy-app
environment:
name: review
on_stop: stop_review_app
stop_review_app:
stage: deploy
script: make delete-app
when: manual
environment:
name: review
action: stop
```
In the above example we set up the `review_app` job to deploy to the `review`
environment, and we also defined a new `stop_review_app` job under `on_stop`.
Once the `review_app` job is successfully finished, it will trigger the
`stop_review_app` job based on what is defined under `when`. In this case we
set it up to `manual` so it will need a [manual action](#manual-actions) via
GitLab's web interface in order to run.
The `stop_review_app` job is **required** to have the following keywords defined:
- `when` - [reference](#when)
- `environment:name`
- `environment:action`
#### dynamic environments #### dynamic environments
> [Introduced][ce-6323] in GitLab 8.12 and GitLab Runner 1.6. > [Introduced][ce-6323] in GitLab 8.12 and GitLab Runner 1.6.
`environment` can also represent a configuration hash with `name` and `url`. `environment` can also represent a configuration hash with `name` and `url`.
These parameters can use any of the defined CI [variables](#variables) These parameters can use any of the defined [CI variables](#variables)
(including predefined, secure variables and `.gitlab-ci.yml` variables). (including predefined, secure variables and `.gitlab-ci.yml` variables).
The common use case is to create dynamic environments for branches and use them For example:
as review apps.
---
**Example configurations**
``` ```
deploy as review app: deploy as review app:
stage: deploy stage: deploy
script: ... script: make deploy
environment: environment:
name: review-apps/$CI_BUILD_REF_NAME name: review-apps/$CI_BUILD_REF_NAME
url: https://$CI_BUILD_REF_NAME.review.example.com/ url: https://$CI_BUILD_REF_NAME.review.example.com/
``` ```
The `deploy as review app` job will be marked as deployment to dynamically The `deploy as review app` job will be marked as deployment to dynamically
create the `review-apps/branch-name` environment. create the `review-apps/$CI_BUILD_REF_NAME` environment, which `$CI_BUILD_REF_NAME`
is an [environment variable][variables] set by the Runner. If for example the
`deploy as review app` job was run in a branch named `pow`, this environment
should be accessible under `https://pow.review.example.com/`.
This environment should be accessible under `https://branch-name.review.example.com/`. This of course implies that the underlying server which hosts the application
is properly configured.
You can see a simple example at https://gitlab.com/gitlab-examples/review-apps-nginx/. The common use case is to create dynamic environments for branches and use them
as Review Apps. You can see a simple example using Review Apps at
https://gitlab.com/gitlab-examples/review-apps-nginx/.
### artifacts ### artifacts
...@@ -1105,3 +1213,5 @@ CI with various languages. ...@@ -1105,3 +1213,5 @@ CI with various languages.
[examples]: ../examples/README.md [examples]: ../examples/README.md
[ce-6323]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6323 [ce-6323]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6323
[environment]: ../environments.md [environment]: ../environments.md
[ce-6669]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6669
[variables]: ../variables/README.md
...@@ -14,7 +14,8 @@ ...@@ -14,7 +14,8 @@
- [Testing standards and style guidelines](testing.md) - [Testing standards and style guidelines](testing.md)
- [UI guide](ui_guide.md) for building GitLab with existing CSS styles and elements - [UI guide](ui_guide.md) for building GitLab with existing CSS styles and elements
- [Frontend guidelines](frontend.md) - [Frontend guidelines](frontend.md)
- [SQL guidelines](sql.md) for SQL guidelines - [SQL guidelines](sql.md) for working with SQL queries
- [Sidekiq guidelines](sidekiq_style_guide.md) for working with Sidekiq workers
## Process ## Process
......
# Sidekiq Style Guide
This document outlines various guidelines that should be followed when adding or
modifying Sidekiq workers.
## Default Queue
Use of the "default" queue is not allowed. Every worker should use a queue that
matches the worker's purpose the closest. For example, workers that are to be
executed periodically should use the "cronjob" queue.
A list of all available queues can be found in `config/sidekiq_queues.yml`.
## Dedicated Queues
Most workers should use their own queue. To ease this process a worker can
include the `DedicatedSidekiqQueue` concern as follows:
```ruby
class ProcessSomethingWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
end
```
This will set the queue name based on the class' name, minus the `Worker`
suffix. In the above example this would lead to the queue being
`process_something`.
In some cases multiple workers do use the same queue. For example, the various
workers for updating CI pipelines all use the `pipeline` queue. Adding workers
to existing queues should be done with care, as adding more workers can lead to
slow jobs blocking work (even for different jobs) on the shared queue.
## Tests
Each Sidekiq worker must be tested using RSpec, just like any other class. These
tests should be placed in `spec/workers`.
...@@ -72,7 +72,7 @@ sudo -u git -H git checkout 8-12-stable-ee ...@@ -72,7 +72,7 @@ sudo -u git -H git checkout 8-12-stable-ee
```bash ```bash
cd /home/git/gitlab-shell cd /home/git/gitlab-shell
sudo -u git -H git fetch --all --tags sudo -u git -H git fetch --all --tags
sudo -u git -H git checkout v3.6.0 sudo -u git -H git checkout v3.6.1
``` ```
### 6. Update gitlab-workhorse ### 6. Update gitlab-workhorse
......
...@@ -8,7 +8,7 @@ module API ...@@ -8,7 +8,7 @@ module API
# #
# Parameters: # Parameters:
# id (required) - The ID of a project # id (required) - The ID of a project
# scope (optional) - The scope of builds to show (one or array of: pending, running, failed, success, canceled; # scope (optional) - The scope of builds to show (one or array of: created, pending, running, failed, success, canceled, skipped;
# if none provided showing all builds) # if none provided showing all builds)
# Example Request: # Example Request:
# GET /projects/:id/builds # GET /projects/:id/builds
...@@ -25,7 +25,7 @@ module API ...@@ -25,7 +25,7 @@ module API
# Parameters: # Parameters:
# id (required) - The ID of a project # id (required) - The ID of a project
# sha (required) - The SHA id of a commit # sha (required) - The SHA id of a commit
# scope (optional) - The scope of builds to show (one or array of: pending, running, failed, success, canceled; # scope (optional) - The scope of builds to show (one or array of: created, pending, running, failed, success, canceled, skipped;
# if none provided showing all builds) # if none provided showing all builds)
# Example Request: # Example Request:
# GET /projects/:id/repository/commits/:sha/builds # GET /projects/:id/repository/commits/:sha/builds
......
...@@ -19,7 +19,7 @@ module Gitlab ...@@ -19,7 +19,7 @@ module Gitlab
] ]
labels.each do |params| labels.each do |params|
::Labels::FindOrCreateService.new(project.owner, project).execute(params) ::Labels::FindOrCreateService.new(project.owner, project, params).execute
end end
end end
end end
......
...@@ -29,5 +29,5 @@ namespace :cache do ...@@ -29,5 +29,5 @@ namespace :cache do
task all: [:db, :redis] task all: [:db, :redis]
end end
task clear: 'cache:clear:all' task clear: 'cache:clear:redis'
end end
...@@ -70,4 +70,19 @@ describe Projects::LabelsController do ...@@ -70,4 +70,19 @@ describe Projects::LabelsController do
get :index, namespace_id: project.namespace.to_param, project_id: project.to_param get :index, namespace_id: project.namespace.to_param, project_id: project.to_param
end end
end end
describe 'POST #generate' do
let(:admin) { create(:admin) }
let(:project) { create(:empty_project) }
before do
sign_in(admin)
end
it 'creates labels' do
post :generate, namespace_id: project.namespace.to_param, project_id: project.to_param
expect(response).to have_http_status(302)
end
end
end end
...@@ -265,4 +265,10 @@ describe Group, models: true do ...@@ -265,4 +265,10 @@ describe Group, models: true do
members members
end end
describe '#web_url' do
it 'returns the canonical URL' do
expect(group.web_url).to include("groups/#{group.name}")
end
end
end end
...@@ -846,7 +846,7 @@ describe API::API, api: true do ...@@ -846,7 +846,7 @@ describe API::API, api: true do
end end
end end
describe 'PUT /user/:id/block' do describe 'PUT /users/:id/block' do
before { admin } before { admin }
it 'blocks existing user' do it 'blocks existing user' do
put api("/users/#{user.id}/block", admin) put api("/users/#{user.id}/block", admin)
...@@ -873,7 +873,7 @@ describe API::API, api: true do ...@@ -873,7 +873,7 @@ describe API::API, api: true do
end end
end end
describe 'PUT /user/:id/unblock' do describe 'PUT /users/:id/unblock' do
let(:blocked_user) { create(:user, state: 'blocked') } let(:blocked_user) { create(:user, state: 'blocked') }
before { admin } before { admin }
...@@ -914,7 +914,7 @@ describe API::API, api: true do ...@@ -914,7 +914,7 @@ describe API::API, api: true do
end end
end end
describe 'GET /user/:id/events' do describe 'GET /users/:id/events' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:project) { create(:empty_project) } let(:project) { create(:empty_project) }
let(:note) { create(:note_on_issue, note: 'What an awesome day!', project: project) } let(:note) { create(:note_on_issue, note: 'What an awesome day!', project: project) }
......
...@@ -46,4 +46,16 @@ describe MergeRequests::AssignIssuesService, services: true do ...@@ -46,4 +46,16 @@ describe MergeRequests::AssignIssuesService, services: true do
it 'assigns these to the merge request owner' do it 'assigns these to the merge request owner' do
expect { service.execute }.to change { issue.reload.assignee }.to(user) expect { service.execute }.to change { issue.reload.assignee }.to(user)
end end
it 'ignores external issues' do
external_issue = ExternalIssue.new('JIRA-123', project)
service = described_class.new(
project,
user,
merge_request: merge_request,
closes_issues: [external_issue]
)
expect(service.assignable_issues.count).to eq 0
end
end end
require 'spec_helper'
describe BuildQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include BuildQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('build')
end
end
require 'spec_helper'
describe CronjobQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include CronjobQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('cronjob')
end
it 'disables retrying of failed jobs' do
expect(worker.sidekiq_options['retry']).to eq(false)
end
end
require 'spec_helper'
describe DedicatedSidekiqQueue do
let(:worker) do
Class.new do
def self.name
'Foo::Bar::DummyWorker'
end
include Sidekiq::Worker
include DedicatedSidekiqQueue
end
end
describe 'queue names' do
it 'sets the queue name based on the class name' do
expect(worker.sidekiq_options['queue']).to eq('foo_bar_dummy')
end
end
end
require 'spec_helper'
describe PipelineQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include PipelineQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('pipeline')
end
end
require 'spec_helper'
describe RepositoryCheckQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include RepositoryCheckQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('repository_check')
end
it 'disables retrying of failed jobs' do
expect(worker.sidekiq_options['retry']).to eq(false)
end
end
require 'spec_helper'
describe 'Every Sidekiq worker' do
let(:workers) do
root = Rails.root.join('app', 'workers')
concerns = root.join('concerns').to_s
workers = Dir[root.join('**', '*.rb')].
reject { |path| path.start_with?(concerns) }
workers.map do |path|
ns = Pathname.new(path).relative_path_from(root).to_s.gsub('.rb', '')
ns.camelize.constantize
end
end
it 'does not use the default queue' do
workers.each do |worker|
expect(worker.sidekiq_options['queue'].to_s).not_to eq('default')
end
end
it 'uses the cronjob queue when the worker runs as a cronjob' do
cron_workers = Settings.cron_jobs.
map { |job_name, options| options['job_class'].constantize }.
to_set
workers.each do |worker|
next unless cron_workers.include?(worker)
expect(worker.sidekiq_options['queue'].to_s).to eq('cronjob')
end
end
it 'defines the queue in the Sidekiq configuration file' do
config = YAML.load_file(Rails.root.join('config', 'sidekiq_queues.yml').to_s)
queue_names = config[:queues].map { |(queue, _)| queue }.to_set
workers.each do |worker|
expect(queue_names).to include(worker.sidekiq_options['queue'].to_s)
end
end
end
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment