Commit 7e9b4189 authored by Rémy Coutable's avatar Rémy Coutable

Merge branch 'refactor-builds-creation-service' into 'master'

Refactor pipeline creation service

## What does this MR do?
This refactors GitLab CI build processing: all builds for pipeline are pre-created when a pipeline object is created. 
The builds are created with a new introduced status `created`.
The builds are then automatically promoted to `pending` when a previous stage do succeed.
This significantly simplifies pipeline processing code solving a lot of problems of lazily initialisation of previous approach (builds were created on-demand).

## Why was this MR needed?
The previous mechanism had a lot of flows (shown in related issues) in how it work, but also in code design. Removing cross model-service-library dependencies.

The current approach moves a build creation to single place `CreatePipelineService` and removes a dynamic dependency on `config_processor` significantly simplifying a build creation and pipeline processing. Pipeline processing is implemented in `ProcessPipelineService`.

This also allows to easily extend GitLab with Manual Actions which is part of 8.10 direction issue.

## Migration problem
~~This MR removes the a on-demand creation of builds in pipelines.
 Pipelines that are running and are in mid-stage (some stages started, but not all) will not be fully evaluated after application restart. 
This happens, because the code responsible for on-demand creation is removed. 
There's no easy way to migrate existing pipelines, other than doing offline migration and putting pipeline processing in migration code (which seems to be a really bad idea).~~

To support old pipelines I added a lazy initialization of builds if none is found.

## What are the relevant issue numbers?
Fixes: https://gitlab.com/gitlab-org/gitlab-ce/issues/12839
Solves: https://gitlab.com/gitlab-org/gitlab-ce/issues/18644 https://gitlab.com/gitlab-org/gitlab-ci-multi-runner/issues/289
Allows to easily implement: https://gitlab.com/gitlab-org/gitlab-ce/issues/17010

## Does this MR meet the acceptance criteria?

- [x] [CHANGELOG](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CHANGELOG) entry added
- [x] [Documentation created/updated](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/doc/development/doc_styleguide.md)
- [x] API support added
- Tests
  - [x] Added for this feature/bug
  - [ ] All builds are passing
- [x] Conform by the [style guides](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CONTRIBUTING.md#style-guides)
- [x] Branch has no merge conflicts with `master` (if you do - rebase it please)
- [ ] [Squashed related commits together](https://git-scm.com/book/en/Git-Tools-Rewriting-History#Squashing-Commits)


See merge request !5295
parents a081b842 39203f1a
...@@ -20,6 +20,7 @@ v 8.11.0 (unreleased) ...@@ -20,6 +20,7 @@ v 8.11.0 (unreleased)
- Cache the commit author in RequestStore to avoid extra lookups in PostReceive - Cache the commit author in RequestStore to avoid extra lookups in PostReceive
- Expand commit message width in repo view (ClemMakesApps) - Expand commit message width in repo view (ClemMakesApps)
- Cache highlighted diff lines for merge requests - Cache highlighted diff lines for merge requests
- Pre-create all builds for a Pipeline when the new Pipeline is created !5295
- Fix of 'Commits being passed to custom hooks are already reachable when using the UI' - Fix of 'Commits being passed to custom hooks are already reachable when using the UI'
- Fix awardable button mutuality loading spinners (ClemMakesApps) - Fix awardable button mutuality loading spinners (ClemMakesApps)
- Add support for using RequestStore within Sidekiq tasks via SIDEKIQ_REQUEST_STORE env variable - Add support for using RequestStore within Sidekiq tasks via SIDEKIQ_REQUEST_STORE env variable
......
...@@ -6,7 +6,7 @@ class Projects::BuildsController < Projects::ApplicationController ...@@ -6,7 +6,7 @@ class Projects::BuildsController < Projects::ApplicationController
def index def index
@scope = params[:scope] @scope = params[:scope]
@all_builds = project.builds @all_builds = project.builds.relevant
@builds = @all_builds.order('created_at DESC') @builds = @all_builds.order('created_at DESC')
@builds = @builds =
case @scope case @scope
......
...@@ -134,8 +134,8 @@ class Projects::CommitController < Projects::ApplicationController ...@@ -134,8 +134,8 @@ class Projects::CommitController < Projects::ApplicationController
end end
def define_status_vars def define_status_vars
@statuses = CommitStatus.where(pipeline: pipelines) @statuses = CommitStatus.where(pipeline: pipelines).relevant
@builds = Ci::Build.where(pipeline: pipelines) @builds = Ci::Build.where(pipeline: pipelines).relevant
end end
def assign_change_commit_vars(mr_source_branch) def assign_change_commit_vars(mr_source_branch)
......
...@@ -160,7 +160,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController ...@@ -160,7 +160,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@diff_notes_disabled = true @diff_notes_disabled = true
@pipeline = @merge_request.pipeline @pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline @statuses = @pipeline.statuses.relevant if @pipeline
@note_counts = Note.where(commit_id: @commits.map(&:id)). @note_counts = Note.where(commit_id: @commits.map(&:id)).
group(:commit_id).count group(:commit_id).count
...@@ -362,7 +362,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController ...@@ -362,7 +362,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@commits_count = @merge_request.commits.count @commits_count = @merge_request.commits.count
@pipeline = @merge_request.pipeline @pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline @statuses = @pipeline.statuses.relevant if @pipeline
if @merge_request.locked_long_ago? if @merge_request.locked_long_ago?
@merge_request.unlock_mr @merge_request.unlock_mr
......
...@@ -19,7 +19,7 @@ class Projects::PipelinesController < Projects::ApplicationController ...@@ -19,7 +19,7 @@ class Projects::PipelinesController < Projects::ApplicationController
end end
def create def create
@pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute @pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute(ignore_skip_ci: true, save_on_errors: false)
unless @pipeline.persisted? unless @pipeline.persisted?
render 'new' render 'new'
return return
......
...@@ -16,7 +16,7 @@ module Ci ...@@ -16,7 +16,7 @@ module Ci
scope :with_artifacts_not_expired, ->() { with_artifacts.where('artifacts_expire_at IS NULL OR artifacts_expire_at > ?', Time.now) } scope :with_artifacts_not_expired, ->() { with_artifacts.where('artifacts_expire_at IS NULL OR artifacts_expire_at > ?', Time.now) }
scope :with_expired_artifacts, ->() { with_artifacts.where('artifacts_expire_at < ?', Time.now) } scope :with_expired_artifacts, ->() { with_artifacts.where('artifacts_expire_at < ?', Time.now) }
scope :last_month, ->() { where('created_at > ?', Date.today - 1.month) } scope :last_month, ->() { where('created_at > ?', Date.today - 1.month) }
scope :manual_actions, ->() { where(when: :manual) } scope :manual_actions, ->() { where(when: :manual).relevant }
mount_uploader :artifacts_file, ArtifactUploader mount_uploader :artifacts_file, ArtifactUploader
mount_uploader :artifacts_metadata, ArtifactUploader mount_uploader :artifacts_metadata, ArtifactUploader
...@@ -65,17 +65,11 @@ module Ci ...@@ -65,17 +65,11 @@ module Ci
end end
end end
state_machine :status, initial: :pending do state_machine :status do
after_transition pending: :running do |build| after_transition pending: :running do |build|
build.execute_hooks build.execute_hooks
end end
# We use around_transition to create builds for next stage as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |build, block|
block.call
build.pipeline.create_next_builds(build) if build.pipeline
end
after_transition any => [:success, :failed, :canceled] do |build| after_transition any => [:success, :failed, :canceled] do |build|
build.update_coverage build.update_coverage
build.execute_hooks build.execute_hooks
......
...@@ -13,11 +13,10 @@ module Ci ...@@ -13,11 +13,10 @@ module Ci
has_many :trigger_requests, dependent: :destroy, class_name: 'Ci::TriggerRequest', foreign_key: :commit_id has_many :trigger_requests, dependent: :destroy, class_name: 'Ci::TriggerRequest', foreign_key: :commit_id
validates_presence_of :sha validates_presence_of :sha
validates_presence_of :ref
validates_presence_of :status validates_presence_of :status
validate :valid_commit_sha validate :valid_commit_sha
# Invalidate object and save if when touched
after_touch :update_state
after_save :keep_around_commits after_save :keep_around_commits
# ref can't be HEAD or SHA, can only be branch/tag name # ref can't be HEAD or SHA, can only be branch/tag name
...@@ -90,12 +89,16 @@ module Ci ...@@ -90,12 +89,16 @@ module Ci
def cancel_running def cancel_running
builds.running_or_pending.each(&:cancel) builds.running_or_pending.each(&:cancel)
reload_status!
end end
def retry_failed(user) def retry_failed(user)
builds.latest.failed.select(&:retryable?).each do |build| builds.latest.failed.select(&:retryable?).each do |build|
Ci::Build.retry(build, user) Ci::Build.retry(build, user)
end end
reload_status!
end end
def latest? def latest?
...@@ -109,37 +112,6 @@ module Ci ...@@ -109,37 +112,6 @@ module Ci
trigger_requests.any? trigger_requests.any?
end end
def create_builds(user, trigger_request = nil)
##
# We persist pipeline only if there are builds available
#
return unless config_processor
build_builds_for_stages(config_processor.stages, user,
'success', trigger_request) && save
end
def create_next_builds(build)
return unless config_processor
# don't create other builds if this one is retried
latest_builds = builds.latest
return unless latest_builds.exists?(build.id)
# get list of stages after this build
next_stages = config_processor.stages.drop_while { |stage| stage != build.stage }
next_stages.delete(build.stage)
# get status for all prior builds
prior_builds = latest_builds.where.not(stage: next_stages)
prior_status = prior_builds.status
# build builds for next stage that has builds available
# and save pipeline if we have builds
build_builds_for_stages(next_stages, build.user, prior_status,
build.trigger_request) && save
end
def retried def retried
@retried ||= (statuses.order(id: :desc) - statuses.latest) @retried ||= (statuses.order(id: :desc) - statuses.latest)
end end
...@@ -151,6 +123,14 @@ module Ci ...@@ -151,6 +123,14 @@ module Ci
end end
end end
def config_builds_attributes
return [] unless config_processor
config_processor.
builds_for_ref(ref, tag?, trigger_requests.first).
sort_by { |build| build[:stage_idx] }
end
def has_warnings? def has_warnings?
builds.latest.ignored.any? builds.latest.ignored.any?
end end
...@@ -182,10 +162,6 @@ module Ci ...@@ -182,10 +162,6 @@ module Ci
end end
end end
def skip_ci?
git_commit_message =~ /\[(ci skip|skip ci)\]/i if git_commit_message
end
def environments def environments
builds.where.not(environment: nil).success.pluck(:environment).uniq builds.where.not(environment: nil).success.pluck(:environment).uniq
end end
...@@ -207,29 +183,21 @@ module Ci ...@@ -207,29 +183,21 @@ module Ci
Note.for_commit_id(sha) Note.for_commit_id(sha)
end end
def process!
Ci::ProcessPipelineService.new(project, user).execute(self)
reload_status!
end
def predefined_variables def predefined_variables
[ [
{ key: 'CI_PIPELINE_ID', value: id.to_s, public: true } { key: 'CI_PIPELINE_ID', value: id.to_s, public: true }
] ]
end end
private def reload_status!
def build_builds_for_stages(stages, user, status, trigger_request)
##
# Note that `Array#any?` implements a short circuit evaluation, so we
# build builds only for the first stage that has builds available.
#
stages.any? do |stage|
CreateBuildsService.new(self).
execute(stage, user, status, trigger_request).
any?(&:active?)
end
end
def update_state
statuses.reload statuses.reload
self.status = if yaml_errors.blank? self.status =
if yaml_errors.blank?
statuses.latest.status || 'skipped' statuses.latest.status || 'skipped'
else else
'failed' 'failed'
...@@ -240,6 +208,8 @@ module Ci ...@@ -240,6 +208,8 @@ module Ci
save save
end end
private
def keep_around_commits def keep_around_commits
return unless project return unless project
......
...@@ -5,7 +5,7 @@ class CommitStatus < ActiveRecord::Base ...@@ -5,7 +5,7 @@ class CommitStatus < ActiveRecord::Base
self.table_name = 'ci_builds' self.table_name = 'ci_builds'
belongs_to :project, class_name: '::Project', foreign_key: :gl_project_id belongs_to :project, class_name: '::Project', foreign_key: :gl_project_id
belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id, touch: true belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id
belongs_to :user belongs_to :user
delegate :commit, to: :pipeline delegate :commit, to: :pipeline
...@@ -25,28 +25,36 @@ class CommitStatus < ActiveRecord::Base ...@@ -25,28 +25,36 @@ class CommitStatus < ActiveRecord::Base
scope :ordered, -> { order(:name) } scope :ordered, -> { order(:name) }
scope :ignored, -> { where(allow_failure: true, status: [:failed, :canceled]) } scope :ignored, -> { where(allow_failure: true, status: [:failed, :canceled]) }
state_machine :status, initial: :pending do state_machine :status do
event :queue do event :queue do
transition skipped: :pending transition [:created, :skipped] => :pending
end end
event :run do event :run do
transition pending: :running transition pending: :running
end end
event :skip do
transition [:created, :pending] => :skipped
end
event :drop do event :drop do
transition [:pending, :running] => :failed transition [:created, :pending, :running] => :failed
end end
event :success do event :success do
transition [:pending, :running] => :success transition [:created, :pending, :running] => :success
end end
event :cancel do event :cancel do
transition [:pending, :running] => :canceled transition [:created, :pending, :running] => :canceled
end
after_transition created: [:pending, :running] do |commit_status|
commit_status.update_attributes queued_at: Time.now
end end
after_transition pending: :running do |commit_status| after_transition [:created, :pending] => :running do |commit_status|
commit_status.update_attributes started_at: Time.now commit_status.update_attributes started_at: Time.now
end end
...@@ -54,13 +62,20 @@ class CommitStatus < ActiveRecord::Base ...@@ -54,13 +62,20 @@ class CommitStatus < ActiveRecord::Base
commit_status.update_attributes finished_at: Time.now commit_status.update_attributes finished_at: Time.now
end end
after_transition [:pending, :running] => :success do |commit_status| after_transition [:created, :pending, :running] => :success do |commit_status|
MergeRequests::MergeWhenBuildSucceedsService.new(commit_status.pipeline.project, nil).trigger(commit_status) MergeRequests::MergeWhenBuildSucceedsService.new(commit_status.pipeline.project, nil).trigger(commit_status)
end end
after_transition any => :failed do |commit_status| after_transition any => :failed do |commit_status|
MergeRequests::AddTodoWhenBuildFailsService.new(commit_status.pipeline.project, nil).execute(commit_status) MergeRequests::AddTodoWhenBuildFailsService.new(commit_status.pipeline.project, nil).execute(commit_status)
end end
# We use around_transition to process pipeline on next stages as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |commit_status, block|
block.call
commit_status.pipeline.process! if commit_status.pipeline
end
end end
delegate :sha, :short_sha, to: :pipeline delegate :sha, :short_sha, to: :pipeline
......
module Statuseable module Statuseable
extend ActiveSupport::Concern extend ActiveSupport::Concern
AVAILABLE_STATUSES = %w(pending running success failed canceled skipped) AVAILABLE_STATUSES = %w[created pending running success failed canceled skipped]
STARTED_STATUSES = %w[running success failed skipped]
ACTIVE_STATUSES = %w[pending running]
COMPLETED_STATUSES = %w[success failed canceled]
class_methods do class_methods do
def status_sql def status_sql
builds = all.select('count(*)').to_sql scope = all.relevant
success = all.success.select('count(*)').to_sql builds = scope.select('count(*)').to_sql
ignored = all.ignored.select('count(*)').to_sql if all.respond_to?(:ignored) success = scope.success.select('count(*)').to_sql
ignored = scope.ignored.select('count(*)').to_sql if scope.respond_to?(:ignored)
ignored ||= '0' ignored ||= '0'
pending = all.pending.select('count(*)').to_sql pending = scope.pending.select('count(*)').to_sql
running = all.running.select('count(*)').to_sql running = scope.running.select('count(*)').to_sql
canceled = all.canceled.select('count(*)').to_sql canceled = scope.canceled.select('count(*)').to_sql
skipped = all.skipped.select('count(*)').to_sql skipped = scope.skipped.select('count(*)').to_sql
deduce_status = "(CASE deduce_status = "(CASE
WHEN (#{builds})=0 THEN NULL WHEN (#{builds})=0 THEN NULL
...@@ -48,7 +52,8 @@ module Statuseable ...@@ -48,7 +52,8 @@ module Statuseable
included do included do
validates :status, inclusion: { in: AVAILABLE_STATUSES } validates :status, inclusion: { in: AVAILABLE_STATUSES }
state_machine :status, initial: :pending do state_machine :status, initial: :created do
state :created, value: 'created'
state :pending, value: 'pending' state :pending, value: 'pending'
state :running, value: 'running' state :running, value: 'running'
state :failed, value: 'failed' state :failed, value: 'failed'
...@@ -57,6 +62,8 @@ module Statuseable ...@@ -57,6 +62,8 @@ module Statuseable
state :skipped, value: 'skipped' state :skipped, value: 'skipped'
end end
scope :created, -> { where(status: 'created') }
scope :relevant, -> { where.not(status: 'created') }
scope :running, -> { where(status: 'running') } scope :running, -> { where(status: 'running') }
scope :pending, -> { where(status: 'pending') } scope :pending, -> { where(status: 'pending') }
scope :success, -> { where(status: 'success') } scope :success, -> { where(status: 'success') }
...@@ -68,14 +75,14 @@ module Statuseable ...@@ -68,14 +75,14 @@ module Statuseable
end end
def started? def started?
!pending? && !canceled? && started_at STARTED_STATUSES.include?(status) && started_at
end end
def active? def active?
running? || pending? ACTIVE_STATUSES.include?(status)
end end
def complete? def complete?
canceled? || success? || failed? COMPLETED_STATUSES.include?(status)
end end
end end
module Ci
class CreateBuildsService
def initialize(pipeline)
@pipeline = pipeline
@config = pipeline.config_processor
end
def execute(stage, user, status, trigger_request = nil)
builds_attrs = @config.builds_for_stage_and_ref(stage, @pipeline.ref, @pipeline.tag, trigger_request)
# check when to create next build
builds_attrs = builds_attrs.select do |build_attrs|
case build_attrs[:when]
when 'on_success'
status == 'success'
when 'on_failure'
status == 'failed'
when 'always', 'manual'
%w(success failed).include?(status)
end
end
# don't create the same build twice
builds_attrs.reject! do |build_attrs|
@pipeline.builds.find_by(ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
name: build_attrs[:name])
end
builds_attrs.map do |build_attrs|
build_attrs.slice!(:name,
:commands,
:tag_list,
:options,
:allow_failure,
:stage,
:stage_idx,
:environment,
:when,
:yaml_variables)
build_attrs.merge!(pipeline: @pipeline,
ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
user: user,
project: @pipeline.project)
# TODO: The proper implementation for this is in
# https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/5295
build_attrs[:status] = 'skipped' if build_attrs[:when] == 'manual'
##
# We do not persist new builds here.
# Those will be persisted when @pipeline is saved.
#
@pipeline.builds.new(build_attrs)
end
end
end
end
module Ci
class CreatePipelineBuildsService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
new_builds.map do |build_attributes|
create_build(build_attributes)
end
end
private
def create_build(build_attributes)
build_attributes = build_attributes.merge(
pipeline: pipeline,
project: pipeline.project,
ref: pipeline.ref,
tag: pipeline.tag,
user: current_user,
trigger_request: trigger_request
)
pipeline.builds.create(build_attributes)
end
def new_builds
@new_builds ||= pipeline.config_builds_attributes.
reject { |build| existing_build_names.include?(build[:name]) }
end
def existing_build_names
@existing_build_names ||= pipeline.builds.pluck(:name)
end
def trigger_request
return @trigger_request if defined?(@trigger_request)
@trigger_request ||= pipeline.trigger_requests.first
end
end
end
module Ci module Ci
class CreatePipelineService < BaseService class CreatePipelineService < BaseService
def execute attr_reader :pipeline
pipeline = project.pipelines.new(params)
pipeline.user = current_user
unless ref_names.include?(params[:ref]) def execute(ignore_skip_ci: false, save_on_errors: true, trigger_request: nil)
pipeline.errors.add(:base, 'Reference not found') @pipeline = Ci::Pipeline.new(
return pipeline project: project,
ref: ref,
sha: sha,
before_sha: before_sha,
tag: tag?,
trigger_requests: Array(trigger_request),
user: current_user
)
unless project.builds_enabled?
return error('Pipeline is disabled')
end
unless trigger_request || can?(current_user, :create_pipeline, project)
return error('Insufficient permissions to create a new pipeline')
end end
if commit unless branch? || tag?
pipeline.sha = commit.id return error('Reference not found')
else
pipeline.errors.add(:base, 'Commit not found')
return pipeline
end end
unless can?(current_user, :create_pipeline, project) unless commit
pipeline.errors.add(:base, 'Insufficient permissions to create a new pipeline') return error('Commit not found')
return pipeline
end end
unless pipeline.config_processor unless pipeline.config_processor
pipeline.errors.add(:base, pipeline.yaml_errors || 'Missing .gitlab-ci.yml file') unless pipeline.ci_yaml_file
return pipeline return error('Missing .gitlab-ci.yml file')
end
return error(pipeline.yaml_errors, save: save_on_errors)
end end
pipeline.save! if !ignore_skip_ci && skip_ci?
return error('Creation of pipeline is skipped', save: save_on_errors)
end
unless pipeline.create_builds(current_user) unless pipeline.config_builds_attributes.present?
pipeline.errors.add(:base, 'No builds for this pipeline.') return error('No builds for this pipeline.')
end end
pipeline.save pipeline.save
pipeline.process!
pipeline pipeline
end end
private private
def ref_names def skip_ci?
@ref_names ||= project.repository.ref_names pipeline.git_commit_message =~ /\[(ci skip|skip ci)\]/i if pipeline.git_commit_message
end end
def commit def commit
@commit ||= project.commit(params[:ref]) @commit ||= project.commit(origin_sha || origin_ref)
end
def sha
commit.try(:id)
end
def before_sha
params[:checkout_sha] || params[:before] || Gitlab::Git::BLANK_SHA
end
def origin_sha
params[:checkout_sha] || params[:after]
end
def origin_ref
params[:ref]
end
def branch?
project.repository.ref_exists?(Gitlab::Git::BRANCH_REF_PREFIX + ref)
end
def tag?
project.repository.ref_exists?(Gitlab::Git::TAG_REF_PREFIX + ref)
end
def ref
Gitlab::Git.ref_name(origin_ref)
end
def valid_sha?
origin_sha && origin_sha != Gitlab::Git::BLANK_SHA
end
def error(message, save: false)
pipeline.errors.add(:base, message)
pipeline.reload_status! if save
pipeline
end end
end end
end end
module Ci module Ci
class CreateTriggerRequestService class CreateTriggerRequestService
def execute(project, trigger, ref, variables = nil) def execute(project, trigger, ref, variables = nil)
commit = project.commit(ref) trigger_request = trigger.trigger_requests.create(variables: variables)
return unless commit
# check if ref is tag pipeline = Ci::CreatePipelineService.new(project, nil, ref: ref).
tag = project.repository.find_tag(ref).present? execute(ignore_skip_ci: true, trigger_request: trigger_request)
if pipeline.persisted?
pipeline = project.pipelines.create(sha: commit.sha, ref: ref, tag: tag)
trigger_request = trigger.trigger_requests.create!(
variables: variables,
pipeline: pipeline,
)
if pipeline.create_builds(nil, trigger_request)
trigger_request trigger_request
end end
end end
......
module Ci
class ProcessPipelineService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
# This method will ensure that our pipeline does have all builds for all stages created
if created_builds.empty?
create_builds!
end
new_builds =
stage_indexes_of_created_builds.map do |index|
process_stage(index)
end
# Return a flag if a when builds got enqueued
new_builds.flatten.any?
end
private
def create_builds!
Ci::CreatePipelineBuildsService.new(project, current_user).execute(pipeline)
end
def process_stage(index)
current_status = status_for_prior_stages(index)
created_builds_in_stage(index).select do |build|
process_build(build, current_status)
end
end
def process_build(build, current_status)
return false unless Statuseable::COMPLETED_STATUSES.include?(current_status)
if valid_statuses_for_when(build.when).include?(current_status)
build.queue
true
else
build.skip
false
end
end
def valid_statuses_for_when(value)
case value
when 'on_success'
%w[success]
when 'on_failure'
%w[failed]
when 'always'
%w[success failed]
else
[]
end
end
def status_for_prior_stages(index)
pipeline.builds.where('stage_idx < ?', index).latest.status || 'success'
end
def stage_indexes_of_created_builds
created_builds.order(:stage_idx).pluck('distinct stage_idx')
end
def created_builds_in_stage(index)
created_builds.where(stage_idx: index)
end
def created_builds
pipeline.builds.created
end
end
end
class CreateCommitBuildsService
def execute(project, user, params)
return unless project.builds_enabled?
before_sha = params[:checkout_sha] || params[:before]
sha = params[:checkout_sha] || params[:after]
origin_ref = params[:ref]
ref = Gitlab::Git.ref_name(origin_ref)
tag = Gitlab::Git.tag_ref?(origin_ref)
# Skip branch removal
if sha == Gitlab::Git::BLANK_SHA
return false
end
@pipeline = Ci::Pipeline.new(
project: project,
sha: sha,
ref: ref,
before_sha: before_sha,
tag: tag,
user: user)
##
# Skip creating pipeline if no gitlab-ci.yml is found
#
unless @pipeline.ci_yaml_file
return false
end
##
# Skip creating builds for commits that have [ci skip]
# but save pipeline object
#
if @pipeline.skip_ci?
return save_pipeline!
end
##
# Skip creating builds when CI config is invalid
# but save pipeline object
#
unless @pipeline.config_processor
return save_pipeline!
end
##
# Skip creating pipeline object if there are no builds for it.
#
unless @pipeline.create_builds(user)
@pipeline.errors.add(:base, 'No builds created')
return false
end
save_pipeline!
end
private
##
# Create a new pipeline and touch object to calculate status
#
def save_pipeline!
@pipeline.save!
@pipeline.touch
@pipeline
end
end
...@@ -69,7 +69,7 @@ class GitPushService < BaseService ...@@ -69,7 +69,7 @@ class GitPushService < BaseService
SystemHooksService.new.execute_hooks(build_push_data_system_hook.dup, :push_hooks) SystemHooksService.new.execute_hooks(build_push_data_system_hook.dup, :push_hooks)
@project.execute_hooks(build_push_data.dup, :push_hooks) @project.execute_hooks(build_push_data.dup, :push_hooks)
@project.execute_services(build_push_data.dup, :push_hooks) @project.execute_services(build_push_data.dup, :push_hooks)
CreateCommitBuildsService.new.execute(@project, current_user, build_push_data) Ci::CreatePipelineService.new(project, current_user, build_push_data).execute
ProjectCacheWorker.perform_async(@project.id) ProjectCacheWorker.perform_async(@project.id)
end end
......
...@@ -11,7 +11,7 @@ class GitTagPushService < BaseService ...@@ -11,7 +11,7 @@ class GitTagPushService < BaseService
SystemHooksService.new.execute_hooks(build_system_push_data.dup, :tag_push_hooks) SystemHooksService.new.execute_hooks(build_system_push_data.dup, :tag_push_hooks)
project.execute_hooks(@push_data.dup, :tag_push_hooks) project.execute_hooks(@push_data.dup, :tag_push_hooks)
project.execute_services(@push_data.dup, :tag_push_hooks) project.execute_services(@push_data.dup, :tag_push_hooks)
CreateCommitBuildsService.new.execute(project, current_user, @push_data) Ci::CreatePipelineService.new(project, current_user, @push_data).execute
ProjectCacheWorker.perform_async(project.id) ProjectCacheWorker.perform_async(project.id)
true true
......
...@@ -33,7 +33,7 @@ ...@@ -33,7 +33,7 @@
Cant find HEAD commit for this branch Cant find HEAD commit for this branch
- stages_status = pipeline.statuses.latest.stages_status - stages_status = pipeline.statuses.relevant.latest.stages_status
- stages.each do |stage| - stages.each do |stage|
%td.stage-cell %td.stage-cell
- status = stages_status[stage] - status = stages_status[stage]
......
...@@ -46,5 +46,5 @@ ...@@ -46,5 +46,5 @@
- if pipeline.project.build_coverage_enabled? - if pipeline.project.build_coverage_enabled?
%th Coverage %th Coverage
%th %th
- pipeline.statuses.stages.each do |stage| - pipeline.statuses.relevant.stages.each do |stage|
= render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.where(stage: stage) = render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.relevant.where(stage: stage)
class AddQueuedAtToCiBuilds < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
def change
add_column :ci_builds, :queued_at, :timestamp
end
end
...@@ -170,6 +170,7 @@ ActiveRecord::Schema.define(version: 20160810142633) do ...@@ -170,6 +170,7 @@ ActiveRecord::Schema.define(version: 20160810142633) do
t.integer "artifacts_size" t.integer "artifacts_size"
t.string "when" t.string "when"
t.text "yaml_variables" t.text "yaml_variables"
t.datetime "queued_at"
end end
add_index "ci_builds", ["commit_id", "stage_idx", "created_at"], name: "index_ci_builds_on_commit_id_and_stage_idx_and_created_at", using: :btree add_index "ci_builds", ["commit_id", "stage_idx", "created_at"], name: "index_ci_builds_on_commit_id_and_stage_idx_and_created_at", using: :btree
......
...@@ -10,20 +10,22 @@ module SharedBuilds ...@@ -10,20 +10,22 @@ module SharedBuilds
end end
step 'project has a recent build' do step 'project has a recent build' do
@pipeline = create(:ci_pipeline, project: @project, sha: @project.commit.sha, ref: 'master') @pipeline = create(:ci_empty_pipeline, project: @project, sha: @project.commit.sha, ref: 'master')
@build = create(:ci_build_with_coverage, pipeline: @pipeline) @build = create(:ci_build_with_coverage, pipeline: @pipeline)
@pipeline.reload_status!
end end
step 'recent build is successful' do step 'recent build is successful' do
@build.update(status: 'success') @build.success
end end
step 'recent build failed' do step 'recent build failed' do
@build.update(status: 'failed') @build.drop
end end
step 'project has another build that is running' do step 'project has another build that is running' do
create(:ci_build, pipeline: @pipeline, name: 'second build', status: 'running') create(:ci_build, pipeline: @pipeline, name: 'second build', status: 'running')
@pipeline.reload_status!
end end
step 'I visit recent build details page' do step 'I visit recent build details page' do
......
...@@ -62,7 +62,7 @@ module Ci ...@@ -62,7 +62,7 @@ module Ci
# - before script should be a concatenated command # - before script should be a concatenated command
commands: [job[:before_script] || @before_script, job[:script]].flatten.compact.join("\n"), commands: [job[:before_script] || @before_script, job[:script]].flatten.compact.join("\n"),
tag_list: job[:tags] || [], tag_list: job[:tags] || [],
name: job[:name], name: job[:name].to_s,
allow_failure: job[:allow_failure] || false, allow_failure: job[:allow_failure] || false,
when: job[:when] || 'on_success', when: job[:when] || 'on_success',
environment: job[:environment], environment: job[:environment],
......
...@@ -7,6 +7,7 @@ FactoryGirl.define do ...@@ -7,6 +7,7 @@ FactoryGirl.define do
stage_idx 0 stage_idx 0
ref 'master' ref 'master'
tag false tag false
status 'pending'
created_at 'Di 29. Okt 09:50:00 CET 2013' created_at 'Di 29. Okt 09:50:00 CET 2013'
started_at 'Di 29. Okt 09:51:28 CET 2013' started_at 'Di 29. Okt 09:51:28 CET 2013'
finished_at 'Di 29. Okt 09:53:28 CET 2013' finished_at 'Di 29. Okt 09:53:28 CET 2013'
...@@ -45,6 +46,10 @@ FactoryGirl.define do ...@@ -45,6 +46,10 @@ FactoryGirl.define do
status 'pending' status 'pending'
end end
trait :created do
status 'created'
end
trait :manual do trait :manual do
status 'skipped' status 'skipped'
self.when 'manual' self.when 'manual'
......
...@@ -18,7 +18,9 @@ ...@@ -18,7 +18,9 @@
FactoryGirl.define do FactoryGirl.define do
factory :ci_empty_pipeline, class: Ci::Pipeline do factory :ci_empty_pipeline, class: Ci::Pipeline do
ref 'master'
sha '97de212e80737a608d939f648d959671fb0a0142' sha '97de212e80737a608d939f648d959671fb0a0142'
status 'pending'
project factory: :empty_project project factory: :empty_project
......
...@@ -7,6 +7,30 @@ FactoryGirl.define do ...@@ -7,6 +7,30 @@ FactoryGirl.define do
started_at 'Tue, 26 Jan 2016 08:21:42 +0100' started_at 'Tue, 26 Jan 2016 08:21:42 +0100'
finished_at 'Tue, 26 Jan 2016 08:23:42 +0100' finished_at 'Tue, 26 Jan 2016 08:23:42 +0100'
trait :success do
status 'success'
end
trait :failed do
status 'failed'
end
trait :canceled do
status 'canceled'
end
trait :running do
status 'running'
end
trait :pending do
status 'pending'
end
trait :created do
status 'created'
end
after(:build) do |build, evaluator| after(:build) do |build, evaluator|
build.project = build.pipeline.project build.project = build.pipeline.project
end end
......
...@@ -29,12 +29,16 @@ feature 'Merge request created from fork' do ...@@ -29,12 +29,16 @@ feature 'Merge request created from fork' do
include WaitForAjax include WaitForAjax
given(:pipeline) do given(:pipeline) do
create(:ci_pipeline_with_two_job, project: fork_project, create(:ci_pipeline,
project: fork_project,
sha: merge_request.diff_head_sha, sha: merge_request.diff_head_sha,
ref: merge_request.source_branch) ref: merge_request.source_branch)
end end
background { pipeline.create_builds(user) } background do
create(:ci_build, pipeline: pipeline, name: 'rspec')
create(:ci_build, pipeline: pipeline, name: 'spinach')
end
scenario 'user visits a pipelines page', js: true do scenario 'user visits a pipelines page', js: true do
visit_merge_request(merge_request) visit_merge_request(merge_request)
......
...@@ -33,7 +33,10 @@ describe "Pipelines" do ...@@ -33,7 +33,10 @@ describe "Pipelines" do
context 'cancelable pipeline' do context 'cancelable pipeline' do
let!(:running) { create(:ci_build, :running, pipeline: pipeline, stage: 'test', commands: 'test') } let!(:running) { create(:ci_build, :running, pipeline: pipeline, stage: 'test', commands: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) } before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it { expect(page).to have_link('Cancel') } it { expect(page).to have_link('Cancel') }
it { expect(page).to have_selector('.ci-running') } it { expect(page).to have_selector('.ci-running') }
...@@ -49,7 +52,10 @@ describe "Pipelines" do ...@@ -49,7 +52,10 @@ describe "Pipelines" do
context 'retryable pipelines' do context 'retryable pipelines' do
let!(:failed) { create(:ci_build, :failed, pipeline: pipeline, stage: 'test', commands: 'test') } let!(:failed) { create(:ci_build, :failed, pipeline: pipeline, stage: 'test', commands: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) } before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it { expect(page).to have_link('Retry') } it { expect(page).to have_link('Retry') }
it { expect(page).to have_selector('.ci-failed') } it { expect(page).to have_selector('.ci-failed') }
...@@ -80,7 +86,10 @@ describe "Pipelines" do ...@@ -80,7 +86,10 @@ describe "Pipelines" do
context 'when running' do context 'when running' do
let!(:running) { create(:generic_commit_status, status: 'running', pipeline: pipeline, stage: 'test') } let!(:running) { create(:generic_commit_status, status: 'running', pipeline: pipeline, stage: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) } before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it 'is not cancelable' do it 'is not cancelable' do
expect(page).not_to have_link('Cancel') expect(page).not_to have_link('Cancel')
...@@ -92,9 +101,12 @@ describe "Pipelines" do ...@@ -92,9 +101,12 @@ describe "Pipelines" do
end end
context 'when failed' do context 'when failed' do
let!(:running) { create(:generic_commit_status, status: 'failed', pipeline: pipeline, stage: 'test') } let!(:failed) { create(:generic_commit_status, status: 'failed', pipeline: pipeline, stage: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) } before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it 'is not retryable' do it 'is not retryable' do
expect(page).not_to have_link('Retry') expect(page).not_to have_link('Retry')
...@@ -211,7 +223,7 @@ describe "Pipelines" do ...@@ -211,7 +223,7 @@ describe "Pipelines" do
context 'for invalid commit' do context 'for invalid commit' do
before do before do
fill_in('Create for', with: 'invalid reference') fill_in('Create for', with: 'invalid-reference')
click_on 'Create pipeline' click_on 'Create pipeline'
end end
......
...@@ -5,6 +5,7 @@ describe Ci::Charts, lib: true do ...@@ -5,6 +5,7 @@ describe Ci::Charts, lib: true do
before do before do
@pipeline = FactoryGirl.create(:ci_pipeline) @pipeline = FactoryGirl.create(:ci_pipeline)
FactoryGirl.create(:ci_build, pipeline: @pipeline) FactoryGirl.create(:ci_build, pipeline: @pipeline)
@pipeline.reload_status!
end end
it 'returns build times in minutes' do it 'returns build times in minutes' do
......
...@@ -19,7 +19,7 @@ module Ci ...@@ -19,7 +19,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref(type, "master").first).to eq({ expect(config_processor.builds_for_stage_and_ref(type, "master").first).to eq({
stage: "test", stage: "test",
stage_idx: 1, stage_idx: 1,
name: :rspec, name: "rspec",
commands: "pwd\nrspec", commands: "pwd\nrspec",
tag_list: [], tag_list: [],
options: {}, options: {},
...@@ -433,7 +433,7 @@ module Ci ...@@ -433,7 +433,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({ expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test", stage: "test",
stage_idx: 1, stage_idx: 1,
name: :rspec, name: "rspec",
commands: "pwd\nrspec", commands: "pwd\nrspec",
tag_list: [], tag_list: [],
options: { options: {
...@@ -461,7 +461,7 @@ module Ci ...@@ -461,7 +461,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({ expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test", stage: "test",
stage_idx: 1, stage_idx: 1,
name: :rspec, name: "rspec",
commands: "pwd\nrspec", commands: "pwd\nrspec",
tag_list: [], tag_list: [],
options: { options: {
...@@ -700,7 +700,7 @@ module Ci ...@@ -700,7 +700,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({ expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test", stage: "test",
stage_idx: 1, stage_idx: 1,
name: :rspec, name: "rspec",
commands: "pwd\nrspec", commands: "pwd\nrspec",
tag_list: [], tag_list: [],
options: { options: {
...@@ -837,7 +837,7 @@ module Ci ...@@ -837,7 +837,7 @@ module Ci
expect(subject.first).to eq({ expect(subject.first).to eq({
stage: "test", stage: "test",
stage_idx: 1, stage_idx: 1,
name: :normal_job, name: "normal_job",
commands: "test", commands: "test",
tag_list: [], tag_list: [],
options: {}, options: {},
...@@ -882,7 +882,7 @@ module Ci ...@@ -882,7 +882,7 @@ module Ci
expect(subject.first).to eq({ expect(subject.first).to eq({
stage: "build", stage: "build",
stage_idx: 0, stage_idx: 0,
name: :job1, name: "job1",
commands: "execute-script-for-job", commands: "execute-script-for-job",
tag_list: [], tag_list: [],
options: {}, options: {},
...@@ -894,7 +894,7 @@ module Ci ...@@ -894,7 +894,7 @@ module Ci
expect(subject.second).to eq({ expect(subject.second).to eq({
stage: "build", stage: "build",
stage_idx: 0, stage_idx: 0,
name: :job2, name: "job2",
commands: "execute-script-for-job", commands: "execute-script-for-job",
tag_list: [], tag_list: [],
options: {}, options: {},
......
...@@ -96,7 +96,8 @@ describe Gitlab::Badge::Build do ...@@ -96,7 +96,8 @@ describe Gitlab::Badge::Build do
end end
def create_build(project, sha, branch) def create_build(project, sha, branch)
pipeline = create(:ci_pipeline, project: project, pipeline = create(:ci_empty_pipeline,
project: project,
sha: sha, sha: sha,
ref: branch) ref: branch)
......
...@@ -764,6 +764,53 @@ describe Ci::Build, models: true do ...@@ -764,6 +764,53 @@ describe Ci::Build, models: true do
end end
end end
describe '#when' do
subject { build.when }
context 'if is undefined' do
before do
build.when = nil
end
context 'use from gitlab-ci.yml' do
before do
stub_ci_pipeline_yaml_file(config)
end
context 'if config is not found' do
let(:config) { nil }
it { is_expected.to eq('on_success') }
end
context 'if config does not have a questioned job' do
let(:config) do
YAML.dump({
test_other: {
script: 'Hello World'
}
})
end
it { is_expected.to eq('on_success') }
end
context 'if config has when' do
let(:config) do
YAML.dump({
test: {
script: 'Hello World',
when: 'always'
}
})
end
it { is_expected.to eq('always') }
end
end
end
end
describe '#retryable?' do describe '#retryable?' do
context 'when build is running' do context 'when build is running' do
before do before do
......
...@@ -38,9 +38,6 @@ describe Ci::Pipeline, models: true do ...@@ -38,9 +38,6 @@ describe Ci::Pipeline, models: true do
it { expect(pipeline.sha).to start_with(subject) } it { expect(pipeline.sha).to start_with(subject) }
end end
describe '#create_next_builds' do
end
describe '#retried' do describe '#retried' do
subject { pipeline.retried } subject { pipeline.retried }
...@@ -54,304 +51,20 @@ describe Ci::Pipeline, models: true do ...@@ -54,304 +51,20 @@ describe Ci::Pipeline, models: true do
end end
end end
describe '#create_builds' do
let!(:pipeline) { FactoryGirl.create :ci_pipeline, project: project, ref: 'master', tag: false }
def create_builds(trigger_request = nil)
pipeline.create_builds(nil, trigger_request)
end
def create_next_builds
pipeline.create_next_builds(pipeline.builds.order(:id).last)
end
it 'creates builds' do
expect(create_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(2)
expect(create_next_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(4)
expect(create_next_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(5)
expect(create_next_builds).to be_falsey
end
context 'custom stage with first job allowed to fail' do
let(:yaml) do
{
stages: ['clean', 'test'],
clean_job: {
stage: 'clean',
allow_failure: true,
script: 'BUILD',
},
test_job: {
stage: 'test',
script: 'TEST',
},
}
end
before do
stub_ci_pipeline_yaml_file(YAML.dump(yaml))
create_builds
end
it 'properly schedules builds' do
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending', 'failed')
end
end
context 'properly creates builds when "when" is defined' do
let(:yaml) do
{
stages: ["build", "test", "test_failure", "deploy", "cleanup"],
build: {
stage: "build",
script: "BUILD",
},
test: {
stage: "test",
script: "TEST",
},
test_failure: {
stage: "test_failure",
script: "ON test failure",
when: "on_failure",
},
deploy: {
stage: "deploy",
script: "PUBLISH",
},
cleanup: {
stage: "cleanup",
script: "TIDY UP",
when: "always",
}
}
end
before do
stub_ci_pipeline_yaml_file(YAML.dump(yaml))
end
context 'when builds are successful' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('success')
end
end
context 'when test job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when test and test_failure jobs fail' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when deploy job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when build is canceled in the second stage' do
it 'does not schedule builds after build has been canceled' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.running_or_pending).not_to be_empty
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:cancel)
expect(pipeline.builds.running_or_pending).to be_empty
expect(pipeline.reload.status).to eq('canceled')
end
end
context 'when listing manual actions' do
let(:yaml) do
{
stages: ["build", "test", "staging", "production", "cleanup"],
build: {
stage: "build",
script: "BUILD",
},
test: {
stage: "test",
script: "TEST",
},
staging: {
stage: "staging",
script: "PUBLISH",
},
production: {
stage: "production",
script: "PUBLISH",
when: "manual",
},
cleanup: {
stage: "cleanup",
script: "TIDY UP",
when: "always",
},
clear_cache: {
stage: "cleanup",
script: "CLEAR CACHE",
when: "manual",
}
}
end
it 'returns only for skipped builds' do
# currently all builds are created
expect(create_builds).to be_truthy
expect(manual_actions).to be_empty
# succeed stage build
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage test
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage staging and skip stage production
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_many # production and clear cache
# succeed stage cleanup
pipeline.builds.running_or_pending.each(&:success)
# after processing a pipeline we should have 6 builds, 5 succeeded
expect(pipeline.builds.count).to eq(6)
expect(pipeline.builds.success.count).to eq(4)
end
def manual_actions
pipeline.manual_actions
end
end
end
context 'when no builds created' do
let(:pipeline) { build(:ci_pipeline) }
before do
stub_ci_pipeline_yaml_file(YAML.dump(before_script: ['ls']))
end
it 'returns false' do
expect(pipeline.create_builds(nil)).to be_falsey
expect(pipeline).not_to be_persisted
end
end
end
describe "#finished_at" do describe "#finished_at" do
let(:pipeline) { FactoryGirl.create :ci_pipeline } let(:pipeline) { FactoryGirl.create :ci_pipeline }
it "returns finished_at of latest build" do it "returns finished_at of latest build" do
build = FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 60 build = FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 60
FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 120 FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 120
pipeline.reload_status!
expect(pipeline.finished_at.to_i).to eq(build.finished_at.to_i) expect(pipeline.finished_at.to_i).to eq(build.finished_at.to_i)
end end
it "returns nil if there is no finished build" do it "returns nil if there is no finished build" do
FactoryGirl.create :ci_not_started_build, pipeline: pipeline FactoryGirl.create :ci_not_started_build, pipeline: pipeline
pipeline.reload_status!
expect(pipeline.finished_at).to be_nil expect(pipeline.finished_at).to be_nil
end end
...@@ -359,7 +72,7 @@ describe Ci::Pipeline, models: true do ...@@ -359,7 +72,7 @@ describe Ci::Pipeline, models: true do
describe "coverage" do describe "coverage" do
let(:project) { FactoryGirl.create :empty_project, build_coverage_regex: "/.*/" } let(:project) { FactoryGirl.create :empty_project, build_coverage_regex: "/.*/" }
let(:pipeline) { FactoryGirl.create :ci_pipeline, project: project } let(:pipeline) { FactoryGirl.create :ci_empty_pipeline, project: project }
it "calculates average when there are two builds with coverage" do it "calculates average when there are two builds with coverage" do
FactoryGirl.create :ci_build, name: "rspec", coverage: 30, pipeline: pipeline FactoryGirl.create :ci_build, name: "rspec", coverage: 30, pipeline: pipeline
...@@ -426,31 +139,30 @@ describe Ci::Pipeline, models: true do ...@@ -426,31 +139,30 @@ describe Ci::Pipeline, models: true do
end end
end end
describe '#update_state' do describe '#reload_status!' do
it 'executes update_state after touching object' do let(:pipeline) { create :ci_empty_pipeline, project: project }
expect(pipeline).to receive(:update_state).and_return(true)
pipeline.touch
end
context 'dependent objects' do context 'dependent objects' do
let(:commit_status) { build :commit_status, pipeline: pipeline } let(:commit_status) { create :commit_status, :pending, pipeline: pipeline }
it 'executes reload_status! after succeeding dependent object' do
expect(pipeline).to receive(:reload_status!).and_return(true)
it 'executes update_state after saving dependent object' do commit_status.success
expect(pipeline).to receive(:update_state).and_return(true)
commit_status.save
end end
end end
context 'update state' do context 'updates' do
let(:current) { Time.now.change(usec: 0) } let(:current) { Time.now.change(usec: 0) }
let(:build) { FactoryGirl.create :ci_build, :success, pipeline: pipeline, started_at: current - 120, finished_at: current - 60 } let(:build) { FactoryGirl.create :ci_build, pipeline: pipeline, started_at: current - 120, finished_at: current - 60 }
before do before do
build build
pipeline.reload_status!
end end
[:status, :started_at, :finished_at, :duration].each do |param| [:status, :started_at, :finished_at, :duration].each do |param|
it "update #{param}" do it "#{param}" do
expect(pipeline.send(param)).to eq(build.send(param)) expect(pipeline.send(param)).to eq(build.send(param))
end end
end end
......
...@@ -291,7 +291,8 @@ describe HipchatService, models: true do ...@@ -291,7 +291,8 @@ describe HipchatService, models: true do
end end
context 'build events' do context 'build events' do
let(:build) { create(:ci_build) } let(:pipeline) { create(:ci_empty_pipeline) }
let(:build) { create(:ci_build, pipeline: pipeline) }
let(:data) { Gitlab::BuildDataBuilder.build(build) } let(:data) { Gitlab::BuildDataBuilder.build(build) }
context 'for failed' do context 'for failed' do
......
...@@ -238,6 +238,10 @@ describe API::API, api: true do ...@@ -238,6 +238,10 @@ describe API::API, api: true do
it { expect(response.headers).to include(download_headers) } it { expect(response.headers).to include(download_headers) }
end end
before do
pipeline.reload_status!
end
context 'with regular branch' do context 'with regular branch' do
before do before do
pipeline.update(ref: 'master', pipeline.update(ref: 'master',
......
...@@ -89,16 +89,29 @@ describe API::API, api: true do ...@@ -89,16 +89,29 @@ describe API::API, api: true do
it "returns nil for commit without CI" do it "returns nil for commit without CI" do
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user) get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200) expect(response).to have_http_status(200)
expect(json_response['status']).to be_nil expect(json_response['status']).to be_nil
end end
it "returns status for CI" do it "returns status for CI" do
pipeline = project.ensure_pipeline(project.repository.commit.sha, 'master') pipeline = project.ensure_pipeline(project.repository.commit.sha, 'master')
pipeline.update(status: 'success')
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user) get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200) expect(response).to have_http_status(200)
expect(json_response['status']).to eq(pipeline.status) expect(json_response['status']).to eq(pipeline.status)
end end
it "returns status for CI when pipeline is created" do
project.ensure_pipeline(project.repository.commit.sha, 'master')
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200)
expect(json_response['status']).to be_nil
end
end end
context "unauthorized user" do context "unauthorized user" do
......
...@@ -50,7 +50,8 @@ describe API::API do ...@@ -50,7 +50,8 @@ describe API::API do
post api("/projects/#{project.id}/trigger/builds"), options.merge(ref: 'master') post api("/projects/#{project.id}/trigger/builds"), options.merge(ref: 'master')
expect(response).to have_http_status(201) expect(response).to have_http_status(201)
pipeline.builds.reload pipeline.builds.reload
expect(pipeline.builds.size).to eq(2) expect(pipeline.builds.pending.size).to eq(2)
expect(pipeline.builds.size).to eq(5)
end end
it 'returns bad request with no builds created if there\'s no commit for that ref' do it 'returns bad request with no builds created if there\'s no commit for that ref' do
......
...@@ -6,90 +6,74 @@ describe Ci::API::API do ...@@ -6,90 +6,74 @@ describe Ci::API::API do
let(:runner) { FactoryGirl.create(:ci_runner, tag_list: ["mysql", "ruby"]) } let(:runner) { FactoryGirl.create(:ci_runner, tag_list: ["mysql", "ruby"]) }
let(:project) { FactoryGirl.create(:empty_project) } let(:project) { FactoryGirl.create(:empty_project) }
before do
stub_ci_pipeline_to_return_yaml_file
end
describe "Builds API for runners" do describe "Builds API for runners" do
let(:shared_runner) { FactoryGirl.create(:ci_runner, token: "SharedRunner") } let(:pipeline) { create(:ci_pipeline_without_jobs, project: project, ref: 'master') }
let(:shared_project) { FactoryGirl.create(:empty_project, name: "SharedProject") }
before do before do
FactoryGirl.create :ci_runner_project, project: project, runner: runner project.runners << runner
end end
describe "POST /builds/register" do describe "POST /builds/register" do
it "starts a build" do let!(:build) { create(:ci_build, pipeline: pipeline, name: 'spinach', stage: 'test', stage_idx: 0) }
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
pipeline.create_builds(nil)
build = pipeline.builds.first
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin } it "starts a build" do
register_builds info: { platform: :darwin }
expect(response).to have_http_status(201) expect(response).to have_http_status(201)
expect(json_response['sha']).to eq(build.sha) expect(json_response['sha']).to eq(build.sha)
expect(runner.reload.platform).to eq("darwin") expect(runner.reload.platform).to eq("darwin")
expect(json_response["options"]).to eq({ "image" => "ruby:2.1", "services" => ["postgres"] })
expect(json_response["variables"]).to include(
{ "key" => "CI_BUILD_NAME", "value" => "spinach", "public" => true },
{ "key" => "CI_BUILD_STAGE", "value" => "test", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true }
)
end end
it "returns 404 error if no pending build found" do context 'when builds are finished' do
post ci_api("/builds/register"), token: runner.token before do
build.success
expect(response).to have_http_status(404)
end end
it "returns 404 error if no builds for specific runner" do it "returns 404 error if no builds for specific runner" do
pipeline = FactoryGirl.create(:ci_pipeline, project: shared_project) register_builds
FactoryGirl.create(:ci_build, pipeline: pipeline, status: 'pending')
post ci_api("/builds/register"), token: runner.token
expect(response).to have_http_status(404) expect(response).to have_http_status(404)
end end
end
it "returns 404 error if no builds for shared runner" do context 'for other project with builds' do
pipeline = FactoryGirl.create(:ci_pipeline, project: project) before do
FactoryGirl.create(:ci_build, pipeline: pipeline, status: 'pending') build.success
create(:ci_build, :pending)
end
post ci_api("/builds/register"), token: shared_runner.token it "returns 404 error if no builds for shared runner" do
register_builds
expect(response).to have_http_status(404) expect(response).to have_http_status(404)
end end
end
it "returns options" do context 'for shared runner' do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master') let(:shared_runner) { create(:ci_runner, token: "SharedRunner") }
pipeline.create_builds(nil)
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin } it "should return 404 error if no builds for shared runner" do
register_builds shared_runner.token
expect(response).to have_http_status(201) expect(response).to have_http_status(404)
expect(json_response["options"]).to eq({ "image" => "ruby:2.1", "services" => ["postgres"] }) end
end end
it "returns variables" do context 'for triggered build' do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master') before do
pipeline.create_builds(nil) trigger = create(:ci_trigger, project: project)
create(:ci_trigger_request_with_variables, pipeline: pipeline, builds: [build], trigger: trigger)
project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value") project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value")
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["variables"]).to include(
{ "key" => "CI_BUILD_NAME", "value" => "spinach", "public" => true },
{ "key" => "CI_BUILD_STAGE", "value" => "test", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true },
{ "key" => "SECRET_KEY", "value" => "secret_value", "public" => false }
)
end end
it "returns variables for triggers" do it "returns variables for triggers" do
trigger = FactoryGirl.create(:ci_trigger, project: project) register_builds info: { platform: :darwin }
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
trigger_request = FactoryGirl.create(:ci_trigger_request_with_variables, pipeline: pipeline, trigger: trigger)
pipeline.create_builds(nil, trigger_request)
project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value")
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
expect(response).to have_http_status(201) expect(response).to have_http_status(201)
expect(json_response["variables"]).to include( expect(json_response["variables"]).to include(
...@@ -98,20 +82,26 @@ describe Ci::API::API do ...@@ -98,20 +82,26 @@ describe Ci::API::API do
{ "key" => "CI_BUILD_TRIGGERED", "value" => "true", "public" => true }, { "key" => "CI_BUILD_TRIGGERED", "value" => "true", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true }, { "key" => "DB_NAME", "value" => "postgres", "public" => true },
{ "key" => "SECRET_KEY", "value" => "secret_value", "public" => false }, { "key" => "SECRET_KEY", "value" => "secret_value", "public" => false },
{ "key" => "TRIGGER_KEY_1", "value" => "TRIGGER_VALUE_1", "public" => false } { "key" => "TRIGGER_KEY_1", "value" => "TRIGGER_VALUE_1", "public" => false },
) )
end end
end
it "returns dependent builds" do context 'with multiple builds' do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master') before do
pipeline.create_builds(nil, nil) build.success
pipeline.builds.where(stage: 'test').each(&:success) end
let!(:test_build) { create(:ci_build, pipeline: pipeline, name: 'deploy', stage: 'deploy', stage_idx: 1) }
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin } it "returns dependent builds" do
register_builds info: { platform: :darwin }
expect(response).to have_http_status(201) expect(response).to have_http_status(201)
expect(json_response["depends_on_builds"].count).to eq(2) expect(json_response["id"]).to eq(test_build.id)
expect(json_response["depends_on_builds"][0]["name"]).to eq("rspec") expect(json_response["depends_on_builds"].count).to eq(1)
expect(json_response["depends_on_builds"][0]).to include('id' => build.id, 'name' => 'spinach')
end
end end
%w(name version revision platform architecture).each do |param| %w(name version revision platform architecture).each do |param|
...@@ -121,8 +111,9 @@ describe Ci::API::API do ...@@ -121,8 +111,9 @@ describe Ci::API::API do
subject { runner.read_attribute(param.to_sym) } subject { runner.read_attribute(param.to_sym) }
it do it do
post ci_api("/builds/register"), token: runner.token, info: { param => value } register_builds info: { param => value }
expect(response).to have_http_status(404)
expect(response).to have_http_status(201)
runner.reload runner.reload
is_expected.to eq(value) is_expected.to eq(value)
end end
...@@ -131,8 +122,7 @@ describe Ci::API::API do ...@@ -131,8 +122,7 @@ describe Ci::API::API do
context 'when build has no tags' do context 'when build has no tags' do
before do before do
pipeline = create(:ci_pipeline, project: project) build.update(tags: [])
create(:ci_build, pipeline: pipeline, tags: [])
end end
context 'when runner is allowed to pick untagged builds' do context 'when runner is allowed to pick untagged builds' do
...@@ -154,17 +144,15 @@ describe Ci::API::API do ...@@ -154,17 +144,15 @@ describe Ci::API::API do
expect(response).to have_http_status 404 expect(response).to have_http_status 404
end end
end end
def register_builds
post ci_api("/builds/register"), token: runner.token,
info: { platform: :darwin }
end end
def register_builds(token = runner.token, **params)
post ci_api("/builds/register"), params.merge(token: token)
end end
end end
describe "PUT /builds/:id" do describe "PUT /builds/:id" do
let(:pipeline) {create(:ci_pipeline, project: project)} let(:build) { create(:ci_build, :pending, :trace, pipeline: pipeline, runner_id: runner.id) }
let(:build) { create(:ci_build, :trace, pipeline: pipeline, runner_id: runner.id) }
before do before do
build.run! build.run!
...@@ -189,7 +177,7 @@ describe Ci::API::API do ...@@ -189,7 +177,7 @@ describe Ci::API::API do
end end
describe 'PATCH /builds/:id/trace.txt' do describe 'PATCH /builds/:id/trace.txt' do
let(:build) { create(:ci_build, :trace, runner_id: runner.id) } let(:build) { create(:ci_build, :pending, :trace, runner_id: runner.id) }
let(:headers) { { Ci::API::Helpers::BUILD_TOKEN_HEADER => build.token, 'Content-Type' => 'text/plain' } } let(:headers) { { Ci::API::Helpers::BUILD_TOKEN_HEADER => build.token, 'Content-Type' => 'text/plain' } }
let(:headers_with_range) { headers.merge({ 'Content-Range' => '11-20' }) } let(:headers_with_range) { headers.merge({ 'Content-Range' => '11-20' }) }
...@@ -237,8 +225,7 @@ describe Ci::API::API do ...@@ -237,8 +225,7 @@ describe Ci::API::API do
context "Artifacts" do context "Artifacts" do
let(:file_upload) { fixture_file_upload(Rails.root + 'spec/fixtures/banana_sample.gif', 'image/gif') } let(:file_upload) { fixture_file_upload(Rails.root + 'spec/fixtures/banana_sample.gif', 'image/gif') }
let(:file_upload2) { fixture_file_upload(Rails.root + 'spec/fixtures/dk.png', 'image/gif') } let(:file_upload2) { fixture_file_upload(Rails.root + 'spec/fixtures/dk.png', 'image/gif') }
let(:pipeline) { create(:ci_pipeline, project: project) } let(:build) { create(:ci_build, :pending, pipeline: pipeline, runner_id: runner.id) }
let(:build) { create(:ci_build, pipeline: pipeline, runner_id: runner.id) }
let(:authorize_url) { ci_api("/builds/#{build.id}/artifacts/authorize") } let(:authorize_url) { ci_api("/builds/#{build.id}/artifacts/authorize") }
let(:post_url) { ci_api("/builds/#{build.id}/artifacts") } let(:post_url) { ci_api("/builds/#{build.id}/artifacts") }
let(:delete_url) { ci_api("/builds/#{build.id}/artifacts") } let(:delete_url) { ci_api("/builds/#{build.id}/artifacts") }
......
...@@ -42,7 +42,8 @@ describe Ci::API::API do ...@@ -42,7 +42,8 @@ describe Ci::API::API do
post ci_api("/projects/#{project.ci_id}/refs/master/trigger"), options post ci_api("/projects/#{project.ci_id}/refs/master/trigger"), options
expect(response).to have_http_status(201) expect(response).to have_http_status(201)
pipeline.builds.reload pipeline.builds.reload
expect(pipeline.builds.size).to eq(2) expect(pipeline.builds.pending.size).to eq(2)
expect(pipeline.builds.size).to eq(5)
end end
it 'returns bad request with no builds created if there\'s no commit for that ref' do it 'returns bad request with no builds created if there\'s no commit for that ref' do
......
require 'spec_helper'
describe Ci::CreateBuildsService, services: true do
let(:pipeline) { create(:ci_pipeline, ref: 'master') }
let(:user) { create(:user) }
describe '#execute' do
# Using stubbed .gitlab-ci.yml created in commit factory
#
subject do
described_class.new(pipeline).execute('test', user, status, nil)
end
context 'next builds available' do
let(:status) { 'success' }
it { is_expected.to be_an_instance_of Array }
it { is_expected.to all(be_an_instance_of Ci::Build) }
it 'does not persist created builds' do
expect(subject.first).not_to be_persisted
end
end
context 'builds skipped' do
let(:status) { 'skipped' }
it { is_expected.to be_empty }
end
end
end
require 'spec_helper' require 'spec_helper'
describe CreateCommitBuildsService, services: true do describe Ci::CreatePipelineService, services: true do
let(:service) { CreateCommitBuildsService.new } let(:project) { FactoryGirl.create(:project) }
let(:project) { FactoryGirl.create(:empty_project) } let(:user) { create(:admin) }
let(:user) { create(:user) }
before do before do
stub_ci_pipeline_to_return_yaml_file stub_ci_pipeline_to_return_yaml_file
end end
describe '#execute' do describe '#execute' do
def execute(params)
described_class.new(project, user, params).execute
end
context 'valid params' do context 'valid params' do
let(:pipeline) do let(:pipeline) do
service.execute(project, user, execute(ref: 'refs/heads/master',
ref: 'refs/heads/master',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: [{ message: "Message" }] commits: [{ message: "Message" }])
)
end end
it { expect(pipeline).to be_kind_of(Ci::Pipeline) } it { expect(pipeline).to be_kind_of(Ci::Pipeline) }
...@@ -30,38 +31,33 @@ describe CreateCommitBuildsService, services: true do ...@@ -30,38 +31,33 @@ describe CreateCommitBuildsService, services: true do
context "skip tag if there is no build for it" do context "skip tag if there is no build for it" do
it "creates commit if there is appropriate job" do it "creates commit if there is appropriate job" do
result = service.execute(project, user, result = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: [{ message: "Message" }] commits: [{ message: "Message" }])
)
expect(result).to be_persisted expect(result).to be_persisted
end end
it "creates commit if there is no appropriate job but deploy job has right ref setting" do it "creates commit if there is no appropriate job but deploy job has right ref setting" do
config = YAML.dump({ deploy: { script: "ls", only: ["0_1"] } }) config = YAML.dump({ deploy: { script: "ls", only: ["master"] } })
stub_ci_pipeline_yaml_file(config) stub_ci_pipeline_yaml_file(config)
result = execute(ref: 'refs/heads/master',
result = service.execute(project, user,
ref: 'refs/heads/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: [{ message: "Message" }] commits: [{ message: "Message" }])
)
expect(result).to be_persisted expect(result).to be_persisted
end end
end end
it 'skips creating pipeline for refs without .gitlab-ci.yml' do it 'skips creating pipeline for refs without .gitlab-ci.yml' do
stub_ci_pipeline_yaml_file(nil) stub_ci_pipeline_yaml_file(nil)
result = service.execute(project, user, result = execute(ref: 'refs/heads/master',
ref: 'refs/heads/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: [{ message: 'Message' }] commits: [{ message: 'Message' }])
)
expect(result).to be_falsey expect(result).not_to be_persisted
expect(Ci::Pipeline.count).to eq(0) expect(Ci::Pipeline.count).to eq(0)
end end
...@@ -70,12 +66,11 @@ describe CreateCommitBuildsService, services: true do ...@@ -70,12 +66,11 @@ describe CreateCommitBuildsService, services: true do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message } allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message }
stub_ci_pipeline_yaml_file('invalid: file: file') stub_ci_pipeline_yaml_file('invalid: file: file')
commits = [{ message: message }] commits = [{ message: message }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq('failed') expect(pipeline.status).to eq('failed')
...@@ -94,12 +89,10 @@ describe CreateCommitBuildsService, services: true do ...@@ -94,12 +89,10 @@ describe CreateCommitBuildsService, services: true do
it "skips builds creation if there is [ci skip] tag in commit message" do it "skips builds creation if there is [ci skip] tag in commit message" do
commits = [{ message: message }] commits = [{ message: message }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
...@@ -108,12 +101,10 @@ describe CreateCommitBuildsService, services: true do ...@@ -108,12 +101,10 @@ describe CreateCommitBuildsService, services: true do
it "skips builds creation if there is [skip ci] tag in commit message" do it "skips builds creation if there is [skip ci] tag in commit message" do
commits = [{ message: messageFlip }] commits = [{ message: messageFlip }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
...@@ -122,12 +113,10 @@ describe CreateCommitBuildsService, services: true do ...@@ -122,12 +113,10 @@ describe CreateCommitBuildsService, services: true do
it "skips builds creation if there is [CI SKIP] tag in commit message" do it "skips builds creation if there is [CI SKIP] tag in commit message" do
commits = [{ message: capMessage }] commits = [{ message: capMessage }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
...@@ -136,12 +125,10 @@ describe CreateCommitBuildsService, services: true do ...@@ -136,12 +125,10 @@ describe CreateCommitBuildsService, services: true do
it "skips builds creation if there is [SKIP CI] tag in commit message" do it "skips builds creation if there is [SKIP CI] tag in commit message" do
commits = [{ message: capMessageFlip }] commits = [{ message: capMessageFlip }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
...@@ -152,67 +139,37 @@ describe CreateCommitBuildsService, services: true do ...@@ -152,67 +139,37 @@ describe CreateCommitBuildsService, services: true do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { "some message" } allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { "some message" }
commits = [{ message: "some message" }] commits = [{ message: "some message" }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.first.name).to eq("staging") expect(pipeline.builds.first.name).to eq("rspec")
end end
it "skips builds creation if there is [ci skip] tag in commit message and yaml is invalid" do it "fails builds creation if there is [ci skip] tag in commit message and yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file: fiile') stub_ci_pipeline_yaml_file('invalid: file: fiile')
commits = [{ message: message }] commits = [{ message: message }]
pipeline = service.execute(project, user, pipeline = execute(ref: 'refs/heads/master',
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped") expect(pipeline.status).to eq("failed")
expect(pipeline.yaml_errors).to be_nil expect(pipeline.yaml_errors).not_to be_nil
end
end end
it "skips build creation if there are already builds" do
allow_any_instance_of(Ci::Pipeline).to receive(:ci_yaml_file) { gitlab_ci_yaml }
commits = [{ message: "message" }]
pipeline = service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.count(:all)).to eq(2)
pipeline = service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.count(:all)).to eq(2)
end end
it "creates commit with failed status if yaml is invalid" do it "creates commit with failed status if yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file') stub_ci_pipeline_yaml_file('invalid: file')
commits = [{ message: "some message" }] commits = [{ message: "some message" }]
pipeline = execute(ref: 'refs/heads/master',
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: commits commits: commits)
)
expect(pipeline).to be_persisted expect(pipeline).to be_persisted
expect(pipeline.status).to eq("failed") expect(pipeline.status).to eq("failed")
...@@ -226,16 +183,32 @@ describe CreateCommitBuildsService, services: true do ...@@ -226,16 +183,32 @@ describe CreateCommitBuildsService, services: true do
end end
it 'does not create a new pipeline' do it 'does not create a new pipeline' do
result = service.execute(project, user, result = execute(ref: 'refs/heads/master',
ref: 'refs/heads/master',
before: '00000000', before: '00000000',
after: '31das312', after: project.commit.id,
commits: [{ message: 'some msg' }]) commits: [{ message: 'some msg' }])
expect(result).to be_falsey expect(result).not_to be_persisted
expect(Ci::Build.all).to be_empty expect(Ci::Build.all).to be_empty
expect(Ci::Pipeline.count).to eq(0) expect(Ci::Pipeline.count).to eq(0)
end end
end end
context 'with manual actions' do
before do
config = YAML.dump({ deploy: { script: 'ls', when: 'manual' } })
stub_ci_pipeline_yaml_file(config)
end
it 'does not create a new pipeline' do
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: 'some msg' }])
expect(result).to be_persisted
expect(result.manual_actions).not_to be_empty
end
end
end end
end end
require 'spec_helper' require 'spec_helper'
describe Ci::CreateTriggerRequestService, services: true do describe Ci::CreateTriggerRequestService, services: true do
let(:service) { Ci::CreateTriggerRequestService.new } let(:service) { described_class.new }
let(:project) { create(:project) } let(:project) { create(:project) }
let(:trigger) { create(:ci_trigger, project: project) } let(:trigger) { create(:ci_trigger, project: project) }
...@@ -27,8 +27,7 @@ describe Ci::CreateTriggerRequestService, services: true do ...@@ -27,8 +27,7 @@ describe Ci::CreateTriggerRequestService, services: true do
subject { service.execute(project, trigger, 'master') } subject { service.execute(project, trigger, 'master') }
before do before do
stub_ci_pipeline_yaml_file('{}') stub_ci_pipeline_yaml_file('script: { only: [develop], script: hello World }')
FactoryGirl.create :ci_pipeline, project: project
end end
it { expect(subject).to be_nil } it { expect(subject).to be_nil }
......
...@@ -5,8 +5,8 @@ module Ci ...@@ -5,8 +5,8 @@ module Ci
let(:service) { ImageForBuildService.new } let(:service) { ImageForBuildService.new }
let(:project) { FactoryGirl.create(:empty_project) } let(:project) { FactoryGirl.create(:empty_project) }
let(:commit_sha) { '01234567890123456789' } let(:commit_sha) { '01234567890123456789' }
let(:commit) { project.ensure_pipeline(commit_sha, 'master') } let(:pipeline) { project.ensure_pipeline(commit_sha, 'master') }
let(:build) { FactoryGirl.create(:ci_build, pipeline: commit) } let(:build) { FactoryGirl.create(:ci_build, pipeline: pipeline) }
describe '#execute' do describe '#execute' do
before { build } before { build }
...@@ -14,6 +14,7 @@ module Ci ...@@ -14,6 +14,7 @@ module Ci
context 'branch name' do context 'branch name' do
before { allow(project).to receive(:commit).and_return(OpenStruct.new(sha: commit_sha)) } before { allow(project).to receive(:commit).and_return(OpenStruct.new(sha: commit_sha)) }
before { build.run! } before { build.run! }
before { pipeline.reload_status! }
let(:image) { service.execute(project, ref: 'master') } let(:image) { service.execute(project, ref: 'master') }
it { expect(image).to be_kind_of(OpenStruct) } it { expect(image).to be_kind_of(OpenStruct) }
...@@ -31,6 +32,7 @@ module Ci ...@@ -31,6 +32,7 @@ module Ci
context 'commit sha' do context 'commit sha' do
before { build.run! } before { build.run! }
before { pipeline.reload_status! }
let(:image) { service.execute(project, sha: build.sha) } let(:image) { service.execute(project, sha: build.sha) }
it { expect(image).to be_kind_of(OpenStruct) } it { expect(image).to be_kind_of(OpenStruct) }
......
require 'spec_helper'
describe Ci::ProcessPipelineService, services: true do
let(:pipeline) { create(:ci_pipeline, ref: 'master') }
let(:user) { create(:user) }
let(:all_builds) { pipeline.builds }
let(:builds) { all_builds.where.not(status: [:created, :skipped]) }
let(:config) { nil }
before do
allow(pipeline).to receive(:ci_yaml_file).and_return(config)
end
describe '#execute' do
def create_builds
described_class.new(pipeline.project, user).execute(pipeline)
end
def succeed_pending
builds.pending.update_all(status: 'success')
end
context 'start queuing next builds' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'linux', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'mac', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'rspec', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'rubocop', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'deploy', stage_idx: 2)
end
it 'processes a pipeline' do
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(2)
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(4)
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(5)
expect(create_builds).to be_falsey
end
it 'does not process pipeline if existing stage is running' do
expect(create_builds).to be_truthy
expect(builds.pending.count).to eq(2)
expect(create_builds).to be_falsey
expect(builds.pending.count).to eq(2)
end
end
context 'custom stage with first job allowed to fail' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'clean_job', stage_idx: 0, allow_failure: true)
create(:ci_build, :created, pipeline: pipeline, name: 'test_job', stage_idx: 1, allow_failure: true)
end
it 'automatically triggers a next stage when build finishes' do
expect(create_builds).to be_truthy
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:status)).to contain_exactly('failed', 'pending')
end
end
context 'properly creates builds when "when" is defined' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'build', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'test', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'test_failure', stage_idx: 2, when: 'on_failure')
create(:ci_build, :created, pipeline: pipeline, name: 'deploy', stage_idx: 3)
create(:ci_build, :created, pipeline: pipeline, name: 'production', stage_idx: 3, when: 'manual')
create(:ci_build, :created, pipeline: pipeline, name: 'cleanup', stage_idx: 4, when: 'always')
create(:ci_build, :created, pipeline: pipeline, name: 'clear cache', stage_idx: 4, when: 'manual')
end
context 'when builds are successful' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('success')
end
end
context 'when test job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when test and test_failure jobs fail' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when deploy job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when build is canceled in the second stage' do
it 'does not schedule builds after build has been canceled' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.running_or_pending).not_to be_empty
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:cancel)
expect(builds.running_or_pending).to be_empty
expect(pipeline.reload.status).to eq('canceled')
end
end
context 'when listing manual actions' do
it 'returns only for skipped builds' do
# currently all builds are created
expect(create_builds).to be_truthy
expect(manual_actions).to be_empty
# succeed stage build
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage test
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_one # production
# succeed stage deploy
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_many # production and clear cache
end
def manual_actions
pipeline.manual_actions
end
end
end
context 'creates a builds from .gitlab-ci.yml' do
let(:config) do
YAML.dump({
rspec: {
stage: 'test',
script: 'rspec'
},
rubocop: {
stage: 'test',
script: 'rubocop'
},
deploy: {
stage: 'deploy',
script: 'deploy'
}
})
end
# Using stubbed .gitlab-ci.yml created in commit factory
#
before do
stub_ci_pipeline_yaml_file(config)
create(:ci_build, :created, pipeline: pipeline, name: 'linux', stage: 'build', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'mac', stage: 'build', stage_idx: 0)
end
it 'when processing a pipeline' do
# Currently we have two builds with state created
expect(builds.count).to eq(0)
expect(all_builds.count).to eq(2)
# Create builds will mark the created as pending
expect(create_builds).to be_truthy
expect(builds.count).to eq(2)
expect(all_builds.count).to eq(2)
# When we builds succeed we will create a rest of pipeline from .gitlab-ci.yml
# We will have 2 succeeded, 2 pending (from stage test), total 5 (one more build from deploy)
succeed_pending
expect(create_builds).to be_truthy
expect(builds.success.count).to eq(2)
expect(builds.pending.count).to eq(2)
expect(all_builds.count).to eq(5)
# When we succeed the 2 pending from stage test,
# We will queue a deploy stage, no new builds will be created
succeed_pending
expect(create_builds).to be_truthy
expect(builds.pending.count).to eq(1)
expect(builds.success.count).to eq(4)
expect(all_builds.count).to eq(5)
# When we succeed last pending build, we will have a total of 5 succeeded builds, no new builds will be created
succeed_pending
expect(create_builds).to be_falsey
expect(builds.success.count).to eq(5)
expect(all_builds.count).to eq(5)
end
end
end
end
...@@ -110,19 +110,15 @@ describe MergeRequests::MergeWhenBuildSucceedsService do ...@@ -110,19 +110,15 @@ describe MergeRequests::MergeWhenBuildSucceedsService do
context 'properly handles multiple stages' do context 'properly handles multiple stages' do
let(:ref) { mr_merge_if_green_enabled.source_branch } let(:ref) { mr_merge_if_green_enabled.source_branch }
let(:build) { create(:ci_build, pipeline: pipeline, ref: ref, name: 'build', stage: 'build') } let!(:build) { create(:ci_build, :created, pipeline: pipeline, ref: ref, name: 'build', stage: 'build') }
let(:test) { create(:ci_build, pipeline: pipeline, ref: ref, name: 'test', stage: 'test') } let!(:test) { create(:ci_build, :created, pipeline: pipeline, ref: ref, name: 'test', stage: 'test') }
let(:pipeline) { create(:ci_empty_pipeline, ref: mr_merge_if_green_enabled.source_branch, project: project) }
before do before do
# This behavior of MergeRequest: we instantiate a new object # This behavior of MergeRequest: we instantiate a new object
allow_any_instance_of(MergeRequest).to receive(:pipeline).and_wrap_original do allow_any_instance_of(MergeRequest).to receive(:pipeline).and_wrap_original do
Ci::Pipeline.find(pipeline.id) Ci::Pipeline.find(pipeline.id)
end end
# We create test after the build
allow(pipeline).to receive(:create_next_builds).and_wrap_original do
test
end
end end
it "doesn't merge if some stages failed" do it "doesn't merge if some stages failed" do
......
...@@ -53,7 +53,13 @@ describe PostReceive do ...@@ -53,7 +53,13 @@ describe PostReceive do
subject { PostReceive.new.perform(pwd(project), key_id, base64_changes) } subject { PostReceive.new.perform(pwd(project), key_id, base64_changes) }
context "creates a Ci::Pipeline for every change" do context "creates a Ci::Pipeline for every change" do
before { stub_ci_pipeline_to_return_yaml_file } before do
allow_any_instance_of(Ci::CreatePipelineService).to receive(:commit) do
OpenStruct.new(id: '123456')
end
allow_any_instance_of(Ci::CreatePipelineService).to receive(:branch?).and_return(true)
stub_ci_pipeline_to_return_yaml_file
end
it { expect{ subject }.to change{ Ci::Pipeline.count }.by(2) } it { expect{ subject }.to change{ Ci::Pipeline.count }.by(2) }
end end
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment