Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
G
gitlab-ce
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
Analytics
Analytics
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Commits
Issue Boards
Open sidebar
Boxiang Sun
gitlab-ce
Commits
d28f1a7f
Commit
d28f1a7f
authored
Dec 16, 2015
by
Kamil Trzcinski
Committed by
James Edwards-Jones
Jan 31, 2017
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Split PagesWorker
parent
adc1a9ab
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
61 additions
and
40 deletions
+61
-40
app/workers/pages_worker.rb
app/workers/pages_worker.rb
+61
-40
No files found.
app/workers/pages_worker.rb
View file @
d28f1a7f
...
@@ -12,62 +12,83 @@ class PagesWorker
...
@@ -12,62 +12,83 @@ class PagesWorker
return
unless
valid?
return
unless
valid?
# Create status notifying the deployment of pages
# Create status notifying the deployment of pages
@status
=
GenericCommitStatus
.
new
(
@status
=
create_status
project:
project
,
commit:
build
.
commit
,
user:
build
.
user
,
ref:
build
.
ref
,
stage:
'deploy'
,
name:
'pages:deploy'
)
@status
.
run!
@status
.
run!
raise
'pages are outdated'
unless
latest?
FileUtils
.
mkdir_p
(
tmp_path
)
# Calculate dd parameters: we limit the size of pages
max_size
=
current_application_settings
.
max_pages_size
.
megabytes
max_size
||=
MAX_SIZE
blocks
=
1
+
max_size
/
BLOCK_SIZE
# Create temporary directory in which we will extract the artifacts
# Create temporary directory in which we will extract the artifacts
Dir
.
mktmpdir
(
nil
,
tmp_path
)
do
|
temp_path
|
Dir
.
mktmpdir
(
nil
,
tmp_path
)
do
|
archive_path
|
# We manually extract the archive and limit the archive size with dd
results
=
extract_archive
(
archive_path
)
results
=
Open3
.
pipeline
(
%W(gunzip -c
#{
artifacts
}
)
,
raise
'pages failed to extract'
unless
results
.
all?
(
&
:success?
)
%W(dd bs=
#{
BLOCK_SIZE
}
count=
#{
blocks
}
)
,
%W(tar -x -C
#{
temp_path
}
public/)
,
err:
'/dev/null'
)
return
unless
results
.
compact
.
all?
(
&
:success?
)
# Check if we did extract public directory
# Check if we did extract public directory
temp_public_path
=
File
.
join
(
temp_path
,
'public'
)
archive_public_path
=
File
.
join
(
archive_path
,
'public'
)
return
unless
Dir
.
exists?
(
temp_public_path
)
raise
'pages miss the public folder'
unless
Dir
.
exists?
(
archive_public_path
)
raise
'pages are outdated'
unless
latest?
deploy_page!
(
archive_public_path
)
FileUtils
.
mkdir_p
(
pages_path
)
@status
.
success
end
rescue
=>
e
fail
(
e
.
message
,
!
latest?
)
end
# Ignore deployment if the HEAD changed when we were extracting the archive
private
return
unless
valid?
# Do atomic move of pages
def
create_status
# Move and removal may not be atomic, but they are significantly faster then extracting and removal
GenericCommitStatus
.
new
(
# 1. We move deployed public to previous public path (file removal is slow)
project:
project
,
# 2. We move temporary public to be deployed public
commit:
build
.
commit
,
# 3. We remove previous public path
user:
build
.
user
,
FileUtils
.
move
(
public_path
,
previous_public_path
,
force:
true
)
ref:
build
.
ref
,
FileUtils
.
move
(
temp_public_path
,
public_path
)
stage:
'deploy'
,
FileUtils
.
rm_r
(
previous_public_path
,
force:
true
)
name:
'pages:deploy'
)
end
@status
.
success
def
extract_archive
(
temp_path
)
end
results
=
Open3
.
pipeline
(
%W(gunzip -c
#{
artifacts
}
)
,
%W(dd bs=
#{
BLOCK_SIZE
}
count=
#{
blocks
}
)
,
%W(tar -x -C
#{
temp_path
}
public/)
,
err:
'/dev/null'
)
results
.
compact
end
def
deploy_page!
(
archive_public_path
)
# Do atomic move of pages
# Move and removal may not be atomic, but they are significantly faster then extracting and removal
# 1. We move deployed public to previous public path (file removal is slow)
# 2. We move temporary public to be deployed public
# 3. We remove previous public path
FileUtils
.
mkdir_p
(
pages_path
)
FileUtils
.
move
(
public_path
,
previous_public_path
,
force:
true
)
FileUtils
.
move
(
archive_public_path
,
public_path
)
ensure
ensure
@status
.
drop
if
@status
&&
@status
.
active?
FileUtils
.
rm_r
(
previous_public_path
,
force:
true
)
end
end
private
def
fail
(
message
,
allow_failure
=
true
)
@status
.
allow_failure
=
allow_failure
@status
.
description
=
message
@status
.
drop
end
def
valid?
def
valid?
build
&&
build
.
artifacts_file?
end
def
latest?
# check if sha for the ref is still the most recent one
# check if sha for the ref is still the most recent one
# this helps in case when multiple deployments happens
# this helps in case when multiple deployments happens
build
&&
build
.
artifacts_file?
&&
sha
==
latest_sha
sha
==
latest_sha
end
def
blocks
# Calculate dd parameters: we limit the size of pages
max_size
=
current_application_settings
.
max_pages_size
.
megabytes
max_size
||=
MAX_SIZE
blocks
=
1
+
max_size
/
BLOCK_SIZE
blocks
end
end
def
build
def
build
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment