Commit b6a0a0d5 authored by Stan Hu's avatar Stan Hu

Merge branch 'cache-last-usage-data' into 'master'

Cache the last usage data to avoid unicorn timeouts

On GitLab.com, the usage data cache was never populated because it takes longer than 60 seconds to generate.

This MR also improves usage data performance. The Event default_scope was causing the query to use the wrong index, causing the query to scan through all rows unnecessarily.

Closes #1044

See merge request !779
parents aad78d68 118d4e51
...@@ -2,6 +2,7 @@ Please view this file on the master branch, on stable branches it's out of date. ...@@ -2,6 +2,7 @@ Please view this file on the master branch, on stable branches it's out of date.
## 8.13.0 (2016-10-22) ## 8.13.0 (2016-10-22)
- Cache the last usage data to avoid unicorn timeouts
- Add user activity table and service to query for active users - Add user activity table and service to query for active users
- Fix 500 error updating mirror URLs for projects - Fix 500 error updating mirror URLs for projects
- Restrict protected branch access to specific groups !645 - Restrict protected branch access to specific groups !645
......
...@@ -15,7 +15,7 @@ class GitlabUsagePingWorker ...@@ -15,7 +15,7 @@ class GitlabUsagePingWorker
begin begin
HTTParty.post(url, HTTParty.post(url,
body: Gitlab::UsageData.to_json, body: Gitlab::UsageData.to_json(force_refresh: true),
headers: { 'Content-type' => 'application/json' } headers: { 'Content-type' => 'application/json' }
) )
rescue HTTParty::Error => e rescue HTTParty::Error => e
......
module Gitlab module Gitlab
class UsageData class UsageData
class << self class << self
def data def data(force_refresh: false)
Rails.cache.fetch('usage_data', expires_in: 1.hour) { uncached_data } Rails.cache.fetch('usage_data', force: force_refresh, expires_in: 2.weeks) { uncached_data }
end end
def uncached_data def uncached_data
license_usage_data.merge(system_usage_data) license_usage_data.merge(system_usage_data)
end end
def to_json def to_json(force_refresh: false)
data.to_json data(force_refresh: force_refresh).to_json
end end
def system_usage_data def system_usage_data
...@@ -36,7 +36,8 @@ module Gitlab ...@@ -36,7 +36,8 @@ module Gitlab
merge_requests: MergeRequest.count, merge_requests: MergeRequest.count,
milestones: Milestone.count, milestones: Milestone.count,
notes: Note.count, notes: Note.count,
pushes: Event.code_push.count, # Default scope causes this query to run for a long time
pushes: Event.unscoped.code_push.count,
pages_domains: PagesDomain.count, pages_domains: PagesDomain.count,
projects: Project.count, projects: Project.count,
protected_branches: ProtectedBranch.count, protected_branches: ProtectedBranch.count,
......
...@@ -8,6 +8,7 @@ describe GitlabUsagePingWorker do ...@@ -8,6 +8,7 @@ describe GitlabUsagePingWorker do
stub_request(:post, "https://version.gitlab.com/usage_data"). stub_request(:post, "https://version.gitlab.com/usage_data").
to_return(status: 200, body: '', headers: {}) to_return(status: 200, body: '', headers: {})
expect(Gitlab::UsageData).to receive(:to_json).with({ force_refresh: true }).and_call_original
expect(subject).to receive(:try_obtain_lease).and_return(true) expect(subject).to receive(:try_obtain_lease).and_return(true)
expect(subject.perform.response.code.to_i).to eq(200) expect(subject.perform.response.code.to_i).to eq(200)
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment