Commit e59dd19e authored by Amy Qualls's avatar Amy Qualls

Merge branch 'docs-please-remove' into 'master'

Remove several instances of the word "please" from docs

See merge request gitlab-org/gitlab!47240
parents dc497024 4d81fca1
......@@ -13,10 +13,10 @@ info: To determine the technical writer assigned to the Stage/Group associated w
#### Connection refused
If you are getting `Connection Refused` errors when trying to connect to the
LDAP server please double-check the LDAP `port` and `encryption` settings used by
GitLab. Common combinations are `encryption: 'plain'` and `port: 389`, OR
`encryption: 'simple_tls'` and `port: 636`.
If you're getting `Connection Refused` error messages when attempting to
connect to the LDAP server, review the LDAP `port` and `encryption` settings
used by GitLab. Common combinations are `encryption: 'plain'` and `port: 389`,
or `encryption: 'simple_tls'` and `port: 636`.
#### Connection times out
......@@ -649,8 +649,7 @@ ldapsearch -D "cn=admin,dc=ldap-testing,dc=example,dc=com" \
Note that the `bind_dn`, `password`, `port`, `host`, and `base` are all
identical to what's configured in the `gitlab.rb`.
Please see [the official
`ldapsearch` documentation](https://linux.die.net/man/1/ldapsearch) for more.
For more information, see the [official `ldapsearch` documentation](https://linux.die.net/man/1/ldapsearch).
### Using **AdFind** (Windows)
......@@ -675,15 +674,15 @@ adfind -h ad.example.org:636 -ssl -u "CN=GitLabSRV,CN=Users,DC=GitLab,DC=org" -u
### Rails console
CAUTION: **Caution:**
Please note that it is very easy to create, read, modify, and destroy data on the
rails console, so please be sure to run commands exactly as listed.
It is very easy to create, read, modify, and destroy data with the rails
console. Be sure to run commands exactly as listed.
The rails console is a valuable tool to help debug LDAP problems. It allows you to
directly interact with the application by running commands and seeing how GitLab
responds to them.
Please refer to [this guide](../../operations/rails_console.md#starting-a-rails-console-session)
for instructions on how to use the rails console.
For instructions about how to use the rails console, refer to this
[guide](../../operations/rails_console.md#starting-a-rails-console-session).
#### Enable debug output
......
......@@ -134,7 +134,10 @@ Note the following when promoting a secondary:
1. Promote the **secondary** node to the **primary** node.
DANGER: **Warning:**
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. This issue has been fixed in GitLab 13.4 or later.
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the
secondary is paused fails. Do not pause replication before promoting a
secondary. If the node is paused, be sure to resume before promoting. This
issue has been fixed in GitLab 13.4 and later.
CAUTION: **Caution:**
If the secondary node [has been paused](../../geo/index.md#pausing-and-resuming-replication), this performs
......@@ -171,7 +174,10 @@ perform changes on a **secondary** with only a single machine. Instead, you must
do this manually.
DANGER: **Warning:**
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. This issue has been fixed in GitLab 13.4 or later.
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the
secondary is paused fails. Do not pause replication before promoting a
secondary. If the node is paused, be sure to resume before promoting. This
issue has been fixed in GitLab 13.4 and later.
CAUTION: **Caution:**
If the secondary node [has been paused](../../geo/index.md#pausing-and-resuming-replication), this performs
......
......@@ -228,7 +228,10 @@ perform changes on a **secondary** with only a single machine. Instead, you must
do this manually.
DANGER: **Warning:**
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. This issue has been fixed in GitLab 13.4 or later.
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the
secondary is paused fails. Do not pause replication before promoting a
secondary. If the node is paused, be sure to resume before promoting. This
issue has been fixed in GitLab 13.4 and later.
CAUTION: **Caution:**
If the secondary node [has been paused](../../../geo/index.md#pausing-and-resuming-replication), this performs
......
......@@ -197,7 +197,10 @@ For information on how to update your Geo nodes to the latest GitLab version, se
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/35913) in [GitLab Premium](https://about.gitlab.com/pricing/) 13.2.
DANGER: **Warning:**
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. This issue has been fixed in GitLab 13.4 or later.
In GitLab 13.2 and 13.3, promoting a secondary node to a primary while the
secondary is paused fails. Do not pause replication before promoting a
secondary. If the node is paused, be sure to resume before promoting. This
issue has been fixed in GitLab 13.4 and later.
CAUTION: **Caution:**
Pausing and resuming of replication is currently only supported for Geo installations using an
......
......@@ -164,7 +164,7 @@ replicating data from those features will cause the data to be **lost**.
If you wish to use those features on a **secondary** node, or to execute a failover
successfully, you must replicate their data using some other means.
| Feature | Replicated (added in GitLab version) | Verified (added in GitLab version) | Object Storage replication (please see [Geo with Object Storage](object_storage.md)) | Notes |
| Feature | Replicated (added in GitLab version) | Verified (added in GitLab version) | Object Storage replication (see [Geo with Object Storage](object_storage.md)) | Notes |
|:---------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [Application data in PostgreSQL](../../postgresql/index.md) | **Yes** (10.2) | **Yes** (10.2) | No | |
| [Project repository](../../..//user/project/repository/) | **Yes** (10.2) | **Yes** (10.7) | No | |
......
......@@ -43,8 +43,8 @@ In any case, you require:
- A working GitLab **secondary** node.
- A Route53 Hosted Zone managing your domain.
If you have not yet setup a Geo **primary** node and **secondary** node, please consult
[the Geo setup instructions](../index.md#setup-instructions).
If you haven't yet set up a Geo _primary_ node and _secondary_ node, see the
[Geo setup instructions](../index.md#setup-instructions).
## Create a traffic policy
......
......@@ -72,7 +72,7 @@ If you are using Google Cloud Storage, consider using
Or you can use the [Storage Transfer Service](https://cloud.google.com/storage-transfer/docs/),
although this only supports daily synchronization.
For manual synchronization, or scheduled by `cron`, please have a look at:
For manual synchronization, or scheduled by `cron`, see:
- [`s3cmd sync`](https://s3tools.org/s3cmd-sync)
- [`gsutil rsync`](https://cloud.google.com/storage/docs/gsutil/commands/rsync)
......@@ -720,7 +720,8 @@ GitLab cannot find or doesn't have permission to access the `database_geo.yml` c
In an Omnibus GitLab installation, the file should be in `/var/opt/gitlab/gitlab-rails/etc`.
If it doesn't exist or inadvertent changes have been made to it, run `sudo gitlab-ctl reconfigure` to restore it to its correct state.
If this path is mounted on a remote volume, please check your volume configuration and that it has correct permissions.
If this path is mounted on a remote volume, ensure your volume configuration
has the correct permissions.
### An existing tracking database cannot be reused
......
......@@ -8,7 +8,9 @@ type: howto
# Updating the Geo nodes **(PREMIUM ONLY)**
CAUTION: **Warning:**
Please ensure you read these sections carefully before updating your Geo nodes! Not following version-specific update steps may result in unexpected downtime. Please [contact support](https://about.gitlab.com/support/#contact-support) if you have any specific questions.
Read these sections carefully before updating your Geo nodes. Not following
version-specific update steps may result in unexpected downtime. If you have
any specific questions, [contact Support](https://about.gitlab.com/support/#contact-support).
Updating Geo nodes involves performing:
......@@ -47,4 +49,4 @@ everything is working correctly:
1. Test the data replication by pushing code to the **primary** node and see if it
is received by **secondary** nodes.
If you encounter any issues, please consult the [Geo troubleshooting guide](troubleshooting.md).
If you encounter any issues, see the [Geo troubleshooting guide](troubleshooting.md).
......@@ -25,40 +25,43 @@ DROP EXTENSION IF EXISTS postgres_fdw;
```
DANGER: **Warning:**
In GitLab 13.3, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. To avoid this issue, upgrade to GitLab 13.4 or later.
In GitLab 13.3, promoting a secondary node to a primary while the secondary is
paused fails. Do not pause replication before promoting a secondary. If the
node is paused, be sure to resume before promoting. To avoid this issue,
upgrade to GitLab 13.4 or later.
## Updating to GitLab 13.2
In GitLab 13.2, promoting a secondary node to a primary while the secondary is paused fails. Do not pause replication before promoting a secondary. If the node is paused, please resume before promoting. To avoid this issue, upgrade to GitLab 13.4 or later.
In GitLab 13.2, promoting a secondary node to a primary while the secondary is
paused fails. Do not pause replication before promoting a secondary. If the
node is paused, be sure to resume before promoting. To avoid this issue,
upgrade to GitLab 13.4 or later.
## Updating to GitLab 13.0
Upgrading to GitLab 13.0 requires GitLab 12.10 to already be using PostgreSQL
version 11. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
version 11. For the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
## Updating to GitLab 12.10
GitLab 12.10 does not attempt to automatically update the embedded PostgreSQL
server when using Geo, because the PostgreSQL upgrade requires downtime on
secondaries while reinitializing streaming replication. It must be upgraded
manually. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
GitLab 12.10 doesn't attempt to update the embedded PostgreSQL server when
using Geo, because the PostgreSQL upgrade requires downtime for secondaries
while reinitializing streaming replication. It must be upgraded manually. For
the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
## Updating to GitLab 12.9
CAUTION: **Warning:**
GitLab 12.9.0 through GitLab 12.9.3 are affected by [a bug that stops
repository verification](https://gitlab.com/gitlab-org/gitlab/-/issues/213523).
The issue is fixed in GitLab 12.9.4. Please upgrade to GitLab 12.9.4 or later.
The issue is fixed in GitLab 12.9.4. Upgrade to GitLab 12.9.4 or later.
By default, GitLab 12.9 will attempt to automatically update the embedded
PostgreSQL server to 10.12 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
while reinitializing streaming replication. For the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -70,9 +73,8 @@ sudo touch /etc/gitlab/disable-postgresql-upgrade
By default, GitLab 12.8 will attempt to automatically update the embedded
PostgreSQL server to 10.12 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
while reinitializing streaming replication. For the recommended procedure, see
the [Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -91,10 +93,9 @@ fix](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/24021) was
shipped in 12.7.5.
By default, GitLab 12.7 will attempt to automatically update the embedded
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries while
reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. For the recommended procedure, see
the [Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -105,10 +106,9 @@ sudo touch /etc/gitlab/disable-postgresql-upgrade
## Updating to GitLab 12.6
By default, GitLab 12.6 will attempt to automatically update the embedded
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries while
reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. For the recommended procedure, see
the [Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -119,10 +119,9 @@ sudo touch /etc/gitlab/disable-postgresql-upgrade
## Updating to GitLab 12.5
By default, GitLab 12.5 will attempt to automatically update the embedded
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries while
reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. For the recommended procedure, see
the [Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -133,10 +132,9 @@ sudo touch /etc/gitlab/disable-postgresql-upgrade
## Updating to GitLab 12.4
By default, GitLab 12.4 will attempt to automatically update the embedded
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries while
reinitializing streaming replication. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
PostgreSQL server to 10.9 from 9.6, which requires downtime on secondaries
while reinitializing streaming replication. For the recommended procedure, see
the [Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
This can be temporarily disabled by running the following before updating:
......@@ -148,33 +146,29 @@ sudo touch /etc/gitlab/disable-postgresql-upgrade
DANGER: **Warning:**
If the existing PostgreSQL server version is 9.6.x, it is recommended to
upgrade to GitLab 12.4 or newer. By default, GitLab 12.3 will attempt to
automatically update the embedded PostgreSQL server to 10.9 from 9.6. In
certain circumstances, it will fail. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for more information.
upgrade to GitLab 12.4 or later. By default, GitLab 12.3 attempts to update the
embedded PostgreSQL server from 9.6 to 10.9. In certain circumstances, it will
fail. For more information, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
Additionally, if the PostgreSQL upgrade does not fail, a successful upgrade
requires downtime on secondaries while reinitializing streaming replication.
Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
Additionally, if the PostgreSQL upgrade doesn't fail, a successful upgrade
requires downtime for secondaries while reinitializing streaming replication.
For the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
## Updating to GitLab 12.2
DANGER: **Warning:**
If the existing PostgreSQL server version is 9.6.x, it is recommended to
upgrade to GitLab 12.4 or newer. By default, GitLab 12.2 will attempt to
automatically update the embedded PostgreSQL server to 10.9 from 9.6. In
certain circumstances, it will fail. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for more information.
upgrade to GitLab 12.4 or later. By default, GitLab 12.2 attempts to update the
embedded PostgreSQL server from 9.6 to 10.9. In certain circumstances, it will
fail. For more information, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
Additionally, if the PostgreSQL upgrade does not fail, a successful upgrade
requires downtime on secondaries while reinitializing streaming replication.
Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
requires downtime for secondaries while reinitializing streaming replication.
For the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
GitLab 12.2 includes the following minor PostgreSQL updates:
......@@ -196,31 +190,29 @@ The restart avoids a version mismatch when PostgreSQL tries to load the FDW exte
DANGER: **Warning:**
If the existing PostgreSQL server version is 9.6.x, it is recommended to
upgrade to GitLab 12.4 or newer. By default, GitLab 12.1 will attempt to
automatically update the embedded PostgreSQL server to 10.9 from 9.6. In
certain circumstances, it will fail. Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for more information.
upgrade to GitLab 12.4 or later. By default, GitLab 12.1 attempts to update the
embedded PostgreSQL server from 9.6 to 10.9. In certain circumstances, it will
fail. For more information, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
Additionally, if the PostgreSQL upgrade does not fail, a successful upgrade
requires downtime on secondaries while reinitializing streaming replication.
Please see
[the Omnibus documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance)
for the recommended procedure.
Additionally, if the PostgreSQL upgrade doesn't fail, a successful upgrade
requires downtime for secondaries while reinitializing streaming replication.
For the recommended procedure, see the
[Omnibus GitLab documentation](https://docs.gitlab.com/omnibus/settings/database.html#upgrading-a-geo-instance).
## Updating to GitLab 12.0
CAUTION: **Warning:**
This version is affected by [a bug that results in new LFS objects not being replicated to
Geo secondary nodes](https://gitlab.com/gitlab-org/gitlab/-/issues/32696). The issue is fixed
in GitLab 12.1. Please upgrade to GitLab 12.1 or later.
This version is affected by a [bug that results in new LFS objects not being
replicated to Geo secondary nodes](https://gitlab.com/gitlab-org/gitlab/-/issues/32696).
The issue is fixed in GitLab 12.1; be sure to upgrade to GitLab 12.1 or later.
## Updating to GitLab 11.11
CAUTION: **Warning:**
This version is affected by [a bug that results in new LFS objects not being replicated to
Geo secondary nodes](https://gitlab.com/gitlab-org/gitlab/-/issues/32696). The issue is fixed
in GitLab 12.1. Please upgrade to GitLab 12.1 or later.
This version is affected by a [bug that results in new LFS objects not being
replicated to Geo secondary nodes](https://gitlab.com/gitlab-org/gitlab/-/issues/32696).
The issue is fixed in GitLab 12.1; be sure to upgrade to GitLab 12.1 or later.
## Updating to GitLab 10.8
......@@ -440,7 +432,7 @@ required because replication slots are used by default. However, if you
started with GitLab 9.3 and updated later, you should still follow the
instructions below.
When in doubt, it does not hurt to do a resync. The easiest way to do this in
When in doubt, it doesn't hurt to do a resync. The easiest way to do this in
Omnibus is the following:
1. Make sure you have Omnibus GitLab on the **primary** server.
......
......@@ -249,9 +249,9 @@ gitlab_rails['shared_path'] = '/gitlab-nfs/gitlab-data/shared'
gitlab_ci['builds_directory'] = '/gitlab-nfs/gitlab-data/builds'
```
Run `sudo gitlab-ctl reconfigure` to start using the central location. Please
be aware that if you had existing data you will need to manually copy/rsync it
to these new locations and then restart GitLab.
Run `sudo gitlab-ctl reconfigure` to start using the central location. Be aware
that if you had existing data, you'll need to manually copy or rsync it to
these new locations, and then restart GitLab.
### Bind mounts
......@@ -399,8 +399,8 @@ Additionally, this configuration is specifically warned against in the
>system semantics, this can cause reliability problems. Specifically, delayed (asynchronous) writes
>to the NFS server can cause data corruption problems.
For supported database architecture, please see our documentation on
[Configuring a Database for GitLab HA](postgresql/replication_and_failover.md).
For supported database architecture, see our documentation about
[configuring a database for replication and failover](postgresql/replication_and_failover.md).
<!-- ## Troubleshooting
......
......@@ -46,4 +46,4 @@ If it finds a reply key, it will be able to leave your reply as a comment on
the entity the notification was about (issue, merge request, commit...).
For more details about the `Message-ID`, `In-Reply-To`, and `References headers`,
please consult [RFC 5322](https://tools.ietf.org/html/rfc5322#section-3.6.4).
see [RFC 5322](https://tools.ietf.org/html/rfc5322#section-3.6.4).
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment