Mergify Enterprise
ArchitectureRequirementsInstallationManual Installation (Legacy)Advanced FeaturesDatadog Integration
Enabling metrics
Mergify can send metrics to Datadog via the statsd
protocol. You can enable metrics by adding these environment variables:
DD_TRACE_ENABLED=1
DD_DOGSTATSD_DISABLE=0
DD_AGENT_HOST=<my-datadog-agent-host>
DD_DOGSTATSD_PORT=8125
GUNICORN_CMD_ARGS="--statsd-host=<my-datadog-agent-host>:8125"
Enabling logging
Mergify can send logs to the Datadog agent UDP log server receiver.
On your Datadog agent, you need to create the configuration file /etc/datadog-agent/mergify-engine.d/conf.yaml
with the following content to activate the log receiver:
init_config:
instances:
logs:
- type: udp
port: 10518
source: python
service: mergify-engine
sourcecategory: sourcecode
Then you can send the Mergify log to Datadog by adding these environment variables for the engine container:
MERGIFYENGINE_LOG_DATADOG=udp://<my-datadog-agent-host>:10518
CI Insights Object Storage
CI tests can be sent to Mergify CI Insights. To enable this feature, you need to configure an object storage.
Supported backend storage:
- Google Cloud Storage
- Amazon S3
Google Cloud Storage
Requirements:
- Two Google Cloud Storage buckets
<mycompany>-mergify-ci-traces-incoming
<mycompany>-mergify-ci-traces-done
- A Service Account with read/write/delete permissions on these buckets (we recommend using the
Object Storage User
role limited to these buckets) - A Service Account Key in JSON format for the above Service Account
Convert you JSON key file into base64 format:
cat credentials.json | base64 -w0
Add the following environment variables to the Docker image:
MERGIFYENGINE_CI_TRACES_BACKEND="gcs"
MERGIFYENGINE_CI_TRACES_INCOMING_BUCKET="<mycompany>-mergify-ci-traces-incoming"
MERGIFYENGINE_CI_TRACES_DONE_BUCKET="<mycompany>-mergify-ci-traces-done"
MERGIFYENGINE_GCS_CREDENTIALS="base64-encoded-credentials-json"
Amazon S3
Requirements:
- Two Amazon S3 buckets
<mycompany>-mergify-ci-traces-incoming
<mycompany>-mergify-ci-traces-done
- A Service Account with read/write/delete permissions on these buckets
- A Service Account Key for that Service Account
Add the following environment variables to the Docker image:
MERGIFYENGINE_CI_TRACES_BACKEND="s3"
MERGIFYENGINE_CI_TRACES_INCOMING_BUCKET="<mycompany>-mergify-ci-traces-incoming"
MERGIFYENGINE_CI_TRACES_DONE_BUCKET="<mycompany>-mergify-ci-traces-done"
MERGIFYENGINE_AWS_ACCOUNT_ID=1234567
MERGIFYENGINE_AWS_ACCESS_KEY_ID=123567
MERGIFYENGINE_AWS_SECRET_ACCESS_KEY=
# Optional
MERGIFYENGINE_AWS_REGION_NAME=us-west-2
# If you don't use Amazon for provide the S3 protocol
MERGIFYENGINE_AWS_ENDPOINT_URL_S3=s3://my-s3-domain.example.com:1234/