Logs management

Clever Cloud new logs stack, based on Vector and Apache Pulsar, is available for applications and through the new Console dedicated section, in public beta. This Web Component allow you to check for live or past logs. You can target a specific time window, select logs lines and copy them in clipboard through keyboard and/or mouse. The settings panel offers lots of parameters such as dark/light themes, line wrapping, ANSI codes escaping, etc. You can also choose the date/time format, UTC or local time, to show the instances name or not.

New logs interfaces

During beta, you can send us your feed back through our GitHub Community:

Logs are retained for 7 days, sometimes more for specific customers/needs. On our old stack, they are flushed daily at midnight.

Get continuous logs from your application

Log management is also available through Clever Tools and our APIv4. They’re collected and sent through the Vector service enabled in every application deployed on Clever Cloud. To disable it, set the CC_PREVENT_LOGSCOLLECTION environment variable to true.You can see logs with the command down below.

clever logs

You can add --since, followed by a duration or a date (ISO8601 format). The --until flag should be followed by a date (ISO8601 format).

clever logs --since 2h
clever logs --until 2024-04-15T13:37:42Z 

You can also get your add-on’s logs by using --addon flag, the value must be the add-on id starting by addon_.

clever logs --addon <addon_xxx>
With add-ons, only the last 1000 lines of logs are got by clever logs.

Access logs

It contains all incoming requests to your application. Here is an example: - - [06/Feb/2020:07:59:22 +0100] "GET /aget/to/your/beautiful/website -" 200 1453

They are available in different formats, the most common is CLF which stands for Common Log Format.

You can see access logs with the following command:

clever accesslogs

As with the logs command, you can specify --before and --after flags as well as the --follow to display access logs continuously.

If you need to change the ouput you can specify the --format flag with one of these values:

  • simple: 2021-06-25T10:11:35.358Z GET /

  • extended: 2021-06-25T10:11:35.358Z [ - Nantes, FR ] GET www.clever-cloud.com / 200

  • clf: - - [25/Jun/2021:12:11:35 +0200] "GET / -" 200 562

  • json:


Exporting logs to an external tool

You can use the logs drains to send your application’s logs to an external server with the following command.

clever drain create [--alias <alias>] <DRAIN-TYPE> <DRAIN-URL> [--username <username>] [--password <password>]

Where DRAIN-TYPE is one of:

  • TCPSyslog: for TCP syslog endpoint;
  • UDPSyslog: for UDP syslog endpoint;
  • HTTP: for TCP syslog endpoint (note that this endpoint has optional username/password parameters as HTTP Basic Authentication);
  • ElasticSearch: for Elasticsearch endpoint (note that this endpoint requires username/password parameters as HTTP Basic Authentication);
  • DatadogHTTP: for Datadog endpoint (note that this endpoint needs your Datadog API Key).

You can list the currently activated drains with this command.

clever drain [--alias <alias>]

And remove them if needed

clever drain remove [--alias <alias>] <DRAIN-ID>

If the status of your drain is shown as DISABLED without you disabling it, it may be because we have not been able to send your logs to your drain endpoint or because the requests timed out after 25 seconds.

You can also use the logs drain to send your add-on’s logs by using --addon flag, the value must be the add-on id starting by addon_.


ElasticSearch drains use the Elastic bulk API. To match this endpoint, specify /_bulk at the end of your Elasticsearch endpoint.

clever drain create ElasticSearch https://xxx-elasticsearch.services.clever-cloud.com/_bulk --username USERNAME --password PASSWORD

Each day, we will create an index logstash-<yyyy-MM-dd> and push logs to it.

Index Lifecycle Management

Depending on the amount of logs generated by your application, you might want to manage the lifecyle of your log indexes to prevent your Elasticsearch instance from running out of storage space.

To do so, Elasticsearch provides a functionnality called Index Lifecycle management that allows you to create a policy to delete indexes based on their creation date.

With our Elasticsearch add-on, you can choose to create a Kibana application in which you can create the policy and apply it to your indexes with an index template, but you can create them manually through API requests.

Here is an example that will create a policy to delete indexes older than 30 days:

curl -X PUT "https://username:password@xxx-elasticsearch.services.clever-cloud.com/_ilm/policy/logs_drain?pretty" -H 'Content-Type: application/json' -d'
  "policy": {
    "phases": {
      "delete": {
        "min_age": "30d",
        "actions": {
          "delete": {} 

An index template example to apply the policy based on an index pattern:

curl -X PUT "https://username:password@xxx-elasticsearch.services.clever-cloud.com/_index_template/logs_drain?pretty" -H 'Content-Type: application/json' -d'
  "index_patterns": ["logstash-*"], 
  "template": {
    "settings": {
      "index.lifecycle.name": "logs_drain"

For more information, please refer to the official documentation.


To create a Datadog drain, you just need to use:

clever drain create DatadogHTTP "https://http-intake.logs.datadoghq.com/v1/input/<API_KEY>?ddsource=clevercloud&service=<SERVICE>&hostname=<HOST>"
Datadog has two zones, EU and COM. An account on one zone is not available on the other, make sure to target the right intake endpoint (datadoghq.eu or datadoghq.com).


To create a NewRelic drain, you just need to use:

clever drain create NewRelicHTTP "https://log-api.eu.newrelic.com/log/v1" --api-key "<API_KEY>"
NewRelic has two zones, EU and US. An account on one zone is not available on the other, make sure to target the right intake endpoint (log-api.eu.newrelic.com or log-api.newrelic.com).

Community software

Community software isn’t directly supported by Clever Cloud. It’s developed by our community. We don’t guarantee their maintenance or correct functioning. You are better off opening issues on their GitHub repositories than contacting Clever Cloud support.

HTTPS-based solution

Some tools available on GitHub enable to create a drain to collect logs through an HTTPS endpoint. This project, for example, is fully compatible with Clever Cloud.

You could host it as an app and an add-on on Clever Cloud. A complete README explains all the features.

Last updated on

Did this documentation help you ?