diff --git a/promtail/.DS_Store b/promtail/.DS_Store deleted file mode 100644 index f2c42ff..0000000 Binary files a/promtail/.DS_Store and /dev/null differ diff --git a/promtail/DOCS.md b/promtail/DOCS.md deleted file mode 100644 index 94da072..0000000 --- a/promtail/DOCS.md +++ /dev/null @@ -1,327 +0,0 @@ -# Home Assistant Add-on: Promtail - -## Install - -First add the repository to the add-on store (`https://gitea.bonelle-family.dscloud.biz/francois.bonelle/hassio-repo.git`): - -[![Open your Home Assistant instance and show the add add-on repository dialog -with a specific repository URL pre-filled.][add-repo-shield]][add-repo] - -Then find Promtail in the store and click install: - -[![Open your Home Assistant instance and show the dashboard of a Supervisor add-on.][add-addon-shield]][add-addon] - -## Default Setup - -By default this addon version of Promtail will tail logs from the systemd -journal. This will include all logs from all addons, supervisor, home assistant, -docker, and the host system itself. It will then ship them to the Loki add-on in -this same repository if you have it installed. No additional configuration is -required if this is the setup you want. - -If you adjusted the configuration of the Loki add-on, have a separate Loki -add-on or have other log files you want Promtail to monitor then see below for -the configuration options. - -## Configuration - -**Note**: _Remember to restart the add-on when the configuration is changed._ - -Example add-on configuration: - -```yaml -client: - url: http://39bd2704-loki:3100 - username: loki - password: secret - cafile: /ssl/ca.pem -additional_scrape_configs: /share/promtail/scrape_configs.yaml -log_level: info -``` - -**Note**: _This is just an example, don't copy and paste it! Create your own!_ - -### Option: `client.url` (required) - -The URL of the Loki deployment Promtail should ship logs to. - -If you use the Loki add-on, this will be `http://39bd2704-loki:3100` (unless you -enabled `ssl`, then change it to `https`). If you use Grafana Cloud then the URL -will look like this: `https://:@logs-prod-us-central1.grafana.net/api/prom/push` -([see here for more info][grafana-cloud-docs-promtail]). - -### Option: `client.username` - -The username to use if you require basic auth to connect to your Loki deployment. - -### Option: `client.password` - -The password for the username you choose if you require basic auth to connect to -your Loki deployment. **Note**: This field is required if `client.username` is -provided. - -### Option: `client.cafile` - -The CA certificate used to sign Loki's certificate if Loki is using a self-signed -certificate for SSL. - -**Note**: _The file MUST be stored in `/ssl/`, which is the default_ - -### Option: `client.servername` - -The servername listed on the certificate Loki is using if using SSL to connect -by a different URL then what's on Loki's certificate (usually if the certificate -lists a public URL and you're connecting locally). - -### Option: `client.certfile` - -The absolute path to a certificate for client-authentication if Loki is using -mTLS to authenticate clients. - -### Option: `client.keyfile` - -The absolute path to the key for the client-authentication certificate if Loki -is using mTLS to authenticate clients. **Note**: This field is required if -`client.certfile` is provided - -### Option: `additional_pipeline_stages` - -The absolute path to a YAML file with a list of additional pipeline stages to -apply to the [default journal scrape config][addon-default-config]. The primary -use of this is to apply additional processing to logs from particular add-ons -you use if they are noisy or difficult to read. - -This file must contain only a YAML list of pipeline stages. They will be added -to the end of the ones already listed. If you don't like the ones listed, use -`skip_default_scrape_config` and `additional_scrape_configs` to write your own -instead. Here's an example of the contents of this file: - -```yaml -- match: - selector: '{container_name="addon_cebe7a76_hassio_google_drive_backup"}' - stages: - - multiline: - firstline: '^\x{001b}' -``` - -This particular example applies to the [google drive backup addon][addon-google-drive-backup]. -It uses the same log format as Home Assistant and outputs the escape character -at the start of each log line for color-coding in terminals. Looking for that -in a multiline stage makes it so tracebacks are included in the same log entry -as the error that caused them for easier readability. - -See the [promtail documentation][promtail-doc-stages] for more information on how -to configure pipeline stages. - -**Note**: This addon has access to `/ssl`, `/share` and `/config/promtail`. Place -the file in one of these locations, others will not work. - -### Option: `skip_default_scrape_config` - -Promtail will scrape the `systemd journal` using a pre-defined config you can -find [here][addon-default-config]. If you only want it to look at specific log -files you specify or you don't like the default config and want to adjust it, set -this to `true`. Then the only scrape configs used will be the ones you specify -in the `additional_scrape_configs` file. - -**Note**: This addon has access to `/ssl`, `/share` and `/config/promtail`. Place -the file in one of these locations, others will not work. - -### Option: `additional_scrape_configs` - -The absolute path to a YAML file with a list of additional scrape configs for -Promtail to use. The primary use of this is to point Promtail at additional log -files created by add-ons which don't use `stdout` for all logging. You an also -change the system journal is scraped by using this in conjunction with -`skip_default_scrape_config`. **Note**: If `skip_default_scrape_config` is `true` -then this field becomes required (otherwise there would be no scrape configs) - -The file must contain only a YAML list of scrape configs. Here's an example of -the contents of this file: - -```yaml -- job_name: zigbee2mqtt_messages - pipeline_stages: - static_configs: - - targets: - - localhost - labels: - job: zigbee2mqtt_messages - __path__: /share/zigbee2mqtt/log/**.txt -``` - -This particular example would cause Promtail to scrape up the logs MQTT that the -[Zigbee2MQTT add-on][addon-z2m] makes by default. - -Promtail provides a lot of options for configuring scrape configs. See the -documentation on [scrape_configs][promtail-doc-scrape-configs] for more info on -the options available and how to configure them. The documentation also provides -[other examples][promtail-doc-examples] you can use. - -I would also recommend reading the [Loki best practices][loki-doc-best-practices] -guide before making custom scrape configs. Pipelines are pretty powerful but -avoid making too many labels, it does more harm then good. Instead look into -what you can do with [LogQL][logql] can do at the other end. - -**Note**: This addon has access to `/ssl`, `/share` and `/config/promtail`. Place -the file in one of these locations, others will not work. - -### Port: `9080/tcp` - -Promtail expose an [API][api] on this port. This is primarily used for healthchecks -by the supervisor watchdog which does not require exposing it on the host. Most -users should leave this option disabled unless you have an external application -doing healthchecks. - -For advanced users creating custom scrape configs the other purpose of this API -is to expose metrics created by the [metrics][promtail-doc-metrics] pipeline -stage. Exposing this port would then allow you to read those metrics from another -system on your network. - -### Option: `log_level` - -The `log_level` option controls the level of log output by the addon and can -be changed to be more or less verbose, which might be useful when you are -dealing with an unknown issue. Possible values are: - -- `debug`: Shows detailed debug information. -- `info`: Normal (usually) interesting events. -- `warning`: Exceptional occurrences that are not errors. -- `error`: Runtime errors that do not require immediate action. - -Please note that each level automatically includes log messages from a -more severe level, e.g., `debug` also shows `info` messages. By default, -the `log_level` is set to `info`, which is the recommended setting unless -you are troubleshooting. - -## PLG Stack (Promtail, Loki and Grafana) - -Promtail isn't a standalone application, it's job is to find logs, process them -and ship them to Loki. Most likely you want the full PLG stack: - -- Promtail to process and ship logs -- Loki to aggregate and index them -- Grafana to visualize and monitor them - -### Loki - -The easiest way to get started is to also install the Loki add-on in this same -repository. By default this add-on is set up to collect all logs from the system -journal and ship them over to that add-on. If that's what you want there is no -additional configuration required for either. - -[![Open your Home Assistant instance and show the dashboard of a Supervisor add-on.][add-addon-shield]][add-addon-loki] - -Alternatively you can deploy Loki somewhere else. Take a look at the -[Loki documentation][loki-doc] for info on setting up a Loki deployment and -connecting Promtail to it. - -### Grafana - -Once you have Loki and Promtail set up you will probably want to connect to it -from [Grafana][grafana]. The easiest way to do that is to use the -[Grafana community add-on][addon-grafana]. From there you can find Loki in the -list of data sources and configure the connection (see [Loki in Grafana][loki-in-grafana]). -If you did choose to use the Loki add-on you can find additional instructions in -[the add-on's documentation][addon-loki-doc]. - -[![Open your Home Assistant instance and show the dashboard of a Supervisor add-on.][add-addon-shield]][add-addon-grafana] - -### Grafana Cloud - -If you prefer, you can also use Grafana's cloud service, -[see here](https://grafana.com/products/cloud/) on how to get started. This -service takes the place of both Loki and Grafana in the PLG stack, you just -point Promtail at it and you're done. To do this, first create an account then -[review this guide][grafana-cloud-docs-promtail] to see how to connect Promtail -to your account. Essentially its just a different URL since the credential information -goes in the URL. - -## Changelog & Releases - -This repository keeps a change log using [GitHub's releases][releases] -functionality. - -Releases are based on [Semantic Versioning][semver], and use the format -of `MAJOR.MINOR.PATCH`. In a nutshell, the version will be incremented -based on the following: - -- `MAJOR`: Incompatible or major changes. -- `MINOR`: Backwards-compatible new features and enhancements. -- `PATCH`: Backwards-compatible bugfixes and package updates. - -## Support - -Got questions? - -You have several ways to get them answered: - -- The Home Assistant [Community Forum][forum]. I am - [CentralCommand][forum-centralcommand] there. -- The Home Assistant [Discord Chat Server][discord-ha]. Use the #add-ons channel, - I am CentralCommand#0913 there. - -You could also [open an issue here][issue] on GitHub. - -## Authors & contributors - -The original setup of this repository is by [Mike Degatano][mdegat01]. - -For a full list of all authors and contributors, -check [the contributor's page][contributors]. - -## License - -MIT License - -Copyright (c) 2021-2022 Mike Degatano - -Permission is hereby granted, free of charge, to any person obtaining a copy -of this software and associated documentation files (the "Software"), to deal -in the Software without restriction, including without limitation the rights -to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -copies of the Software, and to permit persons to whom the Software is -furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in all -copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE -SOFTWARE. - -[add-addon-shield]: https://my.home-assistant.io/badges/supervisor_addon.svg -[add-addon]: https://my.home-assistant.io/redirect/supervisor_addon/?addon=39bd2704_promtail -[add-addon-grafana]: https://my.home-assistant.io/redirect/supervisor_addon/?addon=a0d7b954_grafana -[add-addon-loki]: https://my.home-assistant.io/redirect/supervisor_addon/?addon=39bd2704_loki -[add-repo-shield]: https://my.home-assistant.io/badges/supervisor_add_addon_repository.svg -[add-repo]: https://my.home-assistant.io/redirect/supervisor_add_addon_repository/?repository_url=https%3A%2F%2Fgithub.com%2Fmdegat01%2Fhassio-addons -[addon-default-config]: https://github.com/mdegat01/addon-promtail/blob/main/promtail/rootfs/etc/promtail/default-scrape-config.yaml -[addon-grafana]: https://github.com/hassio-addons/addon-grafana -[addon-google-drive-backup]: https://github.com/sabeechen/hassio-google-drive-backup -[addon-loki-doc]: https://github.com/mdegat01/addon-loki/blob/main/loki/DOCS.md#grafana -[addon-z2m]: https://github.com/zigbee2mqtt/hassio-zigbee2mqtt -[api]: https://grafana.com/docs/loki/latest/clients/promtail/#api -[contributors]: https://github.com/mdegat01/addon-promtail/graphs/contributors -[discord-ha]: https://discord.gg/c5DvZ4e -[forum-centralcommand]: https://community.home-assistant.io/u/CentralCommand/?u=CentralCommand -[forum]: https://community.home-assistant.io/t/home-assistant-add-on-promtail/293732?u=CentralCommand -[grafana]: https://grafana.com/oss/grafana/ -[grafana-cloud]: https://grafana.com/products/cloud/ -[grafana-cloud-docs-promtail]: https://grafana.com/docs/grafana-cloud/quickstart/logs_promtail_linuxnode/ -[issue]: https://github.com/mdegat01/addon-promtail/issues -[logql]: https://grafana.com/docs/loki/latest/logql/ -[loki-doc]: https://grafana.com/docs/loki/latest/overview/ -[loki-doc-best-practices]: https://grafana.com/docs/loki/latest/best-practices/ -[loki-in-grafana]: https://grafana.com/docs/loki/latest/getting-started/grafana/ -[mdegat01]: https://github.com/mdegat01 -[promtail-doc-examples]: https://grafana.com/docs/loki/latest/clients/promtail/configuration/#example-static-config -[promtil-doc-metrics]: https://grafana.com/docs/loki/latest/clients/promtail/configuration/#metrics -[promtail-doc-scrape-configs]: https://grafana.com/docs/loki/latest/clients/promtail/configuration/#scrape_configs -[promtail-doc-stages]: https://grafana.com/docs/loki/latest/clients/promtail/stages/ -[releases]: https://github.com/mdegat01/addon-promtail/releases -[semver]: http://semver.org/spec/v2.0.0 diff --git a/promtail/Dockerfile b/promtail/Dockerfile deleted file mode 100644 index 9ffbba6..0000000 --- a/promtail/Dockerfile +++ /dev/null @@ -1,67 +0,0 @@ -ARG BUILD_FROM -FROM ${BUILD_FROM} - -# Build arguments -ARG BUILD_ARCH -ARG BUILD_DATE -ARG BUILD_DESCRIPTION -ARG BUILD_NAME -ARG BUILD_REF -ARG BUILD_REPOSITORY -ARG BUILD_VERSION -ARG YQ_VERSION -ARG PROMTAIL_VERSION - -# Add yq and tzdata (required for the timestamp stage) -RUN set -eux; \ - apt-get update; \ - apt-get install -qy --no-install-recommends \ - tar \ - unzip \ - libsystemd-dev \ - ; \ - update-ca-certificates; \ - case "${BUILD_ARCH}" in \ - amd64) BINARCH='amd64' ;; \ - armhf) BINARCH='arm' ;; \ - armv7) BINARCH='arm' ;; \ - aarch64) BINARCH='arm64' ;; \ - *) echo >&2 "error: unsupported architecture (${APKARCH})"; exit 1 ;; \ - esac; \ - curl -s -J -L -o /tmp/yq.tar.gz \ - "https://github.com/mikefarah/yq/releases/download/v${YQ_VERSION}/yq_linux_${BINARCH}.tar.gz"; \ - tar -xf /tmp/yq.tar.gz -C /usr/bin; \ - mv /usr/bin/yq_linux_${BINARCH} /usr/bin/yq; \ - chmod a+x /usr/bin/yq; \ - rm /tmp/yq.tar.gz; \ - yq --version; \ - apt-get clean; \ - rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*; \ - curl -s -J -L -o /tmp/promtail.zip \ - "https://github.com/grafana/loki/releases/download/v${PROMTAIL_VERSION}/promtail-linux-${BINARCH}.zip"; \ - unzip /tmp/promtail.zip -d /usr/bin; \ - mv /usr/bin/promtail-linux-${BINARCH} /usr/bin/promtail; \ - chmod +x /usr/bin/promtail; \ - rm /tmp/promtail.zip; \ - mkdir -p /data/promtail; \ - rm -rf /var/lib/apt/lists/* - -COPY rootfs / -WORKDIR /data/promtail - -# Labels -LABEL \ - io.hass.name="${BUILD_NAME}" \ - io.hass.description="${BUILD_DESCRIPTION}" \ - io.hass.arch="${BUILD_ARCH}" \ - io.hass.type="addon" \ - io.hass.version=${BUILD_VERSION} \ - maintainer="fbonelle" \ - org.opencontainers.image.title="${BUILD_NAME}" \ - org.opencontainers.image.description="${BUILD_DESCRIPTION}" \ - org.opencontainers.image.vendor="fbonelle's addons" \ - org.opencontainers.image.authors="fbonelle" \ - org.opencontainers.image.licenses="MIT" \ - org.opencontainers.image.created=${BUILD_DATE} \ - org.opencontainers.image.revision=${BUILD_REF} \ - org.opencontainers.image.version=${BUILD_VERSION} \ No newline at end of file diff --git a/promtail/apparmor.txt b/promtail/apparmor.txt deleted file mode 100644 index 910a878..0000000 --- a/promtail/apparmor.txt +++ /dev/null @@ -1,123 +0,0 @@ -include - -# Docker overlay -@{docker_root}=/docker/ /var/lib/docker/ -@{fs_root}=/ @{docker_root}/overlay2/*/diff/ -@{do_etc}=@{fs_root}/etc/ -@{do_opt}=@{fs_root}/opt/ -@{do_run}=@{fs_root}/{run,var/run}/ -@{do_usr}=@{fs_root}/usr/ -@{do_var}=@{fs_root}/var/ - -# Systemd Journal location -@{journald}=/{run,var}/log/journal/{,**} - -profile hassio_promtail flags=(attach_disconnected,mediate_deleted) { - include - include - - # Send signals to child services - signal (send) peer=@{profile_name}//*, - - # Network access - network tcp, - network udp, - - # S6-Overlay - /init rix, - /bin/** rix, - @{do_usr}/bin/** rix, - @{do_usr}/sbin/** rix, - @{do_usr}/share/{,**} r, - @{do_usr}/lib/locale/{,**} r, - @{do_etc}/* rw, - @{do_etc}/s6*/** r, - @{do_etc}/fix-attrs.d/{,**} r, - @{do_etc}/cont-{init,finish}.d/{,**} rwix, - @{do_etc}/services.d/{,**} rwix, - @{do_etc}/ssl/openssl.cnf r, - @{do_etc}/{group,hosts,passwd} r, - @{do_etc}/{host,nsswitch,resolv}.conf r, - @{do_run}/{s6,s6-rc*,service}/** rix, - @{do_run}/{,**} rwk, - /var/cache/{,**} rw, - /dev/tty rw, - /dev/null k, - /command/** rix, - /package/** rix, - - - # Bashio - @{do_usr}/lib/bashio/** ix, - /tmp/** rw, - - # Options.json & addon data - /data r, - /data/** rw, - - # Files needed for setup - @{do_etc}/promtail/{,**} rw, - /config/promtail/{,**} r, - /{share,ssl}/{,**} r, - @{journald} r, - - # Programs - /usr/bin/promtail cx -> promtail_profile, - /usr/bin/yq Cx -> yq_profile, - /usr/sbin/dpkg-reconfigure Cx -> dpkg_reconfigure_profile, - - profile promtail_profile flags=(attach_disconnected,mediate_deleted) { - include - - # Receive signals from s6 - signal (receive) peer=*_promtail, - - # Network access - network tcp, - network udp, - network netlink raw, - network unix dgram, - - # Temp files - /tmp/.positions.yaml* rw, - - # Addon data - /data/** r, - /data/promtail/** rwk, - - # Config & log data - @{do_etc}/promtail/config.yaml r, - /config/promtail/{,**} r, - /{share,ssl}/** r, - @{journald} r, - - # Runtime usage - /usr/bin/promtail rm, - @{do_etc}/{hosts,passwd} r, - @{do_etc}/{resolv,nsswitch}.conf r, - @{PROC}/sys/net/core/somaxconn r, - @{sys}/kernel/mm/transparent_hugepage/hpage_pmd_size r, - /dev/null k, - @{do_etc}/ssl/certs/** r, - } - - profile yq_profile flags=(attach_disconnected,mediate_deleted) { - include - - # Config files - @{do_etc}/promtail/* rw, - /config/promtail/{,**} r, - /share/** r, - - # Runtime usage - /usr/bin/yq rm, - @{sys}/kernel/mm/transparent_hugepage/hpage_pmd_size r, - /dev/null k, - } - - profile dpkg_reconfigure_profile flags=(attach_disconnected,mediate_deleted) { - include - - /** rwlkmix, - } -} \ No newline at end of file diff --git a/promtail/build.yaml b/promtail/build.yaml deleted file mode 100644 index e8c2a29..0000000 --- a/promtail/build.yaml +++ /dev/null @@ -1,10 +0,0 @@ ---- -build_from: - aarch64: ghcr.io/hassio-addons/debian-base:8.1.4 - amd64: ghcr.io/hassio-addons/debian-base:8.1.4 - armhf: ghcr.io/hassio-addons/debian-base:8.1.4 - armv7: ghcr.io/hassio-addons/debian-base:8.1.4 - i386: ghcr.io/hassio-addons/debian-base:8.1.4 -args: - YQ_VERSION: 4.48.1 - PROMTAIL_VERSION: 3.5.7 \ No newline at end of file diff --git a/promtail/config.yaml b/promtail/config.yaml deleted file mode 100644 index 8dd8d9a..0000000 --- a/promtail/config.yaml +++ /dev/null @@ -1,39 +0,0 @@ ---- -name: Promtail -url: https://gitea.bonelle-family.dscloud.biz/francois.bonelle/hassio-repo.git -version: 3.5.7 -slug: hassio_promtail -arch: - - aarch64 - - amd64 - - armv7 - - armhf -description: Promtail for Home Assistant -init: false -journald: true -map: - - config - - share - - ssl -watchdog: http://[HOST]:[PORT:9080]/ready -ports: - 9080/tcp: -ports_description: - 9080/tcp: Promtail web server -options: - client: - url: http://loki:3100/loki/api/v1/push - log_level: info -schema: - client: - url: str - username: str? - password: password? - cafile: str? - servername: str? - certfile: str? - keyfile: str? - additional_pipeline_stages: str? - additional_scrape_configs: str? - skip_default_scrape_config: bool? - log_level: list(trace|debug|info|notice|warning|error|fatal)? diff --git a/promtail/icon.png b/promtail/icon.png deleted file mode 100644 index 8e89cb4..0000000 Binary files a/promtail/icon.png and /dev/null differ diff --git a/promtail/logo.png b/promtail/logo.png deleted file mode 100644 index db5ec01..0000000 Binary files a/promtail/logo.png and /dev/null differ diff --git a/promtail/rootfs/.DS_Store b/promtail/rootfs/.DS_Store deleted file mode 100644 index e412cbf..0000000 Binary files a/promtail/rootfs/.DS_Store and /dev/null differ diff --git a/promtail/rootfs/etc/.DS_Store b/promtail/rootfs/etc/.DS_Store deleted file mode 100644 index c2d418d..0000000 Binary files a/promtail/rootfs/etc/.DS_Store and /dev/null differ diff --git a/promtail/rootfs/etc/cont-init.d/promtail_setup.sh b/promtail/rootfs/etc/cont-init.d/promtail_setup.sh deleted file mode 100755 index f8a95eb..0000000 --- a/promtail/rootfs/etc/cont-init.d/promtail_setup.sh +++ /dev/null @@ -1,129 +0,0 @@ -#!/usr/bin/with-contenv bashio -# shellcheck shell=bash -# ============================================================================== -# Home Assistant Add-on: Promtail -# This file makes the config file from inputs -# ============================================================================== -readonly CONFIG_DIR=/etc/promtail -readonly CONFIG_FILE="${CONFIG_DIR}/config.yaml" -readonly BASE_CONFIG="${CONFIG_DIR}/base_config.yaml" -readonly DEF_SCRAPE_CONFIGS="${CONFIG_DIR}/default-scrape-config.yaml" -readonly CUSTOM_SCRAPE_CONFIGS="${CONFIG_DIR}/custom-scrape-config.yaml" -declare cafile -declare add_stages -declare add_scrape_configs -scrape_configs="${DEF_SCRAPE_CONFIGS}" - -bashio::log.info 'Setting base config for promtail...' -cp "${BASE_CONFIG}" "${CONFIG_FILE}" - -# Set up client section -if ! bashio::config.is_empty 'client.username'; then - bashio::log.info 'Adding basic auth to client config...' - bashio::config.require 'client.password' "'client.username' is specified" - { - echo " basic_auth:" - echo " username: $(bashio::config 'client.username')" - echo " password: $(bashio::config 'client.password')" - } >> "${CONFIG_FILE}" -fi - -if ! bashio::config.is_empty 'client.cafile'; then - bashio::log.info "Adding TLS to client config..." - cafile="/ssl/$(bashio::config 'client.cafile')" - - if ! bashio::fs.file_exists "${cafile}"; then - bashio::log.fatal - bashio::log.fatal "The file specified for 'cafile' does not exist!" - bashio::log.fatal "Ensure the CA certificate file exists and full path is provided" - bashio::log.fatal - bashio::exit.nok - fi - { - echo " tls_config:" - echo " ca_file: ${cafile}" - } >> "${CONFIG_FILE}" - - if ! bashio::config.is_empty 'client.servername'; then - echo " server_name: $(bashio::config 'client.servername')" >> "${CONFIG_FILE}" - fi - - if ! bashio::config.is_empty 'client.certfile'; then - bashio::log.info "Adding mTLS to client config..." - bashio::config.require 'client.keyfile' "'client.certfile' is specified" - if ! bashio::fs.file_exists "$(bashio::config 'client.certfile')"; then - bashio::log.fatal - bashio::log.fatal "The file specified for 'certfile' does not exist!" - bashio::log.fatal "Ensure the certificate file exists and full path is provided" - bashio::log.fatal - bashio::exit.nok - fi - if ! bashio::fs.file_exists "$(bashio::config 'client.keyfile')"; then - bashio::log.fatal - bashio::log.fatal "The file specified for 'keyfile' does not exist!" - bashio::log.fatal "Ensure the key file exists and full path is provided" - bashio::log.fatal - bashio::exit.nok - fi - { - echo " cert_file: $(bashio::config 'client.certfile')" - echo " key_file: $(bashio::config 'client.keyfile')" - } >> "${CONFIG_FILE}" - fi -fi - -# Add in scrape configs -{ - echo - echo "scrape_configs:" -} >> "${CONFIG_FILE}" -if bashio::config.true 'skip_default_scrape_config'; then - bashio::log.info 'Skipping default journald scrape config...' - if ! bashio::config.is_empty 'additional_pipeline_stages'; then - bashio::log.warning - bashio::log.warning "'additional_pipeline_stages' ignored since 'skip_default_scrape_config' is true!" - bashio::log.warning 'See documentation for more information.' - bashio::log.warning - fi - bashio::config.require 'additional_scrape_configs' "'skip_default_scrape_config' is true" - -elif ! bashio::config.is_empty 'additional_pipeline_stages'; then - bashio::log.info "Adding additional pipeline stages to default journal scrape config..." - add_stages="$(bashio::config 'additional_pipeline_stages')" - scrape_configs="${CUSTOM_SCRAPE_CONFIGS}" - if ! bashio::fs.file_exists "${add_stages}"; then - bashio::log.fatal - bashio::log.fatal "The file specified for 'additional_pipeline_stages' does not exist!" - bashio::log.fatal "Ensure the file exists at the path specified" - bashio::log.fatal - bashio::exit.nok - fi - - yq -NP eval-all \ - 'select(fi == 0) + [{"add_pipeline_stages": select(fi == 1)}]' \ - "${DEF_SCRAPE_CONFIGS}" "${add_stages}" \ - | yq -NP e \ - '[(.[0] * .[1]) | {"job_name": .job_name, "journal": .journal, "relabel_configs": .relabel_configs, "pipeline_stages": .pipeline_stages + .add_pipeline_stages}]' \ - - > "${scrape_configs}" -fi - -if ! bashio::config.is_empty 'additional_scrape_configs'; then - bashio::log.info "Adding custom scrape configs..." - add_scrape_configs="$(bashio::config 'additional_scrape_configs')" - if ! bashio::fs.file_exists "${add_scrape_configs}"; then - bashio::log.fatal - bashio::log.fatal "The file specified for 'additional_scrape_configs' does not exist!" - bashio::log.fatal "Ensure the file exists at the path specified" - bashio::log.fatal - bashio::exit.nok - fi - - if bashio::config.true 'skip_default_scrape_config'; then - yq -NP e '[] + .' "${add_scrape_configs}" >> "${CONFIG_FILE}" - else - yq -NP eval-all 'select(fi == 0) + select(fi == 1)' \ - "${scrape_configs}" "${add_scrape_configs}" >> "${CONFIG_FILE}" - fi -else - yq -NP e '[] + .' "${scrape_configs}" >> "${CONFIG_FILE}" -fi diff --git a/promtail/rootfs/etc/promtail/base_config.yaml b/promtail/rootfs/etc/promtail/base_config.yaml deleted file mode 100644 index 0e6eb25..0000000 --- a/promtail/rootfs/etc/promtail/base_config.yaml +++ /dev/null @@ -1,11 +0,0 @@ ---- -server: - http_listen_port: 9080 - grpc_listen_port: 0 - log_level: "${LOG_LEVEL}" - -positions: - filename: /data/promtail/positions.yaml - -clients: - - url: "${URL}" diff --git a/promtail/rootfs/etc/promtail/default-scrape-config.yaml b/promtail/rootfs/etc/promtail/default-scrape-config.yaml deleted file mode 100644 index 6e3f112..0000000 --- a/promtail/rootfs/etc/promtail/default-scrape-config.yaml +++ /dev/null @@ -1,27 +0,0 @@ ---- -- job_name: journal - journal: - json: false - max_age: 12h - labels: - job: systemd-journal - path: "${JOURNAL_PATH}" - relabel_configs: - - source_labels: - - __journal__systemd_unit - target_label: unit - - source_labels: - - __journal__hostname - target_label: nodename - - source_labels: - - __journal_syslog_identifier - target_label: syslog_identifier - - source_labels: - - __journal_container_name - target_label: container_name - pipeline_stages: - - match: - selector: '{container_name=~"homeassistant|hassio_supervisor"}' - stages: - - multiline: - firstline: '^\x{001b}' diff --git a/promtail/rootfs/etc/services.d/promtail/finish b/promtail/rootfs/etc/services.d/promtail/finish deleted file mode 100755 index 649e4e5..0000000 --- a/promtail/rootfs/etc/services.d/promtail/finish +++ /dev/null @@ -1,15 +0,0 @@ -#!/usr/bin/env bashio -# ============================================================================== -# Take down the S6 supervision tree when Promtail fails -# s6-overlay docs: https://github.com/just-containers/s6-overlay -# ============================================================================== - -declare APP_EXIT_CODE=${1} - -if [[ "${APP_EXIT_CODE}" -ne 0 ]] && [[ "${APP_EXIT_CODE}" -ne 256 ]]; then - bashio::log.warning "Halt add-on with exit code ${APP_EXIT_CODE}" - echo "${APP_EXIT_CODE}" > /run/s6-linux-init-container-results/exitcode - exec /run/s6/basedir/bin/halt -fi - -bashio::log.info "Service restart after closing" diff --git a/promtail/rootfs/etc/services.d/promtail/run b/promtail/rootfs/etc/services.d/promtail/run deleted file mode 100755 index d5f6a71..0000000 --- a/promtail/rootfs/etc/services.d/promtail/run +++ /dev/null @@ -1,40 +0,0 @@ -#!/usr/bin/with-contenv bashio -# shellcheck shell=bash -# ============================================================================== -# Home Assistant Add-on: Promtail -# Runs Promtail -# ============================================================================== -readonly PROMTAIL_CONFIG='/etc/promtail/config.yaml' -declare log_level - -bashio::log.info 'Starting Promtail...' - -journal_path='/var/log/journal' -if ! bashio::fs.directory_exists "${journal_path}" || [ -z "$(ls -A ${journal_path})" ]; then - bashio::log.info "No journal at ${journal_path}, looking for journal in /run/log/journal instead" - journal_path='/run/log/journal' -fi - -case "$(bashio::config 'log_level')" in \ - trace) ;& \ - debug) log_level='debug' ;; \ - notice) ;& \ - warning) log_level='warn' ;; \ - error) ;& \ - fatal) log_level='error' ;; \ - *) log_level='info' ;; \ -esac; -bashio::log.info "Promtail log level set to ${log_level}" - -export "URL=$(bashio::config 'client.url')" -export "JOURNAL_PATH=${journal_path}" -export "LOG_LEVEL=${log_level}" - -promtail_args=("-config.expand-env=true" "-config.file=${PROMTAIL_CONFIG}") -if [ "${log_level}" == "debug" ]; then - bashio::log.debug "Logging full config on startup for debugging..." - promtail_args+=("-print-config-stderr=true") -fi - -bashio::log.info "Handing over control to Promtail..." -/usr/bin/promtail "${promtail_args[@]}"