(docs): update README.md (#919)

This commit is contained in:
vladislav doster 2021-10-01 10:59:11 -05:00 committed by GitHub
parent fe0294798c
commit f3419fbb85
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -3,7 +3,7 @@
![UI for Apache Kafka Price Free](images/free-open-source.svg) ![UI for Apache Kafka Price Free](images/free-open-source.svg)
<em>UI for Apache Kafka is a free open-source web UI for monitoring and management of Apache Kafka clusters. </em> <em>UI for Apache Kafka is a free, open-source web UI to monitor and manage Apache Kafka clusters. </em>
UI for Apache Kafka is a simple tool that makes your data flows observable, helps find and troubleshoot issues faster and deliver optimal performance. Its lightweight dashboard makes it easy to track key metrics of your Kafka clusters - Brokers, Topics, Partitions, Production, and Consumption. UI for Apache Kafka is a simple tool that makes your data flows observable, helps find and troubleshoot issues faster and deliver optimal performance. Its lightweight dashboard makes it easy to track key metrics of your Kafka clusters - Brokers, Topics, Partitions, Production, and Consumption.
@ -18,7 +18,7 @@ Set up UI for Apache Kafka with just a couple of easy commands to visualize your
* **View Kafka Brokers** — view topic and partition assignments, controller status * **View Kafka Brokers** — view topic and partition assignments, controller status
* **View Kafka Topics** — view partition count, replication status, and custom configuration * **View Kafka Topics** — view partition count, replication status, and custom configuration
* **View Consumer Groups** — view per-partition parked offsets, combined and per-partition lag * **View Consumer Groups** — view per-partition parked offsets, combined and per-partition lag
* **Browse Messages** — browse messages with JSON, plain text and Avro encoding * **Browse Messages** — browse messages with JSON, plain text, and Avro encoding
* **Dynamic Topic Configuration** — create and configure new topics with dynamic configuration * **Dynamic Topic Configuration** — create and configure new topics with dynamic configuration
* **Configurable Authentification** — secure your installation with optional Github/Gitlab/Google OAuth 2.0 * **Configurable Authentification** — secure your installation with optional Github/Gitlab/Google OAuth 2.0
@ -120,7 +120,7 @@ To be continued
# Configuration # Configuration
We have a plenty of docker-compose files as examples. Please check them out in ``docker`` directory. We have plenty of docker-compose files as examples. Please check them out in ``docker`` directory.
## Configuration File ## Configuration File
Example of how to configure clusters in the [application-local.yml](https://github.com/provectus/kafka-ui/blob/master/kafka-ui-api/src/main/resources/application-local.yml) configuration file: Example of how to configure clusters in the [application-local.yml](https://github.com/provectus/kafka-ui/blob/master/kafka-ui-api/src/main/resources/application-local.yml) configuration file:
@ -156,7 +156,7 @@ Configure as many clusters as you need by adding their configs below separated w
## <a name="env_variables"></a> Environment Variables ## <a name="env_variables"></a> Environment Variables
Alternatively, each variable of of the .yml file can be set with an environment variable. Alternatively, each variable of the .yml file can be set with an environment variable.
For example, if you want to use an environment variable to set the `name` parameter, you can write it like this: `KAFKA_CLUSTERS_2_NAME` For example, if you want to use an environment variable to set the `name` parameter, you can write it like this: `KAFKA_CLUSTERS_2_NAME`
|Name |Description |Name |Description
@ -164,7 +164,7 @@ For example, if you want to use an environment variable to set the `name` parame
|`SERVER_SERVLET_CONTEXT_PATH` | URI basePath |`SERVER_SERVLET_CONTEXT_PATH` | URI basePath
|`KAFKA_CLUSTERS_0_NAME` | Cluster name |`KAFKA_CLUSTERS_0_NAME` | Cluster name
|`KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS` |Address where to connect |`KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS` |Address where to connect
|`KAFKA_CLUSTERS_0_ZOOKEEPER` | Zookeper service address |`KAFKA_CLUSTERS_0_ZOOKEEPER` | Zookeeper service address
|`KAFKA_CLUSTERS_0_KSQLDBSERVER` | KSQL DB server address |`KAFKA_CLUSTERS_0_KSQLDBSERVER` | KSQL DB server address
|`KAFKA_CLUSTERS_0_PROPERTIES_SECURITY_PROTOCOL` |Security protocol to connect to the brokers. For SSL connection use "SSL", for plaintext connection don't set this environment variable |`KAFKA_CLUSTERS_0_PROPERTIES_SECURITY_PROTOCOL` |Security protocol to connect to the brokers. For SSL connection use "SSL", for plaintext connection don't set this environment variable
|`KAFKA_CLUSTERS_0_SCHEMAREGISTRY` |SchemaRegistry's address |`KAFKA_CLUSTERS_0_SCHEMAREGISTRY` |SchemaRegistry's address
@ -172,13 +172,13 @@ For example, if you want to use an environment variable to set the `name` parame
|`KAFKA_CLUSTERS_0_SCHEMAREGISTRYAUTH_PASSWORD` |SchemaRegistry's basic authentication password |`KAFKA_CLUSTERS_0_SCHEMAREGISTRYAUTH_PASSWORD` |SchemaRegistry's basic authentication password
|`KAFKA_CLUSTERS_0_SCHEMANAMETEMPLATE` |How keys are saved to schemaRegistry |`KAFKA_CLUSTERS_0_SCHEMANAMETEMPLATE` |How keys are saved to schemaRegistry
|`KAFKA_CLUSTERS_0_JMXPORT` |Open jmxPosrts of a broker |`KAFKA_CLUSTERS_0_JMXPORT` |Open jmxPosrts of a broker
|`KAFKA_CLUSTERS_0_READONLY` |Enable read only mode. Default: false |`KAFKA_CLUSTERS_0_READONLY` |Enable read-only mode. Default: false
|`KAFKA_CLUSTERS_0_DISABLELOGDIRSCOLLECTION` |Disable collecting segments information. Should be true for confluent cloud. Default: false |`KAFKA_CLUSTERS_0_DISABLELOGDIRSCOLLECTION` |Disable collecting segments information. It should be true for confluent cloud. Default: false
|`KAFKA_CLUSTERS_0_KAFKACONNECT_0_NAME` |Given name for the Kafka Connect cluster |`KAFKA_CLUSTERS_0_KAFKACONNECT_0_NAME` |Given name for the Kafka Connect cluster
|`KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS` |Address of the Kafka Connect service endpoint |`KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS` |Address of the Kafka Connect service endpoint
|`LOGGING_LEVEL_ROOT` | Setting log level (all, debug, info, warn, error, fatal, off). Default: debug |`LOGGING_LEVEL_ROOT` | Setting log level (all, debug, info, warn, error, fatal, off). Default: debug
|`LOGGING_LEVEL_COM_PROVECTUS` |Setting log level (all, debug, info, warn, error, fatal, off). Default: debug |`LOGGING_LEVEL_COM_PROVECTUS` |Setting log level (all, debug, info, warn, error, fatal, off). Default: debug
|`SERVER_PORT` |Port for the embedded server. Default `8080` |`SERVER_PORT` |Port for the embedded server. Default `8080`
|`KAFKA_CLUSTERS_0_JMXSSL` |Enable SSL for JMX? `true` or `false`. For advanced setup see `kafka-ui-jmx-secured.yml` |`KAFKA_CLUSTERS_0_JMXSSL` |Enable SSL for JMX? `true` or `false`. For advanced setup, see `kafka-ui-jmx-secured.yml`
|`KAFKA_CLUSTERS_0_JMXUSERNAME` |Username for JMX authentication |`KAFKA_CLUSTERS_0_JMXUSERNAME` |Username for JMX authentication
|`KAFKA_CLUSTERS_0_JMXPASSWORD` |Password for JMX authentication |`KAFKA_CLUSTERS_0_JMXPASSWORD` |Password for JMX authentication