Ver Fonte

GITBOOK-36: Add an AWS quickstart page

Roman Zabaluev há 2 anos atrás
pai
commit
c70711d064

+ 1 - 1
README.md

@@ -4,7 +4,7 @@
 
 **Versatile, fast and lightweight web UI for managing Apache Kafka® clusters. Built by developers, for developers.**
 
-****
+
 
 **UI for Apache Kafka is a free, open-source web UI to monitor and manage Apache Kafka clusters.**
 

+ 3 - 3
SUMMARY.md

@@ -23,7 +23,8 @@
 
 ## 👷♂ Configuration
 
-* [Quick Start](configuration/quick-start.md)
+* [Quick Start](configuration/quick-start/README.md)
+  * [via AWS Marketplace](configuration/quick-start/via-aws-marketplace.md)
 * [Configuration wizard](configuration/configuration-wizard.md)
 * [Configuration file](configuration/configuration-file.md)
 * [Compose examples](configuration/compose-examples.md)
@@ -46,8 +47,7 @@
   * [LDAP / Active Directory](configuration/authentication/ldap-active-directory.md)
 * [RBAC (Role based access control)](configuration/rbac-role-based-access-control.md)
 * [Data masking](configuration/data-masking.md)
-* [Serialization / SerDe](configuration/serialization-serde/README.md)
-  * [Protobuf setup](configuration/serialization-serde/protobuf-setup.md)
+* [Serialization / SerDe](configuration/serialization-serde.md)
 * [OpenDataDiscovery Integration](configuration/opendatadiscovery-integration.md)
 
 ## ❓ FAQ

+ 1 - 1
configuration/quick-start.md → configuration/quick-start/README.md

@@ -39,4 +39,4 @@ The app in AWS Marketplace could be found by [this link](https://aws.amazon.com/
 
 ## Helm way
 
-To install the app via Helm please refer to [this page](helm-charts/quick-start.md).
+To install the app via Helm please refer to [this page](../helm-charts/quick-start.md).

+ 53 - 0
configuration/quick-start/via-aws-marketplace.md

@@ -0,0 +1,53 @@
+---
+description: How to Deploy Kafka UI from AWS Marketplace
+---
+
+# via AWS Marketplace
+
+### Step 1: Go to AWS Marketplace
+
+Go to the AWS Marketplace website and sign in to your account.
+
+### Step 2: Find UI for Apache Kafka
+
+Either use the search bar to find "UI for Apache Kafka" or go to [marketplace product page](https://aws.amazon.com/marketplace/pp/prodview-ogtt5hfhzkq6a).
+
+### Step 3: Subscribe and Configure
+
+Click "Continue to Subscribe" and accept the terms and conditions. Click "Continue to Configuration".
+
+### Step 4: Choose the Software Version and Region
+
+Choose your desired software version and region. Click "Continue to Launch".
+
+### Step 5: Launch the Instance
+
+Choose "Launch from Website" and select your desired EC2 instance type. You can choose a free tier instance or choose a larger instance depending on your needs. We recommend having at least 512 RAM for an instant.
+
+Next, select the VPC and subnet where you want the instance to be launched. If you don't have an existing VPC or subnet, you can create one by clicking "Create New VPC" or "Create New Subnet".
+
+Choose your security group. A security group acts as a virtual firewall that controls traffic to and from your instance. If you don't have an existing security group, you can create a new one based on the _seller settings_ by clicking "Create New Based on Seller Settings".
+
+Give your security group a name and description. The _seller settings_ will automatically populate the inbound and outbound rules for the security group based on best practices. You can review and modify the rules if necessary.
+
+Click "Save" to create your new security group.
+
+Select your key pair or create a new one. A key pair is used to securely connect to your instance via SSH. If you choose to create a new key pair, give it a name and click "Create". Your private key will be downloaded to your computer, so make sure to keep it in a safe place.
+
+Finally, click "Launch" to deploy your instance. AWS will create the instance and install the Kafka UI software for you.
+
+### Step 6: Check EC2 Status
+
+To check the EC2 state please click on "EC2 console".
+
+### Step 7: Access the Kafka UI
+
+After the instance is launched, you can check its status on the EC2 dashboard. Once it's running, you can access the Kafka UI by copying the public DNS name or IP address provided by AWS and adding after the IP address or DNS name port 8080.\
+Example: `ec2-xx-xxx-x-xx.us-west-2.compute.amazonaws.com:8080`
+
+### Step 8: Configure Kafka UI to Communicate with Brokers
+
+If your broker is deployed in AWS then allow incoming from Kafka-ui EC2 by adding an ingress rule in the security group which is used for a broker.\
+If your broker is not in AWS then be sure that your broker can handle requests from Kafka-ui EC2 IP address.
+
+That's it! You've successfully deployed the Kafka UI from AWS Marketplace.

+ 13 - 7
configuration/serialization-serde/README.md → configuration/serialization-serde.md

@@ -40,7 +40,7 @@ kafka:
             encoding: "UTF-16"
 ```
 
-#### Protobuf
+#### ProtobufFile
 
 Class name: `com.provectus.kafka.ui.serdes.builtin.ProtobufFileSerde`
 
@@ -54,10 +54,18 @@ kafka:
       serde:
         - name: ProtobufFile
           properties:
-            # path to the protobuf schema files
+            # protobufFilesDir specifies root location for proto files (will be scanned recursively)
+            # NOTE: if 'protobufFilesDir' specified, then 'protobufFile' and 'protobufFiles' settings will be ignored
+            protobufFilesDir: "/path/to/my-protobufs"
+            # (DEPRECATED) protobufFile is the path to the protobuf schema. (deprecated: please use "protobufFiles")
+            protobufFile: path/to/my.proto
+            # (DEPRECATED) protobufFiles is the location of one or more protobuf schemas
             protobufFiles:
-              - path/to/my.proto
-              - path/to/another.proto
+              - /path/to/my-protobufs/my.proto
+              - /path/to/my-protobufs/another.proto
+            # protobufMessageName is the default protobuf type that is used to deserialize
+            # the message's value if the topic is not found in protobufMessageNameByTopic.    
+            protobufMessageName: my.DefaultValType
             # default protobuf type that is used for KEY serialization/deserialization
             # optional
             protobufMessageNameForKey: my.Type1
@@ -76,9 +84,7 @@ kafka:
               "topic.2": my.Type2
 ```
 
-Docker-compose sample for Protobuf serialization is here.
 
-Legacy configuration for protobuf is here.
 
 #### SchemaRegistry
 
@@ -160,7 +166,7 @@ You can implement your own serde and register it in kafka-ui application. To do
 2. Implement `com.provectus.kafka.ui.serde.api.Serde` interface. See javadoc for implementation requirements.
 3. Pack your serde into uber jar, or provide directory with no-dependency jar and it's dependencies jars
 
-Example pluggable serdes : https://github.com/provectus/kafkaui-smile-serde https://github.com/provectus/kafkaui-glue-sr-serde
+Example pluggable serdes : [kafka-smile-serde](https://github.com/provectus/kafkaui-smile-serde), [kafka-glue-sr-serde](https://github.com/provectus/kafkaui-glue-sr-serde)
 
 Sample configuration:
 

+ 0 - 55
configuration/serialization-serde/protobuf-setup.md

@@ -1,55 +0,0 @@
-# Protobuf setup
-
-## Kafkaui Protobuf Support
-
-#### This document is deprecated, please see examples in Serialization document.
-
-Kafkaui supports deserializing protobuf messages in two ways:
-
-1. Using Confluent Schema Registry's [protobuf support](https://docs.confluent.io/platform/current/schema-registry/serdes-develop/serdes-protobuf.html).
-2. Supplying a protobuf file as well as a configuration that maps topic names to protobuf types.
-
-### Configuring Kafkaui with a Protobuf File
-
-To configure Kafkaui to deserialize protobuf messages using a supplied protobuf schema add the following to the config:
-
-```yaml
-kafka:
-  clusters:
-    - # Cluster configuration omitted.
-      # protobufFile is the path to the protobuf schema. (deprecated: please use "protobufFiles")
-      protobufFile: path/to/my.proto
-      # protobufFiles is the path to one or more protobuf schemas.
-      protobufFiles: 
-        - /path/to/my.proto
-        - /path/to/another.proto
-      # protobufMessageName is the default protobuf type that is used to deserilize
-      # the message's value if the topic is not found in protobufMessageNameByTopic.
-      protobufMessageName: my.DefaultValType
-      # protobufMessageNameByTopic is a mapping of topic names to protobuf types.
-      # This mapping is required and is used to deserialize the Kafka message's value.
-      protobufMessageNameByTopic:
-        topic1: my.Type1
-        topic2: my.Type2
-      # protobufMessageNameForKey is the default protobuf type that is used to deserilize
-      # the message's key if the topic is not found in protobufMessageNameForKeyByTopic.
-      protobufMessageNameForKey: my.DefaultKeyType
-      # protobufMessageNameForKeyByTopic is a mapping of topic names to protobuf types.
-      # This mapping is optional and is used to deserialize the Kafka message's key.
-      # If a protobuf type is not found for a topic's key, the key is deserialized as a string,
-      # unless protobufMessageNameForKey is specified.
-      protobufMessageNameForKeyByTopic:
-        topic1: my.KeyType1
-```
-
-Same config with flattened config (for docker-compose):
-
-```
-kafka.clusters.0.protobufFiles.0: /path/to/my.proto
-kafka.clusters.0.protobufFiles.1: /path/to/another.proto
-kafka.clusters.0.protobufMessageName: my.DefaultValType
-kafka.clusters.0.protobufMessageNameByTopic.topic1: my.Type1
-kafka.clusters.0.protobufMessageNameByTopic.topic2: my.Type2
-kafka.clusters.0.protobufMessageNameForKey: my.DefaultKeyType
-kafka.clusters.0.protobufMessageNameForKeyByTopic.topic1: my.KeyType1
-```