Explorar o código

Role based access control (#2790)

* Role based access control

* Fix build + checkstyle

* Refactoring, some bug fixes, review fixes

* Compile permission value patterns

* Make the resource a enum instead of a string

* Refactoring

* Make clusters required

* Fix formatting

* switch the switch case to a smart switch case

* Get rid of topic analysis actions

* Rename endpoints, fix an issue

* Return a flag indicating if rbac is on and a username

* Fix yaml indent in editorconfig

* Fix github & cognito role name fetching

* Fix case matching for actions

* Update readme

* Add an endpoint to determine if a user can create a resource

* Fix tests (I hope so)

* Fix tests

* Use spring configs instead of a separate file, rename endpoints

* Add "ALL" action
Get rid of unnecessary cache, save groups into spring auth
Review fixes

* Make "all" action case-insensitive

* Role based access control / FrontEnd  (#2933)

* Initial modifications and mocking the For the RoleAccess

* fix the Suspense issue in the components , comment the Tests to implement later

* minor test comment

* Roles and configuration and santization of data

* initialize RoleCheck hook

* make the App test file visible + minor modification in the permission hook

* Structure the data so the Burger header toggle does not rerender the whole application

* add tests to the NavBar and the Page container , add tests

* NavBar and PageContainer bug fixes

* Roles Testing code modification

* covering Topics create button Actions, and Schema create button Actions

* minor typescript code modifications for the cluster required parameter in the rolesHelper

* minor typescript code modifications for the cluster required parameter in the rolesHelper

* minor code modification to describe the Permission tests more clearly

* Produce message Permissions with Tests Suites for Topic

* Add Schema Edit Permission with tests

* Minor role changes

* Add ActionButton Component to handle the Button with tooltip

* Add ActionButton Component to handle the Button with tooltip

* Add Action Button to every Button create Action

* ActionButton add test suites

* usePermission code modification to include regular expressions

* Abstract Actions Component for code repetition, add Configs Edit button Permission + add the tests suites to it.

* Schema Remove functionality Permission and Test Suites + creation of the ActionDropdownItem for Actions

* Topic Edit Clear and delete Topic , Permissions with test suites

* ActionsCell For Topic Message Overview for permissions with tests suites

* Connector Delete , Consumer Groups Permission + writing test suites

* Add Permissions to the Topics ActionCell

* Topic Table Permissions Tests Suites

* Headless Logic for the Permission Part

* add documentation for the headless Part of the permission + add modification of the data version 2 for efficient algorithmic lookup

* replace modify data logic and isPermitted function to have faster access to the data

* Add Permission helpers tests suites

* usePermission hook test suites

* BatchActionsBar add Permissions + minor modification in TopicTable tests suites

* Statistics and Metrics code Permission + add test suites

* Recreate Topic Permissions in the Topic page, add tests suites

* Actions for the Connector components

* Messages NavLink View Permission

* Test suites messages code modifications

* Permissions comment code modifications

* Replacing the Mock Data With the actual code

* Add ActionNavLink test suites

* BatchActionsBar code smell modifications

* maximizing the permissions tests suites

* maximizing the permissions tests suites

* maximizing the permissions tests suites

* Tooltip code refactoring and fix the positions issue

* permissions increase the tests coverage

* add user info at the navigation header and tests suites

* Add Global Schema Selector Permissions with test suites

* Roles minor code removal

* Change the Action Component form hook mixin approach to declarative props approach

* add isPermitted function for multiple Actions , adding tests suites for this particular case

* remove redundant Permissions test blocks from the components

* remove redundant Permissions test blocks from the components

* Action Buttons test suites' coverage + generalizing the code of the Actions

* add invalid Permission check in Action Components tests suites

* Modularization of Actions Components

* Modularization of Actions Components by adding DropDownAction to it.

* Reflect the BE Changes to the UI , by changing the default behavior or the testing of roles.

* Reflect the BE Changes to the UI , by changing the default behavior or the testing of roles.

* Get rid of not necessary usePermission mocks

* Modifications in the UserInfo data , to consider the UI without any login functionality

* minor code modifications in the BatchActionBar component

* change the Query key for the user info

* change the default message for the tooltip

* Fix the Create Role Access for Topics and Schemas

* ListPage Connector create permissions

* add Headless logic for Create Permission with test suites. + add react hook render-er

* Create Button ActionButton logic implementation

* Remove Code smells , by removing the duplications

* increase the test suites for isPermittedToCreate logic

* increase the test suites for isPermittedToCreate logic

* Change the UserResourceType Enum with the new value

* Apply New Resource Creation validation, for Topic, Schema, Connector

* Apply New Resource Creation validation, for Topic, Schema, Connector

* minor code refactor modifications

* minor code modification in the topics useCreate hook

* Async Validation for all the Create Pages

* caching test for optimal performance in async validation schemas

* Reverting the Front End Validation

* Reverting the Front End Validation

* Authorization API minor syntax modifications

* fix SmokeTests

Co-authored-by: Roman Zabaluev <rzabaluev@provectus.com>
Co-authored-by: VladSenyuta <vlad.senyuta@gmail.com>

Co-authored-by: Mgrdich <46796009+Mgrdich@users.noreply.github.com>
Co-authored-by: VladSenyuta <vlad.senyuta@gmail.com>
Roman Zabaluev %!s(int64=2) %!d(string=hai) anos
pai
achega
5c723d9b44
Modificáronse 100 ficheiros con 3688 adicións e 1065 borrados
  1. 5 0
      .editorconfig
  2. 1 0
      README.md
  3. 1 0
      kafka-ui-api/pom.xml
  4. 8 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/AuthenticatedUser.java
  5. 0 80
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/CognitoOAuthSecurityConfig.java
  6. 43 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthProperties.java
  7. 68 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthPropertiesConverter.java
  8. 101 36
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthSecurityConfig.java
  9. 30 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacOAuth2User.java
  10. 47 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacOidcUser.java
  11. 10 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacUser.java
  12. 23 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RoleBasedAccessControlProperties.java
  13. 13 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/condition/CognitoCondition.java
  14. 18 10
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/CognitoLogoutSuccessHandler.java
  15. 15 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/LogoutSuccessHandler.java
  16. 46 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/OAuthLogoutSuccessHandler.java
  17. 0 44
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/props/CognitoProperties.java
  18. 80 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/AccessController.java
  19. 68 27
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/BrokersController.java
  20. 39 11
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/ClustersController.java
  21. 127 69
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/ConsumerGroupsController.java
  22. 132 34
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java
  23. 39 13
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java
  24. 62 19
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/MessagesController.java
  25. 140 40
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/SchemasController.java
  26. 179 65
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/TopicsController.java
  27. 2 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/exception/ErrorCode.java
  28. 134 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/AccessContext.java
  29. 72 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Permission.java
  30. 21 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Resource.java
  31. 19 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Role.java
  32. 24 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Subject.java
  33. 18 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ClusterConfigAction.java
  34. 19 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ConnectAction.java
  35. 20 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ConsumerGroupAction.java
  36. 15 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/KsqlAction.java
  37. 4 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/PermissibleAction.java
  38. 21 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/SchemaAction.java
  39. 24 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/TopicAction.java
  40. 27 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/provider/Provider.java
  41. 1 1
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/serdes/builtin/sr/JsonSchemaSchemaRegistrySerializer.java
  42. 1 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ClusterService.java
  43. 11 5
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java
  44. 6 9
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java
  45. 1 1
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java
  46. 31 35
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java
  47. 6 2
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java
  48. 31 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/AbstractProviderCondition.java
  49. 398 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/AccessControlService.java
  50. 70 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/CognitoAuthorityExtractor.java
  51. 99 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/GithubAuthorityExtractor.java
  52. 69 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/GoogleAuthorityExtractor.java
  53. 23 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/LdapAuthorityExtractor.java
  54. 31 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/OauthAuthorityExtractor.java
  55. 14 0
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/ProviderAuthorityExtractor.java
  56. 1 1
      kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/annotation/KafkaClientInternalsDependant.java
  57. 21 1
      kafka-ui-api/src/main/resources/application-local.yml
  58. 38 32
      kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/SchemaRegistryPaginationTest.java
  59. 6 2
      kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/TopicsServicePaginationTest.java
  60. 23 0
      kafka-ui-api/src/test/java/com/provectus/kafka/ui/util/AccessControlServiceMock.java
  61. 86 2
      kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml
  62. 2 2
      kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/TopPanel.java
  63. 1 244
      kafka-ui-react-app/src/components/App.styled.ts
  64. 41 106
      kafka-ui-react-app/src/components/App.tsx
  65. 8 3
      kafka-ui-react-app/src/components/Brokers/Broker/Configs/InputCell.tsx
  66. 65 14
      kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx
  67. 8 4
      kafka-ui-react-app/src/components/Connect/List/ListPage.tsx
  68. 1 1
      kafka-ui-react-app/src/components/Connect/New/New.tsx
  69. 2 1
      kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx
  70. 23 7
      kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx
  71. 11 6
      kafka-ui-react-app/src/components/KsqlDb/List/List.tsx
  72. 5 4
      kafka-ui-react-app/src/components/KsqlDb/List/__test__/List.spec.tsx
  73. 146 0
      kafka-ui-react-app/src/components/NavBar/NavBar.styled.ts
  74. 60 0
      kafka-ui-react-app/src/components/NavBar/NavBar.tsx
  75. 19 0
      kafka-ui-react-app/src/components/NavBar/UserInfo/UserInfo.styled.ts
  76. 35 0
      kafka-ui-react-app/src/components/NavBar/UserInfo/UserInfo.tsx
  77. 44 0
      kafka-ui-react-app/src/components/NavBar/UserInfo/__tests__/UserInfo.spec.tsx
  78. 28 0
      kafka-ui-react-app/src/components/NavBar/__tests__/NavBar.spec.tsx
  79. 88 0
      kafka-ui-react-app/src/components/PageContainer/PageContainer.styled.ts
  80. 41 0
      kafka-ui-react-app/src/components/PageContainer/PageContainer.tsx
  81. 47 0
      kafka-ui-react-app/src/components/PageContainer/__tests__/PageContainer.spec.tsx
  82. 20 5
      kafka-ui-react-app/src/components/Schemas/Details/Details.tsx
  83. 11 3
      kafka-ui-react-app/src/components/Schemas/List/GlobalSchemaSelector/GlobalSchemaSelector.tsx
  84. 8 4
      kafka-ui-react-app/src/components/Schemas/List/List.tsx
  85. 30 13
      kafka-ui-react-app/src/components/Schemas/New/New.tsx
  86. 17 6
      kafka-ui-react-app/src/components/Topics/List/ActionsCell.tsx
  87. 38 6
      kafka-ui-react-app/src/components/Topics/List/BatchActionsBar.tsx
  88. 8 3
      kafka-ui-react-app/src/components/Topics/List/ListPage.tsx
  89. 16 15
      kafka-ui-react-app/src/components/Topics/List/__tests__/ListPage.spec.tsx
  90. 3 1
      kafka-ui-react-app/src/components/Topics/List/__tests__/TopicTable.spec.tsx
  91. 4 4
      kafka-ui-react-app/src/components/Topics/New/New.tsx
  92. 2 1
      kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx
  93. 13 4
      kafka-ui-react-app/src/components/Topics/Topic/Overview/ActionsCell.tsx
  94. 16 5
      kafka-ui-react-app/src/components/Topics/Topic/Statistics/Metrics.tsx
  95. 9 3
      kafka-ui-react-app/src/components/Topics/Topic/Statistics/Statistics.tsx
  96. 2 6
      kafka-ui-react-app/src/components/Topics/Topic/Statistics/__test__/Statistics.spec.tsx
  97. 56 23
      kafka-ui-react-app/src/components/Topics/Topic/Topic.tsx
  98. 12 32
      kafka-ui-react-app/src/components/__tests__/App.spec.tsx
  99. 18 0
      kafka-ui-react-app/src/components/common/ActionComponent/ActionButton/ActionButton.tsx
  100. 48 0
      kafka-ui-react-app/src/components/common/ActionComponent/ActionButton/ActionCanButton/ActionCanButton.tsx

+ 5 - 0
.editorconfig

@@ -279,3 +279,8 @@ ij_java_wrap_long_lines = false
 insert_final_newline = false
 trim_trailing_whitespace = false
 
+[*.yaml]
+indent_size = 2
+[*.yml]
+indent_size = 2
+

+ 1 - 0
README.md

@@ -31,6 +31,7 @@ the cloud.
 * **Dynamic Topic Configuration** — create and configure new topics with dynamic configuration
 * **Configurable Authentification** — secure your installation with optional Github/Gitlab/Google OAuth 2.0
 * **Custom serialization/deserialization plugins** - use a ready-to-go serde for your data like AWS Glue or Smile, or code your own!
+* **Role based access control** - manage permissions to access the UI with granular precision
 
 # The Interface
 UI for Apache Kafka wraps major functions of Apache Kafka with an intuitive user interface.

+ 1 - 0
kafka-ui-api/pom.xml

@@ -306,6 +306,7 @@
                         </configuration>
                     </execution>
                 </executions>
+
             </plugin>
             <plugin>
                 <groupId>org.antlr</groupId>

+ 8 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/AuthenticatedUser.java

@@ -0,0 +1,8 @@
+package com.provectus.kafka.ui.config.auth;
+
+import java.util.Collection;
+import lombok.Value;
+
+public record AuthenticatedUser(String principal, Collection<String> groups) {
+
+}

+ 0 - 80
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/CognitoOAuthSecurityConfig.java

@@ -1,80 +0,0 @@
-package com.provectus.kafka.ui.config.auth;
-
-import com.provectus.kafka.ui.config.CognitoOidcLogoutSuccessHandler;
-import com.provectus.kafka.ui.config.auth.props.CognitoProperties;
-import java.util.Optional;
-import lombok.RequiredArgsConstructor;
-import lombok.extern.slf4j.Slf4j;
-import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
-import org.springframework.boot.context.properties.ConfigurationProperties;
-import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Configuration;
-import org.springframework.security.config.annotation.web.reactive.EnableWebFluxSecurity;
-import org.springframework.security.config.web.server.ServerHttpSecurity;
-import org.springframework.security.oauth2.client.registration.ClientRegistration;
-import org.springframework.security.oauth2.client.registration.ClientRegistrations;
-import org.springframework.security.oauth2.client.registration.InMemoryReactiveClientRegistrationRepository;
-import org.springframework.security.web.server.SecurityWebFilterChain;
-import org.springframework.security.web.server.authentication.logout.ServerLogoutSuccessHandler;
-
-@Configuration
-@EnableWebFluxSecurity
-@ConditionalOnProperty(value = "auth.type", havingValue = "OAUTH2_COGNITO")
-@RequiredArgsConstructor
-@Slf4j
-public class CognitoOAuthSecurityConfig extends AbstractAuthSecurityConfig {
-
-  private static final String COGNITO = "cognito";
-
-  @Bean
-  public SecurityWebFilterChain configure(ServerHttpSecurity http, CognitoProperties props) {
-    log.info("Configuring Cognito OAUTH2 authentication.");
-
-    String clientId = props.getClientId();
-    String logoutUrl = props.getLogoutUri();
-
-    final ServerLogoutSuccessHandler logoutHandler = new CognitoOidcLogoutSuccessHandler(logoutUrl, clientId);
-
-    return http.authorizeExchange()
-        .pathMatchers(AUTH_WHITELIST)
-        .permitAll()
-        .anyExchange()
-        .authenticated()
-
-        .and()
-        .oauth2Login()
-
-        .and()
-        .oauth2Client()
-
-        .and()
-        .logout()
-        .logoutSuccessHandler(logoutHandler)
-
-        .and()
-        .csrf().disable()
-        .build();
-  }
-
-  @Bean
-  public InMemoryReactiveClientRegistrationRepository clientRegistrationRepository(CognitoProperties props) {
-    ClientRegistration.Builder builder = ClientRegistrations
-        .fromIssuerLocation(props.getIssuerUri())
-        .registrationId(COGNITO);
-
-    builder.clientId(props.getClientId());
-    builder.clientSecret(props.getClientSecret());
-
-    Optional.ofNullable(props.getScope()).ifPresent(builder::scope);
-    Optional.ofNullable(props.getUserNameAttribute()).ifPresent(builder::userNameAttributeName);
-
-    return new InMemoryReactiveClientRegistrationRepository(builder.build());
-  }
-
-  @Bean
-  @ConfigurationProperties("auth.cognito")
-  public CognitoProperties cognitoProperties() {
-    return new CognitoProperties();
-  }
-
-}

+ 43 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthProperties.java

@@ -0,0 +1,43 @@
+package com.provectus.kafka.ui.config.auth;
+
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Set;
+import javax.annotation.PostConstruct;
+import lombok.Data;
+import org.springframework.boot.context.properties.ConfigurationProperties;
+import org.springframework.util.Assert;
+
+@ConfigurationProperties("auth.oauth2")
+@Data
+public class OAuthProperties {
+  private Map<String, OAuth2Provider> client = new HashMap<>();
+
+  @PostConstruct
+  public void validate() {
+    getClient().values().forEach(this::validateProvider);
+  }
+
+  private void validateProvider(final OAuth2Provider provider) {
+    Assert.hasText(provider.getClientId(), "Client id must not be empty.");
+    Assert.hasText(provider.getProvider(), "Provider name must not be empty");
+  }
+
+  @Data
+  public static class OAuth2Provider {
+    private String provider;
+    private String clientId;
+    private String clientSecret;
+    private String clientName;
+    private String redirectUri;
+    private String authorizationGrantType;
+    private Set<String> scope;
+    private String issuerUri;
+    private String authorizationUri;
+    private String tokenUri;
+    private String userInfoUri;
+    private String jwkSetUri;
+    private String userNameAttribute;
+    private Map<String, String> customParams;
+  }
+}

+ 68 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthPropertiesConverter.java

@@ -0,0 +1,68 @@
+package com.provectus.kafka.ui.config.auth;
+
+import static com.provectus.kafka.ui.config.auth.OAuthProperties.OAuth2Provider;
+import static org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2ClientProperties.Provider;
+import static org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2ClientProperties.Registration;
+
+import lombok.AccessLevel;
+import lombok.NoArgsConstructor;
+import org.apache.commons.lang3.StringUtils;
+import org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2ClientProperties;
+
+@NoArgsConstructor(access = AccessLevel.PRIVATE)
+public final class OAuthPropertiesConverter {
+
+  private static final String TYPE = "type";
+  private static final String GOOGLE = "google";
+
+  public static OAuth2ClientProperties convertProperties(final OAuthProperties properties) {
+    final var result = new OAuth2ClientProperties();
+    properties.getClient().forEach((key, provider) -> {
+      var registration = new Registration();
+      registration.setClientId(provider.getClientId());
+      registration.setClientSecret(provider.getClientSecret());
+      registration.setClientName(provider.getClientName());
+      registration.setScope(provider.getScope());
+      registration.setRedirectUri(provider.getRedirectUri());
+      registration.setAuthorizationGrantType(provider.getAuthorizationGrantType());
+
+      result.getRegistration().put(key, registration);
+
+      var clientProvider = new Provider();
+      applyCustomTransformations(provider);
+
+      clientProvider.setAuthorizationUri(provider.getAuthorizationUri());
+      clientProvider.setIssuerUri(provider.getIssuerUri());
+      clientProvider.setJwkSetUri(provider.getJwkSetUri());
+      clientProvider.setTokenUri(provider.getTokenUri());
+      clientProvider.setUserInfoUri(provider.getUserInfoUri());
+      clientProvider.setUserNameAttribute(provider.getUserNameAttribute());
+
+      result.getProvider().put(key, clientProvider);
+    });
+    return result;
+  }
+
+  private static void applyCustomTransformations(OAuth2Provider provider) {
+    applyGoogleTransformations(provider);
+  }
+
+  private static void applyGoogleTransformations(OAuth2Provider provider) {
+    if (!isGoogle(provider)) {
+      return;
+    }
+
+    String allowedDomain = provider.getCustomParams().get("allowedDomain");
+    if (StringUtils.isEmpty(allowedDomain)) {
+      return;
+    }
+
+    final String newUri = provider.getAuthorizationUri() + "?hd=" + allowedDomain;
+    provider.setAuthorizationUri(newUri);
+  }
+
+  private static boolean isGoogle(OAuth2Provider provider) {
+    return provider.getCustomParams().get(TYPE).equalsIgnoreCase(GOOGLE);
+  }
+}
+

+ 101 - 36
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/OAuthSecurityConfig.java

@@ -1,66 +1,131 @@
 package com.provectus.kafka.ui.config.auth;
 
-import lombok.AllArgsConstructor;
+import com.provectus.kafka.ui.config.auth.logout.OAuthLogoutSuccessHandler;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import com.provectus.kafka.ui.service.rbac.extractor.ProviderAuthorityExtractor;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import lombok.RequiredArgsConstructor;
 import lombok.extern.log4j.Log4j2;
+import org.jetbrains.annotations.Nullable;
 import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
-import org.springframework.context.ApplicationContext;
+import org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2ClientProperties;
+import org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2ClientPropertiesRegistrationAdapter;
+import org.springframework.boot.context.properties.EnableConfigurationProperties;
 import org.springframework.context.annotation.Bean;
 import org.springframework.context.annotation.Configuration;
+import org.springframework.security.config.annotation.method.configuration.EnableReactiveMethodSecurity;
 import org.springframework.security.config.annotation.web.reactive.EnableWebFluxSecurity;
 import org.springframework.security.config.web.server.ServerHttpSecurity;
+import org.springframework.security.oauth2.client.oidc.userinfo.OidcReactiveOAuth2UserService;
+import org.springframework.security.oauth2.client.oidc.userinfo.OidcUserRequest;
+import org.springframework.security.oauth2.client.oidc.web.server.logout.OidcClientInitiatedServerLogoutSuccessHandler;
+import org.springframework.security.oauth2.client.registration.ClientRegistration;
+import org.springframework.security.oauth2.client.registration.InMemoryReactiveClientRegistrationRepository;
+import org.springframework.security.oauth2.client.registration.ReactiveClientRegistrationRepository;
+import org.springframework.security.oauth2.client.userinfo.DefaultReactiveOAuth2UserService;
+import org.springframework.security.oauth2.client.userinfo.OAuth2UserRequest;
+import org.springframework.security.oauth2.client.userinfo.ReactiveOAuth2UserService;
+import org.springframework.security.oauth2.core.oidc.user.OidcUser;
+import org.springframework.security.oauth2.core.user.OAuth2User;
 import org.springframework.security.web.server.SecurityWebFilterChain;
-import org.springframework.util.ClassUtils;
+import org.springframework.security.web.server.authentication.logout.ServerLogoutSuccessHandler;
+import reactor.core.publisher.Mono;
 
 @Configuration
-@EnableWebFluxSecurity
 @ConditionalOnProperty(value = "auth.type", havingValue = "OAUTH2")
-@AllArgsConstructor
+@EnableConfigurationProperties(OAuthProperties.class)
+@EnableWebFluxSecurity
+@EnableReactiveMethodSecurity
+@RequiredArgsConstructor
 @Log4j2
 public class OAuthSecurityConfig extends AbstractAuthSecurityConfig {
 
-  public static final String REACTIVE_CLIENT_REGISTRATION_REPOSITORY_CLASSNAME =
-      "org.springframework.security.oauth2.client.registration."
-          + "ReactiveClientRegistrationRepository";
-
-  private static final boolean IS_OAUTH2_PRESENT = ClassUtils.isPresent(
-      REACTIVE_CLIENT_REGISTRATION_REPOSITORY_CLASSNAME,
-      OAuthSecurityConfig.class.getClassLoader()
-  );
-
-  private final ApplicationContext context;
+  private final OAuthProperties properties;
 
   @Bean
-  public SecurityWebFilterChain configure(ServerHttpSecurity http) {
+  public SecurityWebFilterChain configure(ServerHttpSecurity http, OAuthLogoutSuccessHandler logoutHandler) {
     log.info("Configuring OAUTH2 authentication.");
-    http.authorizeExchange()
+
+    return http.authorizeExchange()
         .pathMatchers(AUTH_WHITELIST)
         .permitAll()
         .anyExchange()
-        .authenticated();
+        .authenticated()
+
+        .and()
+        .oauth2Login()
+
+        .and()
+        .logout()
+        .logoutSuccessHandler(logoutHandler)
+
+        .and()
+        .csrf().disable()
+        .build();
+  }
+
+  @Bean
+  public ReactiveOAuth2UserService<OidcUserRequest, OidcUser> customOidcUserService(AccessControlService acs) {
+    final OidcReactiveOAuth2UserService delegate = new OidcReactiveOAuth2UserService();
+    return request -> delegate.loadUser(request)
+        .flatMap(user -> {
+          String providerId = request.getClientRegistration().getRegistrationId();
+          final var extractor = getExtractor(providerId, acs);
+          if (extractor == null) {
+            return Mono.just(user);
+          }
 
-    if (IS_OAUTH2_PRESENT && OAuth2ClasspathGuard.shouldConfigure(this.context)) {
-      OAuth2ClasspathGuard.configure(http);
-    }
+          return extractor.extract(acs, user, Map.of("request", request))
+              .map(groups -> new RbacOidcUser(user, groups));
+        });
+  }
+
+  @Bean
+  public ReactiveOAuth2UserService<OAuth2UserRequest, OAuth2User> customOauth2UserService(AccessControlService acs) {
+    final DefaultReactiveOAuth2UserService delegate = new DefaultReactiveOAuth2UserService();
+    return request -> delegate.loadUser(request)
+        .flatMap(user -> {
+          String providerId = request.getClientRegistration().getRegistrationId();
+          final var extractor = getExtractor(providerId, acs);
+          if (extractor == null) {
+            return Mono.just(user);
+          }
 
-    return http.csrf().disable().build();
+          return extractor.extract(acs, user, Map.of("request", request))
+              .map(groups -> new RbacOAuth2User(user, groups));
+        });
   }
 
-  private static class OAuth2ClasspathGuard {
-    static void configure(ServerHttpSecurity http) {
-      http
-          .oauth2Login()
-          .and()
-          .oauth2Client();
-    }
-
-    static boolean shouldConfigure(ApplicationContext context) {
-      ClassLoader loader = context.getClassLoader();
-      Class<?> reactiveClientRegistrationRepositoryClass =
-          ClassUtils.resolveClassName(REACTIVE_CLIENT_REGISTRATION_REPOSITORY_CLASSNAME, loader);
-      return context.getBeanNamesForType(reactiveClientRegistrationRepositoryClass).length == 1;
-    }
+  @Bean
+  public InMemoryReactiveClientRegistrationRepository clientRegistrationRepository() {
+    final OAuth2ClientProperties props = OAuthPropertiesConverter.convertProperties(properties);
+    final List<ClientRegistration> registrations =
+        new ArrayList<>(OAuth2ClientPropertiesRegistrationAdapter.getClientRegistrations(props).values());
+    return new InMemoryReactiveClientRegistrationRepository(registrations);
   }
 
+  @Bean
+  public ServerLogoutSuccessHandler defaultOidcLogoutHandler(final ReactiveClientRegistrationRepository repository) {
+    return new OidcClientInitiatedServerLogoutSuccessHandler(repository);
+  }
+
+  @Nullable
+  private ProviderAuthorityExtractor getExtractor(final String providerId, AccessControlService acs) {
+    final String provider = getProviderByProviderId(providerId);
+    Optional<ProviderAuthorityExtractor> extractor = acs.getExtractors()
+        .stream()
+        .filter(e -> e.isApplicable(provider))
+        .findFirst();
+
+    return extractor.orElse(null);
+  }
+
+  private String getProviderByProviderId(final String providerId) {
+    return properties.getClient().get(providerId).getProvider();
+  }
 
 }
 

+ 30 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacOAuth2User.java

@@ -0,0 +1,30 @@
+package com.provectus.kafka.ui.config.auth;
+
+import java.util.Collection;
+import java.util.Map;
+import lombok.Value;
+import org.springframework.security.core.GrantedAuthority;
+import org.springframework.security.oauth2.core.user.OAuth2User;
+
+public record RbacOAuth2User(OAuth2User user, Collection<String> groups) implements RbacUser, OAuth2User {
+
+  @Override
+  public Map<String, Object> getAttributes() {
+    return user.getAttributes();
+  }
+
+  @Override
+  public Collection<? extends GrantedAuthority> getAuthorities() {
+    return user.getAuthorities();
+  }
+
+  @Override
+  public String getName() {
+    return user.getName();
+  }
+
+  @Override
+  public String name() {
+    return user.getName();
+  }
+}

+ 47 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacOidcUser.java

@@ -0,0 +1,47 @@
+package com.provectus.kafka.ui.config.auth;
+
+import java.util.Collection;
+import java.util.Map;
+import lombok.Value;
+import org.springframework.security.core.GrantedAuthority;
+import org.springframework.security.oauth2.core.oidc.OidcIdToken;
+import org.springframework.security.oauth2.core.oidc.OidcUserInfo;
+import org.springframework.security.oauth2.core.oidc.user.OidcUser;
+
+public record RbacOidcUser(OidcUser user, Collection<String> groups) implements RbacUser, OidcUser {
+
+  @Override
+  public Map<String, Object> getClaims() {
+    return user.getClaims();
+  }
+
+  @Override
+  public OidcUserInfo getUserInfo() {
+    return user.getUserInfo();
+  }
+
+  @Override
+  public OidcIdToken getIdToken() {
+    return user.getIdToken();
+  }
+
+  @Override
+  public Map<String, Object> getAttributes() {
+    return user.getAttributes();
+  }
+
+  @Override
+  public Collection<? extends GrantedAuthority> getAuthorities() {
+    return user.getAuthorities();
+  }
+
+  @Override
+  public String getName() {
+    return user.getName();
+  }
+
+  @Override
+  public String name() {
+    return user.getName();
+  }
+}

+ 10 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RbacUser.java

@@ -0,0 +1,10 @@
+package com.provectus.kafka.ui.config.auth;
+
+import java.util.Collection;
+
+public interface RbacUser {
+  String name();
+
+  Collection<String> groups();
+
+}

+ 23 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/RoleBasedAccessControlProperties.java

@@ -0,0 +1,23 @@
+package com.provectus.kafka.ui.config.auth;
+
+import com.provectus.kafka.ui.model.rbac.Role;
+import java.util.ArrayList;
+import java.util.List;
+import javax.annotation.PostConstruct;
+import org.springframework.boot.context.properties.ConfigurationProperties;
+
+@ConfigurationProperties("rbac")
+public class RoleBasedAccessControlProperties {
+
+  private final List<Role> roles = new ArrayList<>();
+
+  @PostConstruct
+  public void init() {
+    roles.forEach(Role::validate);
+  }
+
+  public List<Role> getRoles() {
+    return roles;
+  }
+
+}

+ 13 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/condition/CognitoCondition.java

@@ -0,0 +1,13 @@
+package com.provectus.kafka.ui.config.auth.condition;
+
+import com.provectus.kafka.ui.service.rbac.AbstractProviderCondition;
+import org.springframework.context.annotation.Condition;
+import org.springframework.context.annotation.ConditionContext;
+import org.springframework.core.type.AnnotatedTypeMetadata;
+
+public class CognitoCondition extends AbstractProviderCondition implements Condition {
+  @Override
+  public boolean matches(final ConditionContext context, final AnnotatedTypeMetadata metadata) {
+    return getRegisteredProvidersTypes(context.getEnvironment()).stream().anyMatch(a -> a.equalsIgnoreCase("cognito"));
+  }
+}

+ 18 - 10
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/CognitoOidcLogoutSuccessHandler.java → kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/CognitoLogoutSuccessHandler.java

@@ -1,27 +1,34 @@
-package com.provectus.kafka.ui.config;
+package com.provectus.kafka.ui.config.auth.logout;
 
+import com.provectus.kafka.ui.config.auth.OAuthProperties;
+import com.provectus.kafka.ui.config.auth.condition.CognitoCondition;
+import com.provectus.kafka.ui.model.rbac.provider.Provider;
 import java.net.URI;
 import java.nio.charset.StandardCharsets;
-import lombok.RequiredArgsConstructor;
+import org.springframework.context.annotation.Conditional;
 import org.springframework.http.HttpStatus;
 import org.springframework.http.server.reactive.ServerHttpResponse;
 import org.springframework.security.core.Authentication;
 import org.springframework.security.web.server.WebFilterExchange;
-import org.springframework.security.web.server.authentication.logout.ServerLogoutSuccessHandler;
 import org.springframework.security.web.util.UrlUtils;
+import org.springframework.stereotype.Component;
 import org.springframework.web.server.WebSession;
 import org.springframework.web.util.UriComponents;
 import org.springframework.web.util.UriComponentsBuilder;
 import reactor.core.publisher.Mono;
 
-@RequiredArgsConstructor
-public class CognitoOidcLogoutSuccessHandler implements ServerLogoutSuccessHandler {
+@Component
+@Conditional(CognitoCondition.class)
+public class CognitoLogoutSuccessHandler implements LogoutSuccessHandler {
 
-  private final String logoutUrl;
-  private final String clientId;
+  @Override
+  public boolean isApplicable(String provider) {
+    return Provider.Name.COGNITO.equalsIgnoreCase(provider);
+  }
 
   @Override
-  public Mono<Void> onLogoutSuccess(final WebFilterExchange exchange, final Authentication authentication) {
+  public Mono<Void> handle(WebFilterExchange exchange, Authentication authentication,
+                           OAuthProperties.OAuth2Provider provider) {
     final ServerHttpResponse response = exchange.getExchange().getResponse();
     response.setStatusCode(HttpStatus.FOUND);
 
@@ -39,8 +46,8 @@ public class CognitoOidcLogoutSuccessHandler implements ServerLogoutSuccessHandl
         .build();
 
     final var uri = UriComponentsBuilder
-        .fromUri(URI.create(logoutUrl))
-        .queryParam("client_id", clientId)
+        .fromUri(URI.create(provider.getCustomParams().get("logoutUrl")))
+        .queryParam("client_id", provider.getClientId())
         .queryParam("logout_uri", baseUrl)
         .encode(StandardCharsets.UTF_8)
         .build()
@@ -49,5 +56,6 @@ public class CognitoOidcLogoutSuccessHandler implements ServerLogoutSuccessHandl
     response.getHeaders().setLocation(uri);
     return exchange.getExchange().getSession().flatMap(WebSession::invalidate);
   }
+
 }
 

+ 15 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/LogoutSuccessHandler.java

@@ -0,0 +1,15 @@
+package com.provectus.kafka.ui.config.auth.logout;
+
+import com.provectus.kafka.ui.config.auth.OAuthProperties;
+import org.springframework.security.core.Authentication;
+import org.springframework.security.web.server.WebFilterExchange;
+import reactor.core.publisher.Mono;
+
+public interface LogoutSuccessHandler {
+
+  boolean isApplicable(final String provider);
+
+  Mono<Void> handle(final WebFilterExchange exchange,
+                    final Authentication authentication,
+                    final OAuthProperties.OAuth2Provider provider);
+}

+ 46 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/logout/OAuthLogoutSuccessHandler.java

@@ -0,0 +1,46 @@
+package com.provectus.kafka.ui.config.auth.logout;
+
+import com.provectus.kafka.ui.config.auth.OAuthProperties;
+import java.util.List;
+import java.util.Optional;
+import org.springframework.beans.factory.annotation.Qualifier;
+import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
+import org.springframework.security.core.Authentication;
+import org.springframework.security.oauth2.client.authentication.OAuth2AuthenticationToken;
+import org.springframework.security.web.server.WebFilterExchange;
+import org.springframework.security.web.server.authentication.logout.ServerLogoutSuccessHandler;
+import org.springframework.stereotype.Component;
+import reactor.core.publisher.Mono;
+
+@Component
+@ConditionalOnProperty(value = "auth.type", havingValue = "OAUTH2")
+public class OAuthLogoutSuccessHandler implements ServerLogoutSuccessHandler {
+  private final OAuthProperties properties;
+  private final List<LogoutSuccessHandler> logoutSuccessHandlers;
+  private final ServerLogoutSuccessHandler defaultOidcLogoutHandler;
+
+  public OAuthLogoutSuccessHandler(final OAuthProperties properties,
+                                   final List<LogoutSuccessHandler> logoutSuccessHandlers,
+                                   final @Qualifier("defaultOidcLogoutHandler") ServerLogoutSuccessHandler handler) {
+    this.properties = properties;
+    this.logoutSuccessHandlers = logoutSuccessHandlers;
+    this.defaultOidcLogoutHandler = handler;
+  }
+
+  @Override
+  public Mono<Void> onLogoutSuccess(final WebFilterExchange exchange,
+                                    final Authentication authentication) {
+    final OAuth2AuthenticationToken oauthToken = (OAuth2AuthenticationToken) authentication;
+    final String providerId = oauthToken.getAuthorizedClientRegistrationId();
+    final OAuthProperties.OAuth2Provider oAuth2Provider = properties.getClient().get(providerId);
+    return getLogoutHandler(oAuth2Provider.getProvider())
+        .map(handler -> handler.handle(exchange, authentication, oAuth2Provider))
+        .orElseGet(() -> defaultOidcLogoutHandler.onLogoutSuccess(exchange, authentication));
+  }
+
+  private Optional<LogoutSuccessHandler> getLogoutHandler(final String provider) {
+    return logoutSuccessHandlers.stream()
+        .filter(h -> h.isApplicable(provider))
+        .findFirst();
+  }
+}

+ 0 - 44
kafka-ui-api/src/main/java/com/provectus/kafka/ui/config/auth/props/CognitoProperties.java

@@ -1,44 +0,0 @@
-package com.provectus.kafka.ui.config.auth.props;
-
-import lombok.Data;
-import lombok.ToString;
-import org.jetbrains.annotations.Nullable;
-
-@Data
-@ToString(exclude = "clientSecret")
-public class CognitoProperties {
-
-  String clientId;
-  String logoutUri;
-  String issuerUri;
-  String clientSecret;
-  @Nullable
-  String scope;
-  @Nullable
-  String userNameAttribute;
-
-  public String getClientId() {
-    return clientId;
-  }
-
-  public String getLogoutUri() {
-    return logoutUri;
-  }
-
-  public String getIssuerUri() {
-    return issuerUri;
-  }
-
-  public String getClientSecret() {
-    return clientSecret;
-  }
-
-  public @Nullable String getScope() {
-    return scope;
-  }
-
-  public @Nullable String getUserNameAttribute() {
-    return userNameAttribute;
-  }
-
-}

+ 80 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/AccessController.java

@@ -0,0 +1,80 @@
+package com.provectus.kafka.ui.controller;
+
+import com.provectus.kafka.ui.api.AuthorizationApi;
+import com.provectus.kafka.ui.model.ActionDTO;
+import com.provectus.kafka.ui.model.AuthenticationInfoDTO;
+import com.provectus.kafka.ui.model.ResourceTypeDTO;
+import com.provectus.kafka.ui.model.UserInfoDTO;
+import com.provectus.kafka.ui.model.UserPermissionDTO;
+import com.provectus.kafka.ui.model.rbac.Permission;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.security.Principal;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import java.util.stream.Collectors;
+import lombok.RequiredArgsConstructor;
+import org.springframework.http.ResponseEntity;
+import org.springframework.security.core.context.ReactiveSecurityContextHolder;
+import org.springframework.security.core.context.SecurityContext;
+import org.springframework.web.bind.annotation.RestController;
+import org.springframework.web.server.ServerWebExchange;
+import reactor.core.publisher.Mono;
+
+@RestController
+@RequiredArgsConstructor
+public class AccessController implements AuthorizationApi {
+
+  private final AccessControlService accessControlService;
+
+  public Mono<ResponseEntity<AuthenticationInfoDTO>> getUserAuthInfo(ServerWebExchange exchange) {
+    AuthenticationInfoDTO dto = new AuthenticationInfoDTO();
+    dto.setRbacEnabled(accessControlService.isRbacEnabled());
+    UserInfoDTO userInfo = new UserInfoDTO();
+
+    Mono<List<UserPermissionDTO>> permissions = accessControlService.getUser()
+        .map(user -> accessControlService.getRoles()
+            .stream()
+            .filter(role -> user.groups().contains(role.getName()))
+            .map(role -> mapPermissions(role.getPermissions(), role.getClusters()))
+            .flatMap(Collection::stream)
+            .collect(Collectors.toList())
+        )
+        .switchIfEmpty(Mono.just(Collections.emptyList()));
+
+    Mono<String> userName = ReactiveSecurityContextHolder.getContext()
+        .map(SecurityContext::getAuthentication)
+        .map(Principal::getName);
+
+    return userName
+        .zipWith(permissions)
+        .map(data -> {
+          userInfo.setUsername(data.getT1());
+          userInfo.setPermissions(data.getT2());
+
+          dto.setUserInfo(userInfo);
+          return dto;
+        })
+        .switchIfEmpty(Mono.just(dto))
+        .map(ResponseEntity::ok);
+  }
+
+  private List<UserPermissionDTO> mapPermissions(List<Permission> permissions, List<String> clusters) {
+    return permissions
+        .stream()
+        .map(permission -> {
+          UserPermissionDTO dto = new UserPermissionDTO();
+          dto.setClusters(clusters);
+          dto.setResource(ResourceTypeDTO.fromValue(permission.getResource().toString().toUpperCase()));
+          dto.setValue(permission.getValue() != null ? permission.getValue().toString() : null);
+          dto.setActions(permission.getActions()
+              .stream()
+              .map(String::toUpperCase)
+              .map(ActionDTO::valueOf)
+              .collect(Collectors.toList()));
+          return dto;
+        })
+        .collect(Collectors.toList());
+  }
+
+}

+ 68 - 27
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/BrokersController.java

@@ -8,7 +8,10 @@ import com.provectus.kafka.ui.model.BrokerDTO;
 import com.provectus.kafka.ui.model.BrokerLogdirUpdateDTO;
 import com.provectus.kafka.ui.model.BrokerMetricsDTO;
 import com.provectus.kafka.ui.model.BrokersLogdirsDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.ClusterConfigAction;
 import com.provectus.kafka.ui.service.BrokerService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.List;
 import lombok.RequiredArgsConstructor;
 import lombok.extern.slf4j.Slf4j;
@@ -24,47 +27,78 @@ import reactor.core.publisher.Mono;
 public class BrokersController extends AbstractController implements BrokersApi {
   private final BrokerService brokerService;
   private final ClusterMapper clusterMapper;
+  private final AccessControlService accessControlService;
 
   @Override
-  public Mono<ResponseEntity<BrokerMetricsDTO>> getBrokersMetrics(String clusterName, Integer id,
-                                                                  ServerWebExchange exchange) {
-    return brokerService.getBrokerMetrics(getCluster(clusterName), id)
-        .map(clusterMapper::toBrokerMetrics)
-        .map(ResponseEntity::ok)
-        .onErrorReturn(ResponseEntity.notFound().build());
+  public Mono<ResponseEntity<Flux<BrokerDTO>>> getBrokers(String clusterName,
+                                                          ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .build());
+
+    var job = brokerService.getBrokers(getCluster(clusterName)).map(clusterMapper::toBrokerDto);
+
+    return validateAccess.thenReturn(ResponseEntity.ok(job));
   }
 
   @Override
-  public Mono<ResponseEntity<Flux<BrokerDTO>>> getBrokers(String clusterName,
-                                                          ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity.ok(
-        brokerService.getBrokers(getCluster(clusterName)).map(clusterMapper::toBrokerDto)));
+  public Mono<ResponseEntity<BrokerMetricsDTO>> getBrokersMetrics(String clusterName, Integer id,
+                                                                  ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .build());
+
+    return validateAccess.then(
+        brokerService.getBrokerMetrics(getCluster(clusterName), id)
+            .map(clusterMapper::toBrokerMetrics)
+            .map(ResponseEntity::ok)
+            .onErrorReturn(ResponseEntity.notFound().build())
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Flux<BrokersLogdirsDTO>>> getAllBrokersLogdirs(String clusterName,
                                                                             List<Integer> brokers,
-                                                                            ServerWebExchange exchange
-  ) {
-    return Mono.just(ResponseEntity.ok(
+                                                                            ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .build());
+
+    return validateAccess.thenReturn(ResponseEntity.ok(
         brokerService.getAllBrokersLogdirs(getCluster(clusterName), brokers)));
   }
 
   @Override
-  public Mono<ResponseEntity<Flux<BrokerConfigDTO>>> getBrokerConfig(String clusterName, Integer id,
+  public Mono<ResponseEntity<Flux<BrokerConfigDTO>>> getBrokerConfig(String clusterName,
+                                                                     Integer id,
                                                                      ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity.ok(
-        brokerService.getBrokerConfig(getCluster(clusterName), id)
-            .map(clusterMapper::toBrokerConfig)));
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .clusterConfigActions(ClusterConfigAction.VIEW)
+        .build());
+
+    return validateAccess.thenReturn(
+        ResponseEntity.ok(
+            brokerService.getBrokerConfig(getCluster(clusterName), id)
+                .map(clusterMapper::toBrokerConfig))
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<Void>> updateBrokerTopicPartitionLogDir(
-      String clusterName, Integer id, Mono<BrokerLogdirUpdateDTO> brokerLogdir,
-      ServerWebExchange exchange) {
-    return brokerLogdir
-        .flatMap(bld -> brokerService.updateBrokerLogDir(getCluster(clusterName), id, bld))
-        .map(ResponseEntity::ok);
+  public Mono<ResponseEntity<Void>> updateBrokerTopicPartitionLogDir(String clusterName,
+                                                                     Integer id,
+                                                                     Mono<BrokerLogdirUpdateDTO> brokerLogdir,
+                                                                     ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .clusterConfigActions(ClusterConfigAction.VIEW, ClusterConfigAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        brokerLogdir
+            .flatMap(bld -> brokerService.updateBrokerLogDir(getCluster(clusterName), id, bld))
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -73,9 +107,16 @@ public class BrokersController extends AbstractController implements BrokersApi
                                                              String name,
                                                              Mono<BrokerConfigItemDTO> brokerConfig,
                                                              ServerWebExchange exchange) {
-    return brokerConfig
-        .flatMap(bci -> brokerService.updateBrokerConfigByName(
-            getCluster(clusterName), id, name, bci.getValue()))
-        .map(ResponseEntity::ok);
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .clusterConfigActions(ClusterConfigAction.VIEW, ClusterConfigAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        brokerConfig
+            .flatMap(bci -> brokerService.updateBrokerConfigByName(
+                getCluster(clusterName), id, name, bci.getValue()))
+            .map(ResponseEntity::ok)
+    );
   }
 }

+ 39 - 11
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/ClustersController.java

@@ -4,7 +4,9 @@ import com.provectus.kafka.ui.api.ClustersApi;
 import com.provectus.kafka.ui.model.ClusterDTO;
 import com.provectus.kafka.ui.model.ClusterMetricsDTO;
 import com.provectus.kafka.ui.model.ClusterStatsDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
 import com.provectus.kafka.ui.service.ClusterService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import lombok.RequiredArgsConstructor;
 import lombok.extern.slf4j.Slf4j;
 import org.springframework.http.ResponseEntity;
@@ -18,31 +20,57 @@ import reactor.core.publisher.Mono;
 @Slf4j
 public class ClustersController extends AbstractController implements ClustersApi {
   private final ClusterService clusterService;
+  private final AccessControlService accessControlService;
+
+  @Override
+  public Mono<ResponseEntity<Flux<ClusterDTO>>> getClusters(ServerWebExchange exchange) {
+    Flux<ClusterDTO> job = Flux.fromIterable(clusterService.getClusters())
+        .filterWhen(accessControlService::isClusterAccessible);
+
+    return Mono.just(ResponseEntity.ok(job));
+  }
 
   @Override
   public Mono<ResponseEntity<ClusterMetricsDTO>> getClusterMetrics(String clusterName,
                                                                    ServerWebExchange exchange) {
-    return clusterService.getClusterMetrics(getCluster(clusterName))
-        .map(ResponseEntity::ok)
-        .onErrorReturn(ResponseEntity.notFound().build());
+    AccessContext context = AccessContext.builder()
+        .cluster(clusterName)
+        .build();
+
+    return accessControlService.validateAccess(context)
+        .then(
+            clusterService.getClusterMetrics(getCluster(clusterName))
+                .map(ResponseEntity::ok)
+                .onErrorReturn(ResponseEntity.notFound().build())
+        );
   }
 
   @Override
   public Mono<ResponseEntity<ClusterStatsDTO>> getClusterStats(String clusterName,
                                                                ServerWebExchange exchange) {
-    return clusterService.getClusterStats(getCluster(clusterName))
-        .map(ResponseEntity::ok)
-        .onErrorReturn(ResponseEntity.notFound().build());
-  }
+    AccessContext context = AccessContext.builder()
+        .cluster(clusterName)
+        .build();
 
-  @Override
-  public Mono<ResponseEntity<Flux<ClusterDTO>>> getClusters(ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity.ok(Flux.fromIterable(clusterService.getClusters())));
+    return accessControlService.validateAccess(context)
+        .then(
+            clusterService.getClusterStats(getCluster(clusterName))
+                .map(ResponseEntity::ok)
+                .onErrorReturn(ResponseEntity.notFound().build())
+        );
   }
 
   @Override
   public Mono<ResponseEntity<ClusterDTO>> updateClusterInfo(String clusterName,
                                                             ServerWebExchange exchange) {
-    return clusterService.updateCluster(getCluster(clusterName)).map(ResponseEntity::ok);
+
+    AccessContext context = AccessContext.builder()
+        .cluster(clusterName)
+        .build();
+
+    return accessControlService.validateAccess(context)
+        .then(
+            clusterService.updateCluster(getCluster(clusterName)).map(ResponseEntity::ok)
+        );
   }
 }

+ 127 - 69
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/ConsumerGroupsController.java

@@ -1,5 +1,8 @@
 package com.provectus.kafka.ui.controller;
 
+import static com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction.DELETE;
+import static com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction.RESET_OFFSETS;
+import static com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction.VIEW;
 import static java.util.stream.Collectors.toMap;
 
 import com.provectus.kafka.ui.api.ConsumerGroupsApi;
@@ -12,10 +15,14 @@ import com.provectus.kafka.ui.model.ConsumerGroupOrderingDTO;
 import com.provectus.kafka.ui.model.ConsumerGroupsPageResponseDTO;
 import com.provectus.kafka.ui.model.PartitionOffsetDTO;
 import com.provectus.kafka.ui.model.SortOrderDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.TopicAction;
 import com.provectus.kafka.ui.service.ConsumerGroupService;
 import com.provectus.kafka.ui.service.OffsetsResetService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.Map;
 import java.util.Optional;
+import java.util.function.Supplier;
 import java.util.stream.Collectors;
 import lombok.RequiredArgsConstructor;
 import lombok.extern.slf4j.Slf4j;
@@ -34,33 +41,65 @@ public class ConsumerGroupsController extends AbstractController implements Cons
 
   private final ConsumerGroupService consumerGroupService;
   private final OffsetsResetService offsetsResetService;
+  private final AccessControlService accessControlService;
 
   @Value("${consumer.groups.page.size:25}")
   private int defaultConsumerGroupsPageSize;
 
   @Override
-  public Mono<ResponseEntity<Void>> deleteConsumerGroup(String clusterName, String id,
+  public Mono<ResponseEntity<Void>> deleteConsumerGroup(String clusterName,
+                                                        String id,
                                                         ServerWebExchange exchange) {
-    return consumerGroupService.deleteConsumerGroupById(getCluster(clusterName), id)
-        .thenReturn(ResponseEntity.ok().build());
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .consumerGroup(id)
+        .consumerGroupActions(DELETE)
+        .build());
+
+    return validateAccess.then(
+        consumerGroupService.deleteConsumerGroupById(getCluster(clusterName), id)
+            .thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<ConsumerGroupDetailsDTO>> getConsumerGroup(
-      String clusterName, String consumerGroupId, ServerWebExchange exchange) {
-    return consumerGroupService.getConsumerGroupDetail(getCluster(clusterName), consumerGroupId)
-        .map(ConsumerGroupMapper::toDetailsDto)
-        .map(ResponseEntity::ok);
+  public Mono<ResponseEntity<ConsumerGroupDetailsDTO>> getConsumerGroup(String clusterName,
+                                                                        String consumerGroupId,
+                                                                        ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .consumerGroup(consumerGroupId)
+        .consumerGroupActions(VIEW)
+        .build());
+
+    return validateAccess.then(
+        consumerGroupService.getConsumerGroupDetail(getCluster(clusterName), consumerGroupId)
+            .map(ConsumerGroupMapper::toDetailsDto)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<Flux<ConsumerGroupDTO>>> getTopicConsumerGroups(
-      String clusterName, String topicName, ServerWebExchange exchange) {
-    return consumerGroupService.getConsumerGroupsForTopic(getCluster(clusterName), topicName)
-        .map(Flux::fromIterable)
-        .map(f -> f.map(ConsumerGroupMapper::toDto))
-        .map(ResponseEntity::ok)
-        .switchIfEmpty(Mono.just(ResponseEntity.notFound().build()));
+  public Mono<ResponseEntity<Flux<ConsumerGroupDTO>>> getTopicConsumerGroups(String clusterName,
+                                                                             String topicName,
+                                                                             ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(TopicAction.VIEW)
+        .build());
+
+    Mono<ResponseEntity<Flux<ConsumerGroupDTO>>> job =
+        consumerGroupService.getConsumerGroupsForTopic(getCluster(clusterName), topicName)
+            .flatMapMany(Flux::fromIterable)
+            .filterWhen(cg -> accessControlService.isConsumerGroupAccessible(cg.getGroupId(), clusterName))
+            .map(ConsumerGroupMapper::toDto)
+            .collectList()
+            .map(Flux::fromIterable)
+            .map(ResponseEntity::ok)
+            .switchIfEmpty(Mono.just(ResponseEntity.notFound().build()));
+
+    return validateAccess.then(job);
   }
 
   @Override
@@ -72,16 +111,79 @@ public class ConsumerGroupsController extends AbstractController implements Cons
       ConsumerGroupOrderingDTO orderBy,
       SortOrderDTO sortOrderDto,
       ServerWebExchange exchange) {
-    return consumerGroupService.getConsumerGroupsPage(
-            getCluster(clusterName),
-            Optional.ofNullable(page).filter(i -> i > 0).orElse(1),
-            Optional.ofNullable(perPage).filter(i -> i > 0).orElse(defaultConsumerGroupsPageSize),
-            search,
-            Optional.ofNullable(orderBy).orElse(ConsumerGroupOrderingDTO.NAME),
-            Optional.ofNullable(sortOrderDto).orElse(SortOrderDTO.ASC)
-        )
-        .map(this::convertPage)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        // consumer group access validation is within the service
+        .build());
+
+    return validateAccess.then(
+        consumerGroupService.getConsumerGroupsPage(
+                getCluster(clusterName),
+                Optional.ofNullable(page).filter(i -> i > 0).orElse(1),
+                Optional.ofNullable(perPage).filter(i -> i > 0).orElse(defaultConsumerGroupsPageSize),
+                search,
+                Optional.ofNullable(orderBy).orElse(ConsumerGroupOrderingDTO.NAME),
+                Optional.ofNullable(sortOrderDto).orElse(SortOrderDTO.ASC)
+            )
+            .map(this::convertPage)
+            .map(ResponseEntity::ok)
+    );
+  }
+
+  @Override
+  public Mono<ResponseEntity<Void>> resetConsumerGroupOffsets(String clusterName,
+                                                              String group,
+                                                              Mono<ConsumerGroupOffsetsResetDTO> resetDto,
+                                                              ServerWebExchange exchange) {
+    return resetDto.flatMap(reset -> {
+      Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+          .cluster(clusterName)
+          .topic(reset.getTopic())
+          .topicActions(TopicAction.VIEW)
+          .consumerGroupActions(RESET_OFFSETS)
+          .build());
+
+      Supplier<Mono<Void>> mono = () -> {
+        var cluster = getCluster(clusterName);
+        switch (reset.getResetType()) {
+          case EARLIEST:
+            return offsetsResetService
+                .resetToEarliest(cluster, group, reset.getTopic(), reset.getPartitions());
+          case LATEST:
+            return offsetsResetService
+                .resetToLatest(cluster, group, reset.getTopic(), reset.getPartitions());
+          case TIMESTAMP:
+            if (reset.getResetToTimestamp() == null) {
+              return Mono.error(
+                  new ValidationException(
+                      "resetToTimestamp is required when TIMESTAMP reset type used"
+                  )
+              );
+            }
+            return offsetsResetService
+                .resetToTimestamp(cluster, group, reset.getTopic(), reset.getPartitions(),
+                    reset.getResetToTimestamp());
+          case OFFSET:
+            if (CollectionUtils.isEmpty(reset.getPartitionsOffsets())) {
+              return Mono.error(
+                  new ValidationException(
+                      "partitionsOffsets is required when OFFSET reset type used"
+                  )
+              );
+            }
+            Map<Integer, Long> offsets = reset.getPartitionsOffsets().stream()
+                .collect(toMap(PartitionOffsetDTO::getPartition, PartitionOffsetDTO::getOffset));
+            return offsetsResetService.resetToOffsets(cluster, group, reset.getTopic(), offsets);
+          default:
+            return Mono.error(
+                new ValidationException("Unknown resetType " + reset.getResetType())
+            );
+        }
+      };
+
+      return validateAccess.then(mono.get());
+    }).thenReturn(ResponseEntity.ok().build());
   }
 
   private ConsumerGroupsPageResponseDTO convertPage(ConsumerGroupService.ConsumerGroupsPage
@@ -94,48 +196,4 @@ public class ConsumerGroupsController extends AbstractController implements Cons
             .collect(Collectors.toList()));
   }
 
-  @Override
-  public Mono<ResponseEntity<Void>> resetConsumerGroupOffsets(String clusterName, String group,
-                                                              Mono<ConsumerGroupOffsetsResetDTO>
-                                                                  consumerGroupOffsetsReset,
-                                                              ServerWebExchange exchange) {
-    return consumerGroupOffsetsReset.flatMap(reset -> {
-      var cluster = getCluster(clusterName);
-      switch (reset.getResetType()) {
-        case EARLIEST:
-          return offsetsResetService
-              .resetToEarliest(cluster, group, reset.getTopic(), reset.getPartitions());
-        case LATEST:
-          return offsetsResetService
-              .resetToLatest(cluster, group, reset.getTopic(), reset.getPartitions());
-        case TIMESTAMP:
-          if (reset.getResetToTimestamp() == null) {
-            return Mono.error(
-                new ValidationException(
-                    "resetToTimestamp is required when TIMESTAMP reset type used"
-                )
-            );
-          }
-          return offsetsResetService
-              .resetToTimestamp(cluster, group, reset.getTopic(), reset.getPartitions(),
-                  reset.getResetToTimestamp());
-        case OFFSET:
-          if (CollectionUtils.isEmpty(reset.getPartitionsOffsets())) {
-            return Mono.error(
-                new ValidationException(
-                    "partitionsOffsets is required when OFFSET reset type used"
-                )
-            );
-          }
-          Map<Integer, Long> offsets = reset.getPartitionsOffsets().stream()
-              .collect(toMap(PartitionOffsetDTO::getPartition, PartitionOffsetDTO::getOffset));
-          return offsetsResetService.resetToOffsets(cluster, group, reset.getTopic(), offsets);
-        default:
-          return Mono.error(
-              new ValidationException("Unknown resetType " + reset.getResetType())
-          );
-      }
-    }).thenReturn(ResponseEntity.ok().build());
-  }
-
 }

+ 132 - 34
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KafkaConnectController.java

@@ -11,7 +11,10 @@ import com.provectus.kafka.ui.model.FullConnectorInfoDTO;
 import com.provectus.kafka.ui.model.NewConnectorDTO;
 import com.provectus.kafka.ui.model.SortOrderDTO;
 import com.provectus.kafka.ui.model.TaskDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.ConnectAction;
 import com.provectus.kafka.ui.service.KafkaConnectService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.Comparator;
 import java.util.Map;
 import javax.validation.Valid;
@@ -28,42 +31,83 @@ import reactor.core.publisher.Mono;
 @Slf4j
 public class KafkaConnectController extends AbstractController implements KafkaConnectApi {
   private final KafkaConnectService kafkaConnectService;
+  private final AccessControlService accessControlService;
 
   @Override
   public Mono<ResponseEntity<Flux<ConnectDTO>>> getConnects(String clusterName,
                                                             ServerWebExchange exchange) {
-    return kafkaConnectService.getConnects(getCluster(clusterName)).map(ResponseEntity::ok);
+
+    Flux<ConnectDTO> flux = Flux.fromIterable(kafkaConnectService.getConnects(getCluster(clusterName)))
+        .filterWhen(dto -> accessControlService.isConnectAccessible(dto, clusterName));
+
+    return Mono.just(ResponseEntity.ok(flux));
   }
 
   @Override
   public Mono<ResponseEntity<Flux<String>>> getConnectors(String clusterName, String connectName,
                                                           ServerWebExchange exchange) {
-    var connectors = kafkaConnectService.getConnectors(getCluster(clusterName), connectName);
-    return Mono.just(ResponseEntity.ok(connectors));
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .build());
+
+    return validateAccess.thenReturn(
+        ResponseEntity.ok(kafkaConnectService.getConnectors(getCluster(clusterName), connectName))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<ConnectorDTO>> createConnector(String clusterName, String connectName,
                                                             @Valid Mono<NewConnectorDTO> connector,
                                                             ServerWebExchange exchange) {
-    return kafkaConnectService.createConnector(getCluster(clusterName), connectName, connector)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW, ConnectAction.CREATE)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService.createConnector(getCluster(clusterName), connectName, connector)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<ConnectorDTO>> getConnector(String clusterName, String connectName,
                                                          String connectorName,
                                                          ServerWebExchange exchange) {
-    return kafkaConnectService.getConnector(getCluster(clusterName), connectName, connectorName)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .connector(connectorName)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService.getConnector(getCluster(clusterName), connectName, connectorName)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> deleteConnector(String clusterName, String connectName,
                                                     String connectorName,
                                                     ServerWebExchange exchange) {
-    return kafkaConnectService.deleteConnector(getCluster(clusterName), connectName, connectorName)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW, ConnectAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService.deleteConnector(getCluster(clusterName), connectName, connectorName)
+            .map(ResponseEntity::ok)
+    );
   }
 
 
@@ -76,11 +120,13 @@ public class KafkaConnectController extends AbstractController implements KafkaC
       ServerWebExchange exchange
   ) {
     var comparator = sortOrder == null || sortOrder.equals(SortOrderDTO.ASC)
-            ? getConnectorsComparator(orderBy)
-            : getConnectorsComparator(orderBy).reversed();
-    return Mono.just(ResponseEntity.ok(
-        kafkaConnectService.getAllConnectors(getCluster(clusterName), search).sort(comparator))
-    );
+        ? getConnectorsComparator(orderBy)
+        : getConnectorsComparator(orderBy).reversed();
+    Flux<FullConnectorInfoDTO> job = kafkaConnectService.getAllConnectors(getCluster(clusterName), search)
+        .filterWhen(dto -> accessControlService.isConnectAccessible(dto.getConnect(), clusterName))
+        .filterWhen(dto -> accessControlService.isConnectorAccessible(dto.getConnect(), dto.getName(), clusterName));
+
+    return Mono.just(ResponseEntity.ok(job.sort(comparator)));
   }
 
   @Override
@@ -88,9 +134,18 @@ public class KafkaConnectController extends AbstractController implements KafkaC
                                                                       String connectName,
                                                                       String connectorName,
                                                                       ServerWebExchange exchange) {
-    return kafkaConnectService
-        .getConnectorConfig(getCluster(clusterName), connectName, connectorName)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService
+            .getConnectorConfig(getCluster(clusterName), connectName, connectorName)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -99,9 +154,18 @@ public class KafkaConnectController extends AbstractController implements KafkaC
                                                                String connectorName,
                                                                @Valid Mono<Object> requestBody,
                                                                ServerWebExchange exchange) {
-    return kafkaConnectService
-        .setConnectorConfig(getCluster(clusterName), connectName, connectorName, requestBody)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW, ConnectAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService
+            .setConnectorConfig(getCluster(clusterName), connectName, connectorName, requestBody)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -109,9 +173,18 @@ public class KafkaConnectController extends AbstractController implements KafkaC
                                                          String connectorName,
                                                          ConnectorActionDTO action,
                                                          ServerWebExchange exchange) {
-    return kafkaConnectService
-        .updateConnectorState(getCluster(clusterName), connectName, connectorName, action)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW, ConnectAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService
+            .updateConnectorState(getCluster(clusterName), connectName, connectorName, action)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -119,31 +192,56 @@ public class KafkaConnectController extends AbstractController implements KafkaC
                                                                String connectName,
                                                                String connectorName,
                                                                ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity
-        .ok(kafkaConnectService
-            .getConnectorTasks(getCluster(clusterName), connectName, connectorName)));
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .build());
+
+    return validateAccess.thenReturn(
+        ResponseEntity
+            .ok(kafkaConnectService
+                .getConnectorTasks(getCluster(clusterName), connectName, connectorName))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> restartConnectorTask(String clusterName, String connectName,
                                                          String connectorName, Integer taskId,
                                                          ServerWebExchange exchange) {
-    return kafkaConnectService
-        .restartConnectorTask(getCluster(clusterName), connectName, connectorName, taskId)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW, ConnectAction.EDIT)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService
+            .restartConnectorTask(getCluster(clusterName), connectName, connectorName, taskId)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Flux<ConnectorPluginDTO>>> getConnectorPlugins(
       String clusterName, String connectName, ServerWebExchange exchange) {
-    return kafkaConnectService
-        .getConnectorPlugins(getCluster(clusterName), connectName)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .build());
+
+    return validateAccess.then(
+        kafkaConnectService
+            .getConnectorPlugins(getCluster(clusterName), connectName)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<ConnectorPluginConfigValidationResponseDTO>>
-      validateConnectorPluginConfig(
+  public Mono<ResponseEntity<ConnectorPluginConfigValidationResponseDTO>> validateConnectorPluginConfig(
       String clusterName, String connectName, String pluginName, @Valid Mono<Object> requestBody,
       ServerWebExchange exchange) {
     return kafkaConnectService

+ 39 - 13
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/KsqlController.java

@@ -7,7 +7,10 @@ import com.provectus.kafka.ui.model.KsqlResponseDTO;
 import com.provectus.kafka.ui.model.KsqlStreamDescriptionDTO;
 import com.provectus.kafka.ui.model.KsqlTableDescriptionDTO;
 import com.provectus.kafka.ui.model.KsqlTableResponseDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.KsqlAction;
 import com.provectus.kafka.ui.service.ksql.KsqlServiceV2;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.List;
 import java.util.Map;
 import java.util.Optional;
@@ -19,51 +22,74 @@ import org.springframework.web.server.ServerWebExchange;
 import reactor.core.publisher.Flux;
 import reactor.core.publisher.Mono;
 
-
 @RestController
 @RequiredArgsConstructor
 @Slf4j
 public class KsqlController extends AbstractController implements KsqlApi {
 
   private final KsqlServiceV2 ksqlServiceV2;
+  private final AccessControlService accessControlService;
 
   @Override
   public Mono<ResponseEntity<KsqlCommandV2ResponseDTO>> executeKsql(String clusterName,
                                                                     Mono<KsqlCommandV2DTO>
                                                                         ksqlCommand2Dto,
                                                                     ServerWebExchange exchange) {
-    return ksqlCommand2Dto.map(dto -> {
-      var id = ksqlServiceV2.registerCommand(
-          getCluster(clusterName),
-          dto.getKsql(),
-          Optional.ofNullable(dto.getStreamsProperties()).orElse(Map.of()));
-      return new KsqlCommandV2ResponseDTO().pipeId(id);
-    }).map(ResponseEntity::ok);
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .ksqlActions(KsqlAction.EXECUTE)
+        .build());
+
+    return validateAccess.then(
+        ksqlCommand2Dto.map(dto -> {
+          var id = ksqlServiceV2.registerCommand(
+              getCluster(clusterName),
+              dto.getKsql(),
+              Optional.ofNullable(dto.getStreamsProperties()).orElse(Map.of()));
+          return new KsqlCommandV2ResponseDTO().pipeId(id);
+        }).map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Flux<KsqlResponseDTO>>> openKsqlResponsePipe(String clusterName,
                                                                           String pipeId,
                                                                           ServerWebExchange exchange) {
-    return Mono.just(
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .ksqlActions(KsqlAction.EXECUTE)
+        .build());
+
+    return validateAccess.thenReturn(
         ResponseEntity.ok(ksqlServiceV2.execute(pipeId)
             .map(table -> new KsqlResponseDTO()
                 .table(
                     new KsqlTableResponseDTO()
                         .header(table.getHeader())
                         .columnNames(table.getColumnNames())
-                        .values((List<List<Object>>) ((List<?>) (table.getValues())))))));
+                        .values((List<List<Object>>) ((List<?>) (table.getValues()))))))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Flux<KsqlStreamDescriptionDTO>>> listStreams(String clusterName,
-                                                                         ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity.ok(ksqlServiceV2.listStreams(getCluster(clusterName))));
+                                                                          ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .ksqlActions(KsqlAction.EXECUTE)
+        .build());
+
+    return validateAccess.thenReturn(ResponseEntity.ok(ksqlServiceV2.listStreams(getCluster(clusterName))));
   }
 
   @Override
   public Mono<ResponseEntity<Flux<KsqlTableDescriptionDTO>>> listTables(String clusterName,
                                                                         ServerWebExchange exchange) {
-    return Mono.just(ResponseEntity.ok(ksqlServiceV2.listTables(getCluster(clusterName))));
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .ksqlActions(KsqlAction.EXECUTE)
+        .build());
+
+    return validateAccess.thenReturn(ResponseEntity.ok(ksqlServiceV2.listTables(getCluster(clusterName))));
   }
 }

+ 62 - 19
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/MessagesController.java

@@ -1,5 +1,8 @@
 package com.provectus.kafka.ui.controller;
 
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.MESSAGES_DELETE;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.MESSAGES_PRODUCE;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.MESSAGES_READ;
 import static com.provectus.kafka.ui.serde.api.Serde.Target.KEY;
 import static com.provectus.kafka.ui.serde.api.Serde.Target.VALUE;
 import static java.util.stream.Collectors.toMap;
@@ -14,8 +17,11 @@ import com.provectus.kafka.ui.model.SeekTypeDTO;
 import com.provectus.kafka.ui.model.SerdeUsageDTO;
 import com.provectus.kafka.ui.model.TopicMessageEventDTO;
 import com.provectus.kafka.ui.model.TopicSerdeSuggestionDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.TopicAction;
 import com.provectus.kafka.ui.service.DeserializationService;
 import com.provectus.kafka.ui.service.MessagesService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.List;
 import java.util.Map;
 import java.util.Optional;
@@ -42,16 +48,26 @@ public class MessagesController extends AbstractController implements MessagesAp
 
   private final MessagesService messagesService;
   private final DeserializationService deserializationService;
+  private final AccessControlService accessControlService;
 
   @Override
   public Mono<ResponseEntity<Void>> deleteTopicMessages(
       String clusterName, String topicName, @Valid List<Integer> partitions,
       ServerWebExchange exchange) {
-    return messagesService.deleteTopicMessages(
-        getCluster(clusterName),
-        topicName,
-        Optional.ofNullable(partitions).orElse(List.of())
-    ).thenReturn(ResponseEntity.ok().build());
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_DELETE)
+        .build());
+
+    return validateAccess.then(
+        messagesService.deleteTopicMessages(
+            getCluster(clusterName),
+            topicName,
+            Optional.ofNullable(partitions).orElse(List.of())
+        ).thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
@@ -66,6 +82,12 @@ public class MessagesController extends AbstractController implements MessagesAp
                                                                            String keySerde,
                                                                            String valueSerde,
                                                                            ServerWebExchange exchange) {
+    final Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_READ)
+        .build());
+
     seekType = seekType != null ? seekType : SeekTypeDTO.BEGINNING;
     seekDirection = seekDirection != null ? seekDirection : SeekDirectionDTO.FORWARD;
     filterQueryType = filterQueryType != null ? filterQueryType : MessageFilterTypeDTO.STRING_CONTAINS;
@@ -77,22 +99,33 @@ public class MessagesController extends AbstractController implements MessagesAp
         topicName,
         parseSeekTo(topicName, seekType, seekTo)
     );
-    return Mono.just(
+    Mono<ResponseEntity<Flux<TopicMessageEventDTO>>> job = Mono.just(
         ResponseEntity.ok(
             messagesService.loadMessages(
                 getCluster(clusterName), topicName, positions, q, filterQueryType,
                 recordsLimit, seekDirection, keySerde, valueSerde)
         )
     );
+
+    return validateAccess.then(job);
   }
 
   @Override
   public Mono<ResponseEntity<Void>> sendTopicMessages(
       String clusterName, String topicName, @Valid Mono<CreateTopicMessageDTO> createTopicMessage,
       ServerWebExchange exchange) {
-    return createTopicMessage.flatMap(msg ->
-        messagesService.sendMessage(getCluster(clusterName), topicName, msg).then()
-    ).map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_PRODUCE)
+        .build());
+
+    return validateAccess.then(
+        createTopicMessage.flatMap(msg ->
+            messagesService.sendMessage(getCluster(clusterName), topicName, msg).then()
+        ).map(ResponseEntity::ok)
+    );
   }
 
   /**
@@ -128,15 +161,25 @@ public class MessagesController extends AbstractController implements MessagesAp
                                                                  String topicName,
                                                                  SerdeUsageDTO use,
                                                                  ServerWebExchange exchange) {
-    return Mono.just(
-        new TopicSerdeSuggestionDTO()
-            .key(use == SerdeUsageDTO.SERIALIZE
-                ? deserializationService.getSerdesForSerialize(getCluster(clusterName), topicName, KEY)
-                : deserializationService.getSerdesForDeserialize(getCluster(clusterName), topicName, KEY))
-            .value(use == SerdeUsageDTO.SERIALIZE
-                ? deserializationService.getSerdesForSerialize(getCluster(clusterName), topicName, VALUE)
-                : deserializationService.getSerdesForDeserialize(getCluster(clusterName), topicName, VALUE))
-    ).subscribeOn(Schedulers.boundedElastic())
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(TopicAction.VIEW)
+        .build());
+
+    TopicSerdeSuggestionDTO dto = new TopicSerdeSuggestionDTO()
+        .key(use == SerdeUsageDTO.SERIALIZE
+            ? deserializationService.getSerdesForSerialize(getCluster(clusterName), topicName, KEY)
+            : deserializationService.getSerdesForDeserialize(getCluster(clusterName), topicName, KEY))
+        .value(use == SerdeUsageDTO.SERIALIZE
+            ? deserializationService.getSerdesForSerialize(getCluster(clusterName), topicName, VALUE)
+            : deserializationService.getSerdesForDeserialize(getCluster(clusterName), topicName, VALUE));
+
+    return validateAccess.then(
+        Mono.just(dto)
+            .subscribeOn(Schedulers.boundedElastic())
+            .map(ResponseEntity::ok)
+    );
   }
 }

+ 140 - 40
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/SchemasController.java

@@ -9,8 +9,10 @@ import com.provectus.kafka.ui.model.KafkaCluster;
 import com.provectus.kafka.ui.model.NewSchemaSubjectDTO;
 import com.provectus.kafka.ui.model.SchemaSubjectDTO;
 import com.provectus.kafka.ui.model.SchemaSubjectsResponseDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.permission.SchemaAction;
 import com.provectus.kafka.ui.service.SchemaRegistryService;
-import java.util.Arrays;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.List;
 import java.util.stream.Collectors;
 import javax.validation.Valid;
@@ -33,6 +35,7 @@ public class SchemasController extends AbstractController implements SchemasApi
   private final ClusterMapper mapper;
 
   private final SchemaRegistryService schemaRegistryService;
+  private final AccessControlService accessControlService;
 
   @Override
   protected KafkaCluster getCluster(String clusterName) {
@@ -47,48 +50,105 @@ public class SchemasController extends AbstractController implements SchemasApi
   public Mono<ResponseEntity<CompatibilityCheckResponseDTO>> checkSchemaCompatibility(
       String clusterName, String subject, @Valid Mono<NewSchemaSubjectDTO> newSchemaSubject,
       ServerWebExchange exchange) {
-    return schemaRegistryService.checksSchemaCompatibility(
-            getCluster(clusterName), subject, newSchemaSubject)
-        .map(mapper::toCompatibilityCheckResponse)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.VIEW)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.checksSchemaCompatibility(
+                getCluster(clusterName), subject, newSchemaSubject)
+            .map(mapper::toCompatibilityCheckResponse)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<SchemaSubjectDTO>> createNewSchema(
       String clusterName, @Valid Mono<NewSchemaSubjectDTO> newSchemaSubject,
       ServerWebExchange exchange) {
-    return schemaRegistryService
-        .registerNewSchema(getCluster(clusterName), newSchemaSubject)
-        .map(ResponseEntity::ok);
+
+    return newSchemaSubject.flatMap(dto -> {
+      Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+          .cluster(clusterName)
+          .schemaActions(SchemaAction.CREATE)
+          .build());
+
+      return validateAccess.then(
+          schemaRegistryService
+              .registerNewSchema(getCluster(clusterName), dto)
+              .map(ResponseEntity::ok)
+      );
+    });
   }
 
   @Override
-  public Mono<ResponseEntity<Void>> deleteLatestSchema(
-      String clusterName, String subject, ServerWebExchange exchange) {
-    return schemaRegistryService.deleteLatestSchemaSubject(getCluster(clusterName), subject)
-        .thenReturn(ResponseEntity.ok().build());
+  public Mono<ResponseEntity<Void>> deleteLatestSchema(String clusterName,
+                                                       String subject,
+                                                       ServerWebExchange exchange) {
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.DELETE)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.deleteLatestSchemaSubject(getCluster(clusterName), subject)
+            .thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<Void>> deleteSchema(
-      String clusterName, String subjectName, ServerWebExchange exchange) {
-    return schemaRegistryService.deleteSchemaSubjectEntirely(getCluster(clusterName), subjectName)
-        .thenReturn(ResponseEntity.ok().build());
+  public Mono<ResponseEntity<Void>> deleteSchema(String clusterName,
+                                                 String subject,
+                                                 ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.DELETE)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.deleteSchemaSubjectEntirely(getCluster(clusterName), subject)
+            .thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
-  public Mono<ResponseEntity<Void>> deleteSchemaByVersion(
-      String clusterName, String subjectName, Integer version, ServerWebExchange exchange) {
-    return schemaRegistryService.deleteSchemaSubjectByVersion(getCluster(clusterName), subjectName, version)
-        .thenReturn(ResponseEntity.ok().build());
+  public Mono<ResponseEntity<Void>> deleteSchemaByVersion(String clusterName,
+                                                          String subject,
+                                                          Integer version,
+                                                          ServerWebExchange exchange) {
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.DELETE)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.deleteSchemaSubjectByVersion(getCluster(clusterName), subject, version)
+            .thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Flux<SchemaSubjectDTO>>> getAllVersionsBySubject(
-      String clusterName, String subjectName, ServerWebExchange exchange) {
+      String clusterName, String subject, ServerWebExchange exchange) {
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.VIEW)
+        .build());
+
     Flux<SchemaSubjectDTO> schemas =
-        schemaRegistryService.getAllVersionsBySubject(getCluster(clusterName), subjectName);
-    return Mono.just(ResponseEntity.ok(schemas));
+        schemaRegistryService.getAllVersionsBySubject(getCluster(clusterName), subject);
+
+    return validateAccess.thenReturn(ResponseEntity.ok(schemas));
   }
 
   @Override
@@ -101,18 +161,36 @@ public class SchemasController extends AbstractController implements SchemasApi
   }
 
   @Override
-  public Mono<ResponseEntity<SchemaSubjectDTO>> getLatestSchema(String clusterName, String subject,
+  public Mono<ResponseEntity<SchemaSubjectDTO>> getLatestSchema(String clusterName,
+                                                                String subject,
                                                                 ServerWebExchange exchange) {
-    return schemaRegistryService.getLatestSchemaVersionBySubject(getCluster(clusterName), subject)
-        .map(ResponseEntity::ok);
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.VIEW)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.getLatestSchemaVersionBySubject(getCluster(clusterName), subject)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<SchemaSubjectDTO>> getSchemaByVersion(
       String clusterName, String subject, Integer version, ServerWebExchange exchange) {
-    return schemaRegistryService.getSchemaSubjectByVersion(
-            getCluster(clusterName), subject, version)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schema(subject)
+        .schemaActions(SchemaAction.VIEW)
+        .build());
+
+    return validateAccess.then(
+        schemaRegistryService.getSchemaSubjectByVersion(
+                getCluster(clusterName), subject, version)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -120,16 +198,19 @@ public class SchemasController extends AbstractController implements SchemasApi
                                                                     @Valid Integer pageNum,
                                                                     @Valid Integer perPage,
                                                                     @Valid String search,
-                                                                    ServerWebExchange serverWebExchange) {
+                                                                    ServerWebExchange exchange) {
     return schemaRegistryService
         .getAllSubjectNames(getCluster(clusterName))
+        .flatMapMany(Flux::fromArray)
+        .filterWhen(schema -> accessControlService.isSchemaAccessible(schema, clusterName))
+        .collectList()
         .flatMap(subjects -> {
           int pageSize = perPage != null && perPage > 0 ? perPage : DEFAULT_PAGE_SIZE;
           int subjectToSkip = ((pageNum != null && pageNum > 0 ? pageNum : 1) - 1) * pageSize;
-          List<String> filteredSubjects = Arrays.stream(subjects)
+          List<String> filteredSubjects = subjects
+              .stream()
               .filter(subj -> search == null || StringUtils.containsIgnoreCase(subj, search))
-              .sorted()
-              .collect(Collectors.toList());
+              .sorted().toList();
           var totalPages = (filteredSubjects.size() / pageSize)
               + (filteredSubjects.size() % pageSize == 0 ? 0 : 1);
           List<String> subjectsToRender = filteredSubjects.stream()
@@ -138,26 +219,45 @@ public class SchemasController extends AbstractController implements SchemasApi
               .collect(Collectors.toList());
           return schemaRegistryService.getAllLatestVersionSchemas(getCluster(clusterName), subjectsToRender)
               .map(a -> new SchemaSubjectsResponseDTO().pageCount(totalPages).schemas(a));
-        }).map(ResponseEntity::ok);
+        })
+        .map(ResponseEntity::ok);
   }
 
   @Override
   public Mono<ResponseEntity<Void>> updateGlobalSchemaCompatibilityLevel(
       String clusterName, @Valid Mono<CompatibilityLevelDTO> compatibilityLevel,
       ServerWebExchange exchange) {
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schemaActions(SchemaAction.MODIFY_GLOBAL_COMPATIBILITY)
+        .build());
+
     log.info("Updating schema compatibility globally");
-    return schemaRegistryService.updateSchemaCompatibility(
-            getCluster(clusterName), compatibilityLevel)
-        .map(ResponseEntity::ok);
+
+    return validateAccess.then(
+        schemaRegistryService.updateSchemaCompatibility(
+                getCluster(clusterName), compatibilityLevel)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> updateSchemaCompatibilityLevel(
       String clusterName, String subject, @Valid Mono<CompatibilityLevelDTO> compatibilityLevel,
       ServerWebExchange exchange) {
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .schemaActions(SchemaAction.EDIT)
+        .build());
+
     log.info("Updating schema compatibility for subject: {}", subject);
-    return schemaRegistryService.updateSchemaCompatibility(
-            getCluster(clusterName), subject, compatibilityLevel)
-        .map(ResponseEntity::ok);
+
+    return validateAccess.then(
+        schemaRegistryService.updateSchemaCompatibility(
+                getCluster(clusterName), subject, compatibilityLevel)
+            .map(ResponseEntity::ok)
+    );
   }
 }

+ 179 - 65
kafka-ui-api/src/main/java/com/provectus/kafka/ui/controller/TopicsController.java

@@ -1,5 +1,10 @@
 package com.provectus.kafka.ui.controller;
 
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.CREATE;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.DELETE;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.EDIT;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.MESSAGES_READ;
+import static com.provectus.kafka.ui.model.rbac.permission.TopicAction.VIEW;
 import static java.util.stream.Collectors.toList;
 
 import com.provectus.kafka.ui.api.TopicsApi;
@@ -19,8 +24,10 @@ import com.provectus.kafka.ui.model.TopicDTO;
 import com.provectus.kafka.ui.model.TopicDetailsDTO;
 import com.provectus.kafka.ui.model.TopicUpdateDTO;
 import com.provectus.kafka.ui.model.TopicsResponseDTO;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
 import com.provectus.kafka.ui.service.TopicsService;
 import com.provectus.kafka.ui.service.analyze.TopicAnalysisService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.Comparator;
 import java.util.List;
 import javax.validation.Valid;
@@ -44,66 +51,121 @@ public class TopicsController extends AbstractController implements TopicsApi {
   private final TopicsService topicsService;
   private final TopicAnalysisService topicAnalysisService;
   private final ClusterMapper clusterMapper;
+  private final AccessControlService accessControlService;
 
   @Override
   public Mono<ResponseEntity<TopicDTO>> createTopic(
       String clusterName, @Valid Mono<TopicCreationDTO> topicCreation, ServerWebExchange exchange) {
-    return topicsService.createTopic(getCluster(clusterName), topicCreation)
-        .map(clusterMapper::toTopic)
-        .map(s -> new ResponseEntity<>(s, HttpStatus.OK))
-        .switchIfEmpty(Mono.just(ResponseEntity.notFound().build()));
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topicActions(CREATE)
+        .build());
+
+    return validateAccess.then(
+        topicsService.createTopic(getCluster(clusterName), topicCreation)
+            .map(clusterMapper::toTopic)
+            .map(s -> new ResponseEntity<>(s, HttpStatus.OK))
+            .switchIfEmpty(Mono.just(ResponseEntity.notFound().build()))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<TopicDTO>> recreateTopic(String clusterName,
-                                                      String topicName, ServerWebExchange serverWebExchange) {
-    return topicsService.recreateTopic(getCluster(clusterName), topicName)
-        .map(clusterMapper::toTopic)
-        .map(s -> new ResponseEntity<>(s, HttpStatus.CREATED));
+                                                      String topicName, ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW, CREATE, DELETE)
+        .build());
+
+    return validateAccess.then(
+        topicsService.recreateTopic(getCluster(clusterName), topicName)
+            .map(clusterMapper::toTopic)
+            .map(s -> new ResponseEntity<>(s, HttpStatus.CREATED))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<TopicDTO>> cloneTopic(
       String clusterName, String topicName, String newTopicName, ServerWebExchange exchange) {
-    return topicsService.cloneTopic(getCluster(clusterName), topicName, newTopicName)
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW, CREATE)
+        .build());
+
+    return validateAccess.then(topicsService.cloneTopic(getCluster(clusterName), topicName, newTopicName)
         .map(clusterMapper::toTopic)
-        .map(s -> new ResponseEntity<>(s, HttpStatus.CREATED));
+        .map(s -> new ResponseEntity<>(s, HttpStatus.CREATED))
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> deleteTopic(
       String clusterName, String topicName, ServerWebExchange exchange) {
-    return topicsService.deleteTopic(getCluster(clusterName), topicName).map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(DELETE)
+        .build());
+
+    return validateAccess.then(
+        topicsService.deleteTopic(getCluster(clusterName), topicName).map(ResponseEntity::ok)
+    );
   }
 
 
   @Override
   public Mono<ResponseEntity<Flux<TopicConfigDTO>>> getTopicConfigs(
       String clusterName, String topicName, ServerWebExchange exchange) {
-    return topicsService.getTopicConfigs(getCluster(clusterName), topicName)
-        .map(lst -> lst.stream()
-            .map(InternalTopicConfig::from)
-            .map(clusterMapper::toTopicConfig)
-            .collect(toList()))
-        .map(Flux::fromIterable)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW)
+        .build());
+
+    return validateAccess.then(
+        topicsService.getTopicConfigs(getCluster(clusterName), topicName)
+            .map(lst -> lst.stream()
+                .map(InternalTopicConfig::from)
+                .map(clusterMapper::toTopicConfig)
+                .collect(toList()))
+            .map(Flux::fromIterable)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<TopicDetailsDTO>> getTopicDetails(
       String clusterName, String topicName, ServerWebExchange exchange) {
-    return topicsService.getTopicDetails(getCluster(clusterName), topicName)
-        .map(clusterMapper::toTopicDetails)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW)
+        .build());
+
+    return validateAccess.then(
+        topicsService.getTopicDetails(getCluster(clusterName), topicName)
+            .map(clusterMapper::toTopicDetails)
+            .map(ResponseEntity::ok)
+    );
   }
 
-  public Mono<ResponseEntity<TopicsResponseDTO>> getTopics(String clusterName, @Valid Integer page,
+  @Override
+  public Mono<ResponseEntity<TopicsResponseDTO>> getTopics(String clusterName,
+                                                           @Valid Integer page,
                                                            @Valid Integer perPage,
                                                            @Valid Boolean showInternal,
                                                            @Valid String search,
                                                            @Valid TopicColumnsToSortDTO orderBy,
                                                            @Valid SortOrderDTO sortOrder,
                                                            ServerWebExchange exchange) {
+
     return topicsService.getTopicsForPagination(getCluster(clusterName))
         .flatMap(existingTopics -> {
           int pageSize = perPage != null && perPage > 0 ? perPage : DEFAULT_PAGE_SIZE;
@@ -115,7 +177,7 @@ public class TopicsController extends AbstractController implements TopicsApi {
                   || showInternal != null && showInternal)
               .filter(topic -> search == null || StringUtils.contains(topic.getName(), search))
               .sorted(comparator)
-              .collect(toList());
+              .toList();
           var totalPages = (filtered.size() / pageSize)
               + (filtered.size() % pageSize == 0 ? 0 : 1);
 
@@ -126,42 +188,34 @@ public class TopicsController extends AbstractController implements TopicsApi {
               .collect(toList());
 
           return topicsService.loadTopics(getCluster(clusterName), topicsPage)
+              .flatMapMany(Flux::fromIterable)
+              .filterWhen(dto -> accessControlService.isTopicAccessible(dto, clusterName))
+              .collectList()
               .map(topicsToRender ->
                   new TopicsResponseDTO()
                       .topics(topicsToRender.stream().map(clusterMapper::toTopic).collect(toList()))
                       .pageCount(totalPages));
-        }).map(ResponseEntity::ok);
-  }
-
-  private Comparator<InternalTopic> getComparatorForTopic(
-      TopicColumnsToSortDTO orderBy) {
-    var defaultComparator = Comparator.comparing(InternalTopic::getName);
-    if (orderBy == null) {
-      return defaultComparator;
-    }
-    switch (orderBy) {
-      case TOTAL_PARTITIONS:
-        return Comparator.comparing(InternalTopic::getPartitionCount);
-      case OUT_OF_SYNC_REPLICAS:
-        return Comparator.comparing(t -> t.getReplicas() - t.getInSyncReplicas());
-      case REPLICATION_FACTOR:
-        return Comparator.comparing(InternalTopic::getReplicationFactor);
-      case SIZE:
-        return Comparator.comparing(InternalTopic::getSegmentSize);
-      case NAME:
-      default:
-        return defaultComparator;
-    }
+        })
+        .map(ResponseEntity::ok);
   }
 
   @Override
   public Mono<ResponseEntity<TopicDTO>> updateTopic(
-      String clusterId, String topicName, @Valid Mono<TopicUpdateDTO> topicUpdate,
+      String clusterName, String topicName, @Valid Mono<TopicUpdateDTO> topicUpdate,
       ServerWebExchange exchange) {
-    return topicsService
-        .updateTopic(getCluster(clusterId), topicName, topicUpdate)
-        .map(clusterMapper::toTopic)
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW, EDIT)
+        .build());
+
+    return validateAccess.then(
+        topicsService
+            .updateTopic(getCluster(clusterName), topicName, topicUpdate)
+            .map(clusterMapper::toTopic)
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -169,9 +223,18 @@ public class TopicsController extends AbstractController implements TopicsApi {
       String clusterName, String topicName,
       Mono<PartitionsIncreaseDTO> partitionsIncrease,
       ServerWebExchange exchange) {
-    return partitionsIncrease.flatMap(partitions ->
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW, EDIT)
+        .build());
+
+    return validateAccess.then(
+        partitionsIncrease.flatMap(partitions ->
             topicsService.increaseTopicPartitions(getCluster(clusterName), topicName, partitions)
-        ).map(ResponseEntity::ok);
+        ).map(ResponseEntity::ok)
+    );
   }
 
   @Override
@@ -179,23 +242,48 @@ public class TopicsController extends AbstractController implements TopicsApi {
       String clusterName, String topicName,
       Mono<ReplicationFactorChangeDTO> replicationFactorChange,
       ServerWebExchange exchange) {
-    return replicationFactorChange
-        .flatMap(rfc ->
-            topicsService.changeReplicationFactor(getCluster(clusterName), topicName, rfc))
-        .map(ResponseEntity::ok);
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(VIEW, EDIT)
+        .build());
+
+    return validateAccess.then(
+        replicationFactorChange
+            .flatMap(rfc ->
+                topicsService.changeReplicationFactor(getCluster(clusterName), topicName, rfc))
+            .map(ResponseEntity::ok)
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> analyzeTopic(String clusterName, String topicName, ServerWebExchange exchange) {
-    return topicAnalysisService.analyze(getCluster(clusterName), topicName)
-        .thenReturn(ResponseEntity.ok().build());
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_READ)
+        .build());
+
+    return validateAccess.then(
+        topicAnalysisService.analyze(getCluster(clusterName), topicName)
+            .thenReturn(ResponseEntity.ok().build())
+    );
   }
 
   @Override
   public Mono<ResponseEntity<Void>> cancelTopicAnalysis(String clusterName, String topicName,
-                                                       ServerWebExchange exchange) {
+                                                        ServerWebExchange exchange) {
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_READ)
+        .build());
+
     topicAnalysisService.cancelAnalysis(getCluster(clusterName), topicName);
-    return Mono.just(ResponseEntity.ok().build());
+
+    return validateAccess.thenReturn(ResponseEntity.ok().build());
   }
 
 
@@ -203,10 +291,36 @@ public class TopicsController extends AbstractController implements TopicsApi {
   public Mono<ResponseEntity<TopicAnalysisDTO>> getTopicAnalysis(String clusterName,
                                                                  String topicName,
                                                                  ServerWebExchange exchange) {
-    return Mono.just(
-        topicAnalysisService.getTopicAnalysis(getCluster(clusterName), topicName)
-            .map(ResponseEntity::ok)
-            .orElseGet(() -> ResponseEntity.notFound().build())
-    );
+
+    Mono<Void> validateAccess = accessControlService.validateAccess(AccessContext.builder()
+        .cluster(clusterName)
+        .topic(topicName)
+        .topicActions(MESSAGES_READ)
+        .build());
+
+    return validateAccess.thenReturn(topicAnalysisService.getTopicAnalysis(getCluster(clusterName), topicName)
+        .map(ResponseEntity::ok)
+        .orElseGet(() -> ResponseEntity.notFound().build()));
+  }
+
+  private Comparator<InternalTopic> getComparatorForTopic(
+      TopicColumnsToSortDTO orderBy) {
+    var defaultComparator = Comparator.comparing(InternalTopic::getName);
+    if (orderBy == null) {
+      return defaultComparator;
+    }
+    switch (orderBy) {
+      case TOTAL_PARTITIONS:
+        return Comparator.comparing(InternalTopic::getPartitionCount);
+      case OUT_OF_SYNC_REPLICAS:
+        return Comparator.comparing(t -> t.getReplicas() - t.getInSyncReplicas());
+      case REPLICATION_FACTOR:
+        return Comparator.comparing(InternalTopic::getReplicationFactor);
+      case SIZE:
+        return Comparator.comparing(InternalTopic::getSegmentSize);
+      case NAME:
+      default:
+        return defaultComparator;
+    }
   }
 }

+ 2 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/exception/ErrorCode.java

@@ -7,6 +7,8 @@ import org.springframework.http.HttpStatus;
 
 public enum ErrorCode {
 
+  FORBIDDEN(403, HttpStatus.FORBIDDEN),
+
   UNEXPECTED(5000, HttpStatus.INTERNAL_SERVER_ERROR),
   KSQL_API_ERROR(5001, HttpStatus.INTERNAL_SERVER_ERROR),
   BINDING_FAIL(4001, HttpStatus.BAD_REQUEST),

+ 134 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/AccessContext.java

@@ -0,0 +1,134 @@
+package com.provectus.kafka.ui.model.rbac;
+
+import com.provectus.kafka.ui.model.rbac.permission.ClusterConfigAction;
+import com.provectus.kafka.ui.model.rbac.permission.ConnectAction;
+import com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction;
+import com.provectus.kafka.ui.model.rbac.permission.KsqlAction;
+import com.provectus.kafka.ui.model.rbac.permission.SchemaAction;
+import com.provectus.kafka.ui.model.rbac.permission.TopicAction;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+import lombok.Value;
+import org.springframework.util.Assert;
+
+@Value
+public class AccessContext {
+
+  String cluster;
+  Collection<ClusterConfigAction> clusterConfigActions;
+
+  String topic;
+  Collection<TopicAction> topicActions;
+
+  String consumerGroup;
+  Collection<ConsumerGroupAction> consumerGroupActions;
+
+  String connect;
+  Collection<ConnectAction> connectActions;
+
+  String connector;
+
+  String schema;
+  Collection<SchemaAction> schemaActions;
+
+  Collection<KsqlAction> ksqlActions;
+
+  public static AccessContextBuilder builder() {
+    return new AccessContextBuilder();
+  }
+
+  public static final class AccessContextBuilder {
+    private String cluster;
+    private Collection<ClusterConfigAction> clusterConfigActions = Collections.emptySet();
+    private String topic;
+    private Collection<TopicAction> topicActions = Collections.emptySet();
+    private String consumerGroup;
+    private Collection<ConsumerGroupAction> consumerGroupActions = Collections.emptySet();
+    private String connect;
+    private Collection<ConnectAction> connectActions = Collections.emptySet();
+    private String connector;
+    private String schema;
+    private Collection<SchemaAction> schemaActions = Collections.emptySet();
+    private Collection<KsqlAction> ksqlActions = Collections.emptySet();
+
+    private AccessContextBuilder() {
+    }
+
+    public AccessContextBuilder cluster(String cluster) {
+      this.cluster = cluster;
+      return this;
+    }
+
+    public AccessContextBuilder clusterConfigActions(ClusterConfigAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.clusterConfigActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContextBuilder topic(String topic) {
+      this.topic = topic;
+      return this;
+    }
+
+    public AccessContextBuilder topicActions(TopicAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.topicActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContextBuilder consumerGroup(String consumerGroup) {
+      this.consumerGroup = consumerGroup;
+      return this;
+    }
+
+    public AccessContextBuilder consumerGroupActions(ConsumerGroupAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.consumerGroupActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContextBuilder connect(String connect) {
+      this.connect = connect;
+      return this;
+    }
+
+    public AccessContextBuilder connectActions(ConnectAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.connectActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContextBuilder connector(String connector) {
+      this.connector = connector;
+      return this;
+    }
+
+    public AccessContextBuilder schema(String schema) {
+      this.schema = schema;
+      return this;
+    }
+
+    public AccessContextBuilder schemaActions(SchemaAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.schemaActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContextBuilder ksqlActions(KsqlAction... actions) {
+      Assert.isTrue(actions.length > 0, "actions not present");
+      this.ksqlActions = List.of(actions);
+      return this;
+    }
+
+    public AccessContext build() {
+      return new AccessContext(cluster, clusterConfigActions,
+          topic, topicActions,
+          consumerGroup, consumerGroupActions,
+          connect, connectActions,
+          connector,
+          schema, schemaActions,
+          ksqlActions);
+    }
+  }
+}

+ 72 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Permission.java

@@ -0,0 +1,72 @@
+package com.provectus.kafka.ui.model.rbac;
+
+import static com.provectus.kafka.ui.model.rbac.Resource.CLUSTERCONFIG;
+import static com.provectus.kafka.ui.model.rbac.Resource.KSQL;
+
+import com.provectus.kafka.ui.model.rbac.permission.ClusterConfigAction;
+import com.provectus.kafka.ui.model.rbac.permission.ConnectAction;
+import com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction;
+import com.provectus.kafka.ui.model.rbac.permission.KsqlAction;
+import com.provectus.kafka.ui.model.rbac.permission.SchemaAction;
+import com.provectus.kafka.ui.model.rbac.permission.TopicAction;
+import java.util.Arrays;
+import java.util.List;
+import java.util.regex.Pattern;
+import lombok.EqualsAndHashCode;
+import lombok.Getter;
+import lombok.ToString;
+import org.apache.commons.collections.CollectionUtils;
+import org.jetbrains.annotations.Nullable;
+import org.springframework.util.Assert;
+
+@Getter
+@ToString
+@EqualsAndHashCode
+public class Permission {
+
+  Resource resource;
+
+  @Nullable
+  Pattern value;
+  List<String> actions;
+
+  @SuppressWarnings("unused")
+  public void setResource(String resource) {
+    this.resource = Resource.fromString(resource.toUpperCase());
+  }
+
+  public void setValue(String value) {
+    this.value = Pattern.compile(value);
+  }
+
+  @SuppressWarnings("unused")
+  public void setActions(List<String> actions) {
+    this.actions = actions;
+  }
+
+  public void validate() {
+    Assert.notNull(resource, "resource cannot be null");
+    if (!List.of(KSQL, CLUSTERCONFIG).contains(this.resource)) {
+      Assert.notNull(value, "permission value can't be empty for resource " + resource);
+    }
+  }
+
+  public void transform() {
+    if (CollectionUtils.isEmpty(actions) || this.actions.stream().noneMatch("ALL"::equalsIgnoreCase)) {
+      return;
+    }
+    this.actions = getActionValues();
+  }
+
+  private List<String> getActionValues() {
+    return switch (this.resource) {
+      case CLUSTERCONFIG -> Arrays.stream(ClusterConfigAction.values()).map(Enum::toString).toList();
+      case TOPIC -> Arrays.stream(TopicAction.values()).map(Enum::toString).toList();
+      case CONSUMER -> Arrays.stream(ConsumerGroupAction.values()).map(Enum::toString).toList();
+      case SCHEMA -> Arrays.stream(SchemaAction.values()).map(Enum::toString).toList();
+      case CONNECT -> Arrays.stream(ConnectAction.values()).map(Enum::toString).toList();
+      case KSQL -> Arrays.stream(KsqlAction.values()).map(Enum::toString).toList();
+    };
+  }
+
+}

+ 21 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Resource.java

@@ -0,0 +1,21 @@
+package com.provectus.kafka.ui.model.rbac;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum Resource {
+
+  CLUSTERCONFIG,
+  TOPIC,
+  CONSUMER,
+  SCHEMA,
+  CONNECT,
+  KSQL;
+
+  @Nullable
+  public static Resource fromString(String name) {
+    return EnumUtils.getEnum(Resource.class, name);
+  }
+
+
+}

+ 19 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Role.java

@@ -0,0 +1,19 @@
+package com.provectus.kafka.ui.model.rbac;
+
+import java.util.List;
+import lombok.Data;
+
+@Data
+public class Role {
+
+  String name;
+  List<String> clusters;
+  List<Subject> subjects;
+  List<Permission> permissions;
+
+  public void validate() {
+    permissions.forEach(Permission::transform);
+    permissions.forEach(Permission::validate);
+  }
+
+}

+ 24 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/Subject.java

@@ -0,0 +1,24 @@
+package com.provectus.kafka.ui.model.rbac;
+
+import com.provectus.kafka.ui.model.rbac.provider.Provider;
+import lombok.Getter;
+
+@Getter
+public class Subject {
+
+  Provider provider;
+  String type;
+  String value;
+
+  public void setProvider(String provider) {
+    this.provider = Provider.fromString(provider.toUpperCase());
+  }
+
+  public void setType(String type) {
+    this.type = type;
+  }
+
+  public void setValue(String value) {
+    this.value = value;
+  }
+}

+ 18 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ClusterConfigAction.java

@@ -0,0 +1,18 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum ClusterConfigAction implements PermissibleAction {
+
+  VIEW,
+  EDIT
+
+  ;
+
+  @Nullable
+  public static ClusterConfigAction fromString(String name) {
+    return EnumUtils.getEnum(ClusterConfigAction.class, name);
+  }
+
+}

+ 19 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ConnectAction.java

@@ -0,0 +1,19 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum ConnectAction implements PermissibleAction {
+
+  VIEW,
+  EDIT,
+  CREATE
+
+  ;
+
+  @Nullable
+  public static ConnectAction fromString(String name) {
+    return EnumUtils.getEnum(ConnectAction.class, name);
+  }
+
+}

+ 20 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/ConsumerGroupAction.java

@@ -0,0 +1,20 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum ConsumerGroupAction implements PermissibleAction {
+
+  VIEW,
+  DELETE,
+
+  RESET_OFFSETS
+
+  ;
+
+  @Nullable
+  public static ConsumerGroupAction fromString(String name) {
+    return EnumUtils.getEnum(ConsumerGroupAction.class, name);
+  }
+
+}

+ 15 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/KsqlAction.java

@@ -0,0 +1,15 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum KsqlAction implements PermissibleAction {
+
+  EXECUTE;
+
+  @Nullable
+  public static KsqlAction fromString(String name) {
+    return EnumUtils.getEnum(KsqlAction.class, name);
+  }
+
+}

+ 4 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/PermissibleAction.java

@@ -0,0 +1,4 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+public interface PermissibleAction {
+}

+ 21 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/SchemaAction.java

@@ -0,0 +1,21 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum SchemaAction implements PermissibleAction {
+
+  VIEW,
+  CREATE,
+  DELETE,
+  EDIT,
+  MODIFY_GLOBAL_COMPATIBILITY
+
+  ;
+
+  @Nullable
+  public static SchemaAction fromString(String name) {
+    return EnumUtils.getEnum(SchemaAction.class, name);
+  }
+
+}

+ 24 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/permission/TopicAction.java

@@ -0,0 +1,24 @@
+package com.provectus.kafka.ui.model.rbac.permission;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum TopicAction implements PermissibleAction {
+
+  VIEW,
+  CREATE,
+  EDIT,
+  DELETE,
+
+  MESSAGES_READ,
+  MESSAGES_PRODUCE,
+  MESSAGES_DELETE,
+
+  ;
+
+  @Nullable
+  public static TopicAction fromString(String name) {
+    return EnumUtils.getEnum(TopicAction.class, name);
+  }
+
+}

+ 27 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/model/rbac/provider/Provider.java

@@ -0,0 +1,27 @@
+package com.provectus.kafka.ui.model.rbac.provider;
+
+import org.apache.commons.lang3.EnumUtils;
+import org.jetbrains.annotations.Nullable;
+
+public enum Provider {
+
+  OAUTH_GOOGLE,
+  OAUTH_GITHUB,
+
+  OAUTH_COGNITO,
+
+  LDAP,
+  LDAP_AD;
+
+  @Nullable
+  public static Provider fromString(String name) {
+    return EnumUtils.getEnum(Provider.class, name);
+  }
+
+  public static class Name {
+    public static String GOOGLE = "google";
+    public static String GITHUB = "github";
+    public static String COGNITO = "cognito";
+  }
+
+}

+ 1 - 1
kafka-ui-api/src/main/java/com/provectus/kafka/ui/serdes/builtin/sr/JsonSchemaSchemaRegistrySerializer.java

@@ -4,7 +4,7 @@ import com.fasterxml.jackson.core.JsonProcessingException;
 import com.fasterxml.jackson.databind.JsonNode;
 import com.fasterxml.jackson.databind.ObjectMapper;
 import com.provectus.kafka.ui.exception.ValidationException;
-import com.provectus.kafka.ui.util.annotations.KafkaClientInternalsDependant;
+import com.provectus.kafka.ui.util.annotation.KafkaClientInternalsDependant;
 import io.confluent.kafka.schemaregistry.ParsedSchema;
 import io.confluent.kafka.schemaregistry.client.SchemaMetadata;
 import io.confluent.kafka.schemaregistry.client.SchemaRegistryClient;

+ 1 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ClusterService.java

@@ -38,6 +38,7 @@ public class ClusterService {
   }
 
   public Mono<ClusterMetricsDTO> getClusterMetrics(KafkaCluster cluster) {
+
     return Mono.just(
         clusterMapper.toClusterMetrics(
             statisticsCache.get(cluster).getMetrics()));

+ 11 - 5
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ConsumerGroupService.java

@@ -5,6 +5,7 @@ import com.provectus.kafka.ui.model.InternalConsumerGroup;
 import com.provectus.kafka.ui.model.InternalTopicConsumerGroup;
 import com.provectus.kafka.ui.model.KafkaCluster;
 import com.provectus.kafka.ui.model.SortOrderDTO;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
 import java.util.ArrayList;
 import java.util.Comparator;
 import java.util.HashMap;
@@ -35,6 +36,7 @@ import reactor.util.function.Tuples;
 public class ConsumerGroupService {
 
   private final AdminClientService adminClientService;
+  private final AccessControlService accessControlService;
 
   private Mono<List<InternalConsumerGroup>> getConsumerGroups(
       ReactiveAdminClient ac,
@@ -107,8 +109,7 @@ public class ConsumerGroupService {
       int perPage,
       @Nullable String search,
       ConsumerGroupOrderingDTO orderBy,
-      SortOrderDTO sortOrderDto
-  ) {
+      SortOrderDTO sortOrderDto) {
     var comparator = sortOrderDto.equals(SortOrderDTO.ASC)
         ? getPaginationComparator(orderBy)
         : getPaginationComparator(orderBy).reversed();
@@ -121,9 +122,14 @@ public class ConsumerGroupService {
                     .skip((long) (page - 1) * perPage)
                     .limit(perPage)
                     .collect(Collectors.toList())
-            ).map(cgs -> new ConsumerGroupsPage(
-                cgs,
-                (descriptions.size() / perPage) + (descriptions.size() % perPage == 0 ? 0 : 1))))
+            )
+                .flatMapMany(Flux::fromIterable)
+                .filterWhen(
+                    cg -> accessControlService.isConsumerGroupAccessible(cg.getGroupId(), cluster.getName()))
+                .collect(Collectors.toList())
+                .map(cgs -> new ConsumerGroupsPage(
+                    cgs,
+                    (descriptions.size() / perPage) + (descriptions.size() % perPage == 0 ? 0 : 1))))
     );
   }
 

+ 6 - 9
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/KafkaConnectService.java

@@ -52,19 +52,16 @@ public class KafkaConnectService {
   private final KafkaConfigSanitizer kafkaConfigSanitizer;
   private final KafkaConnectClientsFactory kafkaConnectClientsFactory;
 
-  public Mono<Flux<ConnectDTO>> getConnects(KafkaCluster cluster) {
-    return Mono.just(
-        Flux.fromIterable(
-            cluster.getKafkaConnect().stream()
-                .map(clusterMapper::toKafkaConnect)
-                .collect(Collectors.toList())
-        )
-    );
+  public List<ConnectDTO> getConnects(KafkaCluster cluster) {
+    return cluster.getKafkaConnect().stream()
+        .map(clusterMapper::toKafkaConnect)
+        .collect(Collectors.toList());
   }
 
   public Flux<FullConnectorInfoDTO> getAllConnectors(final KafkaCluster cluster,
                                                      final String search) {
-    return getConnects(cluster)
+    Mono<Flux<ConnectDTO>> clusters = Mono.just(Flux.fromIterable(getConnects(cluster))); // TODO get rid
+    return clusters
         .flatMapMany(Function.identity())
         .flatMap(connect -> getConnectorNames(cluster, connect.getName()))
         .flatMap(pair -> getConnector(cluster, pair.getT1(), pair.getT2()))

+ 1 - 1
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/ReactiveAdminClient.java

@@ -11,7 +11,7 @@ import com.provectus.kafka.ui.exception.NotFoundException;
 import com.provectus.kafka.ui.exception.ValidationException;
 import com.provectus.kafka.ui.util.MapUtil;
 import com.provectus.kafka.ui.util.NumberUtil;
-import com.provectus.kafka.ui.util.annotations.KafkaClientInternalsDependant;
+import com.provectus.kafka.ui.util.annotation.KafkaClientInternalsDependant;
 import java.io.Closeable;
 import java.util.ArrayList;
 import java.util.Arrays;

+ 31 - 35
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/SchemaRegistryService.java

@@ -24,6 +24,7 @@ import com.provectus.kafka.ui.model.schemaregistry.SubjectIdResponse;
 import com.provectus.kafka.ui.util.SecuredWebClient;
 import java.io.IOException;
 import java.net.URI;
+import java.util.Arrays;
 import java.util.Collections;
 import java.util.Formatter;
 import java.util.List;
@@ -198,17 +199,12 @@ public class SchemaRegistryService {
    * and then returns the whole content by requesting its latest version.
    */
   public Mono<SchemaSubjectDTO> registerNewSchema(KafkaCluster cluster,
-                                                  Mono<NewSchemaSubjectDTO> newSchemaSubject) {
-    return newSchemaSubject
-        .flatMap(schema -> {
-          SchemaTypeDTO schemaType =
-              SchemaTypeDTO.AVRO == schema.getSchemaType() ? null : schema.getSchemaType();
-          Mono<InternalNewSchema> newSchema =
-              Mono.just(new InternalNewSchema(schema.getSchema(), schemaType));
-          String subject = schema.getSubject();
-          return submitNewSchema(subject, newSchema, cluster)
-              .flatMap(resp -> getLatestSchemaVersionBySubject(cluster, subject));
-        });
+                                                  NewSchemaSubjectDTO dto) {
+    SchemaTypeDTO schemaType = SchemaTypeDTO.AVRO == dto.getSchemaType() ? null : dto.getSchemaType();
+    Mono<InternalNewSchema> newSchema = Mono.just(new InternalNewSchema(dto.getSchema(), schemaType));
+    String subject = dto.getSubject();
+    return submitNewSchema(subject, newSchema, cluster)
+        .flatMap(resp -> getLatestSchemaVersionBySubject(cluster, subject));
   }
 
   @NotNull
@@ -258,18 +254,18 @@ public class SchemaRegistryService {
                                               Mono<CompatibilityLevelDTO> compatibilityLevel) {
     String configEndpoint = Objects.isNull(schemaName) ? "/config" : "/config/{schemaName}";
     return configuredWebClient(
-            cluster,
-            HttpMethod.PUT,
-            configEndpoint,
+        cluster,
+        HttpMethod.PUT,
+        configEndpoint,
         schemaName)
-            .contentType(MediaType.APPLICATION_JSON)
-            .body(BodyInserters.fromPublisher(compatibilityLevel, CompatibilityLevelDTO.class))
-            .retrieve()
-            .onStatus(NOT_FOUND::equals,
-                throwIfNotFoundStatus(formatted(NO_SUCH_SCHEMA, schemaName)))
-            .bodyToMono(Void.class)
-            .as(m -> failoverAble(m, new FailoverMono<>(cluster.getSchemaRegistry(),
-                () -> this.updateSchemaCompatibility(cluster, schemaName, compatibilityLevel))));
+        .contentType(MediaType.APPLICATION_JSON)
+        .body(BodyInserters.fromPublisher(compatibilityLevel, CompatibilityLevelDTO.class))
+        .retrieve()
+        .onStatus(NOT_FOUND::equals,
+            throwIfNotFoundStatus(formatted(NO_SUCH_SCHEMA, schemaName)))
+        .bodyToMono(Void.class)
+        .as(m -> failoverAble(m, new FailoverMono<>(cluster.getSchemaRegistry(),
+            () -> this.updateSchemaCompatibility(cluster, schemaName, compatibilityLevel))));
   }
 
   public Mono<Void> updateSchemaCompatibility(KafkaCluster cluster,
@@ -278,7 +274,7 @@ public class SchemaRegistryService {
   }
 
   public Mono<InternalCompatibilityLevel> getSchemaCompatibilityLevel(KafkaCluster cluster,
-                                                                 String schemaName) {
+                                                                      String schemaName) {
     String globalConfig = Objects.isNull(schemaName) ? "/config" : "/config/{schemaName}";
     final var values = new LinkedMultiValueMap<String, String>();
     values.add("defaultToGlobal", "true");
@@ -298,7 +294,7 @@ public class SchemaRegistryService {
   }
 
   private Mono<InternalCompatibilityLevel> getSchemaCompatibilityInfoOrGlobal(KafkaCluster cluster,
-                                                                         String schemaName) {
+                                                                              String schemaName) {
     return this.getSchemaCompatibilityLevel(cluster, schemaName)
         .switchIfEmpty(this.getGlobalSchemaCompatibilityLevel(cluster));
   }
@@ -306,18 +302,18 @@ public class SchemaRegistryService {
   public Mono<InternalCompatibilityCheck> checksSchemaCompatibility(
       KafkaCluster cluster, String schemaName, Mono<NewSchemaSubjectDTO> newSchemaSubject) {
     return configuredWebClient(
-            cluster,
-            HttpMethod.POST,
-            "/compatibility/subjects/{schemaName}/versions/latest",
+        cluster,
+        HttpMethod.POST,
+        "/compatibility/subjects/{schemaName}/versions/latest",
         schemaName)
-            .contentType(MediaType.APPLICATION_JSON)
-            .body(BodyInserters.fromPublisher(newSchemaSubject, NewSchemaSubjectDTO.class))
-            .retrieve()
-            .onStatus(NOT_FOUND::equals,
-                throwIfNotFoundStatus(formatted(NO_SUCH_SCHEMA, schemaName)))
-            .bodyToMono(InternalCompatibilityCheck.class)
-            .as(m -> failoverAble(m, new FailoverMono<>(cluster.getSchemaRegistry(),
-                () -> this.checksSchemaCompatibility(cluster, schemaName, newSchemaSubject))));
+        .contentType(MediaType.APPLICATION_JSON)
+        .body(BodyInserters.fromPublisher(newSchemaSubject, NewSchemaSubjectDTO.class))
+        .retrieve()
+        .onStatus(NOT_FOUND::equals,
+            throwIfNotFoundStatus(formatted(NO_SUCH_SCHEMA, schemaName)))
+        .bodyToMono(InternalCompatibilityCheck.class)
+        .as(m -> failoverAble(m, new FailoverMono<>(cluster.getSchemaRegistry(),
+            () -> this.checksSchemaCompatibility(cluster, schemaName, newSchemaSubject))));
   }
 
   public String formatted(String str, Object... args) {

+ 6 - 2
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/TopicsService.java

@@ -460,8 +460,12 @@ public class TopicsService {
   }
 
   private Mono<List<String>> filterExisting(KafkaCluster cluster, Collection<String> topics) {
-    return adminClientService.get(cluster).flatMap(ac -> ac.listTopics(true))
-        .map(existing -> existing.stream().filter(topics::contains).collect(toList()));
+    return adminClientService.get(cluster)
+        .flatMap(ac -> ac.listTopics(true))
+        .map(existing -> existing
+            .stream()
+            .filter(topics::contains)
+            .collect(toList()));
   }
 
 }

+ 31 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/AbstractProviderCondition.java

@@ -0,0 +1,31 @@
+package com.provectus.kafka.ui.service.rbac;
+
+import com.provectus.kafka.ui.config.auth.OAuthProperties;
+import java.util.Map;
+import java.util.Objects;
+import java.util.Set;
+import java.util.function.Predicate;
+import java.util.stream.Collectors;
+import org.apache.commons.lang3.StringUtils;
+import org.springframework.boot.context.properties.bind.Bindable;
+import org.springframework.boot.context.properties.bind.Binder;
+import org.springframework.core.env.Environment;
+
+public abstract class AbstractProviderCondition {
+  private static final Bindable<Map<String, OAuthProperties.OAuth2Provider>> OAUTH2_PROPERTIES = Bindable
+      .mapOf(String.class, OAuthProperties.OAuth2Provider.class);
+
+  protected Set<String> getRegisteredProvidersTypes(final Environment env) {
+    final Map<String, OAuthProperties.OAuth2Provider> properties = Binder.get(env)
+        .bind("auth.oauth2.client", OAUTH2_PROPERTIES)
+        .orElse(Map.of());
+    return properties.values().stream()
+        .map(OAuthProperties.OAuth2Provider::getCustomParams)
+        .filter(Objects::nonNull)
+        .filter(Predicate.not(Map::isEmpty))
+        .map(params -> params.get("type"))
+        .filter(Objects::nonNull)
+        .filter(StringUtils::isNotEmpty)
+        .collect(Collectors.toSet());
+  }
+}

+ 398 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/AccessControlService.java

@@ -0,0 +1,398 @@
+package com.provectus.kafka.ui.service.rbac;
+
+import com.provectus.kafka.ui.config.auth.AuthenticatedUser;
+import com.provectus.kafka.ui.config.auth.RbacUser;
+import com.provectus.kafka.ui.config.auth.RoleBasedAccessControlProperties;
+import com.provectus.kafka.ui.model.ClusterDTO;
+import com.provectus.kafka.ui.model.ConnectDTO;
+import com.provectus.kafka.ui.model.InternalTopic;
+import com.provectus.kafka.ui.model.rbac.AccessContext;
+import com.provectus.kafka.ui.model.rbac.Permission;
+import com.provectus.kafka.ui.model.rbac.Resource;
+import com.provectus.kafka.ui.model.rbac.Role;
+import com.provectus.kafka.ui.model.rbac.permission.ConnectAction;
+import com.provectus.kafka.ui.model.rbac.permission.ConsumerGroupAction;
+import com.provectus.kafka.ui.model.rbac.permission.SchemaAction;
+import com.provectus.kafka.ui.model.rbac.permission.TopicAction;
+import com.provectus.kafka.ui.service.rbac.extractor.CognitoAuthorityExtractor;
+import com.provectus.kafka.ui.service.rbac.extractor.GithubAuthorityExtractor;
+import com.provectus.kafka.ui.service.rbac.extractor.GoogleAuthorityExtractor;
+import com.provectus.kafka.ui.service.rbac.extractor.LdapAuthorityExtractor;
+import com.provectus.kafka.ui.service.rbac.extractor.ProviderAuthorityExtractor;
+import java.util.Collections;
+import java.util.List;
+import java.util.Set;
+import java.util.function.Predicate;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+import javax.annotation.Nullable;
+import javax.annotation.PostConstruct;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.commons.collections.CollectionUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.springframework.boot.context.properties.EnableConfigurationProperties;
+import org.springframework.security.access.AccessDeniedException;
+import org.springframework.security.core.context.ReactiveSecurityContextHolder;
+import org.springframework.security.core.context.SecurityContext;
+import org.springframework.security.oauth2.client.registration.InMemoryReactiveClientRegistrationRepository;
+import org.springframework.stereotype.Service;
+import org.springframework.util.Assert;
+import reactor.core.publisher.Mono;
+
+@Service
+@RequiredArgsConstructor
+@EnableConfigurationProperties(RoleBasedAccessControlProperties.class)
+@Slf4j
+public class AccessControlService {
+
+  @Nullable
+  private final InMemoryReactiveClientRegistrationRepository clientRegistrationRepository;
+
+  private boolean rbacEnabled = false;
+  private Set<ProviderAuthorityExtractor> extractors = Collections.emptySet();
+  private final RoleBasedAccessControlProperties properties;
+
+  @PostConstruct
+  public void init() {
+    if (properties.getRoles().isEmpty()) {
+      log.trace("No roles provided, disabling RBAC");
+      return;
+    }
+    rbacEnabled = true;
+
+    this.extractors = properties.getRoles()
+        .stream()
+        .map(role -> role.getSubjects()
+            .stream()
+            .map(provider -> switch (provider.getProvider()) {
+              case OAUTH_COGNITO -> new CognitoAuthorityExtractor();
+              case OAUTH_GOOGLE -> new GoogleAuthorityExtractor();
+              case OAUTH_GITHUB -> new GithubAuthorityExtractor();
+              case LDAP, LDAP_AD -> new LdapAuthorityExtractor();
+            }).collect(Collectors.toSet()))
+        .flatMap(Set::stream)
+        .collect(Collectors.toSet());
+
+    if ((clientRegistrationRepository == null || !clientRegistrationRepository.iterator().hasNext())
+        && !properties.getRoles().isEmpty()) {
+      log.error("Roles are configured but no authentication methods are present. Authentication might fail.");
+    }
+  }
+
+  public Mono<Void> validateAccess(AccessContext context) {
+    if (!rbacEnabled) {
+      return Mono.empty();
+    }
+
+    return getUser()
+        .doOnNext(user -> {
+          boolean accessGranted =
+              isClusterAccessible(context, user)
+                  && isClusterConfigAccessible(context, user)
+                  && isTopicAccessible(context, user)
+                  && isConsumerGroupAccessible(context, user)
+                  && isConnectAccessible(context, user)
+                  && isConnectorAccessible(context, user) // TODO connector selectors
+                  && isSchemaAccessible(context, user)
+                  && isKsqlAccessible(context, user);
+
+          if (!accessGranted) {
+            throw new AccessDeniedException("Access denied");
+          }
+        })
+        .then();
+  }
+
+  public Mono<AuthenticatedUser> getUser() {
+    return ReactiveSecurityContextHolder.getContext()
+        .map(SecurityContext::getAuthentication)
+        .filter(authentication -> authentication.getPrincipal() instanceof RbacUser)
+        .map(authentication -> ((RbacUser) authentication.getPrincipal()))
+        .map(user -> new AuthenticatedUser(user.name(), user.groups()));
+  }
+
+  private boolean isClusterAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    Assert.isTrue(StringUtils.isNotEmpty(context.getCluster()), "cluster value is empty");
+
+    return properties.getRoles()
+        .stream()
+        .filter(filterRole(user))
+        .anyMatch(filterCluster(context.getCluster()));
+  }
+
+  public Mono<Boolean> isClusterAccessible(ClusterDTO cluster) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(cluster.getName())
+        .build();
+
+    return getUser().map(u -> isClusterAccessible(accessContext, u));
+  }
+
+  public boolean isClusterConfigAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (CollectionUtils.isEmpty(context.getClusterConfigActions())) {
+      return true;
+    }
+    Assert.isTrue(StringUtils.isNotEmpty(context.getCluster()), "cluster value is empty");
+
+    Set<String> requiredActions = context.getClusterConfigActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.CLUSTERCONFIG, context.getCluster(), user, context, requiredActions);
+  }
+
+  public boolean isTopicAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (context.getTopic() == null && context.getTopicActions().isEmpty()) {
+      return true;
+    }
+    Assert.isTrue(!context.getTopicActions().isEmpty(), "actions are empty");
+
+    Set<String> requiredActions = context.getTopicActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.TOPIC, context.getTopic(), user, context, requiredActions);
+  }
+
+  public Mono<Boolean> isTopicAccessible(InternalTopic dto, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(clusterName)
+        .topic(dto.getName())
+        .topicActions(TopicAction.VIEW)
+        .build();
+
+    return getUser().map(u -> isTopicAccessible(accessContext, u));
+  }
+
+  private boolean isConsumerGroupAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (context.getConsumerGroup() == null && context.getConsumerGroupActions().isEmpty()) {
+      return true;
+    }
+    Assert.isTrue(!context.getConsumerGroupActions().isEmpty(), "actions are empty");
+
+    Set<String> requiredActions = context.getConsumerGroupActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.CONSUMER, context.getConsumerGroup(), user, context, requiredActions);
+  }
+
+  public Mono<Boolean> isConsumerGroupAccessible(String groupId, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(clusterName)
+        .consumerGroup(groupId)
+        .consumerGroupActions(ConsumerGroupAction.VIEW)
+        .build();
+
+    return getUser().map(u -> isConsumerGroupAccessible(accessContext, u));
+  }
+
+  public boolean isSchemaAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (context.getSchema() == null && context.getSchemaActions().isEmpty()) {
+      return true;
+    }
+    Assert.isTrue(!context.getSchemaActions().isEmpty(), "actions are empty");
+
+    Set<String> requiredActions = context.getSchemaActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.SCHEMA, context.getSchema(), user, context, requiredActions);
+  }
+
+  public Mono<Boolean> isSchemaAccessible(String schema, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(clusterName)
+        .schema(schema)
+        .schemaActions(SchemaAction.VIEW)
+        .build();
+
+    return getUser().map(u -> isSchemaAccessible(accessContext, u));
+  }
+
+  public boolean isConnectAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (context.getConnect() == null && context.getConnectActions().isEmpty()) {
+      return true;
+    }
+    Assert.isTrue(!context.getConnectActions().isEmpty(), "actions are empty");
+
+    Set<String> requiredActions = context.getConnectActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.CONNECT, context.getConnect(), user, context, requiredActions);
+  }
+
+  public Mono<Boolean> isConnectAccessible(ConnectDTO dto, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    return isConnectAccessible(dto.getName(), clusterName);
+  }
+
+  public Mono<Boolean> isConnectAccessible(String connectName, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .build();
+
+    return getUser().map(u -> isConnectAccessible(accessContext, u));
+  }
+
+  public boolean isConnectorAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    return isConnectAccessible(context, user);
+  }
+
+  public Mono<Boolean> isConnectorAccessible(String connectName, String connectorName, String clusterName) {
+    if (!rbacEnabled) {
+      return Mono.just(true);
+    }
+
+    AccessContext accessContext = AccessContext
+        .builder()
+        .cluster(clusterName)
+        .connect(connectName)
+        .connectActions(ConnectAction.VIEW)
+        .connector(connectorName)
+        .build();
+
+    return getUser().map(u -> isConnectorAccessible(accessContext, u));
+  }
+
+  private boolean isKsqlAccessible(AccessContext context, AuthenticatedUser user) {
+    if (!rbacEnabled) {
+      return true;
+    }
+
+    if (context.getKsqlActions().isEmpty()) {
+      return true;
+    }
+
+    Set<String> requiredActions = context.getKsqlActions()
+        .stream()
+        .map(a -> a.toString().toUpperCase())
+        .collect(Collectors.toSet());
+
+    return isAccessible(Resource.KSQL, null, user, context, requiredActions);
+  }
+
+  public Set<ProviderAuthorityExtractor> getExtractors() {
+    return extractors;
+  }
+
+  public List<Role> getRoles() {
+    if (!rbacEnabled) {
+      return Collections.emptyList();
+    }
+    return Collections.unmodifiableList(properties.getRoles());
+  }
+
+  private boolean isAccessible(Resource resource, String resourceValue,
+                               AuthenticatedUser user, AccessContext context, Set<String> requiredActions) {
+    Set<String> grantedActions = properties.getRoles()
+        .stream()
+        .filter(filterRole(user))
+        .filter(filterCluster(context.getCluster()))
+        .flatMap(grantedRole -> grantedRole.getPermissions().stream())
+        .filter(filterResource(resource))
+        .filter(filterResourceValue(resourceValue))
+        .flatMap(grantedPermission -> grantedPermission.getActions().stream())
+        .map(String::toUpperCase)
+        .collect(Collectors.toSet());
+
+    return grantedActions.containsAll(requiredActions);
+  }
+
+  private Predicate<Role> filterRole(AuthenticatedUser user) {
+    return role -> user.groups().contains(role.getName());
+  }
+
+  private Predicate<Role> filterCluster(String cluster) {
+    return grantedRole -> grantedRole.getClusters()
+        .stream()
+        .anyMatch(cluster::equalsIgnoreCase);
+  }
+
+  private Predicate<Permission> filterResource(Resource resource) {
+    return grantedPermission -> resource == grantedPermission.getResource();
+  }
+
+  private Predicate<Permission> filterResourceValue(String resourceValue) {
+
+    if (resourceValue == null) {
+      return grantedPermission -> true;
+    }
+    return grantedPermission -> {
+      Pattern value = grantedPermission.getValue();
+      if (value == null) {
+        return true;
+      }
+      return value.matcher(resourceValue).matches();
+    };
+  }
+
+  public boolean isRbacEnabled() {
+    return rbacEnabled;
+  }
+}

+ 70 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/CognitoAuthorityExtractor.java

@@ -0,0 +1,70 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.nimbusds.jose.shaded.json.JSONArray;
+import com.provectus.kafka.ui.model.rbac.Role;
+import com.provectus.kafka.ui.model.rbac.provider.Provider;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+import lombok.extern.slf4j.Slf4j;
+import org.springframework.security.oauth2.core.user.DefaultOAuth2User;
+import reactor.core.publisher.Mono;
+
+@Slf4j
+public class CognitoAuthorityExtractor implements ProviderAuthorityExtractor {
+
+  private static final String COGNITO_GROUPS_ATTRIBUTE_NAME = "cognito:groups";
+
+  @Override
+  public boolean isApplicable(String provider) {
+    return Provider.Name.COGNITO.equalsIgnoreCase(provider);
+  }
+
+  @Override
+  public Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams) {
+    log.debug("Extracting cognito user authorities");
+
+    DefaultOAuth2User principal;
+    try {
+      principal = (DefaultOAuth2User) value;
+    } catch (ClassCastException e) {
+      log.error("Can't cast value to DefaultOAuth2User", e);
+      throw new RuntimeException();
+    }
+
+    Set<String> groupsByUsername = acs.getRoles()
+        .stream()
+        .filter(r -> r.getSubjects()
+            .stream()
+            .filter(s -> s.getProvider().equals(Provider.OAUTH_COGNITO))
+            .filter(s -> s.getType().equals("user"))
+            .anyMatch(s -> s.getValue().equals(principal.getName())))
+        .map(Role::getName)
+        .collect(Collectors.toSet());
+
+    JSONArray groups = principal.getAttribute(COGNITO_GROUPS_ATTRIBUTE_NAME);
+    if (groups == null) {
+      log.debug("Cognito groups param is not present");
+      return Mono.just(groupsByUsername);
+    }
+
+    Set<String> groupsByGroups = acs.getRoles()
+        .stream()
+        .filter(role -> role.getSubjects()
+            .stream()
+            .filter(s -> s.getProvider().equals(Provider.OAUTH_COGNITO))
+            .filter(s -> s.getType().equals("group"))
+            .anyMatch(subject -> Stream.of(groups.toArray())
+                .map(Object::toString)
+                .distinct()
+                .anyMatch(cognitoGroup -> cognitoGroup.equals(subject.getValue()))
+            ))
+        .map(Role::getName)
+        .collect(Collectors.toSet());
+
+    return Mono.just(Stream.concat(groupsByUsername.stream(), groupsByGroups.stream()).collect(Collectors.toSet()));
+  }
+
+}

+ 99 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/GithubAuthorityExtractor.java

@@ -0,0 +1,99 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.provectus.kafka.ui.model.rbac.Role;
+import com.provectus.kafka.ui.model.rbac.provider.Provider;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+import lombok.extern.slf4j.Slf4j;
+import org.springframework.core.ParameterizedTypeReference;
+import org.springframework.http.HttpHeaders;
+import org.springframework.security.oauth2.client.userinfo.OAuth2UserRequest;
+import org.springframework.security.oauth2.core.user.DefaultOAuth2User;
+import org.springframework.web.reactive.function.client.WebClient;
+import reactor.core.publisher.Mono;
+
+@Slf4j
+public class GithubAuthorityExtractor implements ProviderAuthorityExtractor {
+
+  private static final String ORGANIZATION_ATTRIBUTE_NAME = "organizations_url";
+  private static final String USERNAME_ATTRIBUTE_NAME = "login";
+  private static final String ORGANIZATION_NAME = "login";
+  private static final String GITHUB_ACCEPT_HEADER = "application/vnd.github+json";
+
+  private final WebClient webClient = WebClient.create("https://api.github.com");
+
+  @Override
+  public boolean isApplicable(String provider) {
+    return Provider.Name.GITHUB.equalsIgnoreCase(provider);
+  }
+
+  @Override
+  public Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams) {
+    DefaultOAuth2User principal;
+    try {
+      principal = (DefaultOAuth2User) value;
+    } catch (ClassCastException e) {
+      log.error("Can't cast value to DefaultOAuth2User", e);
+      throw new RuntimeException();
+    }
+
+    Set<String> groupsByUsername = new HashSet<>();
+    String username = principal.getAttribute(USERNAME_ATTRIBUTE_NAME);
+    if (username == null) {
+      log.debug("Github username param is not present");
+    } else {
+      acs.getRoles()
+          .stream()
+          .filter(r -> r.getSubjects()
+              .stream()
+              .filter(s -> s.getProvider().equals(Provider.OAUTH_GITHUB))
+              .filter(s -> s.getType().equals("user"))
+              .anyMatch(s -> s.getValue().equals(username)))
+          .map(Role::getName)
+          .forEach(groupsByUsername::add);
+    }
+
+    String organization = principal.getAttribute(ORGANIZATION_ATTRIBUTE_NAME);
+    if (organization == null) {
+      log.debug("Github organization param is not present");
+      return Mono.just(groupsByUsername);
+    }
+
+    final Mono<List<Map<String, Object>>> userOrganizations = webClient
+        .get()
+        .uri("/user/orgs")
+        .headers(headers -> {
+          headers.set(HttpHeaders.ACCEPT, GITHUB_ACCEPT_HEADER);
+          OAuth2UserRequest request = (OAuth2UserRequest) additionalParams.get("request");
+          headers.setBearerAuth(request.getAccessToken().getTokenValue());
+        })
+        .retrieve()
+        //@formatter:off
+        .bodyToMono(new ParameterizedTypeReference<>() {});
+    //@formatter:on
+
+    return userOrganizations
+        .map(orgsMap -> {
+          var groupsByOrg = acs.getRoles()
+              .stream()
+              .filter(role -> role.getSubjects()
+                  .stream()
+                  .filter(s -> s.getProvider().equals(Provider.OAUTH_GITHUB))
+                  .filter(s -> s.getType().equals("organization"))
+                  .anyMatch(subject -> orgsMap.stream()
+                      .map(org -> org.get(ORGANIZATION_NAME).toString())
+                      .distinct()
+                      .anyMatch(orgName -> orgName.equalsIgnoreCase(subject.getValue()))
+                  ))
+              .map(Role::getName);
+
+          return Stream.concat(groupsByOrg, groupsByUsername.stream()).collect(Collectors.toSet());
+        });
+  }
+
+}

+ 69 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/GoogleAuthorityExtractor.java

@@ -0,0 +1,69 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.provectus.kafka.ui.model.rbac.Role;
+import com.provectus.kafka.ui.model.rbac.provider.Provider;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+import lombok.extern.slf4j.Slf4j;
+import org.springframework.security.oauth2.core.user.DefaultOAuth2User;
+import reactor.core.publisher.Mono;
+
+@Slf4j
+public class GoogleAuthorityExtractor implements ProviderAuthorityExtractor {
+
+  private static final String GOOGLE_DOMAIN_ATTRIBUTE_NAME = "hd";
+  public static final String EMAIL_ATTRIBUTE_NAME = "email";
+
+  @Override
+  public boolean isApplicable(String provider) {
+    return Provider.Name.GOOGLE.equalsIgnoreCase(provider);
+  }
+
+  @Override
+  public Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams) {
+    log.debug("Extracting google user authorities");
+
+    DefaultOAuth2User principal;
+    try {
+      principal = (DefaultOAuth2User) value;
+    } catch (ClassCastException e) {
+      log.error("Can't cast value to DefaultOAuth2User", e);
+      throw new RuntimeException();
+    }
+
+    Set<String> groupsByUsername = acs.getRoles()
+        .stream()
+        .filter(r -> r.getSubjects()
+            .stream()
+            .filter(s -> s.getProvider().equals(Provider.OAUTH_GOOGLE))
+            .filter(s -> s.getType().equals("user"))
+            .anyMatch(s -> s.getValue().equals(principal.getAttribute(EMAIL_ATTRIBUTE_NAME))))
+        .map(Role::getName)
+        .collect(Collectors.toSet());
+
+
+    String domain = principal.getAttribute(GOOGLE_DOMAIN_ATTRIBUTE_NAME);
+    if (domain == null) {
+      log.debug("Google domain param is not present");
+      return Mono.just(groupsByUsername);
+    }
+
+    List<String> groupsByDomain = acs.getRoles()
+        .stream()
+        .filter(r -> r.getSubjects()
+            .stream()
+            .filter(s -> s.getProvider().equals(Provider.OAUTH_GOOGLE))
+            .filter(s -> s.getType().equals("domain"))
+            .anyMatch(s -> s.getValue().equals(domain)))
+        .map(Role::getName)
+        .toList();
+
+    return Mono.just(Stream.concat(groupsByUsername.stream(), groupsByDomain.stream())
+        .collect(Collectors.toSet()));
+  }
+
+}

+ 23 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/LdapAuthorityExtractor.java

@@ -0,0 +1,23 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.Collections;
+import java.util.Map;
+import java.util.Set;
+import lombok.extern.slf4j.Slf4j;
+import reactor.core.publisher.Mono;
+
+@Slf4j
+public class LdapAuthorityExtractor implements ProviderAuthorityExtractor {
+
+  @Override
+  public boolean isApplicable(String provider) {
+    return false; // TODO #2752
+  }
+
+  @Override
+  public Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams) {
+    return Mono.just(Collections.emptySet()); // TODO #2752
+  }
+
+}

+ 31 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/OauthAuthorityExtractor.java

@@ -0,0 +1,31 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.Map;
+import java.util.Set;
+import lombok.extern.slf4j.Slf4j;
+import org.springframework.security.oauth2.core.user.DefaultOAuth2User;
+import reactor.core.publisher.Mono;
+
+@Slf4j
+public class OauthAuthorityExtractor implements ProviderAuthorityExtractor {
+
+  @Override
+  public boolean isApplicable(String provider) {
+    return false; // TODO #2844
+  }
+
+  @Override
+  public Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams) {
+    DefaultOAuth2User principal;
+    try {
+      principal = (DefaultOAuth2User) value;
+    } catch (ClassCastException e) {
+      log.error("Can't cast value to DefaultOAuth2User", e);
+      throw new RuntimeException();
+    }
+
+    return Mono.just(Set.of(principal.getName())); // TODO #2844
+  }
+
+}

+ 14 - 0
kafka-ui-api/src/main/java/com/provectus/kafka/ui/service/rbac/extractor/ProviderAuthorityExtractor.java

@@ -0,0 +1,14 @@
+package com.provectus.kafka.ui.service.rbac.extractor;
+
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import java.util.Map;
+import java.util.Set;
+import reactor.core.publisher.Mono;
+
+public interface ProviderAuthorityExtractor {
+
+  boolean isApplicable(String provider);
+
+  Mono<Set<String>> extract(AccessControlService acs, Object value, Map<String, Object> additionalParams);
+
+}

+ 1 - 1
kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/annotations/KafkaClientInternalsDependant.java → kafka-ui-api/src/main/java/com/provectus/kafka/ui/util/annotation/KafkaClientInternalsDependant.java

@@ -1,4 +1,4 @@
-package com.provectus.kafka.ui.util.annotations;
+package com.provectus.kafka.ui.util.annotation;
 
 /**
  * All code places that depend on kafka-client's internals or implementation-specific logic

+ 21 - 1
kafka-ui-api/src/main/resources/application-local.yml

@@ -34,7 +34,27 @@ kafka:
 spring:
   jmx:
     enabled: true
+  security:
+    oauth2:
+      client:
+        registration:
+          cognito:
+            clientId: xx
+            clientSecret: yy
+            scope: openid
+            client-name: cognito
+            provider: cognito
+            redirect-uri: http://localhost:8080/login/oauth2/code/cognito
+            authorization-grant-type: authorization_code
+        provider:
+          cognito:
+            issuer-uri: https://cognito-idp.eu-central-1.amazonaws.com/eu-central-1_M7cIUn1nj
+            jwk-set-uri: https://cognito-idp.eu-central-1.amazonaws.com/eu-central-1_M7cIUn1nj/.well-known/jwks.json
+            user-name-attribute: username
 auth:
   type: DISABLED
+
+roles.file: /tmp/roles.yml
+
 #server:
-#  port: 8080 #- Port in which kafka-ui will run.
+#  port: 8080 #- Port in which kafka-ui will run.

+ 38 - 32
kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/SchemaRegistryPaginationTest.java

@@ -11,10 +11,13 @@ import com.provectus.kafka.ui.mapper.ClusterMapper;
 import com.provectus.kafka.ui.model.InternalSchemaRegistry;
 import com.provectus.kafka.ui.model.KafkaCluster;
 import com.provectus.kafka.ui.model.SchemaSubjectDTO;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import com.provectus.kafka.ui.util.AccessControlServiceMock;
 import java.util.Comparator;
 import java.util.Optional;
 import java.util.stream.IntStream;
 import org.junit.jupiter.api.Test;
+import org.springframework.test.util.ReflectionTestUtils;
 import reactor.core.publisher.Mono;
 
 public class SchemaRegistryPaginationTest {
@@ -24,55 +27,58 @@ public class SchemaRegistryPaginationTest {
   private final SchemaRegistryService schemaRegistryService = mock(SchemaRegistryService.class);
   private final ClustersStorage clustersStorage = mock(ClustersStorage.class);
   private final ClusterMapper clusterMapper = mock(ClusterMapper.class);
+  private final AccessControlService accessControlService = new AccessControlServiceMock().getMock();
 
-  private final SchemasController controller = new SchemasController(clusterMapper, schemaRegistryService);
+  private final SchemasController controller
+      = new SchemasController(clusterMapper, schemaRegistryService, accessControlService);
 
   private void init(String[] subjects) {
     when(schemaRegistryService.getAllSubjectNames(isA(KafkaCluster.class)))
-                .thenReturn(Mono.just(subjects));
+        .thenReturn(Mono.just(subjects));
     when(schemaRegistryService
-            .getAllLatestVersionSchemas(isA(KafkaCluster.class), anyList())).thenCallRealMethod();
+        .getAllLatestVersionSchemas(isA(KafkaCluster.class), anyList())).thenCallRealMethod();
     when(clustersStorage.getClusterByName(isA(String.class)))
-            .thenReturn(Optional.of(buildKafkaCluster(LOCAL_KAFKA_CLUSTER_NAME)));
+        .thenReturn(Optional.of(buildKafkaCluster(LOCAL_KAFKA_CLUSTER_NAME)));
     when(schemaRegistryService.getLatestSchemaVersionBySubject(isA(KafkaCluster.class), isA(String.class)))
-            .thenAnswer(a -> Mono.just(new SchemaSubjectDTO().subject(a.getArgument(1))));
-    this.controller.setClustersStorage(clustersStorage);
+        .thenAnswer(a -> Mono.just(new SchemaSubjectDTO().subject(a.getArgument(1))));
+
+    ReflectionTestUtils.setField(controller, "clustersStorage", clustersStorage);
   }
 
   @Test
   void shouldListFirst25andThen10Schemas() {
     init(
-            IntStream.rangeClosed(1, 100)
-                    .boxed()
-                    .map(num -> "subject" + num)
-                    .toArray(String[]::new)
+        IntStream.rangeClosed(1, 100)
+            .boxed()
+            .map(num -> "subject" + num)
+            .toArray(String[]::new)
     );
     var schemasFirst25 = controller.getSchemas(LOCAL_KAFKA_CLUSTER_NAME,
-            null, null, null, null).block();
+        null, null, null, null).block();
     assertThat(schemasFirst25.getBody().getPageCount()).isEqualTo(4);
     assertThat(schemasFirst25.getBody().getSchemas()).hasSize(25);
     assertThat(schemasFirst25.getBody().getSchemas())
-            .isSortedAccordingTo(Comparator.comparing(SchemaSubjectDTO::getSubject));
+        .isSortedAccordingTo(Comparator.comparing(SchemaSubjectDTO::getSubject));
 
     var schemasFirst10 = controller.getSchemas(LOCAL_KAFKA_CLUSTER_NAME,
-            null, 10, null, null).block();
+        null, 10, null, null).block();
 
     assertThat(schemasFirst10.getBody().getPageCount()).isEqualTo(10);
     assertThat(schemasFirst10.getBody().getSchemas()).hasSize(10);
     assertThat(schemasFirst10.getBody().getSchemas())
-            .isSortedAccordingTo(Comparator.comparing(SchemaSubjectDTO::getSubject));
+        .isSortedAccordingTo(Comparator.comparing(SchemaSubjectDTO::getSubject));
   }
 
   @Test
   void shouldListSchemasContaining_1() {
     init(
-              IntStream.rangeClosed(1, 100)
-                      .boxed()
-                      .map(num -> "subject" + num)
-                      .toArray(String[]::new)
+        IntStream.rangeClosed(1, 100)
+            .boxed()
+            .map(num -> "subject" + num)
+            .toArray(String[]::new)
     );
     var schemasSearch7 = controller.getSchemas(LOCAL_KAFKA_CLUSTER_NAME,
-            null, null, "1", null).block();
+        null, null, "1", null).block();
     assertThat(schemasSearch7.getBody().getPageCount()).isEqualTo(1);
     assertThat(schemasSearch7.getBody().getSchemas()).hasSize(20);
   }
@@ -80,13 +86,13 @@ public class SchemaRegistryPaginationTest {
   @Test
   void shouldCorrectlyHandleNonPositivePageNumberAndPageSize() {
     init(
-                IntStream.rangeClosed(1, 100)
-                        .boxed()
-                        .map(num -> "subject" + num)
-                        .toArray(String[]::new)
+        IntStream.rangeClosed(1, 100)
+            .boxed()
+            .map(num -> "subject" + num)
+            .toArray(String[]::new)
     );
     var schemas = controller.getSchemas(LOCAL_KAFKA_CLUSTER_NAME,
-            0, -1, null, null).block();
+        0, -1, null, null).block();
 
     assertThat(schemas.getBody().getPageCount()).isEqualTo(4);
     assertThat(schemas.getBody().getSchemas()).hasSize(25);
@@ -96,14 +102,14 @@ public class SchemaRegistryPaginationTest {
   @Test
   void shouldCalculateCorrectPageCountForNonDivisiblePageSize() {
     init(
-                IntStream.rangeClosed(1, 100)
-                        .boxed()
-                        .map(num -> "subject" + num)
-                        .toArray(String[]::new)
+        IntStream.rangeClosed(1, 100)
+            .boxed()
+            .map(num -> "subject" + num)
+            .toArray(String[]::new)
     );
 
     var schemas = controller.getSchemas(LOCAL_KAFKA_CLUSTER_NAME,
-            4, 33, null, null).block();
+        4, 33, null, null).block();
 
     assertThat(schemas.getBody().getPageCount()).isEqualTo(4);
     assertThat(schemas.getBody().getSchemas()).hasSize(1);
@@ -112,8 +118,8 @@ public class SchemaRegistryPaginationTest {
 
   private KafkaCluster buildKafkaCluster(String clusterName) {
     return KafkaCluster.builder()
-            .name(clusterName)
-            .schemaRegistry(InternalSchemaRegistry.builder().build())
-            .build();
+        .name(clusterName)
+        .schemaRegistry(InternalSchemaRegistry.builder().build())
+        .build();
   }
 }

+ 6 - 2
kafka-ui-api/src/test/java/com/provectus/kafka/ui/service/TopicsServicePaginationTest.java

@@ -19,6 +19,8 @@ import com.provectus.kafka.ui.model.SortOrderDTO;
 import com.provectus.kafka.ui.model.TopicColumnsToSortDTO;
 import com.provectus.kafka.ui.model.TopicDTO;
 import com.provectus.kafka.ui.service.analyze.TopicAnalysisService;
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import com.provectus.kafka.ui.util.AccessControlServiceMock;
 import java.util.ArrayList;
 import java.util.Comparator;
 import java.util.List;
@@ -32,6 +34,7 @@ import java.util.stream.IntStream;
 import org.apache.kafka.clients.admin.TopicDescription;
 import org.apache.kafka.common.TopicPartitionInfo;
 import org.junit.jupiter.api.Test;
+import org.springframework.test.util.ReflectionTestUtils;
 import reactor.core.publisher.Mono;
 
 class TopicsServicePaginationTest {
@@ -41,9 +44,10 @@ class TopicsServicePaginationTest {
   private final TopicsService topicsService = mock(TopicsService.class);
   private final ClustersStorage clustersStorage = mock(ClustersStorage.class);
   private final ClusterMapper clusterMapper = new ClusterMapperImpl();
+  private final AccessControlService accessControlService = new AccessControlServiceMock().getMock();
 
   private final TopicsController topicsController = new TopicsController(
-      topicsService, mock(TopicAnalysisService.class), clusterMapper);
+      topicsService, mock(TopicAnalysisService.class), clusterMapper, accessControlService);
 
   private void init(Map<String, InternalTopic> topicsInCache) {
 
@@ -56,7 +60,7 @@ class TopicsServicePaginationTest {
           List<String> lst = a.getArgument(1);
           return Mono.just(lst.stream().map(topicsInCache::get).collect(Collectors.toList()));
         });
-    this.topicsController.setClustersStorage(clustersStorage);
+    ReflectionTestUtils.setField(topicsController, "clustersStorage", clustersStorage);
   }
 
   @Test

+ 23 - 0
kafka-ui-api/src/test/java/com/provectus/kafka/ui/util/AccessControlServiceMock.java

@@ -0,0 +1,23 @@
+package com.provectus.kafka.ui.util;
+
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.ArgumentMatchers.anyString;
+import static org.mockito.Mockito.when;
+
+import com.provectus.kafka.ui.service.rbac.AccessControlService;
+import org.mockito.Mockito;
+import reactor.core.publisher.Mono;
+
+public class AccessControlServiceMock {
+
+  public AccessControlService getMock() {
+    AccessControlService mock = Mockito.mock(AccessControlService.class);
+
+    when(mock.validateAccess(any())).thenReturn(Mono.empty());
+    when(mock.isSchemaAccessible(anyString(), anyString())).thenReturn(Mono.just(true));
+
+    when(mock.isTopicAccessible(any(), anyString())).thenReturn(Mono.just(true));
+
+    return mock;
+  }
+}

+ 86 - 2
kafka-ui-contract/src/main/resources/swagger/kafka-ui-api.yaml

@@ -7,7 +7,7 @@ info:
   contact: { }
   license:
     name: Apache 2.0
-    url: http://www.apache.org/licenses/LICENSE-2.0
+    url: https://www.apache.org/licenses/LICENSE-2.0
 tags:
   - name: /api/clusters
   - name: /api/clusters/connects
@@ -1757,6 +1757,20 @@ paths:
               schema:
                 $ref: '#/components/schemas/TimeStampFormat'
 
+  /api/authorization:
+    get:
+      tags:
+        - Authorization
+      summary: Get user authentication related info
+      operationId: getUserAuthInfo
+      responses:
+        200:
+          description: OK
+          content:
+            application/json:
+              schema:
+                $ref: '#/components/schemas/AuthenticationInfo'
+
 components:
   schemas:
     TopicSerdeSuggestion:
@@ -2646,7 +2660,7 @@ components:
           type: string
         schemaType:
           $ref: '#/components/schemas/SchemaType'
-          description: upon updating a schema, the type of an existing schema can't be changed
+          # upon updating a schema, the type of existing schema can't be changed
       required:
         - subject
         - schema
@@ -3154,3 +3168,73 @@ components:
         - COMPACT
         - COMPACT_DELETE
         - UNKNOWN
+
+    AuthenticationInfo:
+      type: object
+      properties:
+        rbacEnabled:
+          type: boolean
+          description: true if role based access control is enabled and granular permission access is required
+        userInfo:
+          $ref: '#/components/schemas/UserInfo'
+      required:
+        - rbacEnabled
+
+    UserInfo:
+      type: object
+      properties:
+        username:
+          type: string
+        permissions:
+          type: array
+          items:
+            $ref: '#/components/schemas/UserPermission'
+      required:
+        - username
+        - permissions
+
+    UserPermission:
+      type: object
+      properties:
+        clusters:
+          type: array
+          items:
+            type: string
+        resource:
+          $ref: '#/components/schemas/ResourceType'
+        value:
+          type: string
+        actions:
+          type: array
+          items:
+            $ref: '#/components/schemas/Action'
+      required:
+        - clusters
+        - resource
+        - actions
+
+    Action:
+      type: string
+      enum:
+        - VIEW
+        - EDIT
+        - CREATE
+        - DELETE
+        - RESET_OFFSETS
+        - EXECUTE
+        - MODIFY_GLOBAL_COMPATIBILITY
+        - ANALYSIS_VIEW
+        - ANALYSIS_RUN
+        - MESSAGES_READ
+        - MESSAGES_PRODUCE
+        - MESSAGES_DELETE
+
+    ResourceType:
+      type: string
+      enum:
+        - CLUSTERCONFIG
+        - TOPIC
+        - CONSUMER
+        - SCHEMA
+        - CONNECT
+        - KSQL

+ 2 - 2
kafka-ui-e2e-checks/src/main/java/com/provectus/kafka/ui/pages/TopPanel.java

@@ -14,10 +14,10 @@ public class TopPanel extends BasePage{
     protected SelenideElement discordBtn = $x("//a[contains(@href,'https://discord.com/invite')]");
 
     public List<SelenideElement> getAllVisibleElements() {
-        return Arrays.asList(kafkaLogo, kafkaVersion, logOutBtn, gitBtn, discordBtn);
+        return Arrays.asList(kafkaLogo, kafkaVersion, gitBtn, discordBtn);
     }
 
     public List<SelenideElement> getAllEnabledElements() {
-        return Arrays.asList(logOutBtn, gitBtn, discordBtn, kafkaLogo);
+        return Arrays.asList(gitBtn, discordBtn, kafkaLogo);
     }
 }

+ 1 - 244
kafka-ui-react-app/src/components/App.styled.ts

@@ -1,9 +1,4 @@
-import styled, { css } from 'styled-components';
-import { Link } from 'react-router-dom';
-
-import { Button } from './common/Button/Button';
-import GitIcon from './common/Icons/GitIcon';
-import DiscordIcon from './common/Icons/DiscordIcon';
+import styled from 'styled-components';
 
 export const Layout = styled.div`
   min-width: 1200px;
@@ -12,241 +7,3 @@ export const Layout = styled.div`
     min-width: initial;
   }
 `;
-
-export const Container = styled.main(
-  ({ theme }) => css`
-    margin-top: ${theme.layout.navBarHeight};
-    margin-left: ${theme.layout.navBarWidth};
-    position: relative;
-    padding-bottom: 30px;
-    z-index: 20;
-    max-width: calc(100vw - ${theme.layout.navBarWidth});
-    @media screen and (max-width: 1023px) {
-      margin-left: initial;
-      max-width: 100vw;
-    }
-  `
-);
-
-export const Sidebar = styled.div<{ $visible: boolean }>(
-  ({ theme, $visible }) => css`
-    width: ${theme.layout.navBarWidth};
-    display: flex;
-    flex-direction: column;
-    border-right: 1px solid ${theme.layout.stuffBorderColor};
-    position: fixed;
-    top: ${theme.layout.navBarHeight};
-    left: 0;
-    bottom: 0;
-    padding: 8px 16px;
-    overflow-y: scroll;
-    transition: width 0.25s, opacity 0.25s, transform 0.25s,
-      -webkit-transform 0.25s;
-    background: ${theme.menu.backgroundColor.normal};
-    @media screen and (max-width: 1023px) {
-      ${$visible &&
-      css`
-        transform: translate3d(${theme.layout.navBarWidth}, 0, 0);
-      `};
-      left: -${theme.layout.navBarWidth};
-      z-index: 100;
-    }
-
-    &::-webkit-scrollbar {
-      width: 8px;
-    }
-
-    &::-webkit-scrollbar-track {
-      background-color: ${theme.scrollbar.trackColor.normal};
-    }
-
-    &::-webkit-scrollbar-thumb {
-      width: 8px;
-      background-color: ${theme.scrollbar.thumbColor.normal};
-      border-radius: 4px;
-    }
-
-    &:hover::-webkit-scrollbar-thumb {
-      background: ${theme.scrollbar.thumbColor.active};
-    }
-
-    &:hover::-webkit-scrollbar-track {
-      background-color: ${theme.scrollbar.trackColor.active};
-    }
-  `
-);
-
-export const Overlay = styled.div<{ $visible: boolean }>(
-  ({ theme, $visible }) => css`
-    height: calc(100vh - ${theme.layout.navBarHeight});
-    z-index: 99;
-    visibility: hidden;
-    opacity: 0;
-    -webkit-transition: all 0.5s ease;
-    transition: all 0.5s ease;
-    left: 0;
-    position: absolute;
-    top: 0;
-    ${$visible &&
-    css`
-      @media screen and (max-width: 1023px) {
-        bottom: 0;
-        right: 0;
-        visibility: visible;
-        opacity: 0.7;
-        background-color: ${theme.layout.overlay.backgroundColor};
-      }
-    `}
-  `
-);
-
-export const Navbar = styled.nav(
-  ({ theme }) => css`
-    display: flex;
-    align-items: center;
-    justify-content: space-between;
-    border-bottom: 1px solid ${theme.layout.stuffBorderColor};
-    position: fixed;
-    top: 0;
-    left: 0;
-    right: 0;
-    z-index: 30;
-    background-color: ${theme.menu.backgroundColor.normal};
-    min-height: 3.25rem;
-  `
-);
-
-export const NavbarBrand = styled.div`
-  display: flex;
-  justify-content: flex-end;
-  align-items: center !important;
-  flex-shrink: 0;
-  min-height: 3.25rem;
-`;
-
-export const SocialLink = styled.a(
-  ({ theme: { layout, icons } }) => css`
-    display: block;
-    margin-top: 5px;
-    cursor: pointer;
-    fill: ${layout.socialLink.color};
-
-    &:hover {
-      ${DiscordIcon} {
-        fill: ${icons.discord.hover};
-      }
-      ${GitIcon} {
-        fill: ${icons.git.hover};
-      }
-    }
-    &:active {
-      ${DiscordIcon} {
-        fill: ${icons.discord.active};
-      }
-      ${GitIcon} {
-        fill: ${icons.git.active};
-      }
-    }
-  `
-);
-
-export const NavbarSocial = styled.div`
-  display: flex;
-  align-items: center;
-  gap: 10px;
-  margin: 10px;
-`;
-
-export const NavbarItem = styled.div`
-  display: flex;
-  position: relative;
-  flex-grow: 0;
-  flex-shrink: 0;
-  align-items: center;
-  line-height: 1.5;
-  padding: 0.5rem 0.75rem;
-`;
-
-export const NavbarBurger = styled.div(
-  ({ theme }) => css`
-    display: block;
-    position: relative;
-    cursor: pointer;
-    height: 3.25rem;
-    width: 3.25rem;
-    margin: 0;
-    padding: 0;
-
-    &:hover {
-      background-color: ${theme.menu.backgroundColor.hover};
-    }
-
-    @media screen and (min-width: 1024px) {
-      display: none;
-    }
-  `
-);
-
-export const Span = styled.span(
-  ({ theme }) => css`
-    display: block;
-    position: absolute;
-    background: ${theme.menu.color.active};
-    height: 1px;
-    left: calc(50% - 8px);
-    transform-origin: center;
-    transition-duration: 86ms;
-    transition-property: background-color, opacity, transform, -webkit-transform;
-    transition-timing-function: ease-out;
-    width: 16px;
-
-    &:first-child {
-      top: calc(50% - 6px);
-    }
-    &:nth-child(2) {
-      top: calc(50% - 1px);
-    }
-    &:nth-child(3) {
-      top: calc(50% + 4px);
-    }
-  `
-);
-
-export const Hyperlink = styled(Link)(
-  ({ theme }) => css`
-    position: relative;
-
-    display: flex;
-    flex-grow: 0;
-    flex-shrink: 0;
-    align-items: center;
-    gap: 8px;
-
-    margin: 0;
-    padding: 0.5rem 0.75rem;
-
-    font-family: Inter, sans-serif;
-    font-style: normal;
-    font-weight: bold;
-    font-size: 12px;
-    line-height: 16px;
-    color: ${theme.menu.color.active};
-    text-decoration: none;
-    word-break: break-word;
-    cursor: pointer;
-  `
-);
-
-export const LogoutButton = styled(Button)(
-  ({ theme }) => css`
-    color: ${theme.button.primary.invertedColors.normal};
-    background: none !important;
-    padding: 0 8px;
-  `
-);
-
-export const LogoutLink = styled.a(
-  () => css`
-    margin-right: 2px;
-  `
-);

+ 41 - 106
kafka-ui-react-app/src/components/App.tsx

@@ -1,16 +1,14 @@
-import React, { Suspense, useCallback } from 'react';
-import { Routes, Route, useLocation, Navigate } from 'react-router-dom';
+import React, { Suspense } from 'react';
+import { Routes, Route, Navigate } from 'react-router-dom';
 import {
   accessErrorPage,
   clusterPath,
   errorPage,
   getNonExactPath,
 } from 'lib/paths';
-import Nav from 'components/Nav/Nav';
 import PageLoader from 'components/common/PageLoader/PageLoader';
 import Dashboard from 'components/Dashboard/Dashboard';
 import ClusterPage from 'components/Cluster/Cluster';
-import Version from 'components/Version/Version';
 import { ThemeProvider } from 'styled-components';
 import theme from 'theme/theme';
 import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
@@ -18,14 +16,13 @@ import { showServerError } from 'lib/errorHandling';
 import { Toaster } from 'react-hot-toast';
 import GlobalCSS from 'components/global.css';
 import * as S from 'components/App.styled';
-import Logo from 'components/common/Logo/Logo';
-import GitIcon from 'components/common/Icons/GitIcon';
-import DiscordIcon from 'components/common/Icons/DiscordIcon';
 
 import ConfirmationModal from './common/ConfirmationModal/ConfirmationModal';
 import { ConfirmContextProvider } from './contexts/ConfirmContext';
 import { GlobalSettingsProvider } from './contexts/GlobalSettingsContext';
 import ErrorPage from './ErrorPage/ErrorPage';
+import { UserInfoRolesAccessProvider } from './contexts/UserInfoRolesAccessContext';
+import PageContainer from './PageContainer/PageContainer';
 
 const queryClient = new QueryClient({
   defaultOptions: {
@@ -41,109 +38,47 @@ const queryClient = new QueryClient({
 });
 
 const App: React.FC = () => {
-  const [isSidebarVisible, setIsSidebarVisible] = React.useState(false);
-  const onBurgerClick = () => setIsSidebarVisible(!isSidebarVisible);
-  const closeSidebar = useCallback(() => setIsSidebarVisible(false), []);
-  const location = useLocation();
-
-  React.useEffect(() => {
-    closeSidebar();
-  }, [location, closeSidebar]);
-
   return (
     <QueryClientProvider client={queryClient}>
       <GlobalSettingsProvider>
         <ThemeProvider theme={theme}>
-          <ConfirmContextProvider>
-            <GlobalCSS />
-            <S.Layout>
-              <S.Navbar role="navigation" aria-label="Page Header">
-                <S.NavbarBrand>
-                  <S.NavbarBrand>
-                    <S.NavbarBurger
-                      onClick={onBurgerClick}
-                      onKeyDown={onBurgerClick}
-                      role="button"
-                      tabIndex={0}
-                      aria-label="burger"
-                    >
-                      <S.Span role="separator" />
-                      <S.Span role="separator" />
-                      <S.Span role="separator" />
-                    </S.NavbarBurger>
-
-                    <S.Hyperlink to="/">
-                      <Logo />
-                      UI for Apache Kafka
-                    </S.Hyperlink>
-
-                    <S.NavbarItem>
-                      <Version />
-                    </S.NavbarItem>
-                  </S.NavbarBrand>
-                </S.NavbarBrand>
-                <S.NavbarSocial>
-                  <S.LogoutLink href="/logout">
-                    <S.LogoutButton buttonType="primary" buttonSize="M">
-                      Log out
-                    </S.LogoutButton>
-                  </S.LogoutLink>
-                  <S.SocialLink
-                    href="https://github.com/provectus/kafka-ui"
-                    target="_blank"
-                  >
-                    <GitIcon />
-                  </S.SocialLink>
-                  <S.SocialLink
-                    href="https://discord.com/invite/4DWzD7pGE5"
-                    target="_blank"
-                  >
-                    <DiscordIcon />
-                  </S.SocialLink>
-                </S.NavbarSocial>
-              </S.Navbar>
-
-              <S.Container>
-                <S.Sidebar aria-label="Sidebar" $visible={isSidebarVisible}>
-                  <Suspense fallback={<PageLoader />}>
-                    <Nav />
-                  </Suspense>
-                </S.Sidebar>
-                <S.Overlay
-                  $visible={isSidebarVisible}
-                  onClick={closeSidebar}
-                  onKeyDown={closeSidebar}
-                  tabIndex={-1}
-                  aria-hidden="true"
-                  aria-label="Overlay"
-                />
-                <Routes>
-                  {['/', '/ui', '/ui/clusters'].map((path) => (
-                    <Route
-                      key="Home" // optional: avoid full re-renders on route changes
-                      path={path}
-                      element={<Dashboard />}
-                    />
-                  ))}
-                  <Route
-                    path={getNonExactPath(clusterPath())}
-                    element={<ClusterPage />}
-                  />
-                  <Route
-                    path={accessErrorPage}
-                    element={<ErrorPage status={403} text="Access is Denied" />}
-                  />
-                  <Route path={errorPage} element={<ErrorPage />} />
-                  <Route
-                    path="*"
-                    element={<Navigate to={errorPage} replace />}
-                  />
-                </Routes>
-              </S.Container>
-              <Toaster position="bottom-right" />
-            </S.Layout>
-            <ConfirmationModal />
-          </ConfirmContextProvider>
+          <Suspense fallback={<PageLoader />}>
+            <UserInfoRolesAccessProvider>
+              <ConfirmContextProvider>
+                <GlobalCSS />
+                <S.Layout>
+                  <PageContainer>
+                    <Routes>
+                      {['/', '/ui', '/ui/clusters'].map((path) => (
+                        <Route
+                          key="Home" // optional: avoid full re-renders on route changes
+                          path={path}
+                          element={<Dashboard />}
+                        />
+                      ))}
+                      <Route
+                        path={getNonExactPath(clusterPath())}
+                        element={<ClusterPage />}
+                      />
+                      <Route
+                        path={accessErrorPage}
+                        element={
+                          <ErrorPage status={403} text="Access is Denied" />
+                        }
+                      />
+                      <Route path={errorPage} element={<ErrorPage />} />
+                      <Route
+                        path="*"
+                        element={<Navigate to={errorPage} replace />}
+                      />
+                    </Routes>
+                  </PageContainer>
+                  <Toaster position="bottom-right" />
+                </S.Layout>
+                <ConfirmationModal />
+              </ConfirmContextProvider>
+            </UserInfoRolesAccessProvider>
+          </Suspense>
         </ThemeProvider>
       </GlobalSettingsProvider>
     </QueryClientProvider>

+ 8 - 3
kafka-ui-react-app/src/components/Brokers/Broker/Configs/InputCell.tsx

@@ -4,9 +4,10 @@ import CheckmarkIcon from 'components/common/Icons/CheckmarkIcon';
 import EditIcon from 'components/common/Icons/EditIcon';
 import CancelIcon from 'components/common/Icons/CancelIcon';
 import { useConfirm } from 'lib/hooks/useConfirm';
-import { BrokerConfig } from 'generated-sources';
+import { Action, BrokerConfig, ResourceType } from 'generated-sources';
 import { Button } from 'components/common/Button/Button';
 import Input from 'components/common/Input/Input';
+import { ActionButton } from 'components/common/ActionComponent';
 
 import * as S from './Configs.styled';
 
@@ -71,14 +72,18 @@ const InputCell: React.FC<InputCellProps> = ({ row, getValue, onUpdate }) => {
       }
     >
       <S.Value title={initialValue}>{initialValue}</S.Value>
-      <Button
+      <ActionButton
         buttonType="primary"
         buttonSize="S"
         aria-label="editAction"
         onClick={() => setIsEdit(true)}
+        permission={{
+          resource: ResourceType.CLUSTERCONFIG,
+          action: Action.EDIT,
+        }}
       >
         <EditIcon /> Edit
-      </Button>
+      </ActionButton>
     </S.ValueWrapper>
   );
 };

+ 65 - 14
kafka-ui-react-app/src/components/Connect/Details/Actions/Actions.tsx

@@ -2,7 +2,12 @@ import React from 'react';
 import styled from 'styled-components';
 import { useNavigate } from 'react-router-dom';
 import { useIsMutating } from '@tanstack/react-query';
-import { ConnectorState, ConnectorAction } from 'generated-sources';
+import {
+  Action,
+  ConnectorAction,
+  ConnectorState,
+  ResourceType,
+} from 'generated-sources';
 import useAppParams from 'lib/hooks/useAppParams';
 import {
   useConnector,
@@ -14,7 +19,8 @@ import {
   RouterParamsClusterConnectConnector,
 } from 'lib/paths';
 import { useConfirm } from 'lib/hooks/useConfirm';
-import { Dropdown, DropdownItem } from 'components/common/Dropdown';
+import { Dropdown } from 'components/common/Dropdown';
+import { ActionDropdownItem } from 'components/common/ActionComponent';
 
 const ConnectorActionsWrapperStyled = styled.div`
   display: flex;
@@ -65,31 +71,76 @@ const Actions: React.FC = () => {
     <ConnectorActionsWrapperStyled>
       <Dropdown>
         {connector?.status.state === ConnectorState.RUNNING && (
-          <DropdownItem onClick={pauseConnectorHandler} disabled={isMutating}>
+          <ActionDropdownItem
+            onClick={pauseConnectorHandler}
+            disabled={isMutating}
+            permission={{
+              resource: ResourceType.CONNECT,
+              action: Action.EDIT,
+              value: routerProps.connectorName,
+            }}
+          >
             Pause
-          </DropdownItem>
+          </ActionDropdownItem>
         )}
         {connector?.status.state === ConnectorState.PAUSED && (
-          <DropdownItem onClick={resumeConnectorHandler} disabled={isMutating}>
+          <ActionDropdownItem
+            onClick={resumeConnectorHandler}
+            disabled={isMutating}
+            permission={{
+              resource: ResourceType.CONNECT,
+              action: Action.EDIT,
+              value: routerProps.connectorName,
+            }}
+          >
             Resume
-          </DropdownItem>
+          </ActionDropdownItem>
         )}
-        <DropdownItem onClick={restartConnectorHandler} disabled={isMutating}>
+        <ActionDropdownItem
+          onClick={restartConnectorHandler}
+          disabled={isMutating}
+          permission={{
+            resource: ResourceType.CONNECT,
+            action: Action.EDIT,
+            value: routerProps.connectorName,
+          }}
+        >
           Restart Connector
-        </DropdownItem>
-        <DropdownItem onClick={restartAllTasksHandler} disabled={isMutating}>
+        </ActionDropdownItem>
+        <ActionDropdownItem
+          onClick={restartAllTasksHandler}
+          disabled={isMutating}
+          permission={{
+            resource: ResourceType.CONNECT,
+            action: Action.EDIT,
+            value: routerProps.connectorName,
+          }}
+        >
           Restart All Tasks
-        </DropdownItem>
-        <DropdownItem onClick={restartFailedTasksHandler} disabled={isMutating}>
+        </ActionDropdownItem>
+        <ActionDropdownItem
+          onClick={restartFailedTasksHandler}
+          disabled={isMutating}
+          permission={{
+            resource: ResourceType.CONNECT,
+            action: Action.EDIT,
+            value: routerProps.connectorName,
+          }}
+        >
           Restart Failed Tasks
-        </DropdownItem>
-        <DropdownItem
+        </ActionDropdownItem>
+        <ActionDropdownItem
           onClick={deleteConnectorHandler}
           disabled={isMutating}
           danger
+          permission={{
+            resource: ResourceType.CONNECT,
+            action: Action.DELETE,
+            value: routerProps.connectorName,
+          }}
         >
           Delete
-        </DropdownItem>
+        </ActionDropdownItem>
       </Dropdown>
     </ConnectorActionsWrapperStyled>
   );

+ 8 - 4
kafka-ui-react-app/src/components/Connect/List/ListPage.tsx

@@ -5,10 +5,10 @@ import ClusterContext from 'components/contexts/ClusterContext';
 import Search from 'components/common/Search/Search';
 import * as Metrics from 'components/common/Metrics';
 import PageHeading from 'components/common/PageHeading/PageHeading';
-import { Button } from 'components/common/Button/Button';
+import { ActionButton } from 'components/common/ActionComponent';
 import { ControlPanelWrapper } from 'components/common/ControlPanel/ControlPanel.styled';
 import PageLoader from 'components/common/PageLoader/PageLoader';
-import { ConnectorState } from 'generated-sources';
+import { Action, ConnectorState, ResourceType } from 'generated-sources';
 import { useConnectors } from 'lib/hooks/api/kafkaConnect';
 
 import List from './List';
@@ -33,13 +33,17 @@ const ListPage: React.FC = () => {
     <>
       <PageHeading text="Connectors">
         {!isReadOnly && (
-          <Button
+          <ActionButton
             buttonType="primary"
             buttonSize="M"
             to={clusterConnectorNewRelativePath}
+            permission={{
+              resource: ResourceType.CONNECT,
+              action: Action.CREATE,
+            }}
           >
             Create Connector
-          </Button>
+          </ActionButton>
         )}
       </PageHeading>
       <Metrics.Wrapper>

+ 1 - 1
kafka-ui-react-app/src/components/Connect/New/New.tsx

@@ -65,7 +65,7 @@ const New: React.FC = () => {
   }, [connects, getValues, setValue]);
 
   const onSubmit = async (values: FormValues) => {
-    const connector = await mutation.mutateAsync({
+    const connector = await mutation.createResource({
       connectName: values.connectName,
       newConnector: {
         name: values.name,

+ 2 - 1
kafka-ui-react-app/src/components/Connect/New/__tests__/New.spec.tsx

@@ -23,6 +23,7 @@ jest.mock('react-router-dom', () => ({
   ...jest.requireActual('react-router-dom'),
   useNavigate: () => mockHistoryPush,
 }));
+
 jest.mock('lib/hooks/api/kafkaConnect', () => ({
   useConnects: jest.fn(),
   useCreateConnector: jest.fn(),
@@ -67,7 +68,7 @@ describe('New', () => {
       return Promise.resolve(connector);
     });
     (useCreateConnector as jest.Mock).mockImplementation(() => ({
-      mutateAsync: createConnectorMock,
+      createResource: createConnectorMock,
     }));
     renderComponent();
     await simulateFormSubmit();

+ 23 - 7
kafka-ui-react-app/src/components/ConsumerGroups/Details/Details.tsx

@@ -17,15 +17,17 @@ import { Table } from 'components/common/table/Table/Table.styled';
 import TableHeaderCell from 'components/common/table/TableHeaderCell/TableHeaderCell';
 import { useAppDispatch, useAppSelector } from 'lib/hooks/redux';
 import {
-  fetchConsumerGroupDetails,
   deleteConsumerGroup,
-  selectById,
-  getIsConsumerGroupDeleted,
+  fetchConsumerGroupDetails,
   getAreConsumerGroupDetailsFulfilled,
+  getIsConsumerGroupDeleted,
+  selectById,
 } from 'redux/reducers/consumerGroups/consumerGroupsSlice';
 import getTagColor from 'components/common/Tag/getTagColor';
-import { Dropdown, DropdownItem } from 'components/common/Dropdown';
+import { Dropdown } from 'components/common/Dropdown';
 import { ControlPanelWrapper } from 'components/common/ControlPanel/ControlPanel.styled';
+import { Action, ResourceType } from 'generated-sources';
+import { ActionDropdownItem } from 'components/common/ActionComponent';
 
 import ListItem from './ListItem';
 
@@ -84,14 +86,28 @@ const Details: React.FC = () => {
         >
           {!isReadOnly && (
             <Dropdown>
-              <DropdownItem onClick={onResetOffsets}>Reset offset</DropdownItem>
-              <DropdownItem
+              <ActionDropdownItem
+                onClick={onResetOffsets}
+                permission={{
+                  resource: ResourceType.CONSUMER,
+                  action: Action.RESET_OFFSETS,
+                  value: consumerGroupID,
+                }}
+              >
+                Reset offset
+              </ActionDropdownItem>
+              <ActionDropdownItem
                 confirm="Are you sure you want to delete this consumer group?"
                 onClick={onDelete}
                 danger
+                permission={{
+                  resource: ResourceType.CONSUMER,
+                  action: Action.DELETE,
+                  value: consumerGroupID,
+                }}
               >
                 Delete consumer group
-              </DropdownItem>
+              </ActionDropdownItem>
             </Dropdown>
           )}
         </PageHeading>

+ 11 - 6
kafka-ui-react-app/src/components/KsqlDb/List/List.tsx

@@ -4,18 +4,19 @@ import * as Metrics from 'components/common/Metrics';
 import { getKsqlDbTables } from 'redux/reducers/ksqlDb/selectors';
 import {
   clusterKsqlDbQueryRelativePath,
-  ClusterNameRoute,
   clusterKsqlDbStreamsPath,
-  clusterKsqlDbTablesPath,
   clusterKsqlDbStreamsRelativePath,
+  clusterKsqlDbTablesPath,
   clusterKsqlDbTablesRelativePath,
+  ClusterNameRoute,
 } from 'lib/paths';
 import PageHeading from 'components/common/PageHeading/PageHeading';
-import { Button } from 'components/common/Button/Button';
+import { ActionButton } from 'components/common/ActionComponent';
 import Navbar from 'components/common/Navigation/Navbar.styled';
-import { NavLink, Route, Routes, Navigate } from 'react-router-dom';
+import { Navigate, NavLink, Route, Routes } from 'react-router-dom';
 import { fetchKsqlDbTables } from 'redux/reducers/ksqlDb/ksqlDbSlice';
 import { useAppDispatch, useAppSelector } from 'lib/hooks/redux';
+import { Action, ResourceType } from 'generated-sources';
 
 import KsqlDbItem, { KsqlDbItemType } from './KsqlDbItem/KsqlDbItem';
 
@@ -33,13 +34,17 @@ const List: FC = () => {
   return (
     <>
       <PageHeading text="KSQL DB">
-        <Button
+        <ActionButton
           to={clusterKsqlDbQueryRelativePath}
           buttonType="primary"
           buttonSize="M"
+          permission={{
+            resource: ResourceType.KSQL,
+            action: Action.EXECUTE,
+          }}
         >
           Execute KSQL Request
-        </Button>
+        </ActionButton>
       </PageHeading>
       <Metrics.Wrapper>
         <Metrics.Section>

+ 5 - 4
kafka-ui-react-app/src/components/KsqlDb/List/__test__/List.spec.tsx

@@ -6,15 +6,16 @@ import { screen } from '@testing-library/dom';
 import { act } from '@testing-library/react';
 
 describe('KsqlDb List', () => {
-  afterEach(() => fetchMock.reset());
-  it('renders List component with Tables and Streams tabs', async () => {
+  const renderComponent = async () => {
     await act(() => {
       render(<List />);
     });
-
+  };
+  afterEach(() => fetchMock.reset());
+  it('renders List component with Tables and Streams tabs', async () => {
+    await renderComponent();
     const Tables = screen.getByTitle('Tables');
     const Streams = screen.getByTitle('Streams');
-
     expect(Tables).toBeInTheDocument();
     expect(Streams).toBeInTheDocument();
   });

+ 146 - 0
kafka-ui-react-app/src/components/NavBar/NavBar.styled.ts

@@ -0,0 +1,146 @@
+import styled, { css } from 'styled-components';
+import { Link } from 'react-router-dom';
+import DiscordIcon from 'components/common/Icons/DiscordIcon';
+import GitIcon from 'components/common/Icons/GitIcon';
+
+export const Navbar = styled.nav(
+  ({ theme }) => css`
+    display: flex;
+    align-items: center;
+    justify-content: space-between;
+    border-bottom: 1px solid ${theme.layout.stuffBorderColor};
+    position: fixed;
+    top: 0;
+    left: 0;
+    right: 0;
+    z-index: 30;
+    background-color: ${theme.menu.backgroundColor.normal};
+    min-height: 3.25rem;
+  `
+);
+
+export const NavbarBrand = styled.div`
+  display: flex;
+  justify-content: flex-end;
+  align-items: center !important;
+  flex-shrink: 0;
+  min-height: 3.25rem;
+`;
+
+export const SocialLink = styled.a(
+  ({ theme: { layout, icons } }) => css`
+    display: block;
+    margin-top: 5px;
+    cursor: pointer;
+    fill: ${layout.socialLink.color};
+
+    &:hover {
+      ${DiscordIcon} {
+        fill: ${icons.discord.hover};
+      }
+
+      ${GitIcon} {
+        fill: ${icons.git.hover};
+      }
+    }
+
+    &:active {
+      ${DiscordIcon} {
+        fill: ${icons.discord.active};
+      }
+
+      ${GitIcon} {
+        fill: ${icons.git.active};
+      }
+    }
+  `
+);
+
+export const NavbarSocial = styled.div`
+  display: flex;
+  align-items: center;
+  gap: 10px;
+  margin: 10px;
+`;
+
+export const NavbarItem = styled.div`
+  display: flex;
+  position: relative;
+  flex-grow: 0;
+  flex-shrink: 0;
+  align-items: center;
+  line-height: 1.5;
+  padding: 0.5rem 0.75rem;
+`;
+
+export const NavbarBurger = styled.div(
+  ({ theme }) => css`
+    display: block;
+    position: relative;
+    cursor: pointer;
+    height: 3.25rem;
+    width: 3.25rem;
+    margin: 0;
+    padding: 0;
+
+    &:hover {
+      background-color: ${theme.menu.backgroundColor.hover};
+    }
+
+    @media screen and (min-width: 1024px) {
+      display: none;
+    }
+  `
+);
+
+export const Span = styled.span(
+  ({ theme }) => css`
+    display: block;
+    position: absolute;
+    background: ${theme.menu.color.active};
+    height: 1px;
+    left: calc(50% - 8px);
+    transform-origin: center;
+    transition-duration: 86ms;
+    transition-property: background-color, opacity, transform, -webkit-transform;
+    transition-timing-function: ease-out;
+    width: 16px;
+
+    &:first-child {
+      top: calc(50% - 6px);
+    }
+
+    &:nth-child(2) {
+      top: calc(50% - 1px);
+    }
+
+    &:nth-child(3) {
+      top: calc(50% + 4px);
+    }
+  `
+);
+
+export const Hyperlink = styled(Link)(
+  ({ theme }) => css`
+    position: relative;
+
+    display: flex;
+    flex-grow: 0;
+    flex-shrink: 0;
+    align-items: center;
+    gap: 8px;
+
+    margin: 0;
+    padding: 0.5rem 0.75rem;
+
+    font-family: Inter, sans-serif;
+    font-style: normal;
+    font-weight: bold;
+    font-size: 12px;
+    line-height: 16px;
+    color: ${theme.menu.color.active};
+    text-decoration: none;
+    word-break: break-word;
+    cursor: pointer;
+  `
+);

+ 60 - 0
kafka-ui-react-app/src/components/NavBar/NavBar.tsx

@@ -0,0 +1,60 @@
+import React from 'react';
+import Logo from 'components/common/Logo/Logo';
+import Version from 'components/Version/Version';
+import GitIcon from 'components/common/Icons/GitIcon';
+import DiscordIcon from 'components/common/Icons/DiscordIcon';
+
+import * as S from './NavBar.styled';
+import UserInfo from './UserInfo/UserInfo';
+
+interface Props {
+  onBurgerClick: () => void;
+}
+
+const NavBar: React.FC<Props> = ({ onBurgerClick }) => {
+  return (
+    <S.Navbar role="navigation" aria-label="Page Header">
+      <S.NavbarBrand>
+        <S.NavbarBrand>
+          <S.NavbarBurger
+            onClick={onBurgerClick}
+            onKeyDown={onBurgerClick}
+            role="button"
+            tabIndex={0}
+            aria-label="burger"
+          >
+            <S.Span role="separator" />
+            <S.Span role="separator" />
+            <S.Span role="separator" />
+          </S.NavbarBurger>
+
+          <S.Hyperlink to="/">
+            <Logo />
+            UI for Apache Kafka
+          </S.Hyperlink>
+
+          <S.NavbarItem>
+            <Version />
+          </S.NavbarItem>
+        </S.NavbarBrand>
+      </S.NavbarBrand>
+      <S.NavbarSocial>
+        <S.SocialLink
+          href="https://github.com/provectus/kafka-ui"
+          target="_blank"
+        >
+          <GitIcon />
+        </S.SocialLink>
+        <S.SocialLink
+          href="https://discord.com/invite/4DWzD7pGE5"
+          target="_blank"
+        >
+          <DiscordIcon />
+        </S.SocialLink>
+        <UserInfo />
+      </S.NavbarSocial>
+    </S.Navbar>
+  );
+};
+
+export default NavBar;

+ 19 - 0
kafka-ui-react-app/src/components/NavBar/UserInfo/UserInfo.styled.ts

@@ -0,0 +1,19 @@
+import styled, { css } from 'styled-components';
+
+export const Wrapper = styled.div`
+  display: flex;
+  justify-content: center;
+  align-items: center;
+  gap: 5px;
+  svg {
+    position: relative;
+  }
+`;
+
+export const Text = styled.div(
+  ({ theme }) => css`
+    color: ${theme.button.primary.invertedColors.normal};
+  `
+);
+
+export const LogoutLink = styled.a``;

+ 35 - 0
kafka-ui-react-app/src/components/NavBar/UserInfo/UserInfo.tsx

@@ -0,0 +1,35 @@
+import React from 'react';
+import { Dropdown, DropdownItem } from 'components/common/Dropdown';
+import UserIcon from 'components/common/Icons/UserIcon';
+import DropdownArrowIcon from 'components/common/Icons/DropdownArrowIcon';
+import { useTheme } from 'styled-components';
+import { useUserInfo } from 'lib/hooks/useUserInfo';
+
+import * as S from './UserInfo.styled';
+
+const UserInfo = () => {
+  const { username } = useUserInfo();
+  const theme = useTheme();
+
+  return username ? (
+    <Dropdown
+      label={
+        <S.Wrapper>
+          <UserIcon />
+          <S.Text>{username}</S.Text>
+          <DropdownArrowIcon
+            isOpen={false}
+            style={{}}
+            color={theme.button.primary.invertedColors.normal}
+          />
+        </S.Wrapper>
+      }
+    >
+      <DropdownItem>
+        <S.LogoutLink href="/logout">Log out</S.LogoutLink>
+      </DropdownItem>
+    </Dropdown>
+  ) : null;
+};
+
+export default UserInfo;

+ 44 - 0
kafka-ui-react-app/src/components/NavBar/UserInfo/__tests__/UserInfo.spec.tsx

@@ -0,0 +1,44 @@
+import React from 'react';
+import { screen } from '@testing-library/react';
+import { render } from 'lib/testHelpers';
+import UserInfo from 'components/NavBar/UserInfo/UserInfo';
+import { useUserInfo } from 'lib/hooks/useUserInfo';
+import userEvent from '@testing-library/user-event';
+
+jest.mock('lib/hooks/useUserInfo', () => ({
+  useUserInfo: jest.fn(),
+}));
+
+describe('UserInfo', () => {
+  const renderComponent = () => render(<UserInfo />);
+
+  it('should render the userInfo with correct data', () => {
+    const username = 'someName';
+    (useUserInfo as jest.Mock).mockImplementation(() => ({ username }));
+
+    renderComponent();
+    expect(screen.getByText(username)).toBeInTheDocument();
+  });
+
+  it('should render the userInfo during click opens the dropdown', async () => {
+    const username = 'someName';
+    (useUserInfo as jest.Mock).mockImplementation(() => ({ username }));
+
+    renderComponent();
+    const dropdown = screen.getByText(username);
+    await userEvent.click(dropdown);
+
+    const logout = screen.getByText('Log out');
+    expect(logout).toBeInTheDocument();
+    expect(logout).toHaveAttribute('href', '/logout');
+  });
+
+  it('should not render anything if the username does not exists', () => {
+    (useUserInfo as jest.Mock).mockImplementation(() => ({
+      username: undefined,
+    }));
+
+    renderComponent();
+    expect(screen.queryByRole('listbox')).not.toBeInTheDocument();
+  });
+});

+ 28 - 0
kafka-ui-react-app/src/components/NavBar/__tests__/NavBar.spec.tsx

@@ -0,0 +1,28 @@
+import React from 'react';
+import { render } from 'lib/testHelpers';
+import NavBar from 'components/NavBar/NavBar';
+import { screen, within } from '@testing-library/react';
+
+const burgerButtonOptions = { name: 'burger' };
+
+jest.mock('components/Version/Version', () => () => <div>Version</div>);
+jest.mock('components/NavBar/UserInfo/UserInfo', () => () => (
+  <div>UserInfo</div>
+));
+
+describe('NavBar', () => {
+  beforeEach(() => {
+    render(<NavBar onBurgerClick={jest.fn()} />);
+  });
+
+  it('correctly renders header', () => {
+    const header = screen.getByLabelText('Page Header');
+    expect(header).toBeInTheDocument();
+    expect(within(header).getByText('UI for Apache Kafka')).toBeInTheDocument();
+    expect(within(header).getAllByRole('separator').length).toEqual(3);
+    expect(
+      within(header).getByRole('button', burgerButtonOptions)
+    ).toBeInTheDocument();
+    expect(within(header).getByText('UserInfo')).toBeInTheDocument();
+  });
+});

+ 88 - 0
kafka-ui-react-app/src/components/PageContainer/PageContainer.styled.ts

@@ -0,0 +1,88 @@
+import styled, { css } from 'styled-components';
+
+export const Container = styled.main(
+  ({ theme }) => css`
+    margin-top: ${theme.layout.navBarHeight};
+    margin-left: ${theme.layout.navBarWidth};
+    position: relative;
+    padding-bottom: 30px;
+    z-index: 20;
+    max-width: calc(100vw - ${theme.layout.navBarWidth});
+    @media screen and (max-width: 1023px) {
+      margin-left: initial;
+      max-width: 100vw;
+    }
+  `
+);
+
+export const Sidebar = styled.div<{ $visible: boolean }>(
+  ({ theme, $visible }) => css`
+    width: ${theme.layout.navBarWidth};
+    display: flex;
+    flex-direction: column;
+    border-right: 1px solid ${theme.layout.stuffBorderColor};
+    position: fixed;
+    top: ${theme.layout.navBarHeight};
+    left: 0;
+    bottom: 0;
+    padding: 8px 16px;
+    overflow-y: scroll;
+    transition: width 0.25s, opacity 0.25s, transform 0.25s,
+      -webkit-transform 0.25s;
+    background: ${theme.menu.backgroundColor.normal};
+    @media screen and (max-width: 1023px) {
+      ${$visible &&
+      css`
+        transform: translate3d(${theme.layout.navBarWidth}, 0, 0);
+      `};
+      left: -${theme.layout.navBarWidth};
+      z-index: 100;
+    }
+
+    &::-webkit-scrollbar {
+      width: 8px;
+    }
+
+    &::-webkit-scrollbar-track {
+      background-color: ${theme.scrollbar.trackColor.normal};
+    }
+
+    &::-webkit-scrollbar-thumb {
+      width: 8px;
+      background-color: ${theme.scrollbar.thumbColor.normal};
+      border-radius: 4px;
+    }
+
+    &:hover::-webkit-scrollbar-thumb {
+      background: ${theme.scrollbar.thumbColor.active};
+    }
+
+    &:hover::-webkit-scrollbar-track {
+      background-color: ${theme.scrollbar.trackColor.active};
+    }
+  `
+);
+
+export const Overlay = styled.div<{ $visible: boolean }>(
+  ({ theme, $visible }) => css`
+    height: calc(100vh - ${theme.layout.navBarHeight});
+    z-index: 99;
+    visibility: hidden;
+    opacity: 0;
+    -webkit-transition: all 0.5s ease;
+    transition: all 0.5s ease;
+    left: 0;
+    position: absolute;
+    top: 0;
+    ${$visible &&
+    css`
+      @media screen and (max-width: 1023px) {
+        bottom: 0;
+        right: 0;
+        visibility: visible;
+        opacity: 0.7;
+        background-color: ${theme.layout.overlay.backgroundColor};
+      }
+    `}
+  `
+);

+ 41 - 0
kafka-ui-react-app/src/components/PageContainer/PageContainer.tsx

@@ -0,0 +1,41 @@
+import React, { PropsWithChildren, Suspense, useCallback } from 'react';
+import { useLocation } from 'react-router-dom';
+import NavBar from 'components/NavBar/NavBar';
+import * as S from 'components/PageContainer/PageContainer.styled';
+import PageLoader from 'components/common/PageLoader/PageLoader';
+import Nav from 'components/Nav/Nav';
+
+const PageContainer: React.FC<PropsWithChildren<unknown>> = ({ children }) => {
+  const [isSidebarVisible, setIsSidebarVisible] = React.useState(false);
+  const onBurgerClick = () => setIsSidebarVisible(!isSidebarVisible);
+  const closeSidebar = useCallback(() => setIsSidebarVisible(false), []);
+  const location = useLocation();
+
+  React.useEffect(() => {
+    closeSidebar();
+  }, [location, closeSidebar]);
+
+  return (
+    <>
+      <NavBar onBurgerClick={onBurgerClick} />
+      <S.Container>
+        <S.Sidebar aria-label="Sidebar" $visible={isSidebarVisible}>
+          <Suspense fallback={<PageLoader />}>
+            <Nav />
+          </Suspense>
+        </S.Sidebar>
+        <S.Overlay
+          $visible={isSidebarVisible}
+          onClick={closeSidebar}
+          onKeyDown={closeSidebar}
+          tabIndex={-1}
+          aria-hidden="true"
+          aria-label="Overlay"
+        />
+        {children}
+      </S.Container>
+    </>
+  );
+};
+
+export default PageContainer;

+ 47 - 0
kafka-ui-react-app/src/components/PageContainer/__tests__/PageContainer.spec.tsx

@@ -0,0 +1,47 @@
+import React from 'react';
+import { screen, within } from '@testing-library/react';
+import userEvent from '@testing-library/user-event';
+import { render } from 'lib/testHelpers';
+import PageContainer from 'components/PageContainer/PageContainer';
+import { useClusters } from 'lib/hooks/api/clusters';
+
+const burgerButtonOptions = { name: 'burger' };
+
+jest.mock('lib/hooks/api/clusters', () => ({
+  ...jest.requireActual('lib/hooks/api/roles'),
+  useClusters: jest.fn(),
+}));
+
+jest.mock('components/Version/Version', () => () => <div>Version</div>);
+
+describe('Page Container', () => {
+  beforeEach(() => {
+    (useClusters as jest.Mock).mockImplementation(() => ({
+      isSuccess: false,
+    }));
+
+    render(
+      <PageContainer>
+        <div>child</div>
+      </PageContainer>
+    );
+  });
+
+  it('handle burger click correctly', async () => {
+    const burger = within(screen.getByLabelText('Page Header')).getByRole(
+      'button',
+      burgerButtonOptions
+    );
+    const overlay = screen.getByLabelText('Overlay');
+    expect(screen.getByLabelText('Sidebar')).toBeInTheDocument();
+    expect(overlay).toBeInTheDocument();
+    expect(overlay).toHaveStyleRule('visibility: hidden');
+    expect(burger).toHaveStyleRule('display: none');
+    await userEvent.click(burger);
+    expect(overlay).toHaveStyleRule('visibility: visible');
+  });
+
+  it('render the inner container', async () => {
+    expect(screen.getByText('child')).toBeInTheDocument();
+  });
+});

+ 20 - 5
kafka-ui-react-app/src/components/Schemas/Details/Details.tsx

@@ -27,8 +27,13 @@ import { resetLoaderById } from 'redux/reducers/loader/loaderSlice';
 import { TableTitle } from 'components/common/table/TableTitle/TableTitle.styled';
 import useAppParams from 'lib/hooks/useAppParams';
 import { schemasApiClient } from 'lib/api';
-import { Dropdown, DropdownItem } from 'components/common/Dropdown';
+import { Dropdown } from 'components/common/Dropdown';
 import Table from 'components/common/NewTable';
+import { Action, ResourceType } from 'generated-sources';
+import {
+  ActionButton,
+  ActionDropdownItem,
+} from 'components/common/ActionComponent';
 
 import LatestVersionItem from './LatestVersion/LatestVersionItem';
 import SchemaVersion from './SchemaVersion/SchemaVersion';
@@ -106,15 +111,20 @@ const Details: React.FC = () => {
             >
               Compare Versions
             </Button>
-            <Button
+            <ActionButton
               buttonSize="M"
               buttonType="primary"
               to={clusterSchemaEditPageRelativePath}
+              permission={{
+                resource: ResourceType.SCHEMA,
+                action: Action.EDIT,
+                value: subject,
+              }}
             >
               Edit Schema
-            </Button>
+            </ActionButton>
             <Dropdown>
-              <DropdownItem
+              <ActionDropdownItem
                 confirm={
                   <>
                     Are you sure want to remove <b>{subject}</b> schema?
@@ -122,9 +132,14 @@ const Details: React.FC = () => {
                 }
                 onClick={deleteHandler}
                 danger
+                permission={{
+                  resource: ResourceType.SCHEMA,
+                  action: Action.DELETE,
+                  value: subject,
+                }}
               >
                 Remove schema
-              </DropdownItem>
+              </ActionDropdownItem>
             </Dropdown>
           </>
         )}

+ 11 - 3
kafka-ui-react-app/src/components/Schemas/List/GlobalSchemaSelector/GlobalSchemaSelector.tsx

@@ -1,6 +1,9 @@
 import React from 'react';
-import Select from 'components/common/Select/Select';
-import { CompatibilityLevelCompatibilityEnum } from 'generated-sources';
+import {
+  Action,
+  CompatibilityLevelCompatibilityEnum,
+  ResourceType,
+} from 'generated-sources';
 import { useAppDispatch } from 'lib/hooks/redux';
 import useAppParams from 'lib/hooks/useAppParams';
 import { fetchSchemas } from 'redux/reducers/schemas/schemasSlice';
@@ -10,6 +13,7 @@ import { showServerError } from 'lib/errorHandling';
 import { useConfirm } from 'lib/hooks/useConfirm';
 import { useSearchParams } from 'react-router-dom';
 import { PER_PAGE } from 'lib/constants';
+import { ActionSelect } from 'components/common/ActionComponent';
 
 import * as S from './GlobalSchemaSelector.styled';
 
@@ -79,7 +83,7 @@ const GlobalSchemaSelector: React.FC = () => {
   return (
     <S.Wrapper>
       <div>Global Compatibility Level: </div>
-      <Select
+      <ActionSelect
         selectSize="M"
         defaultValue={currentCompatibilityLevel}
         minWidth="200px"
@@ -88,6 +92,10 @@ const GlobalSchemaSelector: React.FC = () => {
         options={Object.keys(CompatibilityLevelCompatibilityEnum).map(
           (level) => ({ value: level, label: level })
         )}
+        permission={{
+          resource: ResourceType.SCHEMA,
+          action: Action.MODIFY_GLOBAL_COMPATIBILITY,
+        }}
       />
     </S.Wrapper>
   );

+ 8 - 4
kafka-ui-react-app/src/components/Schemas/List/List.tsx

@@ -5,7 +5,7 @@ import {
   clusterSchemaPath,
 } from 'lib/paths';
 import ClusterContext from 'components/contexts/ClusterContext';
-import { Button } from 'components/common/Button/Button';
+import { ActionButton } from 'components/common/ActionComponent';
 import PageHeading from 'components/common/PageHeading/PageHeading';
 import { useAppDispatch, useAppSelector } from 'lib/hooks/redux';
 import useAppParams from 'lib/hooks/useAppParams';
@@ -22,7 +22,7 @@ import Search from 'components/common/Search/Search';
 import PlusIcon from 'components/common/Icons/PlusIcon';
 import Table, { LinkCell } from 'components/common/NewTable';
 import { ColumnDef } from '@tanstack/react-table';
-import { SchemaSubject } from 'generated-sources';
+import { Action, SchemaSubject, ResourceType } from 'generated-sources';
 import { useNavigate, useSearchParams } from 'react-router-dom';
 import { PER_PAGE } from 'lib/constants';
 
@@ -79,13 +79,17 @@ const List: React.FC = () => {
         {!isReadOnly && (
           <>
             <GlobalSchemaSelector />
-            <Button
+            <ActionButton
               buttonSize="M"
               buttonType="primary"
               to={clusterSchemaNewRelativePath}
+              permission={{
+                resource: ResourceType.SCHEMA,
+                action: Action.CREATE,
+              }}
             >
               <PlusIcon /> Create Schema
-            </Button>
+            </ActionButton>
           </>
         )}
       </PageHeading>

+ 30 - 13
kafka-ui-react-app/src/components/Schemas/New/New.tsx

@@ -1,6 +1,6 @@
 import React from 'react';
 import { NewSchemaSubjectRaw } from 'redux/interfaces';
-import { FormProvider, useForm, Controller } from 'react-hook-form';
+import { Controller, FormProvider, useForm } from 'react-hook-form';
 import { ErrorMessage } from '@hookform/error-message';
 import {
   ClusterNameRoute,
@@ -22,6 +22,8 @@ import { useAppDispatch } from 'lib/hooks/redux';
 import useAppParams from 'lib/hooks/useAppParams';
 import { showServerError } from 'lib/errorHandling';
 import { schemasApiClient } from 'lib/api';
+import yup from 'lib/yupExtended';
+import { yupResolver } from '@hookform/resolvers/yup';
 
 import * as S from './New.styled';
 
@@ -31,6 +33,28 @@ const SchemaTypeOptions: Array<SelectOption> = [
   { value: SchemaType.PROTOBUF, label: 'PROTOBUF' },
 ];
 
+const schemaCreate = async (
+  { subject, schema, schemaType }: NewSchemaSubjectRaw,
+  clusterName: string
+) => {
+  return schemasApiClient.createNewSchema({
+    clusterName,
+    newSchemaSubject: { subject, schema, schemaType },
+  });
+};
+
+const validationSchema = yup.object().shape({
+  subject: yup
+    .string()
+    .required('Subject is required.')
+    .matches(
+      SCHEMA_NAME_VALIDATION_PATTERN,
+      'Only alphanumeric, _, -, and . allowed'
+    ),
+  schema: yup.string().required('Schema is required.'),
+  schemaType: yup.string().required('Schema Type is required.'),
+});
+
 const New: React.FC = () => {
   const { clusterName } = useAppParams<ClusterNameRoute>();
   const navigate = useNavigate();
@@ -40,6 +64,7 @@ const New: React.FC = () => {
     defaultValues: {
       schemaType: SchemaType.AVRO,
     },
+    resolver: yupResolver(validationSchema),
   });
   const {
     register,
@@ -54,10 +79,10 @@ const New: React.FC = () => {
     schemaType,
   }: NewSchemaSubjectRaw) => {
     try {
-      const resp = await schemasApiClient.createNewSchema({
-        clusterName,
-        newSchemaSubject: { subject, schema, schemaType },
-      });
+      const resp = await schemaCreate(
+        { subject, schema, schemaType } as NewSchemaSubjectRaw,
+        clusterName
+      );
       dispatch(schemaAdded(resp));
       navigate(clusterSchemaPath(clusterName, subject));
     } catch (e) {
@@ -79,13 +104,6 @@ const New: React.FC = () => {
             inputSize="M"
             placeholder="Schema Name"
             name="subject"
-            hookFormOptions={{
-              required: 'Schema Name is required.',
-              pattern: {
-                value: SCHEMA_NAME_VALIDATION_PATTERN,
-                message: 'Only alphanumeric, _, -, and . allowed',
-              },
-            }}
             autoComplete="off"
             disabled={isSubmitting}
           />
@@ -111,7 +129,6 @@ const New: React.FC = () => {
           <InputLabel>Schema Type *</InputLabel>
           <Controller
             control={control}
-            rules={{ required: 'Schema Type is required.' }}
             name="schemaType"
             defaultValue={SchemaTypeOptions[0].value as SchemaType}
             render={({ field: { name, onChange, value } }) => (

+ 17 - 6
kafka-ui-react-app/src/components/Topics/List/ActionsCell.tsx

@@ -1,5 +1,5 @@
 import React from 'react';
-import { CleanUpPolicy, Topic } from 'generated-sources';
+import { Action, CleanUpPolicy, Topic, ResourceType } from 'generated-sources';
 import { CellContext } from '@tanstack/react-table';
 import { useAppDispatch } from 'lib/hooks/redux';
 import ClusterContext from 'components/contexts/ClusterContext';
@@ -17,6 +17,7 @@ import {
   useDeleteTopic,
   useRecreateTopic,
 } from 'lib/hooks/api/topics';
+import { ActionDropdownItem } from 'components/common/ActionComponent';
 
 const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
   const { name, internal, cleanUpPolicy } = row.original;
@@ -36,18 +37,23 @@ const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
     await dispatch(
       clearTopicMessages({ clusterName, topicName: name })
     ).unwrap();
-    queryClient.invalidateQueries(topicKeys.all(clusterName));
+    return queryClient.invalidateQueries(topicKeys.all(clusterName));
   };
 
   const isCleanupDisabled = cleanUpPolicy !== CleanUpPolicy.DELETE;
 
   return (
     <Dropdown disabled={disabled}>
-      <DropdownItem
+      <ActionDropdownItem
         disabled={isCleanupDisabled}
         onClick={clearTopicMessagesHandler}
         confirm="Are you sure want to clear topic messages?"
         danger
+        permission={{
+          resource: ResourceType.TOPIC,
+          action: Action.MESSAGES_DELETE,
+          value: name,
+        }}
       >
         Clear Messages
         <DropdownItemHint>
@@ -55,7 +61,7 @@ const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
           <br />
           with DELETE policy
         </DropdownItemHint>
-      </DropdownItem>
+      </ActionDropdownItem>
       <DropdownItem
         onClick={recreateTopic.mutateAsync}
         confirm={
@@ -67,7 +73,7 @@ const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
       >
         Recreate Topic
       </DropdownItem>
-      <DropdownItem
+      <ActionDropdownItem
         disabled={!isTopicDeletionAllowed}
         onClick={() => deleteTopic.mutateAsync(name)}
         confirm={
@@ -76,6 +82,11 @@ const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
           </>
         }
         danger
+        permission={{
+          resource: ResourceType.TOPIC,
+          action: Action.DELETE,
+          value: name,
+        }}
       >
         Remove Topic
         {!isTopicDeletionAllowed && (
@@ -85,7 +96,7 @@ const ActionsCell: React.FC<CellContext<Topic, unknown>> = ({ row }) => {
             configuration level
           </DropdownItemHint>
         )}
-      </DropdownItem>
+      </ActionDropdownItem>
     </Dropdown>
   );
 };

+ 38 - 6
kafka-ui-react-app/src/components/Topics/List/BatchActionsBar.tsx

@@ -1,6 +1,6 @@
-import React from 'react';
+import React, { useMemo } from 'react';
 import { Row } from '@tanstack/react-table';
-import { Topic } from 'generated-sources';
+import { Action, Topic, ResourceType } from 'generated-sources';
 import useAppParams from 'lib/hooks/useAppParams';
 import { ClusterName } from 'redux/interfaces';
 import { topicKeys, useDeleteTopic } from 'lib/hooks/api/topics';
@@ -10,6 +10,9 @@ import { useAppDispatch } from 'lib/hooks/redux';
 import { clearTopicMessages } from 'redux/reducers/topicMessages/topicMessagesSlice';
 import { clusterTopicCopyRelativePath } from 'lib/paths';
 import { useQueryClient } from '@tanstack/react-query';
+import { ActionCanButton } from 'components/common/ActionComponent';
+import { isPermitted } from 'lib/permissions';
+import { useUserInfo } from 'lib/hooks/useUserInfo';
 
 interface BatchActionsbarProps {
   rows: Row<Topic>[];
@@ -85,17 +88,45 @@ const BatchActionsbar: React.FC<BatchActionsbarProps> = ({
       search: new URLSearchParams(search).toString(),
     };
   };
+  const { roles, rbacFlag } = useUserInfo();
+
+  const canDeleteSelectedTopics = useMemo(() => {
+    return selectedTopics.every((value) =>
+      isPermitted({
+        roles,
+        resource: ResourceType.TOPIC,
+        action: Action.DELETE,
+        value,
+        clusterName,
+        rbacFlag,
+      })
+    );
+  }, [selectedTopics, clusterName, roles]);
+
+  const canPurgeSelectedTopics = useMemo(() => {
+    return selectedTopics.every((value) =>
+      isPermitted({
+        roles,
+        resource: ResourceType.TOPIC,
+        action: Action.MESSAGES_DELETE,
+        value,
+        clusterName,
+        rbacFlag,
+      })
+    );
+  }, [selectedTopics, clusterName, roles]);
 
   return (
     <>
-      <Button
+      <ActionCanButton
         buttonSize="M"
         buttonType="secondary"
         onClick={deleteTopicsHandler}
         disabled={!selectedTopics.length}
+        canDoAction={canDeleteSelectedTopics}
       >
         Delete selected topics
-      </Button>
+      </ActionCanButton>
       <Button
         buttonSize="M"
         buttonType="secondary"
@@ -104,14 +135,15 @@ const BatchActionsbar: React.FC<BatchActionsbarProps> = ({
       >
         Copy selected topic
       </Button>
-      <Button
+      <ActionCanButton
         buttonSize="M"
         buttonType="secondary"
         onClick={purgeTopicsHandler}
         disabled={!selectedTopics.length}
+        canDoAction={canPurgeSelectedTopics}
       >
         Purge messages of selected topics
-      </Button>
+      </ActionCanButton>
     </>
   );
 };

+ 8 - 3
kafka-ui-react-app/src/components/Topics/List/ListPage.tsx

@@ -4,13 +4,14 @@ import { clusterTopicNewRelativePath } from 'lib/paths';
 import { PER_PAGE } from 'lib/constants';
 import ClusterContext from 'components/contexts/ClusterContext';
 import Search from 'components/common/Search/Search';
-import { Button } from 'components/common/Button/Button';
+import { ActionButton } from 'components/common/ActionComponent';
 import PageHeading from 'components/common/PageHeading/PageHeading';
 import { ControlPanelWrapper } from 'components/common/ControlPanel/ControlPanel.styled';
 import Switch from 'components/common/Switch/Switch';
 import PlusIcon from 'components/common/Icons/PlusIcon';
 import PageLoader from 'components/common/PageLoader/PageLoader';
 import TopicTable from 'components/Topics/List/TopicTable';
+import { Action, ResourceType } from 'generated-sources';
 
 const ListPage: React.FC = () => {
   const { isReadOnly } = React.useContext(ClusterContext);
@@ -47,13 +48,17 @@ const ListPage: React.FC = () => {
     <>
       <PageHeading text="Topics">
         {!isReadOnly && (
-          <Button
+          <ActionButton
             buttonType="primary"
             buttonSize="M"
             to={clusterTopicNewRelativePath}
+            permission={{
+              resource: ResourceType.TOPIC,
+              action: Action.CREATE,
+            }}
           >
             <PlusIcon /> Add a Topic
-          </Button>
+          </ActionButton>
         )}
       </PageHeading>
       <ControlPanelWrapper hasInput>

+ 16 - 15
kafka-ui-react-app/src/components/Topics/List/__tests__/ListPage.spec.tsx

@@ -28,22 +28,23 @@ describe('ListPage Component', () => {
       { initialEntries: [clusterTopicsPath(clusterName)] }
     );
 
-  beforeEach(() => {
-    renderComponent();
-  });
-
-  it('handles switch of Internal Topics visibility', async () => {
-    const switchInput = screen.getByLabelText('Show Internal Topics');
-    expect(switchInput).toBeInTheDocument();
+  describe('Component Render', () => {
+    beforeEach(() => {
+      renderComponent();
+    });
+    it('handles switch of Internal Topics visibility', async () => {
+      const switchInput = screen.getByLabelText('Show Internal Topics');
+      expect(switchInput).toBeInTheDocument();
 
-    expect(global.localStorage.getItem('hideInternalTopics')).toBeNull();
-    await userEvent.click(switchInput);
-    expect(global.localStorage.getItem('hideInternalTopics')).toBeTruthy();
-    await userEvent.click(switchInput);
-    expect(global.localStorage.getItem('hideInternalTopics')).toBeNull();
-  });
+      expect(global.localStorage.getItem('hideInternalTopics')).toBeNull();
+      await userEvent.click(switchInput);
+      expect(global.localStorage.getItem('hideInternalTopics')).toBeTruthy();
+      await userEvent.click(switchInput);
+      expect(global.localStorage.getItem('hideInternalTopics')).toBeNull();
+    });
 
-  it('renders the TopicsTable', () => {
-    expect(screen.getByText('TopicTableMock')).toBeInTheDocument();
+    it('renders the TopicsTable', () => {
+      expect(screen.getByText('TopicTableMock')).toBeInTheDocument();
+    });
   });
 });

+ 3 - 1
kafka-ui-react-app/src/components/Topics/List/__tests__/TopicTable.spec.tsx

@@ -66,7 +66,9 @@ describe('TopicTable Components', () => {
           <TopicTable />
         </WithRoute>
       </ClusterContext.Provider>,
-      { initialEntries: [clusterTopicsPath(clusterName)] }
+      {
+        initialEntries: [clusterTopicsPath(clusterName)],
+      }
     );
   };
 

+ 4 - 4
kafka-ui-react-app/src/components/Topics/New/New.tsx

@@ -1,9 +1,9 @@
 import React from 'react';
 import { TopicFormData } from 'redux/interfaces';
-import { useForm, FormProvider } from 'react-hook-form';
+import { FormProvider, useForm } from 'react-hook-form';
 import { ClusterNameRoute, clusterTopicsPath } from 'lib/paths';
 import TopicForm from 'components/Topics/shared/Form/TopicForm';
-import { useNavigate, useLocation } from 'react-router-dom';
+import { useLocation, useNavigate } from 'react-router-dom';
 import { yupResolver } from '@hookform/resolvers/yup';
 import { topicFormValidationSchema } from 'lib/yupExtended';
 import PageHeading from 'components/common/PageHeading/PageHeading';
@@ -19,12 +19,12 @@ enum Filters {
 }
 
 const New: React.FC = () => {
+  const { clusterName } = useAppParams<ClusterNameRoute>();
   const methods = useForm<TopicFormData>({
     mode: 'onChange',
     resolver: yupResolver(topicFormValidationSchema),
   });
 
-  const { clusterName } = useAppParams<ClusterNameRoute>();
   const createTopic = useCreateTopic(clusterName);
 
   const navigate = useNavigate();
@@ -39,7 +39,7 @@ const New: React.FC = () => {
   const cleanUpPolicy = params.get(Filters.CLEANUP_POLICY) || 'Delete';
 
   const onSubmit = async (data: TopicFormData) => {
-    await createTopic.mutateAsync(data);
+    await createTopic.createResource(data);
     navigate(`../${data.name}`);
   };
 

+ 2 - 1
kafka-ui-react-app/src/components/Topics/New/__test__/New.spec.tsx

@@ -23,6 +23,7 @@ jest.mock('react-router-dom', () => ({
 jest.mock('lib/hooks/api/topics', () => ({
   useCreateTopic: jest.fn(),
 }));
+
 const renderComponent = (path: string) => {
   render(
     <Routes>
@@ -38,7 +39,7 @@ const createTopicMock = jest.fn();
 describe('New', () => {
   beforeEach(() => {
     (useCreateTopic as jest.Mock).mockImplementation(() => ({
-      mutateAsync: createTopicMock,
+      createResource: createTopicMock,
     }));
   });
   afterEach(() => {

+ 13 - 4
kafka-ui-react-app/src/components/Topics/Topic/Overview/ActionsCell.tsx

@@ -1,13 +1,14 @@
 import React from 'react';
-import { Partition } from 'generated-sources';
+import { Action, Partition, ResourceType } from 'generated-sources';
 import { CellContext } from '@tanstack/react-table';
 import { useAppDispatch } from 'lib/hooks/redux';
 import ClusterContext from 'components/contexts/ClusterContext';
 import { RouteParamsClusterTopic } from 'lib/paths';
 import useAppParams from 'lib/hooks/useAppParams';
 import { clearTopicMessages } from 'redux/reducers/topicMessages/topicMessagesSlice';
-import { Dropdown, DropdownItem } from 'components/common/Dropdown';
+import { Dropdown } from 'components/common/Dropdown';
 import { useTopicDetails } from 'lib/hooks/api/topics';
+import { ActionDropdownItem } from 'components/common/ActionComponent';
 
 const ActionsCell: React.FC<CellContext<Partition, unknown>> = ({ row }) => {
   const { clusterName, topicName } = useAppParams<RouteParamsClusterTopic>();
@@ -25,9 +26,17 @@ const ActionsCell: React.FC<CellContext<Partition, unknown>> = ({ row }) => {
     data?.internal || isReadOnly || data?.cleanUpPolicy !== 'DELETE';
   return (
     <Dropdown disabled={disabled}>
-      <DropdownItem onClick={clearTopicMessagesHandler} danger>
+      <ActionDropdownItem
+        onClick={clearTopicMessagesHandler}
+        danger
+        permission={{
+          resource: ResourceType.TOPIC,
+          action: Action.MESSAGES_DELETE,
+          value: topicName,
+        }}
+      >
         Clear Messages
-      </DropdownItem>
+      </ActionDropdownItem>
     </Dropdown>
   );
 };

+ 16 - 5
kafka-ui-react-app/src/components/Topics/Topic/Statistics/Metrics.tsx

@@ -6,7 +6,6 @@ import {
 } from 'lib/hooks/api/topics';
 import useAppParams from 'lib/hooks/useAppParams';
 import { RouteParamsClusterTopic } from 'lib/paths';
-import { Button } from 'components/common/Button/Button';
 import * as Informers from 'components/common/Metrics';
 import ProgressBar from 'components/common/ProgressBar/ProgressBar';
 import {
@@ -16,6 +15,8 @@ import {
 import BytesFormatted from 'components/common/BytesFormatted/BytesFormatted';
 import { useTimeFormat } from 'lib/hooks/useTimeFormat';
 import { calculateTimer } from 'lib/dateTimeHelpers';
+import { Action, ResourceType } from 'generated-sources';
+import { ActionButton } from 'components/common/ActionComponent';
 
 import * as S from './Statistics.styles';
 import Total from './Indicators/Total';
@@ -50,16 +51,21 @@ const Metrics: React.FC = () => {
           <ProgressBar completed={data.progress.completenessPercent || 0} />
           <span> {Math.floor(data.progress.completenessPercent || 0)} %</span>
         </S.ProgressBarWrapper>
-        <Button
+        <ActionButton
           onClick={async () => {
             await cancelTopicAnalysis.mutateAsync();
             setIsAnalyzing(true);
           }}
           buttonType="primary"
           buttonSize="M"
+          permission={{
+            resource: ResourceType.TOPIC,
+            action: Action.MESSAGES_READ,
+            value: params.topicName,
+          }}
         >
           Stop Analysis
-        </Button>
+        </ActionButton>
         <List>
           <Label>Started at</Label>
           <span>{formatTimestamp(data.progress.startedAt, 'hh:mm:ss a')}</span>
@@ -87,16 +93,21 @@ const Metrics: React.FC = () => {
     <>
       <S.ActionsBar>
         <S.CreatedAt>{formatTimestamp(data?.result?.finishedAt)}</S.CreatedAt>
-        <Button
+        <ActionButton
           onClick={async () => {
             await analyzeTopic.mutateAsync();
             setIsAnalyzing(true);
           }}
           buttonType="primary"
           buttonSize="S"
+          permission={{
+            resource: ResourceType.TOPIC,
+            action: Action.MESSAGES_READ,
+            value: params.topicName,
+          }}
         >
           Restart Analysis
-        </Button>
+        </ActionButton>
       </S.ActionsBar>
       <Informers.Wrapper>
         <Total {...totalStats} />

+ 9 - 3
kafka-ui-react-app/src/components/Topics/Topic/Statistics/Statistics.tsx

@@ -5,7 +5,8 @@ import useAppParams from 'lib/hooks/useAppParams';
 import { RouteParamsClusterTopic } from 'lib/paths';
 import { QueryErrorResetBoundary } from '@tanstack/react-query';
 import { ErrorBoundary } from 'react-error-boundary';
-import { Button } from 'components/common/Button/Button';
+import { Action, ResourceType } from 'generated-sources';
+import { ActionButton } from 'components/common/ActionComponent';
 
 import * as S from './Statistics.styles';
 import Metrics from './Metrics';
@@ -21,16 +22,21 @@ const Statistics: React.FC = () => {
           onReset={reset}
           fallbackRender={({ resetErrorBoundary }) => (
             <S.ProgressContainer>
-              <Button
+              <ActionButton
                 onClick={async () => {
                   await analyzeTopic.mutateAsync();
                   resetErrorBoundary();
                 }}
                 buttonType="primary"
                 buttonSize="M"
+                permission={{
+                  resource: ResourceType.TOPIC,
+                  action: Action.MESSAGES_READ,
+                  value: params.topicName,
+                }}
               >
                 Start Analysis
-              </Button>
+              </ActionButton>
             </S.ProgressContainer>
           )}
         >

+ 2 - 6
kafka-ui-react-app/src/components/Topics/Topic/Statistics/__test__/Statistics.spec.tsx

@@ -26,29 +26,25 @@ describe('Statistics', () => {
     );
   };
   const startMock = jest.fn();
-  it('renders Metricks component', async () => {
+  it('renders Metrics component', async () => {
     (useTopicAnalysis as jest.Mock).mockImplementation(() => ({
       data: { result: 1 },
     }));
 
     renderComponent();
-
     await expect(screen.getByText('Restart Analysis')).toBeInTheDocument();
     expect(screen.queryByRole('progressbar')).not.toBeInTheDocument();
   });
   it('renders Start Analysis button', async () => {
-    // throwing intentional For error boundaries to work
     jest.spyOn(console, 'error').mockImplementation(() => undefined);
     (useAnalyzeTopic as jest.Mock).mockImplementation(() => ({
       mutateAsync: startMock,
     }));
-    (useTopicAnalysis as jest.Mock).mockImplementation(() => {
-      throw new Error('Error boundary');
-    });
     renderComponent();
     const btn = screen.getByRole('button', { name: 'Start Analysis' });
     expect(btn).toBeInTheDocument();
     await waitFor(() => userEvent.click(btn));
     expect(startMock).toHaveBeenCalled();
+    jest.clearAllMocks();
   });
 });

+ 56 - 23
kafka-ui-react-app/src/components/Topics/Topic/Topic.tsx

@@ -1,25 +1,25 @@
 import React, { Suspense } from 'react';
 import { NavLink, Route, Routes, useNavigate } from 'react-router-dom';
 import {
-  RouteParamsClusterTopic,
-  clusterTopicMessagesRelativePath,
-  clusterTopicSettingsRelativePath,
   clusterTopicConsumerGroupsRelativePath,
   clusterTopicEditRelativePath,
-  clusterTopicStatisticsRelativePath,
+  clusterTopicMessagesRelativePath,
+  clusterTopicSettingsRelativePath,
   clusterTopicsPath,
+  clusterTopicStatisticsRelativePath,
+  RouteParamsClusterTopic,
 } from 'lib/paths';
 import ClusterContext from 'components/contexts/ClusterContext';
 import PageHeading from 'components/common/PageHeading/PageHeading';
-import { Button } from 'components/common/Button/Button';
+import {
+  ActionButton,
+  ActionNavLink,
+  ActionDropdownItem,
+} from 'components/common/ActionComponent';
 import Navbar from 'components/common/Navigation/Navbar.styled';
 import { useAppDispatch } from 'lib/hooks/redux';
 import useAppParams from 'lib/hooks/useAppParams';
-import {
-  Dropdown,
-  DropdownItem,
-  DropdownItemHint,
-} from 'components/common/Dropdown';
+import { Dropdown, DropdownItemHint } from 'components/common/Dropdown';
 import {
   useDeleteTopic,
   useRecreateTopic,
@@ -29,7 +29,7 @@ import {
   clearTopicMessages,
   resetTopicMessages,
 } from 'redux/reducers/topicMessages/topicMessagesSlice';
-import { CleanUpPolicy } from 'generated-sources';
+import { Action, CleanUpPolicy, ResourceType } from 'generated-sources';
 import PageLoader from 'components/common/PageLoader/PageLoader';
 import SlidingSidebar from 'components/common/SlidingSidebar';
 import useBoolean from 'lib/hooks/useBoolean';
@@ -50,6 +50,7 @@ const Topic: React.FC = () => {
     setTrue: openSidebar,
   } = useBoolean(false);
   const { clusterName, topicName } = useAppParams<RouteParamsClusterTopic>();
+
   const navigate = useNavigate();
   const deleteTopic = useDeleteTopic(clusterName);
   const recreateTopic = useRecreateTopic({ clusterName, topicName });
@@ -78,31 +79,48 @@ const Topic: React.FC = () => {
         backText="Topics"
         backTo={clusterTopicsPath(clusterName)}
       >
-        <Button
+        <ActionButton
           buttonSize="M"
           buttonType="primary"
           onClick={openSidebar}
           disabled={isReadOnly}
+          permission={{
+            resource: ResourceType.TOPIC,
+            action: Action.MESSAGES_PRODUCE,
+            value: topicName,
+          }}
         >
           Produce Message
-        </Button>
+        </ActionButton>
         <Dropdown disabled={isReadOnly || data?.internal}>
-          <DropdownItem onClick={() => navigate(clusterTopicEditRelativePath)}>
+          <ActionDropdownItem
+            onClick={() => navigate(clusterTopicEditRelativePath)}
+            permission={{
+              resource: ResourceType.TOPIC,
+              action: Action.EDIT,
+              value: topicName,
+            }}
+          >
             Edit settings
             <DropdownItemHint>
               Pay attention! This operation has
               <br />
               especially important consequences.
             </DropdownItemHint>
-          </DropdownItem>
+          </ActionDropdownItem>
 
-          <DropdownItem
+          <ActionDropdownItem
             onClick={() =>
               dispatch(clearTopicMessages({ clusterName, topicName })).unwrap()
             }
             confirm="Are you sure want to clear topic messages?"
             disabled={!canCleanup}
             danger
+            permission={{
+              resource: ResourceType.TOPIC,
+              action: Action.MESSAGES_DELETE,
+              value: topicName,
+            }}
           >
             Clear messages
             <DropdownItemHint>
@@ -110,9 +128,9 @@ const Topic: React.FC = () => {
               <br />
               with DELETE policy
             </DropdownItemHint>
-          </DropdownItem>
+          </ActionDropdownItem>
 
-          <DropdownItem
+          <ActionDropdownItem
             onClick={recreateTopic.mutateAsync}
             confirm={
               <>
@@ -120,10 +138,15 @@ const Topic: React.FC = () => {
               </>
             }
             danger
+            permission={{
+              resource: ResourceType.TOPIC,
+              action: [Action.MESSAGES_READ, Action.CREATE, Action.DELETE],
+              value: topicName,
+            }}
           >
             Recreate Topic
-          </DropdownItem>
-          <DropdownItem
+          </ActionDropdownItem>
+          <ActionDropdownItem
             onClick={deleteTopicHandler}
             confirm={
               <>
@@ -132,6 +155,11 @@ const Topic: React.FC = () => {
             }
             disabled={!isTopicDeletionAllowed}
             danger
+            permission={{
+              resource: ResourceType.TOPIC,
+              action: Action.DELETE,
+              value: topicName,
+            }}
           >
             Remove Topic
             {!isTopicDeletionAllowed && (
@@ -141,7 +169,7 @@ const Topic: React.FC = () => {
                 configuration level
               </DropdownItemHint>
             )}
-          </DropdownItem>
+          </ActionDropdownItem>
         </Dropdown>
       </PageHeading>
       <Navbar role="navigation">
@@ -152,12 +180,17 @@ const Topic: React.FC = () => {
         >
           Overview
         </NavLink>
-        <NavLink
+        <ActionNavLink
           to={clusterTopicMessagesRelativePath}
           className={({ isActive }) => (isActive ? 'is-active' : '')}
+          permission={{
+            resource: ResourceType.TOPIC,
+            action: Action.MESSAGES_READ,
+            value: topicName,
+          }}
         >
           Messages
-        </NavLink>
+        </ActionNavLink>
         <NavLink
           to={clusterTopicConsumerGroupsRelativePath}
           className={({ isActive }) => (isActive ? 'is-active' : '')}

+ 12 - 32
kafka-ui-react-app/src/components/__tests__/App.spec.tsx

@@ -1,61 +1,41 @@
 import React from 'react';
-import { screen, within } from '@testing-library/react';
+import { screen } from '@testing-library/react';
 import App from 'components/App';
 import { render } from 'lib/testHelpers';
-import userEvent from '@testing-library/user-event';
 import { useTimeFormat } from 'lib/hooks/api/timeFormat';
 import { defaultGlobalSettingsValue } from 'components/contexts/GlobalSettingsContext';
-
-const burgerButtonOptions = { name: 'burger' };
-const logoutButtonOptions = { name: 'Log out' };
+import { useGetUserInfo } from 'lib/hooks/api/roles';
 
 jest.mock('components/Nav/Nav', () => () => <div>Navigation</div>);
 
 jest.mock('components/Version/Version', () => () => <div>Version</div>);
 
+jest.mock('components/NavBar/NavBar', () => () => <div>NavBar</div>);
+
 jest.mock('lib/hooks/api/timeFormat', () => ({
   ...jest.requireActual('lib/hooks/api/timeFormat'),
   useTimeFormat: jest.fn(),
 }));
 
+jest.mock('lib/hooks/api/roles', () => ({
+  useGetUserInfo: jest.fn(),
+}));
+
 describe('App', () => {
   beforeEach(() => {
     (useTimeFormat as jest.Mock).mockImplementation(() => ({
       data: defaultGlobalSettingsValue,
     }));
 
+    (useGetUserInfo as jest.Mock).mockImplementation(() => ({
+      data: {},
+    }));
+
     render(<App />, {
       initialEntries: ['/'],
     });
   });
 
-  it('correctly renders header', () => {
-    const header = screen.getByLabelText('Page Header');
-    expect(header).toBeInTheDocument();
-    expect(within(header).getByText('UI for Apache Kafka')).toBeInTheDocument();
-    expect(within(header).getAllByRole('separator').length).toEqual(3);
-    expect(
-      within(header).getByRole('button', burgerButtonOptions)
-    ).toBeInTheDocument();
-    expect(
-      within(header).getByRole('button', logoutButtonOptions)
-    ).toBeInTheDocument();
-  });
-
-  it('handle burger click correctly', async () => {
-    const burger = within(screen.getByLabelText('Page Header')).getByRole(
-      'button',
-      burgerButtonOptions
-    );
-    const overlay = screen.getByLabelText('Overlay');
-    expect(screen.getByLabelText('Sidebar')).toBeInTheDocument();
-    expect(overlay).toBeInTheDocument();
-    expect(overlay).toHaveStyleRule('visibility: hidden');
-    expect(burger).toHaveStyleRule('display: none');
-    await userEvent.click(burger);
-    expect(overlay).toHaveStyleRule('visibility: visible');
-  });
-
   it('Renders navigation', async () => {
     expect(screen.getByText('Navigation')).toBeInTheDocument();
   });

+ 18 - 0
kafka-ui-react-app/src/components/common/ActionComponent/ActionButton/ActionButton.tsx

@@ -0,0 +1,18 @@
+import React from 'react';
+import { Props as ButtonProps } from 'components/common/Button/Button';
+import { ActionComponentProps } from 'components/common/ActionComponent/ActionComponent';
+import { Action } from 'generated-sources';
+import ActionPermissionButton from 'components/common/ActionComponent/ActionButton/ActionPermissionButton/ActionPermissionButton';
+import ActionCreateButton from 'components/common/ActionComponent/ActionButton//ActionCreateButton/ActionCreateButton';
+
+interface Props extends ActionComponentProps, ButtonProps {}
+
+const ActionButton: React.FC<Props> = ({ permission, ...props }) => {
+  return permission.action === Action.CREATE ? (
+    <ActionCreateButton permission={permission} {...props} />
+  ) : (
+    <ActionPermissionButton permission={permission} {...props} />
+  );
+};
+
+export default ActionButton;

+ 48 - 0
kafka-ui-react-app/src/components/common/ActionComponent/ActionButton/ActionCanButton/ActionCanButton.tsx

@@ -0,0 +1,48 @@
+import React from 'react';
+import { Button, Props as ButtonProps } from 'components/common/Button/Button';
+import * as S from 'components/common/ActionComponent/ActionComponent.styled';
+import {
+  ActionComponentProps,
+  getDefaultActionMessage,
+} from 'components/common/ActionComponent/ActionComponent';
+import { useActionTooltip } from 'lib/hooks/useActionTooltip';
+
+interface Props extends Omit<ActionComponentProps, 'permission'>, ButtonProps {
+  canDoAction: boolean;
+}
+
+const ActionButton: React.FC<Props> = ({
+  placement = 'bottom-end',
+  message = getDefaultActionMessage(),
+  disabled,
+  canDoAction,
+  ...props
+}) => {
+  const isDisabled = !canDoAction;
+
+  const { x, y, reference, floating, strategy, open } = useActionTooltip(
+    isDisabled,
+    placement
+  );
+
+  return (
+    <S.Wrapper ref={reference}>
+      <Button {...props} disabled={disabled || isDisabled} />
+      {open && (
+        <S.MessageTooltipLimited
+          ref={floating}
+          style={{
+            position: strategy,
+            top: y ?? 0,
+            left: x ?? 0,
+            width: 'max-content',
+          }}
+        >
+          {message}
+        </S.MessageTooltipLimited>
+      )}
+    </S.Wrapper>
+  );
+};
+
+export default ActionButton;

Algúns arquivos non se mostraron porque demasiados arquivos cambiaron neste cambio