Commit 74fd11f7 authored by globa's avatar globa

Initial commit

parents
* text eol=lf
* text=auto
**/*.h2.db
**/*.iml
**/*.mv.db
**/*.tmp
**/*.trace.db
**/*.ucls
**/.externalToolBuilders/
**/overlays/
**/rebel.xml
**/singulardb.lock.db
**/target/
*.bak
*.bat text eol=crlf
*.crt binary
*.css text
*.dll binary
*.doc binary
*.docx binary
*.dylib binary
*.eot binary
*.exe binary
*.gif binary
*.html text
*.jar binary
*.java text
*.jpg binary
*.js text
*.otf binary
*.pdf binary
*.png binary
*.properties text
*.sh text
*.so binary
*.so.0 binary
*.so.0.0.0 binary
*.so.0.1 binary
*.so.0.1.0 binary
*.so.0.12.4 binary
*.so.1 binary
*.so.1.0.0 binary
*.so.1.1.0 binary
*.so.1.2.7 binary
*.so.1.3.0 binary
*.so.1.7.0 binary
*.so.12 binary
*.so.6 binary
*.so.6.0.0 binary
*.so.6.10.0 binary
*.so.6.3.0 binary
*.so.6.4.0 binary
*.svg binary
*.swp
*.tmp
*.truststore binary
*.ttf binary
*.woff binary
*.woff2 binary
*.xml text
*.zip binary
*~.nib
.DS_Store
.DS_Store?
.Spotlight-V100
.Trashes
._*
.classpath
.idea/
.loadpath
.metadata
.project
.settings/
Thumbs.db
_confHomol/
atlassian-ide-plugin.xml
bin/**
buildall.sh
classes/
ehthumbs.db
flow/test/singulardb.trace.db.old
local.properties
out/**
pom.xml.next
pom.xml.releaseBackup
pom.xml.tag
pom.xml.versionsBackup
release.properties
resources/ui-static-resources/src/main/webapp/resources/comum/* linguist-vendored
tmp/**
tmp/**/*
wkhtmltoimage binary
wkhtmltopdf binary
keycloak
**/target
.DS_Store
.idea
.vscode/
\ No newline at end of file
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
# singular-keycloak-database-federation
### Compatible with Keycloak 17+ quarkus based.
### ** Keycloak 19+ ** KNOWN ISSUE:
#### New Theme breaks custom providers, to overcome this problem, follow these steps:
- Click "Realm Settings" on the left menu
- Then click the tab "Themes"
- And, for the selection input labeled "Admin console theme", select "keycloak"
- Logoff and login again
- Now, if you try to configure this provider again, keycloak should render all configuration fields and everything else should work fine.
See issue #19 for further information.
**For older versions look at older_versions branch.
Keycloak User Storage SPI for Relational Databases (Keycloak User Federation, supports postgresql, mysql, oracle and mysql).
- Keycloak User federation provider with SQL
- Keycloak User federation using existing database
- Keycloak database user provider
- Keycloak MSSQL Database Integration
- Keycloak SQL Server Database Integration
- Keycloak Oracle Database Integration
- Keycloak Postgres Database Integration
- Keycloak blowfish bcrypt support
## Usage
Fully compatible with Singular Studio NOCODE. See https://www.studio.opensingular.com/
## Configuration
Keycloak User Federation Screen Shot
![Sample Screenshot](screen.png)
There is a new configuration that allows keycloak to remove a user entry from its local database (this option has no effect on the source database). It can be useful when you need to reload user data.
This option can be configured by the following switch:
![Sample Screenshot](deleteuser.png)
## Limitations
- Do not allow user information update, including password update
- Do not supports user roles our groups
## Custom attributes
Just add a mapper to client mappers with the same name as the returned column alias in your queries.Use mapper type "User Attribute". See the example below:
![Sample Screenshot 2](screen2.png)
## Build
- mvn clean package
## Deployment
1) Copy every `.jar` from dist/ folder to /providers folder under your keycloak installation root.
- i.e, on a default keycloak setup, copy all `.jar` files to <keycloak_root_dir>/providers
2) run :
$ ./bin/kc.sh start-dev
OR if you are using a production configuration:
$ ./bin/kc.sh build
$ ./bin/kc.sh start
## For futher information see:
- https://github.com/keycloak/keycloak/issues/9833
- https://www.keycloak.org/docs/latest/server_development/#packaging-and-deployment
This diff was suppressed by a .gitattributes entry.
#!/usr/bin/env bash
read -p "Enter container name of Keycloak: " containerName
containerId=$(docker ps -aqf "name=$containerName$")
echo "container id= $containerId"
mvn clean package && docker cp ./dist/. "$containerId":/opt/keycloak/providers
#!/usr/bin/env bash
read -p "Enter absolute path of Keycloak folder: " pathKeycloak
mvn clean package && cp ./dist/* "$pathKeycloak"/providers
\ No newline at end of file
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>singular-user-storage-provider</artifactId>
<groupId>org.opensingular</groupId>
<version>2.4.6</version>
<dependencies>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-core</artifactId>
<version>${keycloak.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-server-spi</artifactId>
<version>${keycloak.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.keycloak</groupId>
<artifactId>keycloak-model-jpa</artifactId>
<version>${keycloak.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.jboss.logging</groupId>
<artifactId>jboss-logging</artifactId>
<version>${jboss-logging.version}</version>
<scope>provided</scope>
</dependency>
<!-- demonstrates usage of custom dependencies in an ear -->
<!-- https://repo1.maven.org/maven2/com/google/guava/guava/ -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>33.0.0-jre</version>
</dependency>
<dependency>
<groupId>com.google.auto.service</groupId>
<artifactId>auto-service</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>${lombok.version}</version>
<optional>true</optional>
</dependency>
<!-- https://repo1.maven.org/maven2/com/zaxxer/HikariCP/ -->
<dependency>
<groupId>com.zaxxer</groupId>
<artifactId>HikariCP</artifactId>
<version>5.1.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://repo1.maven.org/maven2/org/apache/commons/commons-lang3/ -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.14.0</version>
</dependency>
<!-- https://repo1.maven.org/maven2/commons-codec/commons-codec/ -->
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.16.1</version>
</dependency>
<!-- https://repo1.maven.org/maven2/commons-io/commons-io/ -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.15.1</version>
</dependency>
<!-- https://repo1.maven.org/maven2/net/sourceforge/jtds/jtds/ -->
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
<version>1.3.1</version>
</dependency>
<!-- https://repo1.maven.org/maven2/com/mysql/mysql-connector-j/ -->
<dependency>
<groupId>com.mysql</groupId>
<artifactId>mysql-connector-j</artifactId>
<version>8.3.0</version>
</dependency>
<!-- https://repo1.maven.org/maven2/org/postgresql/postgresql/ -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.7.1</version>
</dependency>
<!-- https://repo1.maven.org/maven2/com/oracle/database/jdbc/ojdbc11/ -->
<dependency>
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc11</artifactId>
<version>23.3.0.23.09</version>
</dependency>
<!-- https://repo1.maven.org/maven2/org/mindrot/jbcrypt/ -->
<dependency>
<groupId>org.mindrot</groupId>
<artifactId>jbcrypt</artifactId>
<version>0.4</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>6.4.4.Final</version>
<scope>provided</scope>
</dependency>
<!-- https://repo1.maven.org/maven2/de/mkammerer/argon2-jvm/ -->
<dependency>
<groupId>de.mkammerer</groupId>
<artifactId>argon2-jvm</artifactId>
<version>2.11</version>
</dependency>
</dependencies>
<build>
<finalName>singular-user-storage-provider</finalName>
<plugins>
<plugin>
<!-- https://repo1.maven.org/maven2/org/wildfly/plugins/wildfly-maven-plugin/ -->
<groupId>org.wildfly.plugins</groupId>
<artifactId>wildfly-maven-plugin</artifactId>
<version>4.2.0.Final</version>
<configuration>
<skip>false</skip>
</configuration>
</plugin>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<excludeScope>provided</excludeScope>
<outputDirectory>${project.basedir}/dist</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<!-- https://repo1.maven.org/maven2/org/apache/maven/plugins/maven-jar-plugin/ -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<outputDirectory>${project.basedir}/dist</outputDirectory>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<!-- https://repo1.maven.org/maven2/org/projectlombok/lombok/ -->
<lombok.version>1.18.30</lombok.version>
<!-- https://repo1.maven.org/maven2/org/jboss/logging/jboss-logging/ -->
<jboss-logging.version>3.5.3.Final</jboss-logging.version>
<keycloak.version>24.0.1</keycloak.version>
<!-- https://repo1.maven.org/maven2/com/google/auto/service/auto-service/ -->
<auto-service.version>1.0-rc5</auto-service.version>
<jboss.home>target/keycloak</jboss.home>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.auto.service</groupId>
<artifactId>auto-service</artifactId>
<version>${auto-service.version}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
</dependencies>
</dependencyManagement>
</project>
\ No newline at end of file
This diff was suppressed by a .gitattributes entry.
This diff was suppressed by a .gitattributes entry.
package org.opensingular.dbuserprovider;
public class DBUserStorageException extends RuntimeException {
public DBUserStorageException(String message, Throwable cause) {
super(message, cause);
}
public DBUserStorageException(Throwable cause) {
super(cause);
}
public DBUserStorageException(String message, Throwable cause, boolean enableSuppression, boolean writableStackTrace) {
super(message, cause, enableSuppression, writableStackTrace);
}
}
package org.opensingular.dbuserprovider;
import lombok.extern.jbosslog.JBossLog;
import org.keycloak.component.ComponentModel;
import org.keycloak.credential.CredentialInput;
import org.keycloak.credential.CredentialInputUpdater;
import org.keycloak.credential.CredentialInputValidator;
import org.keycloak.models.cache.CachedUserModel;
import org.keycloak.models.*;
import org.keycloak.models.credential.PasswordCredentialModel;
import org.keycloak.storage.StorageId;
import org.keycloak.storage.UserStorageProvider;
import org.keycloak.storage.user.UserLookupProvider;
import org.keycloak.storage.user.UserQueryProvider;
import org.keycloak.storage.user.UserRegistrationProvider;
import org.opensingular.dbuserprovider.model.QueryConfigurations;
import org.opensingular.dbuserprovider.model.UserAdapter;
import org.opensingular.dbuserprovider.persistence.DataSourceProvider;
import org.opensingular.dbuserprovider.persistence.UserRepository;
import org.opensingular.dbuserprovider.util.PagingUtil;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.stream.Stream;
@JBossLog
public class DBUserStorageProvider implements UserStorageProvider,
UserLookupProvider, UserQueryProvider, CredentialInputUpdater, CredentialInputValidator, UserRegistrationProvider {
private final KeycloakSession session;
private final ComponentModel model;
private final UserRepository repository;
private final boolean allowDatabaseToOverwriteKeycloak;
DBUserStorageProvider(KeycloakSession session, ComponentModel model, DataSourceProvider dataSourceProvider, QueryConfigurations queryConfigurations) {
this.session = session;
this.model = model;
this.repository = new UserRepository(dataSourceProvider, queryConfigurations);
this.allowDatabaseToOverwriteKeycloak = queryConfigurations.getAllowDatabaseToOverwriteKeycloak();
}
private Stream<UserModel> toUserModel(RealmModel realm, List<Map<String, String>> users) {
return users.stream()
.map(m -> new UserAdapter(session, realm, model, m, allowDatabaseToOverwriteKeycloak));
}
@Override
public boolean supportsCredentialType(String credentialType) {
return PasswordCredentialModel.TYPE.equals(credentialType);
}
@Override
public boolean isConfiguredFor(RealmModel realm, UserModel user, String credentialType) {
return supportsCredentialType(credentialType);
}
@Override
public boolean isValid(RealmModel realm, UserModel user, CredentialInput input) {
log.infov("isValid user credential: userId={0}", user.getId());
if (!supportsCredentialType(input.getType()) || !(input instanceof UserCredentialModel)) {
return false;
}
UserCredentialModel cred = (UserCredentialModel) input;
UserModel dbUser = user;
// If the cache just got loaded in the last 500 millisec (i.e. probably part of the actual flow), there is no point in reloading the user.)
if (allowDatabaseToOverwriteKeycloak && user instanceof CachedUserModel && (System.currentTimeMillis() - ((CachedUserModel) user).getCacheTimestamp()) > 500) {
dbUser = this.getUserById(realm, user.getId());
if (dbUser == null) {
((CachedUserModel) user).invalidate();
return false;
}
// For now, we'll just invalidate the cache if username or email has changed. Eventually we could check all (or a parametered list of) attributes fetched from the DB.
if (!java.util.Objects.equals(user.getUsername(), dbUser.getUsername()) || !java.util.Objects.equals(user.getEmail(), dbUser.getEmail())) {
((CachedUserModel) user).invalidate();
}
}
return repository.validateCredentials(dbUser.getUsername(), cred.getChallengeResponse());
}
@Override
public boolean updateCredential(RealmModel realm, UserModel user, CredentialInput input) {
log.infov("updating credential: realm={0} user={1}", realm.getId(), user.getUsername());
if (!supportsCredentialType(input.getType()) || !(input instanceof UserCredentialModel)) {
return false;
}
UserCredentialModel cred = (UserCredentialModel) input;
return repository.updateCredentials(user.getUsername(), cred.getChallengeResponse());
}
@Override
public void disableCredentialType(RealmModel realm, UserModel user, String credentialType) {
}
@Override
public Stream<String> getDisableableCredentialTypesStream(RealmModel realm, UserModel user)
{
return Stream.empty();
}
@Override
public void preRemove(RealmModel realm) {
log.infov("pre-remove realm");
}
@Override
public void preRemove(RealmModel realm, GroupModel group) {
log.infov("pre-remove group");
}
@Override
public void preRemove(RealmModel realm, RoleModel role) {
log.infov("pre-remove role");
}
@Override
public void close() {
log.debugv("closing");
}
@Override
public UserModel getUserById(RealmModel realm, String id) {
log.infov("lookup user by id: realm={0} userId={1}", realm.getId(), id);
String externalId = StorageId.externalId(id);
Map<String, String> user = repository.findUserById(externalId);
if (user == null) {
log.debugv("findUserById returned null, skipping creation of UserAdapter, expect login error");
return null;
} else {
return new UserAdapter(session, realm, model, user, allowDatabaseToOverwriteKeycloak);
}
}
@Override
public UserModel getUserByUsername(RealmModel realm, String username) {
log.infov("lookup user by username: realm={0} username={1}", realm.getId(), username);
return repository.findUserByUsername(username).map(u -> new UserAdapter(session, realm, model, u, allowDatabaseToOverwriteKeycloak)).orElse(null);
}
@Override
public UserModel getUserByEmail(RealmModel realm, String email) {
log.infov("lookup user by email: realm={0} email={1}", realm.getId(), email);
return repository.findUserByEmail(email).map(u -> new UserAdapter(session, realm, model, u, allowDatabaseToOverwriteKeycloak)).orElse(null);
}
@Override
public int getUsersCount(RealmModel realm) {
return repository.getUsersCount(null);
}
@Override
public int getUsersCount(RealmModel realm, Set<String> groupIds) {
return repository.getUsersCount(null);
}
@Override
public int getUsersCount(RealmModel realm, String search) {
return repository.getUsersCount(search);
}
@Override
public int getUsersCount(RealmModel realm, String search, Set<String> groupIds) {
return repository.getUsersCount(search);
}
@Override
public int getUsersCount(RealmModel realm, Map<String, String> params) {
return repository.getUsersCount(null);
}
@Override
public int getUsersCount(RealmModel realm, Map<String, String> params, Set<String> groupIds) {
return repository.getUsersCount(null);
}
@Override
public int getUsersCount(RealmModel realm, boolean includeServiceAccount) {
return repository.getUsersCount(null);
}
@Override
public Stream<UserModel> searchForUserStream(RealmModel realm, String search, Integer firstResult,
Integer maxResults)
{
log.infov("list users: realm={0} firstResult={1} maxResults={2}", realm.getId(), firstResult, maxResults);
return internalSearchForUser(search, realm, new PagingUtil.Pageable(firstResult, maxResults));
}
@Override
public Stream<UserModel> searchForUserStream(RealmModel realm, Map<String, String> params, Integer firstResult,
Integer maxResults)
{
log.infov("search for users with params: realm={0} params={1}", realm.getId(), params);
return internalSearchForUser(params.values().stream().findFirst().orElse(null), realm, null);
}
@Override
public Stream<UserModel> searchForUserByUserAttributeStream(RealmModel realm, String attrName, String attrValue)
{
log.infov("search for group members: realm={0} attrName={1} attrValue={2}", realm.getId(), attrName, attrValue);
return Stream.empty();
}
@Override
public Stream<UserModel> getGroupMembersStream(RealmModel realm, GroupModel group, Integer firstResult,
Integer maxResults)
{
log.infov("search for group members with params: realm={0} groupId={1} firstResult={2} maxResults={3}", realm.getId(), group.getId(), firstResult, maxResults);
return Stream.empty();
}
private Stream<UserModel> internalSearchForUser(String search, RealmModel realm, PagingUtil.Pageable pageable) {
return toUserModel(realm, repository.findUsers(search, pageable));
}
@Override
public UserModel addUser(RealmModel realm, String username) {
// from documentation: "If your provider has a configuration switch to turn off adding a user, returning null from this method will skip the provider and call the next one."
return null;
}
@Override
public boolean removeUser(RealmModel realm, UserModel user) {
boolean userRemoved = repository.removeUser();
if (userRemoved) {
log.infov("deleted keycloak user: realm={0} userId={1} username={2}", realm.getId(), user.getId(), user.getUsername());
}
return userRemoved;
}
}
package org.opensingular.dbuserprovider;
import com.google.auto.service.AutoService;
import lombok.extern.jbosslog.JBossLog;
import org.keycloak.Config;
import org.keycloak.component.ComponentModel;
import org.keycloak.component.ComponentValidationException;
import org.keycloak.models.KeycloakSession;
import org.keycloak.models.RealmModel;
import org.keycloak.provider.ProviderConfigProperty;
import org.keycloak.provider.ProviderConfigurationBuilder;
import org.keycloak.storage.UserStorageProviderFactory;
import org.opensingular.dbuserprovider.model.QueryConfigurations;
import org.opensingular.dbuserprovider.persistence.DataSourceProvider;
import org.opensingular.dbuserprovider.persistence.RDBMS;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@JBossLog
@AutoService(UserStorageProviderFactory.class)
public class DBUserStorageProviderFactory implements UserStorageProviderFactory<DBUserStorageProvider> {
private static final String PARAMETER_PLACEHOLDER_HELP = "Use '?' as parameter placeholder character (replaced only once). ";
private static final String DEFAULT_HELP_TEXT = "Select to query all users you must return at least: \"id\". " +
" \"username\"," +
" \"email\" (optional)," +
" \"firstName\" (optional)," +
" \"lastName\" (optional). Any other parameter can be mapped by aliases to a realm scope";
private static final String PARAMETER_HELP = " The %s is passed as query parameter.";
private Map<String, ProviderConfig> providerConfigPerInstance = new HashMap<>();
@Override
public void init(Config.Scope config) {
}
@Override
public void close() {
for (Map.Entry<String, ProviderConfig> pc : providerConfigPerInstance.entrySet()) {
pc.getValue().dataSourceProvider.close();
}
}
@Override
public DBUserStorageProvider create(KeycloakSession session, ComponentModel model) {
ProviderConfig providerConfig = providerConfigPerInstance.computeIfAbsent(model.getId(), s -> configure(model));
return new DBUserStorageProvider(session, model, providerConfig.dataSourceProvider, providerConfig.queryConfigurations);
}
private synchronized ProviderConfig configure(ComponentModel model) {
log.infov("Creating configuration for model: id={0} name={1}", model.getId(), model.getName());
ProviderConfig providerConfig = new ProviderConfig();
String user = model.get("user");
String password = model.get("password");
String url = model.get("url");
RDBMS rdbms = RDBMS.getByDescription(model.get("rdbms"));
providerConfig.dataSourceProvider.configure(url, rdbms, user, password, model.getName());
providerConfig.queryConfigurations = new QueryConfigurations(
model.get("count"),
model.get("listAll"),
model.get("findById"),
model.get("findByUsername"),
model.get("findByEmail"),
model.get("findBySearchTerm"),
model.get("findPasswordHash"),
model.get("hashFunction"),
rdbms,
model.get("allowKeycloakDelete", false),
model.get("allowDatabaseToOverwriteKeycloak", false)
);
return providerConfig;
}
@Override
public void validateConfiguration(KeycloakSession session, RealmModel realm, ComponentModel model) throws ComponentValidationException {
try {
ProviderConfig old = providerConfigPerInstance.put(model.getId(), configure(model));
if (old != null) {
old.dataSourceProvider.close();
}
} catch (Exception e) {
throw new ComponentValidationException(e.getMessage(), e);
}
}
@Override
public String getId() {
return "singular-db-user-provider";
}
@Override
public List<ProviderConfigProperty> getConfigProperties() {
return ProviderConfigurationBuilder.create()
//DATABASE
.property()
.name("url")
.label("JDBC URL")
.helpText("JDBC Connection String")
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("jdbc:jtds:sqlserver://server-name/database_name;instance=instance_name")
.add()
.property()
.name("user")
.label("JDBC Connection User")
.helpText("JDBC Connection User")
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("user")
.add()
.property()
.name("password")
.label("JDBC Connection Password")
.helpText("JDBC Connection Password")
.type(ProviderConfigProperty.PASSWORD)
.defaultValue("password")
.add()
.property()
.name("rdbms")
.label("RDBMS")
.helpText("Relational Database Management System")
.type(ProviderConfigProperty.LIST_TYPE)
.options(RDBMS.getAllDescriptions())
.defaultValue(RDBMS.SQL_SERVER.getDesc())
.add()
.property()
.name("allowKeycloakDelete")
.label("Allow Keycloak's User Delete")
.helpText("By default, clicking Delete on a user in Keycloak is not allowed. Activate this option to allow to Delete Keycloak's version of the user (does not touch the user record in the linked RDBMS), e.g. to clear synching issues and allow the user to be synced from scratch from the RDBMS on next use, in Production or for testing.")
.type(ProviderConfigProperty.BOOLEAN_TYPE)
.defaultValue("false")
.add()
.property()
.name("allowDatabaseToOverwriteKeycloak")
.label("Allow DB Attributes to Overwrite Keycloak")
// Technical details for the following comment: we aggregate both the existing Keycloak version and the DB version of an attribute in a Set, but since e.g. email is not a list of values on the Keycloak User, the new email is never set on it.
.helpText("By default, once a user is loaded in Keycloak, its attributes (e.g. 'email') stay as they are in Keycloak even if an attribute of the same name now returns a different value through the query. Activate this option to have all attributes set in the SQL query to always overwrite the existing user attributes in Keycloak (e.g. if Keycloak user has email 'test@test.com' but the query fetches a field named 'email' that has a value 'example@exemple.com', the Keycloak user will now have email attribute = 'example@exemple.com'). This behavior works with NO_CAHCE configuration. In case you set this flag under a cached configuration, the user attributes will be reload if: 1) the cached value is older than 500ms and 2) username or e-mail does not match cached values.")
.type(ProviderConfigProperty.BOOLEAN_TYPE)
.defaultValue("false")
.add()
//QUERIES
.property()
.name("count")
.label("User count SQL query")
.helpText("SQL query returning the total count of users")
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select count(*) from users")
.add()
.property()
.name("listAll")
.label("List All Users SQL query")
.helpText(DEFAULT_HELP_TEXT)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select \"id\"," +
" \"username\"," +
" \"email\"," +
" \"firstName\"," +
" \"lastName\"," +
" \"cpf\"," +
" \"fullName\" from users ")
.add()
.property()
.name("findById")
.label("Find user by id SQL query")
.helpText(DEFAULT_HELP_TEXT + String.format(PARAMETER_HELP, "user id") + PARAMETER_PLACEHOLDER_HELP)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select \"id\"," +
" \"username\"," +
" \"email\"," +
" \"firstName\"," +
" \"lastName\"," +
" \"cpf\"," +
" \"fullName\" from users where \"id\" = ? ")
.add()
.property()
.name("findByUsername")
.label("Find user by username SQL query")
.helpText(DEFAULT_HELP_TEXT + String.format(PARAMETER_HELP, "user username") + PARAMETER_PLACEHOLDER_HELP)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select \"id\"," +
" \"username\"," +
" \"email\"," +
" \"firstName\"," +
" \"lastName\"," +
" \"cpf\"," +
" \"fullName\" from users where \"username\" = ? ")
.add()
.property()
.name("findByEmail")
.label("Find user by email SQL query")
.helpText(DEFAULT_HELP_TEXT + String.format(PARAMETER_HELP, "user email") + PARAMETER_PLACEHOLDER_HELP)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select \"id\"," +
" \"username\"," +
" \"email\"," +
" \"firstName\"," +
" \"lastName\"," +
" \"cpf\"," +
" \"fullName\" from users where \"email\" = ? ")
.add()
.property()
.name("findBySearchTerm")
.label("Find user by search term SQL query")
.helpText(DEFAULT_HELP_TEXT + String.format(PARAMETER_HELP, "search term") + PARAMETER_PLACEHOLDER_HELP)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select \"id\"," +
" \"username\"," +
" \"email\"," +
" \"firstName\"," +
" \"lastName\"," +
" \"cpf\"," +
" \"fullName\" from users where upper(\"username\") like (?) or upper(\"email\") like (?) or upper(\"fullName\") like (?)")
.add()
.property()
.name("findPasswordHash")
.label("Find password hash (blowfish or hash digest hex) SQL query")
.helpText(DEFAULT_HELP_TEXT + String.format(PARAMETER_HELP, "user username") + PARAMETER_PLACEHOLDER_HELP)
.type(ProviderConfigProperty.STRING_TYPE)
.defaultValue("select hash_pwd from users where \"username\" = ? ")
.add()
.property()
.name("hashFunction")
.label("Password hash function")
.helpText("Hash type used to match passwrod (md* e sha* uses hex hash digest)")
.type(ProviderConfigProperty.LIST_TYPE)
.options("Blowfish (bcrypt)", "MD2", "MD5", "SHA-1", "SHA-256", "SHA3-224", "SHA3-256", "SHA3-384", "SHA3-512", "SHA-384", "SHA-512/224", "SHA-512/256", "SHA-512", "PBKDF2-SHA256", "Argon2d", "Argon2i", "Argon2id", "PlainText")
.defaultValue("SHA-1")
.add()
.build();
}
private static class ProviderConfig {
private DataSourceProvider dataSourceProvider = new DataSourceProvider();
private QueryConfigurations queryConfigurations;
}
}
package org.opensingular.dbuserprovider.model;
import org.opensingular.dbuserprovider.persistence.RDBMS;
public class QueryConfigurations {
private final String count;
private final String listAll;
private final String findById;
private final String findByUsername;
private final String findByEmail;
private final String findBySearchTerm;
private final int findBySearchTermParamsCount;
private final String findPasswordHash;
private final String hashFunction;
private final RDBMS RDBMS;
private final boolean allowKeycloakDelete;
private final boolean allowDatabaseToOverwriteKeycloak;
public QueryConfigurations(String count, String listAll, String findById, String findByUsername, String findByEmail, String findBySearchTerm, String findPasswordHash, String hashFunction, RDBMS RDBMS, boolean allowKeycloakDelete, boolean allowDatabaseToOverwriteKeycloak) {
this.count = count;
this.listAll = listAll;
this.findById = findById;
this.findByUsername = findByUsername;
this.findByEmail = findByEmail;
this.findBySearchTerm = findBySearchTerm;
this.findBySearchTermParamsCount = (int)findBySearchTerm.chars().filter(ch -> ch == '?').count();
this.findPasswordHash = findPasswordHash;
this.hashFunction = hashFunction;
this.RDBMS = RDBMS;
this.allowKeycloakDelete = allowKeycloakDelete;
this.allowDatabaseToOverwriteKeycloak = allowDatabaseToOverwriteKeycloak;
}
public RDBMS getRDBMS() {
return RDBMS;
}
public String getCount() {
return count;
}
public String getListAll() {
return listAll;
}
public String getFindById() {
return findById;
}
public String getFindByUsername() {
return findByUsername;
}
public String getFindByEmail() {
return findByEmail;
}
public String getFindBySearchTerm() {
return findBySearchTerm;
}
public int getFindBySearchTermParamsCount() {
return findBySearchTermParamsCount;
}
public String getFindPasswordHash() {
return findPasswordHash;
}
public String getHashFunction() {
return hashFunction;
}
public boolean isArgon2() {
return hashFunction.contains("Argon2");
}
public boolean isBlowfish() {
return hashFunction.toLowerCase().contains("blowfish");
}
public boolean isPlainText() {
return hashFunction.contains("PlainText");
}
public boolean getAllowKeycloakDelete() {
return allowKeycloakDelete;
}
public boolean getAllowDatabaseToOverwriteKeycloak() {
return allowDatabaseToOverwriteKeycloak;
}
}
package org.opensingular.dbuserprovider.model;
import lombok.extern.jbosslog.JBossLog;
import org.apache.commons.lang3.StringUtils;
import org.keycloak.component.ComponentModel;
import org.keycloak.models.KeycloakSession;
import org.keycloak.models.RealmModel;
import org.keycloak.storage.StorageId;
import org.keycloak.storage.adapter.AbstractUserAdapterFederatedStorage;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Objects;
import java.util.Set;
import java.util.stream.Collectors;
@JBossLog
public class UserAdapter extends AbstractUserAdapterFederatedStorage {
private final String keycloakId;
private String username;
public UserAdapter(KeycloakSession session, RealmModel realm, ComponentModel model, Map<String, String> data, boolean allowDatabaseToOverwriteKeycloak) {
super(session, realm, model);
this.keycloakId = StorageId.keycloakId(model, data.get("id"));
this.username = data.get("username");
try {
Map<String, List<String>> attributes = this.getAttributes();
for (Entry<String, String> e : data.entrySet()) {
Set<String> newValues = new HashSet<>();
if (!allowDatabaseToOverwriteKeycloak) {
List<String> attribute = attributes.get(e.getKey());
if (attribute != null) {
newValues.addAll(attribute);
}
}
newValues.add(StringUtils.trimToNull(e.getValue()));
this.setAttribute(e.getKey(), newValues.stream().filter(Objects::nonNull).collect(Collectors.toList()));
}
} catch(Exception e) {
log.errorv(e, "UserAdapter constructor, username={0}", this.username);
}
}
@Override
public String getId() {
return keycloakId;
}
@Override
public String getUsername() {
return username;
}
@Override
public void setUsername(String username) {
this.username = username;
}
}
package org.opensingular.dbuserprovider.persistence;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import lombok.extern.jbosslog.JBossLog;
import org.apache.commons.lang3.StringUtils;
import javax.sql.DataSource;
import java.io.Closeable;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Optional;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
@JBossLog
public class DataSourceProvider implements Closeable {
private static final SimpleDateFormat SIMPLE_DATE_FORMAT = new SimpleDateFormat("dd-MM-YYYY HH:mm:ss");
private ExecutorService executor = Executors.newFixedThreadPool(1);
private HikariDataSource hikariDataSource;
public DataSourceProvider() {
}
synchronized Optional<DataSource> getDataSource() {
return Optional.ofNullable(hikariDataSource);
}
public void configure(String url, RDBMS rdbms, String user, String pass, String name) {
HikariConfig hikariConfig = new HikariConfig();
hikariConfig.setUsername(user);
hikariConfig.setPassword(pass);
hikariConfig.setPoolName(StringUtils.capitalize("SINGULAR-USER-PROVIDER-" + name + SIMPLE_DATE_FORMAT.format(new Date())));
hikariConfig.setJdbcUrl(url);
hikariConfig.setConnectionTestQuery(rdbms.getTestString());
hikariConfig.setDriverClassName(rdbms.getDriver());
HikariDataSource newDS = new HikariDataSource(hikariConfig);
newDS.validate();
HikariDataSource old = this.hikariDataSource;
this.hikariDataSource = newDS;
disposeOldDataSource(old);
}
private void disposeOldDataSource(HikariDataSource old) {
executor.submit(() -> {
try {
if (old != null) {
old.close();
}
} catch (Exception e) {
log.error(e.getMessage(), e);
}
});
}
@Override
public void close() {
executor.shutdownNow();
if (hikariDataSource != null) {
hikariDataSource.close();
}
}
}
package org.opensingular.dbuserprovider.persistence;
import org.hibernate.dialect.Dialect;
import org.hibernate.dialect.MySQLDialect;
import org.hibernate.dialect.OracleDialect;
import org.hibernate.dialect.PostgreSQLDialect;
import org.hibernate.dialect.SQLServerDialect;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
public enum RDBMS {
POSTGRESQL("PostgreSQL 12+", org.postgresql.Driver.class.getName(), "SELECT 1", new PostgreSQLDialect()),
MYSQL("MySQL 8+", com.mysql.cj.jdbc.Driver.class.getName(), "SELECT 1", new MySQLDialect()),
ORACLE("Oracle 19+", oracle.jdbc.OracleDriver.class.getName(), "SELECT 1 FROM DUAL", new OracleDialect()),
SQL_SERVER("MS SQL Server 2012+ (jtds)", net.sourceforge.jtds.jdbc.Driver.class.getName(), "SELECT 1", new SQLServerDialect());
private final String desc;
private final String driver;
private final String testString;
private final Dialect dialect;
RDBMS(String desc, String driver, String testString, Dialect dialect) {
this.desc = desc;
this.driver = driver;
this.testString = testString;
this.dialect = dialect;
}
public static RDBMS getByDescription(String desc) {
for (RDBMS value : values()) {
if (value.desc.equals(desc)) {
return value;
}
}
return null;
}
public Dialect getDialect() {
return dialect;
}
public static List<String> getAllDescriptions() {
return Arrays.stream(values()).map(RDBMS::getDesc).collect(Collectors.toList());
}
public String getDesc() {
return desc;
}
public String getDriver() {
return driver;
}
public String getTestString() {
return testString;
}
}
package org.opensingular.dbuserprovider.persistence;
import java.security.MessageDigest;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Optional;
import java.util.Set;
import java.util.function.Function;
import javax.sql.DataSource;
import org.apache.commons.codec.binary.Hex;
import org.apache.commons.codec.binary.StringUtils;
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.lang3.NotImplementedException;
import org.mindrot.jbcrypt.BCrypt;
import org.opensingular.dbuserprovider.DBUserStorageException;
import org.opensingular.dbuserprovider.model.QueryConfigurations;
import org.opensingular.dbuserprovider.util.PBKDF2SHA256HashingUtil;
import org.opensingular.dbuserprovider.util.PagingUtil;
import org.opensingular.dbuserprovider.util.PagingUtil.Pageable;
import com.google.common.collect.ImmutableMap;
import de.mkammerer.argon2.Argon2;
import de.mkammerer.argon2.Argon2Factory;
import de.mkammerer.argon2.Argon2Factory.Argon2Types;
import lombok.extern.jbosslog.JBossLog;
@JBossLog
public class UserRepository {
private static final Map<String, Argon2Types> ARGON2TYPES = ImmutableMap.of(
"Argon2d", Argon2Types.ARGON2d,
"Argon2i", Argon2Types.ARGON2i,
"Argon2id", Argon2Types.ARGON2id
);
private static final Map<Argon2Types, Argon2> ARGON2 = ImmutableMap.of(
Argon2Types.ARGON2d, Argon2Factory.create(Argon2Types.ARGON2d),
Argon2Types.ARGON2i, Argon2Factory.create(Argon2Types.ARGON2i),
Argon2Types.ARGON2id, Argon2Factory.create(Argon2Types.ARGON2id)
);
private DataSourceProvider dataSourceProvider;
private QueryConfigurations queryConfigurations;
public UserRepository(DataSourceProvider dataSourceProvider, QueryConfigurations queryConfigurations) {
this.dataSourceProvider = dataSourceProvider;
this.queryConfigurations = queryConfigurations;
}
private <T> T doQuery(String query, Pageable pageable, Function<ResultSet, T> resultTransformer, Object... params) {
Optional<DataSource> dataSourceOpt = dataSourceProvider.getDataSource();
if (dataSourceOpt.isPresent()) {
DataSource dataSource = dataSourceOpt.get();
try (Connection c = dataSource.getConnection()) {
if (pageable != null) {
query = PagingUtil.formatScriptWithPageable(query, pageable, queryConfigurations.getRDBMS());
}
log.infov("Query: {0} params: {1} ", query, Arrays.toString(params));
try (PreparedStatement statement = c.prepareStatement(query)) {
if (params != null) {
for (int i = 1; i <= params.length; i++) {
statement.setObject(i, params[i - 1]);
}
}
try (ResultSet rs = statement.executeQuery()) {
return resultTransformer.apply(rs);
}
}
} catch (SQLException e) {
log.error(e.getMessage(), e);
}
return null;
}
return null;
}
private List<Map<String, String>> readMap(ResultSet rs) {
try {
List<Map<String, String>> data = new ArrayList<>();
Set<String> columnsFound = new HashSet<>();
for (int i = 1; i <= rs.getMetaData().getColumnCount(); i++) {
String columnLabel = rs.getMetaData().getColumnLabel(i);
columnsFound.add(columnLabel);
}
while (rs.next()) {
Map<String, String> result = new HashMap<>();
for (String col : columnsFound) {
result.put(col, rs.getString(col));
}
data.add(result);
}
return data;
} catch (Exception e) {
throw new DBUserStorageException(e.getMessage(), e);
}
}
private Integer readInt(ResultSet rs) {
try {
return rs.next() ? rs.getInt(1) : null;
} catch (Exception e) {
throw new DBUserStorageException(e.getMessage(), e);
}
}
private Boolean readBoolean(ResultSet rs) {
try {
return rs.next() ? rs.getBoolean(1) : null;
} catch (Exception e) {
throw new DBUserStorageException(e.getMessage(), e);
}
}
private String readString(ResultSet rs) {
try {
return rs.next() ? rs.getString(1) : null;
} catch (Exception e) {
throw new DBUserStorageException(e.getMessage(), e);
}
}
public List<Map<String, String>> getAllUsers() {
return doQuery(queryConfigurations.getListAll(), null, this::readMap);
}
public int getUsersCount(String search) {
if (search == null || search.isEmpty()) {
return Optional.ofNullable(doQuery(queryConfigurations.getCount(), null, this::readInt)).orElse(0);
} else {
String query = String.format("select count(*) from (%s) count", queryConfigurations.getFindBySearchTerm());
return Optional.ofNullable(doQuery(query, null, this::readInt, searchTermParams(search))).orElse(0);
}
}
private Object[] searchTermParams(String search) {
if (queryConfigurations.getFindBySearchTermParamsCount() == 1)
return new String[] {search};
String[] terms = new String[queryConfigurations.getFindBySearchTermParamsCount()];
Arrays.fill(terms, search);
return terms;
}
public Map<String, String> findUserById(String id) {
return Optional.ofNullable(doQuery(queryConfigurations.getFindById(), null, this::readMap, id))
.orElse(Collections.emptyList())
.stream().findFirst().orElse(null);
}
public Optional<Map<String, String>> findUserByUsername(String username) {
return Optional.ofNullable(doQuery(queryConfigurations.getFindByUsername(), null, this::readMap, username))
.orElse(Collections.emptyList())
.stream().findFirst();
}
public Optional<Map<String, String>> findUserByEmail(String email) {
return Optional.ofNullable(doQuery(queryConfigurations.getFindByEmail(), null, this::readMap, email))
.orElse(Collections.emptyList())
.stream().findFirst();
}
public List<Map<String, String>> findUsers(String search, PagingUtil.Pageable pageable) {
if (search == null || search.isEmpty()) {
return doQuery(queryConfigurations.getListAll(), pageable, this::readMap);
}
return doQuery(queryConfigurations.getFindBySearchTerm(), pageable, this::readMap, searchTermParams(search));
}
public boolean validateCredentials(String username, String password) {
String hash = Optional.ofNullable(doQuery(queryConfigurations.getFindPasswordHash(), null, this::readString, username)).orElse("");
if (queryConfigurations.isBlowfish()) {
return !hash.isEmpty() && BCrypt.checkpw(password, hash);
} else if (queryConfigurations.isArgon2()) {
return !hash.isEmpty() && ARGON2.get(ARGON2TYPES.get(queryConfigurations.getHashFunction())).verify(hash, password.toCharArray());
} else if (queryConfigurations.isPlainText()) {
return !hash.isEmpty() && hash.equals(password);
} else {
String hashFunction = queryConfigurations.getHashFunction();
if(hashFunction.equals("PBKDF2-SHA256")){
String[] components = hash.split("\\$");
return new PBKDF2SHA256HashingUtil(password, components[2], Integer.valueOf(components[1])).validatePassword(components[3]);
}
MessageDigest digest = DigestUtils.getDigest(hashFunction);
byte[] pwdBytes = StringUtils.getBytesUtf8(password);
return Objects.equals(Hex.encodeHexString(digest.digest(pwdBytes)), hash);
}
}
public boolean updateCredentials(String username, String password) {
throw new NotImplementedException("Password update not supported");
}
public boolean removeUser() {
return queryConfigurations.getAllowKeycloakDelete();
}
}
package org.opensingular.dbuserprovider.util;
import java.util.Base64;
import java.util.Objects;
import javax.crypto.SecretKey;
import javax.crypto.SecretKeyFactory;
import javax.crypto.spec.PBEKeySpec;
public class PBKDF2SHA256HashingUtil {
private char[] password;
private byte[] salt;
private int iterations;
private static final int keyLength = 256;
/**
* @param password
* @param salt
* @param iterations
*/
public PBKDF2SHA256HashingUtil(String password, String salt, int iterations){
this.password = password.toCharArray();
this.salt = salt.getBytes();
this.iterations = iterations;
}
public boolean validatePassword(String passwordHash){
return Objects.equals(passwordHash, hashPassword());
}
private String hashPassword(){
try {
SecretKeyFactory skf = SecretKeyFactory.getInstance("PBKDF2WithHmacSHA256");
PBEKeySpec spec = new PBEKeySpec(this.password, this.salt, this.iterations, keyLength);
SecretKey key = skf.generateSecret(spec);
return Base64.getEncoder().encodeToString(key.getEncoded());
} catch (Exception e) {
return "";
}
}
}
package org.opensingular.dbuserprovider.util;
import org.hibernate.dialect.Dialect;
import org.hibernate.dialect.pagination.LimitHandler;
import org.hibernate.query.spi.Limit;
import org.opensingular.dbuserprovider.DBUserStorageException;
import org.opensingular.dbuserprovider.persistence.RDBMS;
import java.sql.SQLException;
import java.util.Map;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
public class PagingUtil {
@SuppressWarnings("RegExpRedundantEscape")
private static final Pattern SINGLE_QUESTION_MARK_REGEX = Pattern.compile("(^|[^\\?])(\\?)([^\\?]|$)");
public static class Pageable {
private final int firstResult;
private final int maxResults;
public Pageable(int firstResult, int maxResults) {
this.firstResult = firstResult;
this.maxResults = maxResults;
}
}
public static String formatScriptWithPageable(String query, Pageable pageable, RDBMS RDBMS) {
final Dialect dialect = RDBMS.getDialect();
Limit rowSelection = new Limit();
rowSelection.setFirstRow(pageable.firstResult);
rowSelection.setMaxRows(pageable.maxResults);
String escapedSQL = escapeQuestionMarks(query);
StringBuilder processedSQL;
try {
LimitHandler limitHandler = dialect.getLimitHandler();
processedSQL = new StringBuilder(limitHandler.processSql(escapedSQL, rowSelection));
int col = 1;
PreparedStatementParameterCollector collector = new PreparedStatementParameterCollector();
col += limitHandler.bindLimitParametersAtStartOfQuery(rowSelection, collector, col);
limitHandler.bindLimitParametersAtEndOfQuery(rowSelection, collector, col);
Map<Integer, Object> parameters = collector.getParameters();
for (int i = 1; i <= parameters.keySet().size(); i++) {
Matcher matcher = SINGLE_QUESTION_MARK_REGEX.matcher(processedSQL);
if (matcher.find()) {
String str = String.valueOf(parameters.get(i));
processedSQL.replace(matcher.start(2), matcher.end(2), str);
}
}
return unescapeQuestionMarks(processedSQL.toString());
} catch (SQLException e) {
throw new DBUserStorageException(e.getMessage(), e);
}
}
private static String unescapeQuestionMarks(String sql) {
return sql.replaceAll("\\?\\?", "?");
}
private static String escapeQuestionMarks(String sql) {
return sql.replaceAll("\\?", "??");
}
}
package org.opensingular.dbuserprovider.util;
import java.io.InputStream;
import java.io.Reader;
import java.math.BigDecimal;
import java.net.URL;
import java.sql.Array;
import java.sql.Blob;
import java.sql.Clob;
import java.sql.Connection;
import java.sql.Date;
import java.sql.NClob;
import java.sql.ParameterMetaData;
import java.sql.PreparedStatement;
import java.sql.Ref;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.RowId;
import java.sql.SQLWarning;
import java.sql.SQLXML;
import java.sql.Time;
import java.sql.Timestamp;
import java.util.Calendar;
import java.util.HashMap;
import java.util.Map;
public class PreparedStatementParameterCollector implements PreparedStatement {
private Map<Integer, Object> parameters = new HashMap<>();
Map<Integer, Object> getParameters() {
return parameters;
}
@Override
public ResultSet executeQuery() {
return null;
}
@Override
public int executeUpdate() {
return 0;
}
@Override
public void setNull(int parameterIndex, int sqlType) {
parameters.put(parameterIndex, null);
}
@Override
public void setBoolean(int parameterIndex, boolean x) {
parameters.put(parameterIndex, x);
}
@Override
public void setByte(int parameterIndex, byte x) {
parameters.put(parameterIndex, x);
}
@Override
public void setShort(int parameterIndex, short x) {
parameters.put(parameterIndex, x);
}
@Override
public void setInt(int parameterIndex, int x) {
parameters.put(parameterIndex, x);
}
@Override
public void setLong(int parameterIndex, long x) {
parameters.put(parameterIndex, x);
}
@Override
public void setFloat(int parameterIndex, float x) {
parameters.put(parameterIndex, x);
}
@Override
public void setDouble(int parameterIndex, double x) {
parameters.put(parameterIndex, x);
}
@Override
public void setBigDecimal(int parameterIndex, BigDecimal x) {
parameters.put(parameterIndex, x);
}
@Override
public void setString(int parameterIndex, String x) {
parameters.put(parameterIndex, x);
}
@Override
public void setBytes(int parameterIndex, byte[] x) {
parameters.put(parameterIndex, x);
}
@Override
public void setDate(int parameterIndex, Date x) {
parameters.put(parameterIndex, x);
}
@Override
public void setTime(int parameterIndex, Time x) {
parameters.put(parameterIndex, x);
}
@Override
public void setTimestamp(int parameterIndex, Timestamp x) {
parameters.put(parameterIndex, x);
}
@Override
public void setAsciiStream(int parameterIndex, InputStream x, int length) {
parameters.put(parameterIndex, x);
}
@Override
public void setUnicodeStream(int parameterIndex, InputStream x, int length) {
parameters.put(parameterIndex, x);
}
@Override
public void setBinaryStream(int parameterIndex, InputStream x, int length) {
parameters.put(parameterIndex, x);
}
@Override
public void clearParameters() {
parameters.clear();
}
@Override
public void setObject(int parameterIndex, Object x, int targetSqlType) {
parameters.put(parameterIndex, x);
}
@Override
public void setObject(int parameterIndex, Object x) {
parameters.put(parameterIndex, x);
}
@Override
public boolean execute() {
return false;
}
@Override
public void addBatch() {
}
@Override
public void setCharacterStream(int parameterIndex, Reader reader, int length) {
parameters.put(parameterIndex, reader);
}
@Override
public void setRef(int parameterIndex, Ref x) {
parameters.put(parameterIndex, x);
}
@Override
public void setBlob(int parameterIndex, Blob x) {
parameters.put(parameterIndex, x);
}
@Override
public void setClob(int parameterIndex, Clob x) {
parameters.put(parameterIndex, x);
}
@Override
public void setArray(int parameterIndex, Array x) {
parameters.put(parameterIndex, x);
}
@Override
public ResultSetMetaData getMetaData() {
return null;
}
@Override
public void setDate(int parameterIndex, Date x, Calendar cal) {
parameters.put(parameterIndex, x);
}
@Override
public void setTime(int parameterIndex, Time x, Calendar cal) {
parameters.put(parameterIndex, x);
}
@Override
public void setTimestamp(int parameterIndex, Timestamp x, Calendar cal) {
parameters.put(parameterIndex, x);
}
@Override
public void setNull(int parameterIndex, int sqlType, String typeName) {
parameters.put(parameterIndex, null);
}
@Override
public void setURL(int parameterIndex, URL x) {
parameters.put(parameterIndex, x);
}
@Override
public ParameterMetaData getParameterMetaData() {
return null;
}
@Override
public void setRowId(int parameterIndex, RowId x) {
parameters.put(parameterIndex, x);
}
@Override
public void setNString(int parameterIndex, String value) {
parameters.put(parameterIndex, value);
}
@Override
public void setNCharacterStream(int parameterIndex, Reader value, long length) {
parameters.put(parameterIndex, value);
}
@Override
public void setNClob(int parameterIndex, NClob value) {
parameters.put(parameterIndex, value);
}
@Override
public void setClob(int parameterIndex, Reader reader, long length) {
parameters.put(parameterIndex, reader);
}
@Override
public void setBlob(int parameterIndex, InputStream inputStream, long length) {
parameters.put(parameterIndex, inputStream);
}
@Override
public void setNClob(int parameterIndex, Reader reader, long length) {
parameters.put(parameterIndex, reader);
}
@Override
public void setSQLXML(int parameterIndex, SQLXML xmlObject) {
parameters.put(parameterIndex, xmlObject);
}
@Override
public void setObject(int parameterIndex, Object x, int targetSqlType, int scaleOrLength) {
parameters.put(parameterIndex, x);
}
@Override
public void setAsciiStream(int parameterIndex, InputStream x, long length) {
parameters.put(parameterIndex, x);
}
@Override
public void setBinaryStream(int parameterIndex, InputStream x, long length) {
parameters.put(parameterIndex, x);
}
@Override
public void setCharacterStream(int parameterIndex, Reader reader, long length) {
parameters.put(parameterIndex, reader);
}
@Override
public void setAsciiStream(int parameterIndex, InputStream x) {
parameters.put(parameterIndex, x);
}
@Override
public void setBinaryStream(int parameterIndex, InputStream x) {
parameters.put(parameterIndex, x);
}
@Override
public void setCharacterStream(int parameterIndex, Reader reader) {
parameters.put(parameterIndex, reader);
}
@Override
public void setNCharacterStream(int parameterIndex, Reader value) {
parameters.put(parameterIndex, value);
}
@Override
public void setClob(int parameterIndex, Reader reader) {
parameters.put(parameterIndex, reader);
}
@Override
public void setBlob(int parameterIndex, InputStream inputStream) {
parameters.put(parameterIndex, inputStream);
}
@Override
public void setNClob(int parameterIndex, Reader reader) {
parameters.put(parameterIndex, reader);
}
@Override
public ResultSet executeQuery(String sql) {
return null;
}
@Override
public int executeUpdate(String sql) {
return 0;
}
@Override
public void close() {
}
@Override
public int getMaxFieldSize() {
return 0;
}
@Override
public void setMaxFieldSize(int max) {
}
@Override
public int getMaxRows() {
return 0;
}
@Override
public void setMaxRows(int max) {
}
@Override
public void setEscapeProcessing(boolean enable) {
}
@Override
public int getQueryTimeout() {
return 0;
}
@Override
public void setQueryTimeout(int seconds) {
}
@Override
public void cancel() {
}
@Override
public SQLWarning getWarnings() {
return null;
}
@Override
public void clearWarnings() {
}
@Override
public void setCursorName(String name) {
}
@Override
public boolean execute(String sql) {
return false;
}
@Override
public ResultSet getResultSet() {
return null;
}
@Override
public int getUpdateCount() {
return 0;
}
@Override
public boolean getMoreResults() {
return false;
}
@Override
public void setFetchDirection(int direction) {
}
@Override
public int getFetchDirection() {
//noinspection MagicConstant
return 0;
}
@Override
public void setFetchSize(int rows) {
}
@Override
public int getFetchSize() {
return 0;
}
@Override
public int getResultSetConcurrency() {
//noinspection MagicConstant
return 0;
}
@Override
public int getResultSetType() {
//noinspection MagicConstant
return 0;
}
@Override
public void addBatch(String sql) {
}
@Override
public void clearBatch() {
}
@Override
public int[] executeBatch() {
return new int[0];
}
@Override
public Connection getConnection() {
return null;
}
@Override
public boolean getMoreResults(int current) {
return false;
}
@Override
public ResultSet getGeneratedKeys() {
return null;
}
@Override
public int executeUpdate(String sql, int autoGeneratedKeys) {
return 0;
}
@Override
public int executeUpdate(String sql, int[] columnIndexes) {
return 0;
}
@Override
public int executeUpdate(String sql, String[] columnNames) {
return 0;
}
@Override
public boolean execute(String sql, int autoGeneratedKeys) {
return false;
}
@Override
public boolean execute(String sql, int[] columnIndexes) {
return false;
}
@Override
public boolean execute(String sql, String[] columnNames) {
return false;
}
@Override
public int getResultSetHoldability() {
return 0;
}
@Override
public boolean isClosed() {
return false;
}
@Override
public void setPoolable(boolean poolable) {
}
@Override
public boolean isPoolable() {
return false;
}
@Override
public void closeOnCompletion() {
}
@Override
public boolean isCloseOnCompletion() {
return false;
}
@Override
public <T> T unwrap(Class<T> iface) {
return null;
}
@Override
public boolean isWrapperFor(Class<?> iface) {
return false;
}
}
org.opensingular.dbuserprovider.DBUserStorageProviderFactory
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment