By end of 2021, there came a new buzz word “LOG4J”. People who don’t know what log4j is started to talk about it. Jokes a part, log4j teared the world apart because of the security vulnerability that were exploited by Hackers. This security vulnerability was allowing attackers to execute malicious code remotely on a target computer. Which means hackers can easily steal data, plant malware, or take control of the target computer via the Internet.
Update the library usage to the latest released version of log4j, where Apache team has fixed the “known” vulnerabilities.
Switch to different logger e.g. Logback
Logback is a logging framework for mostly Java based applications, and a successor to the popular log4j project. Logback has many improvements over log4j. Just for information, logback is very much like log4j as both the projects were founded by the same developers. Logback is very similar to log4j when it comes to usage.
Maven:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j-version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>${logback-version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback-version}</version>
</dependency>
Gradle:
implementation("org.slf4j:slf4j-api:${slf4j-version}")
implementation("ch.qos.logback:logback-core:${logback-version}")
implementation("ch.qos.logback:logback-classic:${logback-version}")
If JAR files are needed locally then download them from logback download page.
If the application is based on Spring boot then, no additional dependencies are required as Spring boot provides log back support.
logback.xml
file (logback-spring.xml in case of Spring boot) in src\main\resources
.
Sample logback.xml
For more information about Logback configuration, check Link.<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>
%date{"yyyy-MM-dd'T'HH:mm:ss,SSSXXX", UTC} - %yellow([tid:%t])[sid:%X{httpSessionId}][reqid:%X{reqId}] - %green(%level) %cyan([%c]) - %m%n
</Pattern>
</layout>
</appender>
<appender name="appServerRollingFile" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>applogs/shpi-api.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
<fileNamePattern>applogs/$${date:yyyy-MMM}/shpi-api-%d{yyyy-MMM-dd}-%i.log.gz</fileNamePattern>
</rollingPolicy>
<triggeringPolicy class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
<maxFileSize>200MB</maxFileSize>
</triggeringPolicy>
<encoder>
<pattern>%date{"yyyy-MM-dd'T'HH:mm:ss,SSSXXX", UTC} - [sid:%X{httpSessionId}][actor:%X{userId}][reqid:%X{reqId}] - %p [%c] - %m%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT"/>
<appender-ref ref="appServerRollingFile"/>
</root>
</configuration>
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
...
static final Logger LOG = LoggerFactory.getLogger(ClassName.class);
...
{
LOG.warn("Warn Test");
}
- If migrating from Log4j to Logback use this translator tool from logback developers Translator.
Idea behind use of logback is the recent issues with log4j which gave everyone a reality check, that now there is definite need of log4j alternative. May be now is the time to migrate!
Software Engineers from Generation X, Millennials generation must have at least once in their life worked with Jenkins (Back in time was Hudson). Java Stack CI/CD evolved around Jenkins. Most of us atleast once used Jenkins. And Artifactory later Nexus perfectly complements each other.
Jenkins is mostly used for CI/CD, and in some cases as a Batch processor. To satisfy this requirement self hosted Jenkins instances running over AWS or self hosted pods on Kubernetes(K8S), mainly in MASTER -> SLAVE (agents) configuration is a correct choice.
Nexus instance can also be self hosted over AWS or with K8S.
Some of many problems with this setup and some solutions to fix some of them.
Jenkinsfile
without much of documentation is kind of hard.Last but not the least, Jenkins will fail when its needed the most. As most of the teams would rely on Jenkins for CI/CD these problems can grow exponentially and the maintenance causing DevOps precious time.
To make everyone’s life easier moving to GitHub Actions and GitHub Packages could be a solution!
The advantage of migrating to GitHub Action is to use YAML instead of different way of defining the Job with Jenkins. Those YAML files are maintained under .github
folder under the respective project that gives the ownership of the flows to the Project/Module owner. Below are some sample workflows that can be defined.
Build Develop/Main
- For manually building develop
or main
.Build PR
- When ever a PR is raised for a merge on develop
or main
then this workflow is executed. This is runs whenever there is commit on the branch.Publish
- After successful build of develop
or main
, built jar is pushed to GitHub Packages.Release Manual
- Manually releasing & tagging a fixed version.Auto Deploy API to DEV
- After successful built of develop
or main
, changes are deployed to DEVELOPMENT environment.Deploy API to ENV
- Manual Deployment of fixed version of API to different environments.Migration to GitHub Actions and Packages could save lot of frustration with Jenkins-Nexus combo. This will give you a chance to try out latest tools rather than being with old and outdated technologies. The process is may not be ideal for everyone, but it solves most of the issues with Jenkins.
As I said earlier, Stable CI/CD is not a Myth!
Testcontainers is a JVM library that allows users to run and manage Docker images and control them from Java code. The integration test additionally runs external components as real Docker containers.
package com.test;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.context.ApplicationContextInitializer;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.PropertySource;
import org.springframework.test.context.ActiveProfiles;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.support.TestPropertySourceUtils;
import org.springframework.test.web.servlet.request.MockHttpServletRequestBuilder;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.ext.ScriptUtils;
import org.testcontainers.jdbc.JdbcDatabaseDelegate;
import org.testcontainers.junit.jupiter.Testcontainers;
import org.testcontainers.shaded.com.fasterxml.jackson.databind.ObjectMapper;
import org.testcontainers.shaded.com.fasterxml.jackson.databind.SerializationFeature;
import java.util.Optional;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.put;
@Testcontainers
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = {com.test.Application.class})
@ActiveProfiles(AbstractBaseIntergrationTestConfiguration.ACTIVE_PROFILE_NAME_TEST)
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
@ContextConfiguration(initializers = AbstractBaseIntergrationTestConfiguration.DockerPostgreDataSourceInitializer.class)
public abstract class AbstractBaseIntergrationTestConfiguration {
protected static final String JDBC_URL = "jdbc.url=";
protected static final String JDBC_USERNAME = "jdbc.username=";
protected static final String JDBC_PASSWORD = "jdbc.password=";
protected static final String JDBC_DRIVER_CLASS_NAME_ORG_POSTGRESQL_DRIVER = "jdbc.driverClassName=org.postgresql.Driver";
protected static final String ACTIVE_PROFILE_NAME_TEST = "TestContainerTests";
//--
public static PostgreSQLContainer<?> postgreDBContainer;
protected ObjectMapper objectMapper = new ObjectMapper().disable(SerializationFeature.FAIL_ON_EMPTY_BEANS);
static {
// Init DB Script here
postgreDBContainer = new PostgreSQLContainer<>(IntegrationTestConstants.POSTGRESQL_IMAGE);
postgreDBContainer
.withInitScript(IntegrationTestConstants.INIT_DB_SCRIPT)
.withDatabaseName(IntegrationTestConstants.DB_NAME)
.withUsername(IntegrationTestConstants.DB_USERNAME)
.withPassword(IntegrationTestConstants.DB_PASSWORD);
postgreDBContainer.start();
var containerDelegate = new JdbcDatabaseDelegate(postgreDBContainer, "");
// Adding Database scripts here
ScriptUtils.runInitScript(containerDelegate, IntegrationTestConstants.MISSING_TABLES_SQL);
ScriptUtils.runInitScript(containerDelegate, IntegrationTestConstants.SAMPLE_DATA_SQL);
}
// This class adds the DB properties to Testcontainers.
public static class DockerPostgreDataSourceInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
@Override
public void initialize(ConfigurableApplicationContext applicationContext) {
TestPropertySourceUtils.addInlinedPropertiesToEnvironment(
applicationContext,
JDBC_DRIVER_CLASS_NAME_ORG_POSTGRESQL_DRIVER,
JDBC_URL + postgreDBContainer.getJdbcUrl(),
JDBC_USERNAME + postgreDBContainer.getUsername(),
JDBC_PASSWORD + postgreDBContainer.getPassword()
);
}
}
}
@Test
void checkIfUserExistInIdealCase() throws Exception {
final JSONObject request = new JSONObject();
request.put("email", "abc@test.com");
final MockHttpServletRequestBuilder postObject = getPostRequestExecutorBuilder("http://localhost:8080/v1/checkemail/", Optional.empty());
final MvcResult result = mockMvc.perform(postObject.content(request.toString())).andExpect(status().isOk()).andReturn();
final String content = result.getResponse().getContentAsString();
final SyncResponseDto responseDto = objectMapper.readValue(content, SyncResponseDto.class);
assertThat(responseDto.getResponseReturnCode()).isEqualTo(ResponseReturnCode.USER\_EXIST);
}
Pull Request(PR) are often when raised are not properly documented. The best way to document them is to use a consistent template. The Template will help the team to document the PR in a concise way. PR reviewer gets an idea about what to expect in a PR.
Create a file pull_request_template.md
inside a root folder of the repository called ./github
./github/pull_request_template.md
This file is nothing but a template that will be shown on GitHub PR, when the PR is raised.
## Description
<!-- Please write a brief information about PR, what it contains, its purpose -->
## Link to Jira
<!-- If there is a ticket for this -->
## Screenshots
<!-- Please add screenshots -->
## Testing
<!-- How to test PR -->
Click to Download a sample PR Template.
]]>Dependabot provides a way to keep your dependencies up to date. Depending on the configuration, it checks your dependency files for outdated dependencies and opens PRs individually. Then based on requirement PRs can be reviewed and merged.
Dependabot has a limited support for Gradle. Dependabot looks for a build.gradle
or a settings.gradle
in your repo, then scans for outdated dependencies and creates a PR based on available updates.
The issue aries when dependencies are maintained outside of these two files. Dependabot ONLY and ONLY scans build.gradle
or settings.gradle
. Most of the projects would follow this standard of having versions in these files, but remaining ones wont work at all.
There is a workaround to this issue. Follow the steps explained below to tackle this issue.
dependencies.gradle
file to extract all the dependencies. The file name HAS TO BE dependencies.gradle
, otherwise the solution will not work. (version.gradle
is also not supported!)
ext {
// -- PLUGINS
springBootVersion = "2.5.5"
springDependencyManagementVersion = "1.0.11.RELEASE"
....
//-- DEPENDENCIES
....
springFoxBootVersion = "3.0.0"
hibernateVersion = "5.4.31.Final"
c3p0Version = "0.9.5.5"
postgresVersion = "42.2.10"
....
supportDependencies = [
springfox_boot_starter : "io.springfox:springfox-boot-starter:$springFoxBootVersion",
hibernate_entitymanager : "org.hibernate:hibernate-entitymanager:$hibernateVersion",
hibernate_core : "org.hibernate:hibernate-core:$hibernateVersion",
c3p0 : "com.mchange:c3p0:$c3p0Version"
hibernate_java8 : "org.hibernate:hibernate-java8:$hibernateVersion",
postgresql : "org.postgresql:postgresql:$postgresVersion",
....
]
}
build.gradle
to use dependencies.gradle
buildscript {
apply from: 'dependencies.gradle'
}
plugins {
id 'org.springframework.boot' version "${springBootVersion}"
id 'io.spring.dependency-management' version "${springDependencyManagementVersion}"
....
}
dependencies {
....
implementation supportDependencies.springfox_boot_starter
implementation supportDependencies.hibernate_entitymanager
implementation supportDependencies.hibernate_core
implementation supportDependencies.c3p0
....
}
....
.github/dependabot.yml
file to the project.version: 2
updates:
- package-ecosystem: "gradle"
directory: "/"
schedule:
interval: "daily"
Dependabot is an amazing tool, to make sure your project gets latest dependencies. But the support of Gradle as compared to Maven is limited when dependencies are not maintained build.gradle
or settings.gradle
.
If you dont want to maintain the versions in these two files, you can tweak your gradle files in a way that dependabot can scan the project and will find out the issues with the dependencies.
Special Thanks to Sumedh.
Welcome to my world!
]]>