Sunday, September 10, 2023

Using .env with maven

Dotfiles are quite popular in ruby and js ecosystem.
They are not popular however in java world as we can use maven (still not 0xDEAD ;) and pass configuration directly from the command line or create script for that. But why not take some standards from the other universes?
Having other ways to configure app provided by eg. spring boot it may be still useful for java developers.

You can use .env file to inject variables into your maven build, then it can get picked by spring boot and passed to configuration eg. while running tests.
Use case is to avoid putting the secrets into version control, while letting local integration tests to run. Without any extra script files and using tool that is known by other developers.

There is of course a maven plugin for that:
<groupid>io.github.mjourard</groupid>
<artifactid>env-file-maven-plugin</artifactid>
I have prepared a small example of usage with groovy and spring boot.
You can find it here: https://github.com/konopski/using-dotenv-with-maven.

It is a simplest spring boot application containing one bean (TheBean) which is using to configuration values. The very standard stuff to obtain a DB connection.
@Component
class TheBean {

    @Value('${spring.datasource.username}')
    String username

    @Value('${spring.datasource.password}')
    String pass
//...
Now spring lets us deliver the actual values using externalized config mechanism . Typically we use properties files and spring profiles. In our example there is a 'dev' profile defined, that provides a clear text password. It is good enough if you remeber to keep the dedicated profile properties file out of your version control.
The file (application-dev.properties) can look like this:
spring.datasource.password=secret
In the main application.properties file we do not define this entry - so in case it is missing (eg. ommiting the profile) the app will not start.

You can also have another mechanism controlling your password. Spring will take a corresponding definition from system environment.
In our case it is delivered by our .env file.
SPRING_DATASOURCE_PASSWORD=t3st_s3cr3t
That is verified in ApplicationTest class:

@SpringBootTest
class ApplicationTests {

    @Autowired TheBean bean

//...

    @Test
    void shouldReadFromDotEnv() {
            bean.pass == "t3st_s3cr3t"
    }
}
We can also use different values by using spring's properties overriding, adding to our .env file:
OVERRIDE_DB_PASS=password_override
And test for that:

@SpringBootTest(
    properties = ['spring.datasource.password=${OVERRIDE_DB_PASS}'] )
class PropsOverrideTest {

    @Autowired TheBean bean

//...

    @Test
    void shouldOverride() {
        bean.pass == "password_override1"
    }

}
For me this approach is quite clean, allows easy sharing among the team and composes quite well with existing environment.
And remember to keep your passwords away from git.

Friday, September 27, 2019

Map or any other option?

I must admit that sometimes we can go too far with using Optionals as silver bullet. Let's take a look at a simple mapper class, defined like this:
class Mapper {
  Optional<Dto> map(Optional<Entity> input) {
    //... implementation
  } 
}
At first passing an Optional instance looks like a good idea - we want to be safe from null values, we express our defensive intent in types. What can go wrong? Let's now take a look at how this mapper may be used.
Entity entity = //...
Dto dto = mapper.map(Optional.of(entity)).get();
Well, typical code involving the mapper is packing the input entity into an instance of Optional, just to get to output value. Even worse could look usage for a collection.
List ents = //...
List<dto> dtos = ents.stream()
  .map(Optional::of)
  .map( x -> mapper.map(x) )
  .map(Optional::get)
  .collect(toList());
Instead of using type system and compiler to our benefit, we lie about our input value - which we actually know that is never null. Not only we do not use the knowledge we already have, but put extra burden of reading through the wrapping on reader's eyes. I do not mention creating extra instances of Optional, and making common use of calling .get() - which is not best practice. I do not want to get familiar to that.
What I also do not like in the code is forcing the user of our mapper to provide object wrapped into the very implementation of Optional. Actually mapper should not care which implementation of Optional I use. There are more than the standard one, and they may have some properties that are more useful for me in the context of my implementation. Last thing, we may observe here is that the API actually mimics what the Optionals are expected to provide - the map method itself.
What we need is just:
Dto map(Entity in) {
//....
}
That simple.
Implementation of very mapping is what should be provided directly by mapper, without requirement for a wrapped value. We should require that value will be non-null. Optionality should be handled in client code somewhere else, we will not tolerate null values anyhow, right? Such mapper can be also easily used when applied to streams or collections. We save machine's memory from wrapping non-null values into Optionals and our precious eyes from reading the code that would cause the latter.

Friday, July 19, 2019

Woobie Doobie - switching database access library in Scala

Doobie is getting more and more popular as a database access tool in modern applications. This year on Scalar conference voting Doobie won over its competition - particularly Slick. Not without a reason. Recently my project also dropped Slick.
Doobie gave us nearly immediately:
  • easier learning curve - it is about plain, old SQL ;
  • more compile time verification - now you do not need to run your query to find out your type mapping is broken ;
  • referential transparency and better integration with effects system used in project ;
  • no hacking needed when you need to select more columns than unfamous 22. One particular issue that came out quite early was a bit tricky to resolve with given message:
    [error] Cannot construct a parameter vector of the following type:
    [error]
    [error]   Boolean(true) :: shapeless.HNil
    [error]
    [error] Because one or more types therein (disregarding HNil) does not have a Put
    [error] instance in scope. Try them one by one in the REPL or in your code:
    [error]
    [error]   scala> Put[Foo]
    [error]
    [error] and find the one that has no instance, then construct one as needed. Refer to
    [error] Chapter 12 of the book of doobie for more information.
    
    Studying chapter 12 did not help. Here is the query that caused the issue:
    sql"select one_col, two_col from TABLE where flag = ${true} "
    
    It turns out that Doobie (actually Shapeless) does not handle properly the inlined true literal. What is needed is just to extract it to separate value:
    val flag = true
    sql"select one_col, two_col from TABLE where flag = ${flag} "
    
    That resolves the problem. It would also work if we added type annotation:
    sql"select one_col, two_col from TABLE where flag = ${true: Boolean} "
    
    It's up to you which way you prefer.
  • Wednesday, October 31, 2018

    Remove one file from git commit to a remote branch

    Ooops, I did it again. Commited and pushed one file too much.
    ? git push --set-upstream origin bugfix/JRASERVER-65811
    
    What now? How to remove the file from a public commit?

    Let's first go back to the base branch.
    ? git checkout -
    
    Switched to branch 'develop'
    Your branch is up to date with 'origin/develop'.
    
    I assume I may have some changes in my local copy so first need to clean it. I am going to delete my local branch.
    ? git branch -D bugfix/JRASERVER-65811
    Deleted branch bugfix/JRASERVER-65811 (was 5e17f72f).
    
    Next step is to synchronize with remote repository and fetch the branch again.
    ? git pull
    remote: Enumerating objects: 119, done.
    remote: Counting objects: 100% (119/119), done.
    remote: Compressing objects: 100% (119/119), done.
    remote: Total 119 (delta 41), reused 0 (delta 0)
    Receiving objects: 100% (119/119), 43.63 KiB | 1.82 MiB/s, done.
    Resolving deltas: 100% (41/41), done.
    From gitlab:project/frontend
       736ac14a..ea0ba57e  bugfix/JRASERVER-65811-allopenissues -> origin/bugfix/JRASERVER-65811-allopenissues
    Already up to date.
    
    I can now switch to the branch back.
    ? git checkout bugfix/JRASERVER-65811
    Switched to a new branch 'bugfix/JRASERVER-65811'
    Branch 'bugfix/JRASERVER-65811' set up to track remote branch 'bugfix/JRASERVER-65811' from 'origin'.
    
    Time to bring the last commit into staging.
    ? git reset --soft HEAD^
    
    I can see this was done by showing status.
    ? git status
    On branch bugfix/JRASERVER-65811
    Your branch is behind 'origin/bugfix/JRASERVER-65811' by 1 commit, and can be fast-forwarded.
      (use "git pull" to update your local branch)
    
    Changes to be committed:
      (use "git reset HEAD ..." to unstage)
    
            new file:   src/app/lktree/lktree.component.spec.ts
            modified:   src/app/modules/spaces/data/space-datasource.ts
            modified:   src/app/modules/spaces/document-details/document-details.component.html
            modified:   src/app/modules/spaces/spaces-tree/spaces-tree.component.ts
            modified:   src/app/modules/user-context/data/user-context-datasource.ts
    
    The first of files is the one I want to extract and unstage from to working copy. This simply undo git add.
    ? git reset src/app/lktree/lktree.component.spec.ts
    
    Yeah! My changes are now in state I wanted!
    ? git status
    On branch bugfix/JRASERVER-65811
    Your branch is behind 'origin/bugfix/JRASERVER-65811' by 1 commit, and can be fast-forwarded.
      (use "git pull" to update your local branch)
    
    Changes to be committed:
      (use "git reset HEAD ..." to unstage)
    
            modified:   src/app/modules/spaces/data/space-datasource.ts
            modified:   src/app/modules/spaces/document-details/document-details.component.html
            modified:   src/app/modules/spaces/spaces-tree/spaces-tree.component.ts
            modified:   src/app/modules/user-context/data/user-context-datasource.ts
    
    Untracked files:
      (use "git add ..." to include in what will be committed)
    
            src/app/lktree/
    
    I can commit again - this time the right files.
    ? git commit src/app/modules/spaces/ src/app/modules/user-context/data/user-context-datasource.ts -m "JRASERVER-65811: allopenissues"
    [bugfix/JRASERVER-65811 80002b42] JRASERVER-65811: allopenissues
     4 files changed, 60 insertions(+), 25 deletions(-)
    
    Last commit is to overwrite the remote branch. Do not do it at home ;)
    ? git push --force
    

    Friday, June 22, 2018

    Authenticating with deploy keys in Jenkins pipelines

    While using M$ github you may use deploy keys dedicated to a specific repository instead of giving your private key to Jenkins. And yes, it is possible to use deploy key in Jenkins pipelines.

    To be able to manage your ssh identity you need first to install sshagent plugin.




    BTW If you are running Jenkins instance on M$ windows machine remember to add sshagent (eg. from your git distribution) to your %PATH%.



    Generate a key pair.
    ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
    


    Goto Credentials in Jenkins left-side main menu. Add credentials of type 'SSH Username with private key'. You can paste the created private key into text area.



    In M$ github repository settings now you can add corresponding public key.



    In your pipeline code you can use credentials when you surround eg. git calls with sshagent block.

                    sshagent(credentials: ['throw-me-away-key']) {
                        bat """git pull origin master"""
                    }
    



    If you get errors make sure that you are not using friendly name but right ID of credential in Jenkins.


    10:09:04 FATAL: [ssh-agent] Could not find specified credentials
    10:09:04 [ssh-agent] Looking for ssh-agent implementation...
    10:09:04 [ssh-agent]   Exec ssh-agent (binary ssh-agent on a remote machine)
    10:09:04 $ ssh-agent
    10:09:04 SSH_AUTH_SOCK=/tmp/ssh-vCKYmwW5gfvP/agent.5592
    10:09:04 SSH_AGENT_PID=5572
    10:09:04 [ssh-agent] Started.
    10:09:04 [original] Running batch script
    10:09:04 
    10:09:04 C:\Program Files (x86)\Jenkins\workspace\lk-pipeline-0\original>git pull 
    10:09:06 $ ssh-agent -k
    10:09:06 git@github.com: Permission denied (publickey).
    10:09:06 fatal: Could not read from remote repository.
    10:09:06 
    10:09:06 Please make sure you have the correct access rights
    10:09:06 and the repository exists.
    10:09:06 unset SSH_AUTH_SOCK;
    10:09:06 unset SSH_AGENT_PID;
    10:09:06 echo Agent pid 5572 killed;
    10:09:06 [ssh-agent] Stopped.
    

    Friday, January 12, 2018

    OOP is not dead: Elegant Objects

    Recently I have observed a growing wave of what I describe as kind of anti-design movement. Experienced people find out they have been doing it wrong. One say: TDD is dead, other: we do not need interfaces. Script kiddies Dynamic language programmers claim strongly typed languages are broken. It comes even to a very last discovery that short methods are pure evil!
    I've just spoken to a programmer, who had nearly 20 years of experience with Java, yet trying to convince me that interfaces are useless, just because for that long time he had never written an alternative implementation of method. In same conversation however, he praised ORMs for possibility of painless exchanging of DB engine!
    The leitmotiv of this new heresy is that we can easily cut off the effort introduced by OOP, design patterns, TDD, DDD or any other, until now considered a good practice and focus on delivering just what matters - the `business value` itself. So that we can successfully bill the client and go to the next project. This sounds good. That may make sense. If your project is not going to be maintained for years, please feel free to skip paradigms. Use spaghetti architecture. Use mixed javascript, php, perl and have fun with testing on production. Building a prototype is a kind of development. Write once and dispose the code soon. But in project that is planned for more than three months this approach is just as wrong like murdering your client with an axe and taking their money. Just like Vikings did.

    Wikingowie (Wolin 2018 04)

    It is all about maintainability. If you use dynamic typing but do not cover your app with test, expect your work will get broken by colleagues siting next to you and following your discipline. If you do not use interfaces - I expect I know already how your tests look like. Like `null`. If you prefer not to extract method I know how lightweight your code is. I have seen java class source code that had over 1 megabyte! Imagine how fast you can deliver business value in such a project. If you think design is a waste - yeah... I was there, I have seen important application presenting different results on each subpage, thanks to `if` pyramid of doom, copied, pasted and happily living its own life. It does cost. Believe me. Technical debt is not something that will disappear by just forgetting the matter.


    Elegant Objects by Yegor Bugayenko gives you some kind of framework. By following author's 23 pieces of advice you can learn how to use Object Oriented Paradigm in real language like Java. Yegor conducts personalized object (Mr. Object) through journey of life. The very similar way as we personalize actors using eg. Akka and follow some important rules - not forced by language itself. This concept seems to be core of author's approach to improve maintainability of software we create. When we interact with a person, we stick to some polite way of communication, assuming we are not talking to an idiot. Same holds for our objects.

    There are some other helpful tricks and lot of strong opinions on using some code constructs and patterns/antipatterns. The book gives number of easy to remember code examples. Explains or recall you why constructor injection is the only right way and why `static` is not welcomed (even if we do now FP in Java). If you are more interested in the content of the book - I can recommend Tomek's review and page of Yegor Bugayenko.

    Some of author's rules I think may be a bit controversial and exaggerated - at least in context of my last project. The only point I really didn't like was actually Yegor's approach to C++, a bit unfair as this language can be as object oriented as Java is.

    I am not imposing that there is only one way to create maintainable software. Whether you will use header files to separate contract from implementation or will you use java interfaces or maybe free monads that define program separated from its interpretation. It is up to you. But please, take some set of rules, and keep to them in consistent manner. That is what your client pays you for.

    Friday, December 8, 2017

    Always say "Junk fix", never "quick fix"

    software development area - names are almost only context that can be added to a computation. By using common vocabulary we build shared understanding of how things work and what is their meaning. When we communicate with outside world using our language we also put a corner stone under the how eg. business people will be able to visualize system we are responsible for.

    One of the greatest examples of metaphor used to explain the nature of projects to stakeholders is a term coined by Ward Cunningham - technical debt. In very illustrative way it mimics well known figure of a financial debt to explain the costs or possible impact that unmaintained software brings to owners. In the opposite pole I can see a term used so often, while bringing so much confusion and leading to misunderstandings and even conflicts in many organisations. I mean the "quick fix". Quick and dirty, when we tend often to forget how dirty it was, leaving it in our software just to make it rot. Please keep in mind each time you say "quick fix" what you really communicate is "cheap fix". Is it what you meant? Will it be that quick in the final picture? Or it is just half done, ad hoc patch that will not consist solution, but maybe bring counter productive impact to changed process?



    About the false velocity of quick fixes
    Recently I have watched Netflix "Cooked" series of four, visually elaborated stories about food and eating. It was about the real food, that brings value to our life, as opposed to what is often offered to us, mass product of industry, optimized for price only, washed from nearly everything our body needs. The junk food.

    I find this being a perfect metaphor to what is often offered as a "quick fix". This word junk explains exactly nature of the fix without giving a false assumption that anything cheap is being discussed. Let's be clear about it.
    It is a "junk fix", not "quick fix".