Sign in

Davinder Pal
Senior Software Engineer III ( R&D )

This is a story of building 190+ Ansible Modules which are being later combined into one package called community.missing_collection.


Aug, 2018

I was working on Big Data Platform called “MapR” inspired from Apache Hadoop, It had hell lot of components and those were earlier manged by MapR Cli Tool and It was hard to support it and do things with it. I was tasked to upgrade platform and create/update Automation to do all these stuff because we had 10–15 clusters and I didn’t like to do it manually on all environments. I did manage to create it do upgrade on…

How I turned around things for myself and Make Success for Myself.

Source: Unknown

As, I mentioned in my second article of this series that My PR was stuck and Reviewer wasn’t ready to merge because of One line of Code.

Situation is still the same so far, Let’s not waste more time in that PR, it’s just too much for me.

Let’s talk about, how I learned something new out of it and made something useful for Ansible Community.

There was another bold thing happened in Ansible was to split modules into separate set of collections, some modules will be maintained…

I hope, you have read my part 1 of it if not then I would definitely recommend you to read that as well to get full context of it.

Let’s start with next part of story, this part will be quite interesting if you are a Ansible Community Developer.

last PR:

Finally, a reviewer was interested in reviewing the code after 2 years. He started reviewing my code on Sep, 2020 and it continued till Dec, 2020 but on Dec, Ansible Bot blocked my PR saying it should be rebased with ansible again so it can be…

It all started when I was creating new AWS region deployment for a project, it took me a week to understand the process and implement it but that’s not important for this article, let’s move to interesting topic. My Project was storing all its credentials to AWS Parameter Store only because it works with most of tools and ease of implementation for storing sensitive data.

As per my memory, I never thought about DR of AWS Parameter store as was very subtle topic and never came into my mind at least.

AWS Parameter store does have versioning but that’s not…

Honestly, there aren’t many good backup tools for Apache Kafka but why we need a backup tool in first place. As Apache Kafka didn’t had any offsite backup solution or DR solutions for it, there are tricks which can be done but none are 100% reliable.

Example Trick:

create MM1 cluster and it will sync data one-way to another Apache Kafka Cluster Only.


My project/utility create offsite backups in local & cloud storage. Currently it supports Any local Mounted Storage /AWS S3.

Project Source Code: 116davinder/apache-kafka-backup-and-restore

here is sample for input configuration for AWS S3.

{ "BOOTSTRAP_SERVERS": "kafka01:9092,kafka02:9092,kafka03:9092", "TOPIC_NAMES"…

It’s about the painful and unsuccessful PR to Open Source Ansible Project.

May, 2018

I was working on CI / CD Part for the Company and I had to integrate Monitoring Product called #NewRelic. After doing little of research, I found out that NewRelic have Record Deployment. On Each Deployment / Re-Deployment, we can hit NewRelic Record Deployment API.

Ansible have default module called newrelic_deployment which was developed by someone in 2016 or earlier which worked fine for my use case. …

let’s talk a bit about ansible filters.

A Simple Function or Set of Functions which can transform given input into desired output.

Ansible Docs:

Create folder called: filter_plugins

Create a .py file with any name: with below content.

import osclass FilterModule(object):
def filters(self):
return {'lastFolder': self.lastFolder}
def lastFolder(self, path):
if os.path.isdir(path):
return path.split('/')[-1]
return path.split('/')[-2]

please do keep in mind that Class Name & method filters should be there as exactly mentioned above otherwise it won’t work.

Create Sample Playbook to Test it: filter.yml

---- hosts: localhost become: false gather_facts: false vars: - p1: /tmp/t1.txt…

August, 2020 ( Initial Support for Docker Containers )

Random Picture of CMAK

In this Article, I will share things about #CMAK running in #Containers.

I using Docker Runtime as a starting point but same things can be extended for CoreOS / LXC / etc.

As CMAK is standalone application and doesn’t store it’s state in local machine. It’s best candidate for containers.

There are two ways for making image of CMAK.

  1. Building CMAK from Scratch.
  2. Use Pre-build CMAK Jar.

I am going to share details about Building CMAK from Scratch.

Let’s check DockerFile

FROM openjdk:11LABEL AUTHOR="Davinder Pal"

July, 2020 ( Kafka Consumer Group Monitoring )

Consumer Group Monitoring is very Important because it provides stats about consumer applications and how far aka lag the application is from actual stream of data.

First: Yahoo Kafka Manager aka CMAK

Trick 3: Ansible Hacks to Boost Execution Performance

why we need to do this?

  1. Too Increase the speed of Ansible Execution.
  2. Parallel Work on More Nodes.
  3. Faster Re-Run of Ansible.
  4. Less clutter in Ansible Server.

Let me share my default ansible configuration aka ansible.cfg or you can take from GitHub profile

host_key_checking = False
command_warnings = False
forks = 100
timeout = 30
retry_files_enabled = False
ssh_args=-C -o ControlMaster=auto -o ControlPersist=1200s -o BatchMode=yes
control_path = /tmp/ansible-%%h-%%p-%%r

Let’s talk about how above configurations works.

host_key_checking = False

With this Ansible, Ignore SSH Key Verification Step. …

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store