After 4 months of preparation, I passed the CKAD exam earlier this week. The aim of this post is not to be another "how I passed" or "exam taking tips" - there's plenty of them available. Rather, I'll share my thoughts about the exam format, and how to bring one's individual competency to the next level.
I was intrigued by the recent update to JFrog Xray's Helm installation documentation around mid February, stating that "JFrog products cannot be joined together if one of them is in a cluster." Upon seeing this, there was only one logical thing to do - invest free time and effort to understand what is the underlying reason.
The casual observer would then wonder - if there are existing RDS offerings for both MySQL and PostgreSQL, why would Amazon Aurora be introduced? To understand its unique selling point, and claims for scalability and cost effectiveness, we need to look at how traditional relational databases handle scaling out.
Planning capacity for a Hadoop cluster is not easy as there are many factors to consider - from the software, hardware, and data aspect. Planning a cluster with too little data capacity and/or processing power may limit the amount of operations/analytics that can be run on it, while planning for every possible scenario may be … Continue reading Apache Hadoop Data Capacity Planning
As part of my part time Specialist Diploma in Big Data Management, we were asked to write a blog post about our thoughts on the trends of big data, and how it can help companies increase their business value. Just thought I'd share it 🙂 Question 1: How does big data help in a company's performance? Data … Continue reading Big Data Business Value