APACHE CASSANDRA9 articles
Working with tombstones can be one of the more frustrating aspects of Apache Cassandra®. Problems with tombstones can happen when a large number of deletes occur within a short period of time, resulting in slower queries and disk bloat. This …
Building Reactive Java Applications with Spring Framework
Pretty much all Java developers are familiar with Spring Pet Clinic. In this article, we have given a walkthrough of the reactive implementation of it using Apache Cassandra.
Kubernetes Data Simplicity: Getting started with K8ssandra
K8ssandra is an open-source distribution of Apache Cassandra for Kubernetes, with a rich set of trusted open-source services and tooling.
Build a Status Dashboard Using Spring Boot and Astra DB
Learn how to leverage the Astra Document API on top of Cassandra to build a dashboard of statuses, with Spring Boot for rendering.
Backing up K8ssandra with MinIO
K8ssandra includes Medusa for Apache Cassandra® to handle backup and restore for your Cassandra nodes. Recently Medusa was upgraded to introduce support for all S3 compatible backends, including MinIO, the popular k8s-native object storage suite. Let’s see how to set up …
Developing an Enterprise-Level Apache Cassandra Sink Connector for Apache Pulsar
Streaming with Apache Pulsar: connecting existing enterprise data sources to Apache Cassandra using Pulsar.
K8ssandra: Production-Ready Platform for Running Apache Cassandra on Kubernetes
K8ssandra aims to provide a “production-ready platform”, and this includes automation for operational tasks such as repairs, backups, and monitoring.
Connect Stateful Workloads Across Kubernetes Clusters
Learn how, with a bit of creativity, we were able to utilize off-the-shelf functionality and components to set up lightweight cross-cluster communication for apps in a multitenant environment.
While this solution is applicable to a somewhat narrow use case, it does come with several important advantages, including shared-nothing architecture, low complexity, low operational burden, and high degree of isolation between tenants.
Best Practices for Data Pipeline Error Handling in Apache NiFi
We have tools such as Apache NiFi, which allow us to design and manage our data pipelines, reducing the amount of custom programming and increasing overall efficiency.
Yet, when it comes to creating them, a key and often neglected aspect is minimizing potential errors.
Although many factors influence a NiFi data pipeline, three important ones are: understanding the nature of your data sources and targets, minimizing maintenance, and following best practices.