Monday 15 January 2024

How to send Microservices(Application) logs from Azure Kubernetes to Azure Blob Storage

Problem: One of my project running more than 30 Microservices which are pushing logs to Azure Log Analytics. This Analytics caused huge billing even after reducing retention period to minimum supported days(30days) and tried other options like logs purging and deleting logs etc. I contacted MS and got response that, product has been designed that way even after purging logs, it will not reduce cost and purging does not give direct option to delete logs from Azure Log Analytic.

Solution: I have put a solution in place, I used Fluent Bit to collect all logs from AKS and forward those to Azure Blob, this Azure Blob stores logs for further processing and analysis. By this solution I drastically reduced the cost of project. The Fluent Bit runs as Daemon set in AKS and collect logs from every Microservices running in the AKS, format those logs and send to Azure Blob Storage. As an addition to this solution we can have Grafana and Prometheus integrated for better analysis and visualisation of logs.

What is Fluent Bit: Fluent Bit is an open-source telemetry agent specifically designed to efficiently handle the challenges of collecting and processing telemetry data across a wide range of environments, from constrained systems to complex cloud infrastructures. Managing telemetry data from various sources and formats can be a constant challenge, particularly when performance is a critical factor.

What is Azure Blob Storage: Azure Blob storage is Microsoft's Object storage solution for the cloud. Blob storage is optimised for storing massive amount of unstructured data. Unstructured data is the data that does not adhere to a particular data model or definition, such as text or binary data.

Solution Diagram:



How to Achieve it: 
1. Create a Storage Blob in Azure Subscription(Storage Account):   To create Azure blob storage account here is the Azure official document, though its very straight forward process.

2. Create a deployment yaml for Fluent Bit Dament Set with configuration details of Azure Blob Storage where to forward and save applications logs:  Below is the deployment file what I have created and deployed, I have removed sensitive information from the deployment, you need to provide those accordingly.

Deployment File:
apiVersion: v1
kind: ConfigMap
metadata:
name: fluentbit-config
namespace: default
data:
fluent-bit.conf: |
[SERVICE]
Flush 5
Log_Level info
Parsers_File parsers.conf

[INPUT]
name tail
path /var/log/containers/*.log
multiline.parser docker
tag *

[OUTPUT]
name azure_blob
match *
account_name storage account Name
shared_key storage shared key
path akslog
container_name container Name
auto_create_container on
tls on
---
apiVersion: v1
kind: ConfigMap
metadata:
name: fluentbit-parsers
namespace: default
data:
parsers.conf: |
[PARSER]
Name docker
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
Time_Keep On

---
apiVersion: apps/v1
kind: DaemonSet
metadata:
name: fluentbit
namespace: default
labels:
app: fluentbit-logging
spec:
selector:
matchLabels:
name: fluentbit
template:
metadata:
creationTimestamp: null
labels:
name: fluentbit
spec:
volumes:
- name: varlog
hostPath:
path: /var/log
type: ''
- name: varlibdockercontainers
hostPath:
path: /var/lib/docker/containers
type: ''
- name: fluentbit-config
configMap:
name: fluentbit-config
defaultMode: 420
- name: fluentbit-parsers
configMap:
name: fluentbit-parsers
defaultMode: 420
containers:
- name: fluentbit
image: fluent/fluent-bit:latest
resources: {}
volumeMounts:
- name: varlog
mountPath: /var/log
- name: varlibdockercontainers
mountPath: /var/lib/docker/containers
- name: fluentbit-config
mountPath: /fluent-bit/etc/
- name: fluentbit-parsers
mountPath: /fluent-bit/etc/parsers.conf
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
imagePullPolicy: Always
restartPolicy: Always