We hope you're enjoying your time at IBM Think 2019 - thank you for dropping by to chat with our team (at booth 598) and now checking our blog. As promised, setting up modern logging for your Kubernetes clusters on IBM Cloud is really easy and in this article we'll take a closer log at IBM Log Analysis with LogDNA and how to use it to log your cloud Kubernetes clusters.
With just a few clicks, IBM Cloud users can log their Kubernetes clusters, Linux servers, and other cloud resources to a fully featured and secure LogDNA instance. Let's get started!
IBM Log Analysis with LogDNA is an IBM Cloud service that provides hosted log management using LogDNA. It lets you collect, analyze, and manage logs in a central location without having to provision or maintain your own logging solution. You can forward logs from your IBM Cloud Kubernetes clusters, servers, and applications in as little as three steps. In addition, you can leverage the IBM Cloud to manage the service, set access controls via IAM, and even archive older logs to IBM Cloud Object Storage.When you provision an IBM Log Analysis with LogDNA instance, you get access to a LogDNA endpoint and web UI hosted on the IBM Cloud. Your logs are stored on the IBM Cloud itself, allowing you to colocate your logging service and applications for greater throughput and control. You get the full benefits of LogDNA—including fast log ingestion and searching, over 30 integrations and ingestion sources, and support for dozens of log formats—with the security and convenience of the IBM cloud.
Before creating an IBM Log Analysis with LogDNA instance, you should meet the following prerequisites.First, create an IBM Cloud Kubernetes Service running Kubernetes version 1.10 or later. When choosing a location for the service, check here to see which region LogDNA is available in, and while you can log resources located in a different region, selecting the same region will simplify the process of managing your resources.Next, make sure your IBM Cloud user account has the necessary IAM permissions. This includes Viewer access to the resource group that the Kubernetes cluster and log service are in, Editor access to the Kubernetes cluster, and Manager access to the logging service.Finally, install the IBM Cloud CLI to your workstation. These let you manage your IBM Cloud resources and Kubernetes cluster remotely. Once your command line tools are configured and connected to your Kubernetes cluster, continue to the next step.
LogDNA uses a Kubernetes DaemonSet to ship logs from Kubernetes clusters to a LogDNA instance. The DaemonSet deploys a copy of the LogDNA agent to each node in the Kubernetes clusters with access to not only container logs, but log files stored on the node itself. Deploying the agent as a DaemonSet ensures that each node in the cluster runs one (and only one) instance of the agent, preventing duplicate log entries while guaranteeing complete coverage of all log-generating components.For IBM Log Analysis with LogDNA, we created a DaemonSet that redirects logs to your private LogDNA instance. Like our original DaemonSet, it only requires your ingestion key and the execution of two commands. It uses LogDNA's ingestion servers for authentication but stores the logs themselves in the IBM Cloud.Start by logging into the IBM Cloud using the IBM Cloud CLI, then change the context to your Kubernetes cluster. Replace CLUSTER-NAME-OR-ID with the actual name or ID of your cluster:$ ibmcloud login api https://api.ng.bluemix.net$ ibmcloud ks cluster-config CLUSTER-NAME-OR-IDWhen the second command completes, it displays a new command that you can use to configure your Kubernetes command line tool (kubectl). The command should look similar to the one below:export KUBECONFIG=/Users/$USER/.bluemix/plugins/container-service/clusters/logdna-cluster/kube-config-hou02-logdna-cluster.ymlCopy this to your command line and run it. This lets you use kubectl to run commands on your IBM Cloud Kubernetes cluster.Next, run the following command to store your LogDNA ingestion key in Kubernetes. Replace YOUR-INGESTION-KEY with your actual ingestion key:$ kubectl create secret generic logdna-agent-key --from-literal=logdna-agent-key=YOUR-INGESTION-KEYFinally, deploy the LogDNA DaemonSet using the following command:$ kubectl create -f https://repo.logdna.com/ibm/prod/logdna-agent-ds-us-south.yamlThis deploys LogDNA agent Pods to each node in your cluster. Once these Pods are running, logs from other Pods, nodes, and internal Kubernetes services will begin appearing in your IBM Cloud Logging with LogDNA instance. You can confirm the health of the LogDNA DaemonSet by running kubectl get ds and looking for logdna-agent, or by accessing your Kubernetes Dashboard via the IBM Cloud web UI.$ kubectl get ds
NAME DESIRED CURRENT READY UP-TO-DATE AVAILABLE NODE SELECTOR AGE
logdna-agent 1 1 1 1 1 <none> 3d</none>
To view your logs, log into the IBM Cloud console, navigate to Observability, and select Logging. Here, you can view your LogDNA ingestion key, change your plan, and learn how to add log sources. You can also manage user access to the service or remove the service from your account. Click View LogDNA to open the LogDNA web UI.The web UI displays all of your Kubernetes logs as they are being generated in real-time. You can access all of the features available in LogDNA including log parsing, unlimited views, comprehensive searching, graphing, and more. LogDNA automatically extracts key fields from each event including its origin Pod, container, node, and Kubernetes namespace.
IBM Log Analysis with LogDNA is a fast, secure, and convenient way to manage your IBM Cloud logs. To learn more about creating your own IBM Log Analysis with LogDNA instance, visit the IBM Cloud Docs. If you're interested in learning more about logging Kubernetes, check out our Kubernetes Logging 101 guide. And if you have any questions, please don't hesitate to contact us.
IBM Log Analysis