linkedin

CloudHub Log Externalization to ELK Cloud using HTTP Appenders

Introduction

Logging is an essential part of monitoring, troubleshooting issues, identifying production errors, and therefore visualizing operational data. Because logs must remain consistent and reliable, teams rely on them in turn to discover meaningful insights. However, CloudHub (except for the Titanium subscription) limits log retention to 100 MB or 30 days. Therefore, a robust logging strategy requires an external log analytics tool to effectively monitor applications. This blog focuses on CloudHub log externalization, explaining how MuleSoft applications can send logs to Elastic Cloud for reliable monitoring and analysis.

In this guide, we use Elastic Cloud as the external logging platform and, at the same time, integrate it with MuleSoft using the Log4j2 HTTP appender to forward Mule application logs. This approach supports logging for both CloudHub and on-premises deployments, thereby improving visibility.

What is the ELK Stack?

The ELK stack is offered by Elastic and is named after 3 open source components for ingesting, parsing, storing and visualising logs:

  • Elasticsearch – This is the storage part of the stack responsible for storing and searching logs. It operates in clusters and as a result scales horizontally with ease. Users interact with it over an API, which allows fast search capabilities.
  • Kibana – Used to navigate and visualize log data stored in Elasticsearch, as well as create charts, metrics, and dashboards.
  • Logstash – This is the tool that takes in logs, parses them and sends them into Elasticsearch. Just like an integration tool, it receives input, performs transformations/filters, and then sends the output to another system.

What is the Elastic Cloud?

Elastic Cloud is the recommended way to consume Elastic products across cloud environments. It enables easy deployment in public or multi-cloud setups and extends Elastic’s value with cloud-native features, while also accelerating secure and scalable results.

  • On-demand computing
  • Pay only for what you use
  • Failover and fault tolerance
  • Common coding
  • Ease of implementation

Prerequisites and Important Considerations

Enable Logging for CloudHub Applications

Before enabling logging for a CloudHub application, you need to disable CloudHub logs first. By default, this option is not available, and therefore, you must raise a support ticket with MuleSoft to enable it.
This step is critical for CloudHub log externalization because external tools rely on custom appenders instead of default CloudHub logs.

CloudHub Log Externalization to ELK Cloud using HTTP Appenders

Important Considerations When Disabling CloudHub Logs

Once you disabled CloudHub logs, MuleSoft is not responsible for below things

  • MuleSoft is not responsible for lost logging data due to misconfiguration of your Log4j appender, and as a result, teams must validate configurations carefully.
  • MuleSoft is also not responsible for misconfigurations that result in performance degradation, running out of disk space, or other side effects.
  • When you disable the default CloudHub application logs, then only the system logs are available. For application worker logs, please check your own application’s logging system. Downloading logs is not an option in this scenario.
  • Only Asynchronous log appenders can be used, Synchronous appenders should not be used.
  • Use asynchronous loggers and not synchronous ones is to avoid threading issues. Synchronous loggers can lock threads waiting for responses.

CloudHub Logging to ELK Using HTTP Appenders

There are two ways you can send logs to ELK

  • HTTP Appender 
  • Socket Appender

Here, we use the HTTP appender for externalizing logs from CloudHub to Elastic Cloud, which allows logs to be streamed reliably and at scale. Through this approach, CloudHub log externalization becomes more flexible and scalable for enterprise monitoring needs.

Setting Up Elastic Cloud

First of all we need to register on https://www.elastic.co/cloud/

  • Figure 1
Cloud

Then we have to login into the registered account,here we are only getting 14 days trail period after that we have to pay for the service we are using.

  • Figure 2

CloudHub Log Externalization to ELK Cloud using HTTP Appenders

When you log in for the first time, it will ask you to create a default ELK deployment, after which ElasticSearch and Kibana are provisioned automatically.

  • Figure 3
Deployment

In the figure 3 we can see the end point of our applications such as Elastic search,Kibana etc. These endpoint are required for logging purpose. It will also create username and password for elastic search, we can note it down or download that credentials in an excel file.

  • Figure 4
Instances

Figure 4, shows the instance configuration where elk has been installed.

Configuring Log4j2 HTTP Appender in Mule Application

After that we have to create sample mule application . Here I have created an application which will send json body through post method and stores the payload in a vaiable and simply print the json payload.

  • Figure 5
Anypoint Studio
  • Now goto project explorer and Open src/main/resoruces/log4j2.xml
  • Then we have add http appender xml tag in the below configuration
<Http name="ELK"

    url="https://elk-deployment-30cf9a.es.us-central1.gcp.cloud.es.io:9243/vlogs/\_doc">

    <JsonLayout compact=*"true" eventEol="true" properties="true" />

    <Property name="Content-Type" value="application/json" />

    <Property name="Authorization" value="Basic ZWxhc3RpYzpESUFsZTZUUndKSld2Sm5FQ3lUUmtHemg=" />

    <PatternLayout pattern="%-5p %d [%t] [processor: %X{processorPath}; event: %X{correlationId}] %c: %m%n" />

</Http>

In the above xml tag the the url given is the elastic search endpoint url which can get from the manage deployment section like in the above figure 3.The vlogs after the url specify the index you want to use.Also we use either  _doc or _create in order to create the Index on Kibana.

The property name Authorization can obtained from the credentials that we got earlier while login into the elastic cloud, Authorization is base64(Username + “:” + password) you can also generate the same using Basic Authentication Header Generator. Also we need to add below xml tag in Configuration/Loggers/AsyncRoot.

    <AsyncRoot level=*"INFO">

    <AppenderRef ref=*"ELK" />

    </AsyncRoot>

Now run and check the mule application in anyppoint studio and generate some logs by hitting the end point, if its giving logs without any http appender error then we are good to go, now we have to check if the index is created or not for that,

Verifying Logs in Kibana

  • We have to go to elastic cloud account, then click on management
  • Figure 6
Homepage
  • Then we will go to this page
  • Figure 7
Stack Management
  • Then we have go to Data views under Kibana then it will go to, DataViews
  • Figure 8
Data views
  • Now click on Create data view, and type the index name that we have give after the url here we have given the index name as “vlogs”, it will automatically created when we deploy the application, now click on create data view.
  • Figure 9
Data view
  • Now click on the Discover option under Analytics, so that you can view indexed log entries.
  • Figure 10
Data view
  • Now we can see that vlog, on the right side we can the logs
  • Figure 11
Elastic code
  • So here our main agenda is to externalize cloudlogs of our mule application.We need to add more loggers like Log4J2CloudhubLogAppender into your log4j2.xml to enable logging on the cloudhub log console of your application.

After adding all the necessary xml tags the log4j file will look like this,

<?xml version="1.0" encoding="utf-8"?>
<Configuration>

    <!--These are some of the loggers you can enable. 
        There are several more you can find in the documentation. 
        Besides this log4j configuration, you can also use Java VM environment variables
        to enable other logs like network (-Djavax.net.debug=ssl or all) and 
        Garbage Collector (-XX:+PrintGC). These will be append to the console, so you will 
        see them in the mule_ee.log file. -->

    <Appenders>
        <RollingFile name="file" fileName="${sys:mule.home}${sys:file.separator}logs${sys:file.separator}elk-sample-api.log"
                 filePattern="${sys:mule.home}${sys:file.separator}logs${sys:file.separator}elk-sample-api-%i.log">
            <PatternLayout pattern="%-5p %d [%t] [processor: %X{processorPath}; event: %X{correlationId}] %c: %m%n"/>
            <SizeBasedTriggeringPolicy size="10 MB"/>
            <DefaultRolloverStrategy max="10"/>
        </RollingFile>
<Http name="ELK"
    url="https://elk-deployment-30cf9a.es.us-central1.gcp.cloud.es.io:9243/vlogs/_doc">
    <JsonLayout compact="true" eventEol="true" properties="true" />
    <Property name="Content-Type" value="application/json" />
    <Property name="Authorization" value="Basic ZWxhc3RpYzpESUFsZTZUUndKSld2Sm5FQ3lUUmtHemg=" />
    <PatternLayout pattern="%-5p %d [%t] [processor: %X{processorPath}; event: %X{correlationId}] %c: %m%n" />
</Http>
<Log4J2CloudhubLogAppender name="CLOUDHUB" addressProvider="com.mulesoft.ch.logging.DefaultAggregatorAddressProvider" applicationContext="com.mulesoft.ch.logging.DefaultApplicationContext" appendRetryIntervalMs="${sys:logging.appendRetryInterval}" appendMaxAttempts="${sys:logging.appendMaxAttempts}" batchSendIntervalMs="${sys:logging.batchSendInterval}" batchMaxRecords="${sys:logging.batchMaxRecords}" memBufferMaxSize="${sys:logging.memBufferMaxSize}" journalMaxWriteBatchSize="${sys:logging.journalMaxBatchSize}" journalMaxFileSize="${sys:logging.journalMaxFileSize}" clientMaxPacketSize="${sys:logging.clientMaxPacketSize}" clientConnectTimeoutMs="${sys:logging.clientConnectTimeout}" clientSocketTimeoutMs="${sys:logging.clientSocketTimeout}" serverAddressPollIntervalMs="${sys:logging.serverAddressPollInterval}" serverHeartbeatSendIntervalMs="${sys:logging.serverHeartbeatSendIntervalMs}" statisticsPrintIntervalMs="${sys:logging.statisticsPrintIntervalMs}">
<PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n"/>
</Log4J2CloudhubLogAppender>
    </Appenders>

    <Loggers>
        <!-- Http Logger shows wire traffic on DEBUG -->
        <!--AsyncLogger name="org.mule.service.http.impl.service.HttpMessageLogger" level="DEBUG"/-->
        <AsyncLogger name="org.mule.service.http" level="WARN"/>
        <AsyncLogger name="org.mule.extension.http" level="WARN"/>

        <!-- Mule logger -->
        <AsyncLogger name="org.mule.runtime.core.internal.processor.LoggerMessageProcessor" level="INFO"/>

        <AsyncRoot level="INFO">
            <AppenderRef ref="file" />
            <AppenderRef ref="ELK" />


        </AsyncRoot>
    </Loggers>

</Configuration>
  • Now we have to export the jar file and save the jar file ,here file name is

elk-sample-api

  • Figure 12
Studio

Deploying the Mule Application to CloudHub

  • Now we have to deploy the mule application in the cloudhub
  • Figure 13
CloudHub Log Externalization to ELK Cloud using HTTP Appenders
  • Here click on disable CloudHub logs,
  • Figure 14
CloudHub Log Externalization to ELK Cloud using HTTP Appenders
  • Click on the two check box and apply changes
  • Figure 15
CloudHub Log Externalization to ELK Cloud using HTTP Appenders
  • Now deploy the application
  • Figure 16
CloudHub Log Externalization to ELK Cloud using HTTP Appenders
  • Wait for the application to get deployed, and once it is available, use the application URL to generate logs.
  • Figure 17
CloudHub Log Externalization to ELK Cloud using HTTP Appenders
  • Now go to elastic cloud account and check the logs like we have done above Figure 11
CloudHub Log Externalization to ELK Cloud using HTTP Appenders

CloudHub logs is shown below

CloudHub Log Externalization to ELK Cloud using HTTP Appenders

Conclusion

This is how CloudHub log externalization can be implemented using Elastic Cloud and the HTTP appender for MuleSoft applications, thereby ultimately improving observability, monitoring, and troubleshooting.

References

https://blogs.mulesoft.com/dev-guides/how-to-tutorials/externalize-logs-to-the-elastic-stack/

https://www.mulesoft.com/anypoint-pricing