Many times we receive requests like: Where's my message? Are you sur that my result has been routed to the good destination ? how many orders has been received from our partner?
All monitoring solutions gives only part of answer. We have experienced the use of wireTap in our camel routes, but we have to design a module that consumes these messages and stock them into Database.
The Deal for us was : how we can design our flows without thinking to the monitoring ? how we can audit our integration flows already in production without any impact?
I think that the solution will be : Flow Activity Monitoring
1. Each camel flow deployed in JBoss Fuse are audited by Fuse BAI [REF-1], You have to develop a flow Backend BAI that consumes from endpoint vm:audit and log the content of the event using camel mdc logging.
2. You have to setup a custom logging appender in JBoss Fuse in order to create a specific log for each camelContext deployed:
e.g:
# Appender App
log4j.appender.app = org.apache.log4j.sift.MDCSiftingAppender
log4j.appender.app.key = camel.contextId
log4j.appender.app.default = unknown
log4j.appender.app.appender = org.apache.log4j.RollingFileAppender
log4j.appender.app.appender.layout = org.apache.log4j.PatternLayout
log4j.appender.app.appender.layout.ConversionPattern =% d {ISO8601} |%-5.5p |% X {camel.contextId} |% {X} camel.routeId |% X {camel.exchangeId} |% m% n
log4j.appender.app.appender.file karaf.home = $ {} / log/apps/mediation- $ \ \ {camel.contextId \ \}. log
log4j.appender.app.appender.append = true
log4j.appender.app.appender.maxFileSize = 1MB
log4j.appender.app.appender.maxBackupIndex = 10
3. logstash [REF-2] listening to all logs in the follwing directory ${karaf.home}/log/apps/ ,parse and filter the message to header fields, Properties fields, Body, breadcrumb..etc, after we sent the output to ElasticSearch [REF-3].
4. Kibana for good graphic representation ;)
Here's the schema for the solution :
Conclusion:
- Your mission-critical projects need management and monitoring. Today, is just possible using open source products: logstash, Elasticsearch and Kibana.
- The future previous version of the JBoss Fuse 6.1 allow to send automatically all information of the message to Elasticsearch by enabling insight-camel, Though the downside is this assumes fabric and 6.1;
Best regards,
Abdellatif BOUCHAMA (@a_bouchama)
References
[REF-1] - https://github.com/jboss-fuse/fuse/tree/master/bai
[REF-2] - http://logstash.net/
[REF-3] - http://www.elasticsearch.org/
All monitoring solutions gives only part of answer. We have experienced the use of wireTap in our camel routes, but we have to design a module that consumes these messages and stock them into Database.
The Deal for us was : how we can design our flows without thinking to the monitoring ? how we can audit our integration flows already in production without any impact?
I think that the solution will be : Flow Activity Monitoring
1. Each camel flow deployed in JBoss Fuse are audited by Fuse BAI [REF-1], You have to develop a flow Backend BAI that consumes from endpoint vm:audit and log the content of the event using camel mdc logging.
2. You have to setup a custom logging appender in JBoss Fuse in order to create a specific log for each camelContext deployed:
e.g:
# Appender App
log4j.appender.app = org.apache.log4j.sift.MDCSiftingAppender
log4j.appender.app.key = camel.contextId
log4j.appender.app.default = unknown
log4j.appender.app.appender = org.apache.log4j.RollingFileAppender
log4j.appender.app.appender.layout = org.apache.log4j.PatternLayout
log4j.appender.app.appender.layout.ConversionPattern =% d {ISO8601} |%-5.5p |% X {camel.contextId} |% {X} camel.routeId |% X {camel.exchangeId} |% m% n
log4j.appender.app.appender.file karaf.home = $ {} / log/apps/mediation- $ \ \ {camel.contextId \ \}. log
log4j.appender.app.appender.append = true
log4j.appender.app.appender.maxFileSize = 1MB
log4j.appender.app.appender.maxBackupIndex = 10
3. logstash [REF-2] listening to all logs in the follwing directory ${karaf.home}/log/apps/ ,parse and filter the message to header fields, Properties fields, Body, breadcrumb..etc, after we sent the output to ElasticSearch [REF-3].
4. Kibana for good graphic representation ;)
Here's the schema for the solution :
Conclusion:
- Your mission-critical projects need management and monitoring. Today, is just possible using open source products: logstash, Elasticsearch and Kibana.
- The future previous version of the JBoss Fuse 6.1 allow to send automatically all information of the message to Elasticsearch by enabling insight-camel, Though the downside is this assumes fabric and 6.1;
Best regards,
Abdellatif BOUCHAMA (@a_bouchama)
References
[REF-1] - https://github.com/jboss-fuse/fuse/tree/master/bai
[REF-2] - http://logstash.net/
[REF-3] - http://www.elasticsearch.org/
Aucun commentaire:
Enregistrer un commentaire