logging

ElasticSearch CLI Tools - Part 1

11 minute read Published: 2019-05-18

While working at Booking.com, I was looking for a solution to logging that matched the ease of use and power as Graphite did for metrics. Reluctant to bring a new technology into production, I talked to co-workers and one mentioned that they were using ElasticSearch in some front-end systems for search and disambiguation. He mentioned hearing there were a few projects using ElasticSearch for storing log data.

This began my love-hate-love relationship with ElasticSearch. I've spent the past 8 years working with ElasticSearch professionally and in my spare time. Graphite and ElasticSearch are two projects that change the game in terms of exploring your data. The countless insights I've gained into system performance, application performance, and system and network security with these tools is unparalleled. Tools like Grafana and Kibana allow you to visualize your data quickly and beautifully. As a system and security engineer, sometimes this isn't enough. I spend most of my day in a terminal and needed something to explore and pivot through the data there.

This is the first part, in a many part series about a tool I created to make ElasticSearch's powerful search interface more accessible from the terminal. This tool has been essential to nearly every incident I've investigated. It was developed with the help, patience, and amazing ideas from co-workers both at Booking.com and now at Craigslist.

In which he authors a book on OSSEC

7 minute read Published: 2013-08-04

In 2004, when I was starting a new job at the National Institute on Aging's Intramural Research Program I began evaluating products to meet FISMA requirements for file integrity monitoring. We already purchased a copy of Tripwire, but I was being driven mad by the volume of alerting from the system. I wanted something open source. I wanted something that would save me time, rather than waste 2 hours a day clicking through a GUI confirming file changes caused by system updates and daily operations.

At the time, I found two projects: Samhain and OSSEC-HIDS. Samhain is a great project that does one thing and does that one thing very well. However, I was buried in a mountain of FISMA compliance requirements and OSSEC offered more than file integrity monitoring; OSSEC offered a framework for distributed analysis of logs, file changes, and other anomalous events in the same open source project.

I now work at Booking.com and manage one of the world's largest distributions of OSSEC-HIDS. My team and I are active contributors to the OSSEC Community. After nearly a decade of experience deploying, managing, and extracting value from OSSEC, I was approached to write a book introducing new users to OSSEC. After 6 months of work, the book has been published!

Instant OSSEC Host-based Intrusion Detection

ElasticSearch for Logging

14 minute read Published: 2012-12-26

We use ElasticSearch at my job for web front-end searches. Performance is critical, and for our purposes, the data is mostly static. We update the search indexes daily, but have no problems running on old indexes for weeks. The majority of the traffic to this cluster is search; it is a "read heavy" cluster. We had some performance hiccups at the beginning, but we worked closely with Shay Bannon of ElasticSearch to eliminate those problems. Now our front end clusters are very reliable, resilient, and fast.

I am now working to implement a centralized logging infrastructure that meets compliance requirements, but is also useful. The goal of the logging infrastructure is to emulate as much of the Splunk functionality as possible. My previous write-up on logging explains why we decided against Splunk.

After evaluating a number of options, I've decided to utilize ElasticSearch as the storage back-end for that system. This type of cluster is very different from the cluster we've implemented for heavy search loads.

Follow-up Central Logging

4 minute read Published: 2012-06-18

The reaction to my Central Logging post has been significantly greater and more positive than I could've expected, so I wanted to recap some of the conversation that came out of this. I am pleasantly surprised by most of the comments on the Hacker News Thread. So, here's a real quick recap of the responses I've received. I will continue this series this weekend with more technical details.

Central Logging with Open Source Software

16 minute read Published: 2012-06-17

I have worn many hats over the past few years: System Administrator, PostgreSQL and MySQL DBA, Perl Programmer, PHP Programmer, Network Administrator, and Security Engineer/Officer. The common thread is having the data I need available, searchable, and visible.

So what data am I talking about? Honestly, everything. System logs, application logs, events, system performance data, and network traffic data are key requirements to making any tough infrastructure decision, if not key to the trivial infrastructure and implementation decisions we have to make everyday.

I'm in the midst of implementing a comprehensive solution, and this post is a brain dump and road map for how I went about it, and why.


find me: