Logstash: Data pipeline that helps you process logs and other event data from a variety of systems

Regular desktop PC users don’t really care all that much if the various logs their PC generates are saved in various folders. Who cares? They just want to do stuff, like type their reports, surf the web, and play computer games. But IT departments aren’t regular PC users, so this state of affairs isn’t tolerable at all. They need a pipeline to stream the log data into a central location. And that’s what Logstash does.

And what’s more, Logstash also handles another common problem in log management: log format diversity. Log data reports can come in various formats, so you need a tool that can make sense of them all. And that’s something Logstash does well also.

KEY FEATURES

  • Central data processing: With Logstash, you get a data pipeline that lets you process the logs from various systems. Logstash offers at least 200 plugins so far to help you connect to a wide range of sources. You can then stream the data at scale to your central analytic system.
  • Converting different schema: Often, data isn’t just scattered all across different systems. They also have their own formats. With Logstash, you can deconstruct the data and change them into your common format, before they’re added to your analytics data store. Some applications and infrastructure write logs in their own custom formats. With Logstash, you can easily custom logic to parse these logs at scale.
  • Plugins for custom sources: Logstash was built with extensibility in mind. So it provides an API for rapid plugin development by the Logstash community.

Related Posts