zrepl/zfs/zfscmd/zfscmd-logging-scraper
Christian Schwarz 1ae087bfcf [WIP] add and use tracing API as part of package logging
- make `logging.GetLogger(ctx, Subsys)` the authoritative `logger.Logger` factory function
    - the context carries a linked list of injected fields which
      `logging.GetLogger` adds to the logger it returns
- introduce the concept of tasks and spans, also tracked as linked list within ctx
    - [ ] TODO automatic logging of span begins and ends, with a unique
      ID stack that makes it easy to follow a series of log entries in
      concurrent code
    - ability to produce a chrome://tracing-compatible trace file,
      either via an env variable or a `zrepl pprof` subcommand
        - this is not a CPU profile, we already have go pprof for that
        - but it is very useful to visually inspect where the
          replication / snapshotter / pruner spends its time
          ( fixes #307 )
2020-04-25 11:16:59 +02:00
..
analysis.ipynb zfs: introduce pkg zfs/zfscmd for command logging, status, prometheus metrics 2020-04-05 20:47:25 +02:00
README.md zfs: introduce pkg zfs/zfscmd for command logging, status, prometheus metrics 2020-04-05 20:47:25 +02:00
scrape_graylog_csv.bash zfs: introduce pkg zfs/zfscmd for command logging, status, prometheus metrics 2020-04-05 20:47:25 +02:00
zfscmd_logging_scraper.go [WIP] add and use tracing API as part of package logging 2020-04-25 11:16:59 +02:00
zfscmd_logging_scrapter_test.go [WIP] add and use tracing API as part of package logging 2020-04-25 11:16:59 +02:00

The tool in this package (go run . -h) scrapes log lines produces by the github.com/zrepl/zrepl/zfs/zfscmd package into a stream of JSON objects.

The analysis.ipynb then runs some basic analysis on the collected log output.

Deps for the scrape_graylog_csv.bash script

pip install --upgrade git+https://github.com/lk-jeffpeck/csvfilter.git@ec433f14330fbbf5d41f56febfeedac22868a949