mirror of
https://github.com/zrepl/zrepl.git
synced 2025-02-18 11:21:03 +01:00
- make `logging.GetLogger(ctx, Subsys)` the authoritative `logger.Logger` factory function - the context carries a linked list of injected fields which `logging.GetLogger` adds to the logger it returns - introduce the concept of tasks and spans, also tracked as linked list within ctx - [ ] TODO automatic logging of span begins and ends, with a unique ID stack that makes it easy to follow a series of log entries in concurrent code - ability to produce a chrome://tracing-compatible trace file, either via an env variable or a `zrepl pprof` subcommand - this is not a CPU profile, we already have go pprof for that - but it is very useful to visually inspect where the replication / snapshotter / pruner spends its time ( fixes #307 ) |
||
---|---|---|
.. | ||
analysis.ipynb | ||
README.md | ||
scrape_graylog_csv.bash | ||
zfscmd_logging_scraper.go | ||
zfscmd_logging_scrapter_test.go |
The tool in this package (go run . -h
) scrapes log lines produces by the github.com/zrepl/zrepl/zfs/zfscmd
package
into a stream of JSON objects.
The analysis.ipynb
then runs some basic analysis on the collected log output.
Deps for the scrape_graylog_csv.bash
script
pip install --upgrade git+https://github.com/lk-jeffpeck/csvfilter.git@ec433f14330fbbf5d41f56febfeedac22868a949