mirror of
https://github.com/zrepl/zrepl.git
synced 2024-11-22 08:23:50 +01:00
10a14a8c50
package trace: - introduce the concept of tasks and spans, tracked as linked list within ctx - see package-level docs for an overview of the concepts - **main feature 1**: unique stack of task and span IDs - makes it easy to follow a series of log entries in concurrent code - **main feature 2**: ability to produce a chrome://tracing-compatible trace file - either via an env variable or a `zrepl pprof` subcommand - this is not a CPU profile, we already have go pprof for that - but it is very useful to visually inspect where the replication / snapshotter / pruner spends its time ( fixes #307 ) usage in package daemon/logging: - goal: every log entry should have a trace field with the ID stack from package trace - make `logging.GetLogger(ctx, Subsys)` the authoritative `logger.Logger` factory function - the context carries a linked list of injected fields which `logging.GetLogger` adds to the logger it returns - `logging.GetLogger` also uses package `trace` to get the task-and-span-stack and injects it into the returned logger's fields |
||
---|---|---|
.. | ||
analysis.ipynb | ||
README.md | ||
scrape_graylog_csv.bash | ||
zfscmd_logging_scraper.go | ||
zfscmd_logging_scrapter_test.go |
The tool in this package (go run . -h
) scrapes log lines produces by the github.com/zrepl/zrepl/zfs/zfscmd
package
into a stream of JSON objects.
The analysis.ipynb
then runs some basic analysis on the collected log output.
Deps for the scrape_graylog_csv.bash
script
pip install --upgrade git+https://github.com/lk-jeffpeck/csvfilter.git@ec433f14330fbbf5d41f56febfeedac22868a949