appender: add option to automatically delete old log files (#2323)

## Motivation

`tracing-appender` does not have `Rotation` based on size yet. Also, it
doesn't have the feature of keeping the most recent `N` log files

I believe the second feature is more easy to implement, and also will
partially solve the `Rotation` based on size problem. Because people may
choose `hourly` or `daily` rotation based on their needs, and put an
extra boundary of `keep the last 5 files` for example. Of course it
won't handle all the edge cases for `Rotation` based on size. But it
will cover most of the scenarios. And also, it is a good feature to have
on its own :)

## Solution

Introduce another field called `max_files: Option<usize>` to the `Inner`
of `RollingFileAppender` struct. I managed to did not touch any of the
existing functions, so it **WON'T BE A BREAKING CHANGE**. Yay :)

The solution is, whenever the rotation should happen, the
`refresh_writer()` is called. So I embed the following logic into that
function:

1- check the log folder and detect the log files 2- if there are more
log files than the `max_files` amount 3- store the filenames in a
vector, and sort them by their dates (dates are already present in the
filename) 4- keep deleting the oldest ones, till we have desired amount
of log files in the log folder

P.S. this PR was opened before, but got closed since it would be easier
for the maintainers to target `master` branch instead of `v0.1.x` Also,
@CBenoit contributed to this PR, it would be great to give credit to him
:)

Co-authored-by: Benoît Cortier <bcortier@proton.me>
Co-authored-by: Eliza Weisman <eliza@buoyant.io>
This commit is contained in:
Özgün Özerk 2022-09-24 23:57:01 +03:00 committed by Eliza Weisman
parent 37c2434f21
commit c542120055
No known key found for this signature in database
GPG Key ID: F9C1A595C3814436
3 changed files with 374 additions and 83 deletions

View File

@ -22,7 +22,7 @@ rust-version = "1.53.0"
[dependencies]
crossbeam-channel = "0.5.6"
time = { version = "0.3.2", default-features = false, features = ["formatting"] }
time = { version = "0.3.2", default-features = false, features = ["formatting", "parsing"] }
parking_lot = { optional = true, version = "0.12.1" }
thiserror = "1"
@ -33,10 +33,8 @@ default-features = false
features = ["fmt", "std"]
[dev-dependencies]
criterion = { version = "0.3.6", default-features = false }
tracing = { path = "../tracing", version = "0.1.35" }
time = { version = "0.3.2", default-features = false, features = ["formatting", "parsing"] }
tempfile = "3"
[[bench]]

View File

@ -34,7 +34,7 @@ use std::{
path::{Path, PathBuf},
sync::atomic::{AtomicUsize, Ordering},
};
use time::{format_description, Duration, OffsetDateTime, Time};
use time::{format_description, Date, Duration, OffsetDateTime, Time};
mod builder;
pub use builder::{Builder, InitError};
@ -104,8 +104,10 @@ struct Inner {
log_directory: PathBuf,
log_filename_prefix: Option<String>,
log_filename_suffix: Option<String>,
date_format: Vec<format_description::FormatItem<'static>>,
rotation: Rotation,
next_date: AtomicUsize,
max_files: Option<usize>,
}
// === impl RollingFileAppender ===
@ -187,6 +189,7 @@ impl RollingFileAppender {
ref rotation,
ref prefix,
ref suffix,
ref max_files,
} = builder;
let directory = directory.as_ref().to_path_buf();
let now = OffsetDateTime::now_utc();
@ -196,6 +199,7 @@ impl RollingFileAppender {
directory,
prefix.clone(),
suffix.clone(),
*max_files,
)?;
Ok(Self {
state,
@ -242,7 +246,7 @@ impl<'a> tracing_subscriber::fmt::writer::MakeWriter<'a> for RollingFileAppender
// Did we get the right to lock the file? If not, another thread
// did it and we can just make a writer.
if self.state.advance_date(now, current_time) {
self.state.refresh_writer(now, &mut *self.writer.write());
self.state.refresh_writer(now, &mut self.writer.write());
}
}
RollingWriter(self.writer.read())
@ -488,32 +492,14 @@ impl Rotation {
}
}
pub(crate) fn join_date(
&self,
filename: Option<&str>,
date: &OffsetDateTime,
suffix: Option<&str>,
) -> String {
let format = match *self {
fn date_format(&self) -> Vec<format_description::FormatItem<'static>> {
match *self {
Rotation::MINUTELY => format_description::parse("[year]-[month]-[day]-[hour]-[minute]"),
Rotation::HOURLY => format_description::parse("[year]-[month]-[day]-[hour]"),
Rotation::DAILY => format_description::parse("[year]-[month]-[day]"),
Rotation::NEVER => format_description::parse("[year]-[month]-[day]"),
}
.expect("Unable to create a formatter; this is a bug in tracing-appender");
let date = date
.format(&format)
.expect("Unable to format OffsetDateTime; this is a bug in tracing-appender");
match (self, filename, suffix) {
(&Rotation::NEVER, Some(filename), None) => filename.to_string(),
(&Rotation::NEVER, Some(filename), Some(suffix)) => format!("{}.{}", filename, suffix),
(&Rotation::NEVER, None, Some(suffix)) => suffix.to_string(),
(_, Some(filename), Some(suffix)) => format!("{}.{}.{}", filename, date, suffix),
(_, Some(filename), None) => format!("{}.{}", filename, date),
(_, None, Some(suffix)) => format!("{}.{}", date, suffix),
(_, None, None) => date,
}
.expect("Unable to create a formatter; this is a bug in tracing-appender")
}
}
@ -538,36 +524,122 @@ impl Inner {
directory: impl AsRef<Path>,
log_filename_prefix: Option<String>,
log_filename_suffix: Option<String>,
max_files: Option<usize>,
) -> Result<(Self, RwLock<File>), builder::InitError> {
let log_directory = directory.as_ref().to_path_buf();
let filename = rotation.join_date(
log_filename_prefix.as_deref(),
&now,
log_filename_suffix.as_deref(),
);
let date_format = rotation.date_format();
let next_date = rotation.next_date(&now);
let writer = RwLock::new(create_writer(log_directory.as_ref(), &filename)?);
let inner = Inner {
log_directory,
log_filename_prefix,
log_filename_suffix,
date_format,
next_date: AtomicUsize::new(
next_date
.map(|date| date.unix_timestamp() as usize)
.unwrap_or(0),
),
rotation,
max_files,
};
let filename = inner.join_date(&now);
let writer = RwLock::new(create_writer(inner.log_directory.as_ref(), &filename)?);
Ok((inner, writer))
}
pub(crate) fn join_date(&self, date: &OffsetDateTime) -> String {
let date = date
.format(&self.date_format)
.expect("Unable to format OffsetDateTime; this is a bug in tracing-appender");
match (
&self.rotation,
&self.log_filename_prefix,
&self.log_filename_suffix,
) {
(&Rotation::NEVER, Some(filename), None) => filename.to_string(),
(&Rotation::NEVER, Some(filename), Some(suffix)) => format!("{}.{}", filename, suffix),
(&Rotation::NEVER, None, Some(suffix)) => suffix.to_string(),
(_, Some(filename), Some(suffix)) => format!("{}.{}.{}", filename, date, suffix),
(_, Some(filename), None) => format!("{}.{}", filename, date),
(_, None, Some(suffix)) => format!("{}.{}", date, suffix),
(_, None, None) => date,
}
}
fn prune_old_logs(&self, max_files: usize) {
let files = fs::read_dir(&self.log_directory).map(|dir| {
dir.filter_map(|entry| {
let entry = entry.ok()?;
let metadata = entry.metadata().ok()?;
// the appender only creates files, not directories or symlinks,
// so we should never delete a dir or symlink.
if !metadata.is_file() {
return None;
}
let filename = entry.file_name();
// if the filename is not a UTF-8 string, skip it.
let filename = filename.to_str()?;
if let Some(prefix) = &self.log_filename_prefix {
if !filename.starts_with(prefix) {
return None;
}
}
if let Some(suffix) = &self.log_filename_suffix {
if !filename.ends_with(suffix) {
return None;
}
}
if self.log_filename_prefix.is_none()
&& self.log_filename_suffix.is_none()
&& Date::parse(filename, &self.date_format).is_err()
{
return None;
}
let created = metadata.created().ok()?;
Some((entry, created))
})
.collect::<Vec<_>>()
});
let mut files = match files {
Ok(files) => files,
Err(error) => {
eprintln!("Error reading the log directory/files: {}", error);
return;
}
};
if files.len() < max_files {
return;
}
// sort the files by their creation timestamps.
files.sort_by_key(|(_, created_at)| *created_at);
// delete files, so that (n-1) files remain, because we will create another log file
for (file, _) in files.iter().take(files.len() - (max_files - 1)) {
if let Err(error) = fs::remove_file(file.path()) {
eprintln!(
"Failed to remove old log file {}: {}",
file.path().display(),
error
);
}
}
}
fn refresh_writer(&self, now: OffsetDateTime, file: &mut File) {
let filename = self.rotation.join_date(
self.log_filename_prefix.as_deref(),
&now,
self.log_filename_suffix.as_deref(),
);
let filename = self.join_date(&now);
if let Some(max_files) = self.max_files {
self.prune_old_logs(max_files);
}
match create_writer(&self.log_directory, &filename) {
Ok(new_file) => {
@ -727,62 +799,126 @@ mod test {
}
#[test]
fn test_path_concatination() {
fn test_path_concatenation() {
let format = format_description::parse(
"[year]-[month]-[day] [hour]:[minute]:[second] [offset_hour \
sign:mandatory]:[offset_minute]:[offset_second]",
)
.unwrap();
let directory = tempfile::tempdir().expect("failed to create tempdir");
let now = OffsetDateTime::parse("2020-02-01 10:01:00 +00:00:00", &format).unwrap();
// per-minute
let path = Rotation::MINUTELY.join_date(Some("app.log"), &now, None);
assert_eq!("app.log.2020-02-01-10-01", path);
struct TestCase {
expected: &'static str,
rotation: Rotation,
prefix: Option<&'static str>,
suffix: Option<&'static str>,
}
// per-hour
let path = Rotation::HOURLY.join_date(Some("app.log"), &now, None);
assert_eq!("app.log.2020-02-01-10", path);
let test = |TestCase {
expected,
rotation,
prefix,
suffix,
}| {
let (inner, _) = Inner::new(
now,
rotation.clone(),
directory.path(),
prefix.map(ToString::to_string),
suffix.map(ToString::to_string),
None,
)
.unwrap();
let path = inner.join_date(&now);
assert_eq!(
expected, path,
"rotation = {:?}, prefix = {:?}, suffix = {:?}",
rotation, prefix, suffix
);
};
// per-day
let path = Rotation::DAILY.join_date(Some("app.log"), &now, None);
assert_eq!("app.log.2020-02-01", path);
// never
let path = Rotation::NEVER.join_date(Some("app.log"), &now, None);
assert_eq!("app.log", path);
// per-minute with suffix
let path = Rotation::MINUTELY.join_date(Some("app"), &now, Some("log"));
assert_eq!("app.2020-02-01-10-01.log", path);
// per-hour with suffix
let path = Rotation::HOURLY.join_date(Some("app"), &now, Some("log"));
assert_eq!("app.2020-02-01-10.log", path);
// per-day with suffix
let path = Rotation::DAILY.join_date(Some("app"), &now, Some("log"));
assert_eq!("app.2020-02-01.log", path);
// never with suffix
let path = Rotation::NEVER.join_date(Some("app"), &now, Some("log"));
assert_eq!("app.log", path);
// per-minute without prefix
let path = Rotation::MINUTELY.join_date(None, &now, Some("app.log"));
assert_eq!("2020-02-01-10-01.app.log", path);
// per-hour without prefix
let path = Rotation::HOURLY.join_date(None, &now, Some("app.log"));
assert_eq!("2020-02-01-10.app.log", path);
// per-day without prefix
let path = Rotation::DAILY.join_date(None, &now, Some("app.log"));
assert_eq!("2020-02-01.app.log", path);
// never without prefix
let path = Rotation::NEVER.join_date(None, &now, Some("app.log"));
assert_eq!("app.log", path);
let test_cases = vec![
// prefix only
TestCase {
expected: "app.log.2020-02-01-10-01",
rotation: Rotation::MINUTELY,
prefix: Some("app.log"),
suffix: None,
},
TestCase {
expected: "app.log.2020-02-01-10",
rotation: Rotation::HOURLY,
prefix: Some("app.log"),
suffix: None,
},
TestCase {
expected: "app.log.2020-02-01",
rotation: Rotation::DAILY,
prefix: Some("app.log"),
suffix: None,
},
TestCase {
expected: "app.log",
rotation: Rotation::NEVER,
prefix: Some("app.log"),
suffix: None,
},
// prefix and suffix
TestCase {
expected: "app.2020-02-01-10-01.log",
rotation: Rotation::MINUTELY,
prefix: Some("app"),
suffix: Some("log"),
},
TestCase {
expected: "app.2020-02-01-10.log",
rotation: Rotation::HOURLY,
prefix: Some("app"),
suffix: Some("log"),
},
TestCase {
expected: "app.2020-02-01.log",
rotation: Rotation::DAILY,
prefix: Some("app"),
suffix: Some("log"),
},
TestCase {
expected: "app.log",
rotation: Rotation::NEVER,
prefix: Some("app"),
suffix: Some("log"),
},
// suffix only
TestCase {
expected: "2020-02-01-10-01.log",
rotation: Rotation::MINUTELY,
prefix: None,
suffix: Some("log"),
},
TestCase {
expected: "2020-02-01-10.log",
rotation: Rotation::HOURLY,
prefix: None,
suffix: Some("log"),
},
TestCase {
expected: "2020-02-01.log",
rotation: Rotation::DAILY,
prefix: None,
suffix: Some("log"),
},
TestCase {
expected: "log",
rotation: Rotation::NEVER,
prefix: None,
suffix: Some("log"),
},
];
for test_case in test_cases {
test(test_case)
}
}
#[test]
@ -804,6 +940,7 @@ mod test {
directory.path(),
Some("test_make_writer".to_string()),
None,
None,
)
.unwrap();
@ -865,4 +1002,108 @@ mod test {
}
}
}
#[test]
fn test_max_log_files() {
use std::sync::{Arc, Mutex};
use tracing_subscriber::prelude::*;
let format = format_description::parse(
"[year]-[month]-[day] [hour]:[minute]:[second] [offset_hour \
sign:mandatory]:[offset_minute]:[offset_second]",
)
.unwrap();
let now = OffsetDateTime::parse("2020-02-01 10:01:00 +00:00:00", &format).unwrap();
let directory = tempfile::tempdir().expect("failed to create tempdir");
let (state, writer) = Inner::new(
now,
Rotation::HOURLY,
directory.path(),
Some("test_max_log_files".to_string()),
None,
Some(2),
)
.unwrap();
let clock = Arc::new(Mutex::new(now));
let now = {
let clock = clock.clone();
Box::new(move || *clock.lock().unwrap())
};
let appender = RollingFileAppender { state, writer, now };
let default = tracing_subscriber::fmt()
.without_time()
.with_level(false)
.with_target(false)
.with_max_level(tracing_subscriber::filter::LevelFilter::TRACE)
.with_writer(appender)
.finish()
.set_default();
tracing::info!("file 1");
// advance time by one second
(*clock.lock().unwrap()) += Duration::seconds(1);
tracing::info!("file 1");
// advance time by one hour
(*clock.lock().unwrap()) += Duration::hours(1);
// depending on the filesystem, the creation timestamp's resolution may
// be as coarse as one second, so we need to wait a bit here to ensure
// that the next file actually is newer than the old one.
std::thread::sleep(std::time::Duration::from_secs(1));
tracing::info!("file 2");
// advance time by one second
(*clock.lock().unwrap()) += Duration::seconds(1);
tracing::info!("file 2");
// advance time by one hour
(*clock.lock().unwrap()) += Duration::hours(1);
// again, sleep to ensure that the creation timestamps actually differ.
std::thread::sleep(std::time::Duration::from_secs(1));
tracing::info!("file 3");
// advance time by one second
(*clock.lock().unwrap()) += Duration::seconds(1);
tracing::info!("file 3");
drop(default);
let dir_contents = fs::read_dir(directory.path()).expect("Failed to read directory");
println!("dir={:?}", dir_contents);
for entry in dir_contents {
println!("entry={:?}", entry);
let path = entry.expect("Expected dir entry").path();
let file = fs::read_to_string(&path).expect("Failed to read file");
println!("path={}\nfile={:?}", path.display(), file);
match path
.extension()
.expect("found a file without a date!")
.to_str()
.expect("extension should be UTF8")
{
"2020-02-01-10" => {
panic!("this file should have been pruned already!");
}
"2020-02-01-11" => {
assert_eq!("file 2\nfile 2\n", file);
}
"2020-02-01-12" => {
assert_eq!("file 3\nfile 3\n", file);
}
x => panic!("unexpected date {}", x),
}
}
}
}

View File

@ -10,6 +10,7 @@ pub struct Builder {
pub(super) rotation: Rotation,
pub(super) prefix: Option<String>,
pub(super) suffix: Option<String>,
pub(super) max_files: Option<usize>,
}
/// Errors returned by [`Builder::build`].
@ -39,15 +40,20 @@ impl Builder {
/// | :-------- | :------------ | :---- |
/// | [`rotation`] | [`Rotation::NEVER`] | By default, log files will never be rotated. |
/// | [`filename_prefix`] | `""` | By default, log file names will not have a prefix. |
/// | [`filename_suffix`] | `""` | By default, log file names will not have a suffix. |
/// | [`max_log_files`] | `None` | By default, there is no limit for maximum log file count. |
///
/// [`rotation`]: Self::rotation
/// [`filename_prefix`]: Self::filename_prefix
/// [`filename_suffix`]: Self::filename_suffix
/// [`max_log_files`]: Self::max_log_files
#[must_use]
pub const fn new() -> Self {
Self {
rotation: Rotation::NEVER,
prefix: None,
suffix: None,
max_files: None,
}
}
@ -181,6 +187,52 @@ impl Builder {
Self { suffix, ..self }
}
/// Keeps the last `n` log files on disk.
///
/// When a new log file is created, if there are `n` or more
/// existing log files in the directory, the oldest will be deleted.
/// If no value is supplied, the `RollingAppender` will not remove any files.
///
/// Files are considered candidates for deletion based on the following
/// criteria:
///
/// * The file must not be a directory or symbolic link.
/// * If the appender is configured with a [`filename_prefix`], the file
/// name must start with that prefix.
/// * If the appender is configured with a [`filename_suffix`], the file
/// name must end with that suffix.
/// * If the appender has neither a filename prefix nor a suffix, then the
/// file name must parse as a valid date based on the appender's date
/// format.
///
/// Files matching these criteria may be deleted if the maximum number of
/// log files in the directory has been reached.
///
/// [`filename_prefix`]: Self::filename_prefix
/// [`filename_suffix`]: Self::filename_suffix
///
/// # Examples
///
/// ```
/// use tracing_appender::rolling::RollingFileAppender;
///
/// # fn docs() {
/// let appender = RollingFileAppender::builder()
/// .max_log_files(5) // only the most recent 5 log files will be kept
/// // ...
/// .build("/var/log")
/// .expect("failed to initialize rolling file appender");
/// # drop(appender)
/// # }
/// ```
#[must_use]
pub fn max_log_files(self, n: usize) -> Self {
Self {
max_files: Some(n),
..self
}
}
/// Builds a new [`RollingFileAppender`] with the configured parameters,
/// emitting log files to the provided directory.
///