Auto merge of #8210 - ehuss:rework-rustc-output, r=alexcrichton

Rework rustc output file tracking.

This is some cleanup around how Cargo computes the outputs from rustc, and changes how `cargo clean -p` works. This also includes some minor observable differences detailed below.

**clean -p changes**

Previously `cargo clean -p` would build the unit graph and attempt to guess what all the filename hashes are. This has several drawbacks. It incorrectly guesses the hashes in many cases (such as different features). It also tends to cause bugs because it passes every permutation of every setting into Cargo's internals, which may not be prepared to accept unsupported combinations like "test a build-script".

This new implementation uses a strategy of querying rustc for the filename prefix/suffix, and then deletes the files using globs.

`cargo clean -p` should now be more complete in deleting a package's artifacts, for example:
- Now deletes incremental files.
- Deletes dep-info files (both rustc and cargo).
- Handles changes in profiles, features, (anything in the hash).
- Covers a few more files (for example, dSYM in some cases, etc.). Should delete almost all files for most targets.

**Internal changes**

There are a bunch of internal changes to make Cargo do a better job of tracking the outputs from rustc, and to make the code easier to understand:

- The list of output files is now solely computed in `TargetInfo`. The files which are uplifted are solely computed in `CompilationFiles::uplift_to`. Previously both responsibilities were awkwardly spread in both locations.
- Whether or not a file should have a hyphen or underscore is now determined in one place (`FileType::should_replace_hyphens`).
- Added `CrateType` to replace `LibKind`, to avoid usage of strings, and to use the same structure for all of the target kinds.
- Added `FileFlavor::Rmeta` to be more explicit as to which output files are ".rmeta". (Previously the `Linkable{rmeta}` flag was specific for pipelining, and rmeta was `false` for things like `cargo check`, which was a bit hard to deal with.)
- Removed hyphen/underscore renaming in `rustc`. This was mostly unused, because it did not consider hashes in the filename, so it only applied to binaries without hashes, which is essentially just wasm32 and msvc. This hyphen/underscore translation still happens during "uplift".
- Changed it so that `Metadata` is always computed for every unit. The logic of whether or not something should use it is moved to `should_use_metadata`. I didn't realize that multiple units were sharing the same fingerprint directory (when they don't have a hash), which caused some bugs (like bad output caching).

**Behavioral changes**

Cargo now does more complete tracking of the files generated by rustc (and the linker). This means that some files will now be uplifted that previously weren't. It also means they will show up in the artifact JSON notifications. The newly tracked files are:

- `.exp` export files for Windows MSVC dynamic libraries. I don't know if these are actually useful (nobody has asked for them AFAIK). Presumably the linker creates them for some reason. Note that lld *doesn't* generate these files, this is only link.exe.
- Proc-macros on Windows track import/export files.
- All targets (like tests, etc.) that generate separate debug files (pdb/dSYM) are tracked.
- Added `.map` files for wasm32-unknown-emscripten.

Some other subtle changes:

- A binary example with a hyphen on Windows MSVC will now show up as `examples/foo_bar.exe` and `examples/foo-bar.exe`. Previously Cargo would just rename it to contain the hyphen. This is a consequence of simplifying the code, and I was reluctant to add a special case for this very narrow situation.
- Example libs now follow the same rules for hyphen/underscore translation as normal libs (they will now use underscores).
- Fingerprint changes:
    - Fingerprint files no longer have a hash in them. Their parent directory already contained the hash.
    - The fingerprint filename now uses hyphens instead of converting to underscores.
    - The fingerprint directory is now separated even if the unit doesn't use Metadata, to fix a caching bug.
- macOS: dSYM is uplifted for all dynamic libraries (dylib/cdylib/proc-macro) and for build-script-build (in case someone wants to debug a build script?).

**Notes**

- I suspect that the implementation of `clean -p` may change again in the future. If and when Cargo implements some kind of on-disk database that tracks artifacts (for the purpose of garbage collection), then `cargo clean -p` can be rewritten to use that mechanism if appropriate.
- The `build_script_build` incremental directory isn't deleted because Cargo doesn't know which one belongs to which package. I'm uncertain if that's reasonably fixable. The only option I've thought of is to place each package's incremental output in a separate directory.
- Should Cargo use `globset` to handle non-utf-8 filenames? I suspect that Cargo's support for non-utf-8 names is pretty poor, so I'm uncertain how important that is.

Closes #8149
Closes #6937
Closes #5788
Closes #5375
Closes #3530
This commit is contained in:
bors 2020-05-07 15:35:34 +00:00
commit c719a05a81
28 changed files with 921 additions and 653 deletions

View File

@ -10,7 +10,7 @@ use std::collections::HashMap;
use std::path::PathBuf;
mod target_info;
pub use self::target_info::{FileFlavor, RustcTargetData, TargetInfo};
pub use self::target_info::{FileFlavor, FileType, RustcTargetData, TargetInfo};
/// The build context, containing all information about a build task.
///

View File

@ -1,5 +1,5 @@
use crate::core::compiler::{BuildOutput, CompileKind, CompileTarget};
use crate::core::{Dependency, TargetKind, Workspace};
use crate::core::compiler::{BuildOutput, CompileKind, CompileMode, CompileTarget, CrateType};
use crate::core::{Dependency, Target, TargetKind, Workspace};
use crate::util::config::{Config, StringList, TargetConfig};
use crate::util::{CargoResult, CargoResultExt, ProcessBuilder, Rustc};
use cargo_platform::{Cfg, CfgExpr};
@ -25,7 +25,7 @@ pub struct TargetInfo {
/// `Some((prefix, suffix))`, for example `libcargo.so` would be
/// `Some(("lib", ".so")). The value is `None` if the crate type is not
/// supported.
crate_types: RefCell<HashMap<String, Option<(String, String)>>>,
crate_types: RefCell<HashMap<CrateType, Option<(String, String)>>>,
/// `cfg` information extracted from `rustc --print=cfg`.
cfg: Vec<Cfg>,
/// Path to the sysroot.
@ -49,41 +49,73 @@ pub struct TargetInfo {
pub enum FileFlavor {
/// Not a special file type.
Normal,
/// Like `Normal`, but not directly executable
/// Like `Normal`, but not directly executable.
/// For example, a `.wasm` file paired with the "normal" `.js` file.
Auxiliary,
/// Something you can link against (e.g., a library).
Linkable { rmeta: bool },
Linkable,
/// An `.rmeta` Rust metadata file.
Rmeta,
/// Piece of external debug information (e.g., `.dSYM`/`.pdb` file).
DebugInfo,
}
/// Type of each file generated by a Unit.
#[derive(Debug)]
pub struct FileType {
/// The kind of file.
pub flavor: FileFlavor,
/// The crate-type that generates this file.
///
/// `None` for things that aren't associated with a specific crate type,
/// for example `rmeta` files.
pub crate_type: Option<CrateType>,
/// The suffix for the file (for example, `.rlib`).
/// This is an empty string for executables on Unix-like platforms.
suffix: String,
/// The prefix for the file (for example, `lib`).
/// This is an empty string for things like executables.
prefix: String,
/// Flag to convert hyphen to underscore.
///
/// wasm bin targets will generate two files in deps such as
/// "web-stuff.js" and "web_stuff.wasm". Note the different usages of "-"
/// and "_". This flag indicates that the stem "web-stuff" should be
/// converted to "web_stuff".
/// Flag to convert hyphen to underscore when uplifting.
should_replace_hyphens: bool,
}
impl FileType {
pub fn filename(&self, stem: &str) -> String {
let stem = if self.should_replace_hyphens {
stem.replace("-", "_")
/// The filename for this FileType crated by rustc.
pub fn output_filename(&self, target: &Target, metadata: Option<&str>) -> String {
match metadata {
Some(metadata) => format!(
"{}{}-{}{}",
self.prefix,
target.crate_name(),
metadata,
self.suffix
),
None => format!("{}{}{}", self.prefix, target.crate_name(), self.suffix),
}
}
/// The filename for this FileType that Cargo should use when "uplifting"
/// it to the destination directory.
pub fn uplift_filename(&self, target: &Target) -> String {
let name = if self.should_replace_hyphens {
target.crate_name()
} else {
stem.to_string()
target.name().to_string()
};
format!("{}{}{}", self.prefix, stem, self.suffix)
format!("{}{}{}", self.prefix, name, self.suffix)
}
/// Creates a new instance representing a `.rmeta` file.
pub fn new_rmeta() -> FileType {
// Note that even binaries use the `lib` prefix.
FileType {
flavor: FileFlavor::Rmeta,
crate_type: None,
suffix: ".rmeta".to_string(),
prefix: "lib".to_string(),
should_replace_hyphens: true,
}
}
}
@ -123,10 +155,16 @@ impl TargetInfo {
}
let crate_type_process = process.clone();
const KNOWN_CRATE_TYPES: &[&str] =
&["bin", "rlib", "dylib", "cdylib", "staticlib", "proc-macro"];
const KNOWN_CRATE_TYPES: &[CrateType] = &[
CrateType::Bin,
CrateType::Rlib,
CrateType::Dylib,
CrateType::Cdylib,
CrateType::Staticlib,
CrateType::ProcMacro,
];
for crate_type in KNOWN_CRATE_TYPES.iter() {
process.arg("--crate-type").arg(crate_type);
process.arg("--crate-type").arg(crate_type.as_str());
}
process.arg("--print=sysroot");
@ -140,7 +178,7 @@ impl TargetInfo {
let mut map = HashMap::new();
for crate_type in KNOWN_CRATE_TYPES {
let out = parse_crate_type(crate_type, &process, &output, &error, &mut lines)?;
map.insert(crate_type.to_string(), out);
map.insert(crate_type.clone(), out);
}
let line = match lines.next() {
@ -226,15 +264,20 @@ impl TargetInfo {
/// Returns the list of file types generated by the given crate type.
///
/// Returns `None` if the target does not support the given crate type.
pub fn file_types(
fn file_types(
&self,
crate_type: &str,
crate_type: &CrateType,
flavor: FileFlavor,
kind: &TargetKind,
target_triple: &str,
) -> CargoResult<Option<Vec<FileType>>> {
let crate_type = if *crate_type == CrateType::Lib {
CrateType::Rlib
} else {
crate_type.clone()
};
let mut crate_types = self.crate_types.borrow_mut();
let entry = crate_types.entry(crate_type.to_string());
let entry = crate_types.entry(crate_type.clone());
let crate_type_info = match entry {
Entry::Occupied(o) => &*o.into_mut(),
Entry::Vacant(v) => {
@ -250,58 +293,95 @@ impl TargetInfo {
suffix: suffix.clone(),
prefix: prefix.clone(),
flavor,
should_replace_hyphens: false,
crate_type: Some(crate_type.clone()),
should_replace_hyphens: crate_type != CrateType::Bin,
}];
// See rust-lang/cargo#4500.
if target_triple.ends_with("-windows-msvc")
&& crate_type.ends_with("dylib")
&& suffix == ".dll"
{
ret.push(FileType {
suffix: ".dll.lib".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::Normal,
should_replace_hyphens: false,
})
} else if target_triple.ends_with("windows-gnu")
&& crate_type.ends_with("dylib")
&& suffix == ".dll"
{
// LD can link DLL directly, but LLD requires the import library.
ret.push(FileType {
suffix: ".dll.a".to_string(),
prefix: "lib".to_string(),
flavor: FileFlavor::Normal,
should_replace_hyphens: false,
})
// Window shared library import/export files.
if crate_type.is_dynamic() {
if target_triple.ends_with("-windows-msvc") {
assert!(suffix == ".dll");
// See https://docs.microsoft.com/en-us/cpp/build/reference/working-with-import-libraries-and-export-files
// for more information about DLL import/export files.
ret.push(FileType {
suffix: ".dll.lib".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::Auxiliary,
crate_type: Some(crate_type.clone()),
should_replace_hyphens: true,
});
// NOTE: lld does not produce these
ret.push(FileType {
suffix: ".dll.exp".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::Auxiliary,
crate_type: Some(crate_type.clone()),
should_replace_hyphens: true,
});
} else if target_triple.ends_with("windows-gnu") {
assert!(suffix == ".dll");
// See https://cygwin.com/cygwin-ug-net/dll.html for more
// information about GNU import libraries.
// LD can link DLL directly, but LLD requires the import library.
ret.push(FileType {
suffix: ".dll.a".to_string(),
prefix: "lib".to_string(),
flavor: FileFlavor::Auxiliary,
crate_type: Some(crate_type.clone()),
should_replace_hyphens: true,
})
}
}
// See rust-lang/cargo#4535.
if target_triple.starts_with("wasm32-") && crate_type == "bin" && suffix == ".js" {
if target_triple.starts_with("wasm32-") && crate_type == CrateType::Bin && suffix == ".js" {
// emscripten binaries generate a .js file, which loads a .wasm
// file.
ret.push(FileType {
suffix: ".wasm".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::Auxiliary,
crate_type: Some(crate_type.clone()),
// Name `foo-bar` will generate a `foo_bar.js` and
// `foo_bar.wasm`. Cargo will translate the underscore and
// copy `foo_bar.js` to `foo-bar.js`. However, the wasm
// filename is embedded in the .js file with an underscore, so
// it should not contain hyphens.
should_replace_hyphens: true,
})
});
// And a map file for debugging. This is only emitted with debug=2
// (-g4 for emcc).
ret.push(FileType {
suffix: ".wasm.map".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::DebugInfo,
crate_type: Some(crate_type.clone()),
should_replace_hyphens: true,
});
}
// See rust-lang/cargo#4490, rust-lang/cargo#4960.
// Only uplift debuginfo for binaries.
// - Tests are run directly from `target/debug/deps/` with the
// metadata hash still in the filename.
// - Examples are only uplifted for apple because the symbol file
// needs to match the executable file name to be found (i.e., it
// needs to remove the hash in the filename). On Windows, the path
// to the .pdb with the hash is embedded in the executable.
// Handle separate debug files.
let is_apple = target_triple.contains("-apple-");
if *kind == TargetKind::Bin || (*kind == TargetKind::ExampleBin && is_apple) {
if matches!(
crate_type,
CrateType::Bin | CrateType::Dylib | CrateType::Cdylib | CrateType::ProcMacro
) {
if is_apple {
let suffix = if crate_type == CrateType::Bin {
".dSYM".to_string()
} else {
".dylib.dSYM".to_string()
};
ret.push(FileType {
suffix: ".dSYM".to_string(),
suffix,
prefix: prefix.clone(),
flavor: FileFlavor::DebugInfo,
crate_type: Some(crate_type.clone()),
// macOS tools like lldb use all sorts of magic to locate
// dSYM files. See https://lldb.llvm.org/use/symbols.html
// for some details. It seems like a `.dSYM` located next
// to the executable with the same name is one method. The
// dSYM should have the same hyphens as the executable for
// the names to match.
should_replace_hyphens: false,
})
} else if target_triple.ends_with("-msvc") {
@ -309,8 +389,13 @@ impl TargetInfo {
suffix: ".pdb".to_string(),
prefix: prefix.clone(),
flavor: FileFlavor::DebugInfo,
// rustc calls the linker with underscores, and the
// filename is embedded in the executable.
crate_type: Some(crate_type.clone()),
// The absolute path to the pdb file is embedded in the
// executable. If the exe/pdb pair is moved to another
// machine, then debuggers will look in the same directory
// of the exe with the original pdb filename. Since the
// original name contains underscores, they need to be
// preserved.
should_replace_hyphens: true,
})
}
@ -319,10 +404,10 @@ impl TargetInfo {
Ok(Some(ret))
}
fn discover_crate_type(&self, crate_type: &str) -> CargoResult<Option<(String, String)>> {
fn discover_crate_type(&self, crate_type: &CrateType) -> CargoResult<Option<(String, String)>> {
let mut process = self.crate_type_process.clone();
process.arg("--crate-type").arg(crate_type);
process.arg("--crate-type").arg(crate_type.as_str());
let output = process.exec_with_output().chain_err(|| {
format!(
@ -341,6 +426,62 @@ impl TargetInfo {
&mut output.lines(),
)?)
}
/// Returns all the file types generated by rustc for the given mode/target_kind.
///
/// The first value is a Vec of file types generated, the second value is
/// a list of CrateTypes that are not supported by the given target.
pub fn rustc_outputs(
&self,
mode: CompileMode,
target_kind: &TargetKind,
target_triple: &str,
) -> CargoResult<(Vec<FileType>, Vec<CrateType>)> {
match mode {
CompileMode::Build => self.calc_rustc_outputs(target_kind, target_triple),
CompileMode::Test | CompileMode::Bench => {
match self.file_types(&CrateType::Bin, FileFlavor::Normal, target_triple)? {
Some(fts) => Ok((fts, Vec::new())),
None => Ok((Vec::new(), vec![CrateType::Bin])),
}
}
CompileMode::Check { .. } => Ok((vec![FileType::new_rmeta()], Vec::new())),
CompileMode::Doc { .. } | CompileMode::Doctest | CompileMode::RunCustomBuild => {
panic!("asked for rustc output for non-rustc mode")
}
}
}
fn calc_rustc_outputs(
&self,
target_kind: &TargetKind,
target_triple: &str,
) -> CargoResult<(Vec<FileType>, Vec<CrateType>)> {
let mut unsupported = Vec::new();
let mut result = Vec::new();
let crate_types = target_kind.rustc_crate_types();
for crate_type in &crate_types {
let flavor = if crate_type.is_linkable() {
FileFlavor::Linkable
} else {
FileFlavor::Normal
};
let file_types = self.file_types(&crate_type, flavor, target_triple)?;
match file_types {
Some(types) => {
result.extend(types);
}
None => {
unsupported.push(crate_type.clone());
}
}
}
if !result.is_empty() && !crate_types.iter().any(|ct| ct.requires_upstream_objects()) {
// Only add rmeta if pipelining.
result.push(FileType::new_rmeta());
}
Ok((result, unsupported))
}
}
/// Takes rustc output (using specialized command line args), and calculates the file prefix and
@ -353,7 +494,7 @@ impl TargetInfo {
/// This function can not handle more than one file per type (with wasm32-unknown-emscripten, there
/// are two files for bin (`.wasm` and `.js`)).
fn parse_crate_type(
crate_type: &str,
crate_type: &CrateType,
cmd: &ProcessBuilder,
output: &str,
error: &str,
@ -525,9 +666,7 @@ pub struct RustcTargetData {
host_info: TargetInfo,
/// Build information for targets that we're building for. This will be
/// empty if the `--target` flag is not passed, and currently also only ever
/// has at most one entry, but eventually we'd like to support multi-target
/// builds with Cargo.
/// empty if the `--target` flag is not passed.
target_config: HashMap<CompileTarget, TargetConfig>,
target_info: HashMap<CompileTarget, TargetInfo>,
}

View File

@ -167,7 +167,7 @@ impl<'cfg> Compilation<'cfg> {
}
for crate_type in unit.target.rustc_crate_types() {
p.arg("--crate-type").arg(crate_type);
p.arg("--crate-type").arg(crate_type.as_str());
}
Ok(p)

View File

@ -9,7 +9,7 @@ use lazycell::LazyCell;
use log::info;
use super::{BuildContext, CompileKind, Context, FileFlavor, Layout};
use crate::core::compiler::{CompileMode, CompileTarget, Unit};
use crate::core::compiler::{CompileMode, CompileTarget, CrateType, FileType, Unit};
use crate::core::{Target, TargetKind, Workspace};
use crate::util::{self, CargoResult};
@ -85,9 +85,6 @@ pub struct CompilationFiles<'a, 'cfg> {
roots: Vec<Unit>,
ws: &'a Workspace<'cfg>,
/// Metadata hash to use for each unit.
///
/// `None` if the unit should not use a metadata data hash (like rustdoc,
/// or some dylibs).
metas: HashMap<Unit, Option<Metadata>>,
/// For each Unit, a list all files produced.
outputs: HashMap<Unit, LazyCell<Arc<Vec<OutputFile>>>>,
@ -151,12 +148,11 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
}
}
/// Gets the metadata for a target in a specific profile.
/// We build to the path `"{filename}-{target_metadata}"`.
/// We use a linking step to link/copy to a predictable filename
/// like `target/debug/libfoo.{a,so,rlib}` and such.
/// Gets the metadata for the given unit.
///
/// Returns `None` if the unit should not use a metadata data hash (like
/// See module docs for more details.
///
/// Returns `None` if the unit should not use a metadata hash (like
/// rustdoc, or some dylibs).
pub fn metadata(&self, unit: &Unit) -> Option<Metadata> {
self.metas[unit]
@ -169,8 +165,8 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
util::short_hash(&hashable)
}
/// Returns the appropriate output directory for the specified package and
/// target.
/// Returns the directory where the artifacts for the given unit are
/// initially created.
pub fn out_dir(&self, unit: &Unit) -> PathBuf {
if unit.mode.is_doc() {
self.layout(unit.kind).doc().to_path_buf()
@ -191,7 +187,10 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
}
/// Directory name to use for a package in the form `NAME-HASH`.
pub fn pkg_dir(&self, unit: &Unit) -> String {
///
/// Note that some units may share the same directory, so care should be
/// taken in those cases!
fn pkg_dir(&self, unit: &Unit) -> String {
let name = unit.pkg.package_id().name();
match self.metas[unit] {
Some(ref meta) => format!("{}-{}", name, meta),
@ -221,9 +220,29 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
self.layout(unit.kind).fingerprint().join(dir)
}
/// Returns the path for a file in the fingerprint directory.
///
/// The "prefix" should be something to distinguish the file from other
/// files in the fingerprint directory.
pub fn fingerprint_file_path(&self, unit: &Unit, prefix: &str) -> PathBuf {
// Different targets need to be distinguished in the
let kind = unit.target.kind().description();
let flavor = if unit.mode.is_any_test() {
"test-"
} else if unit.mode.is_doc() {
"doc-"
} else if unit.mode.is_run_custom_build() {
"run-"
} else {
""
};
let name = format!("{}{}{}-{}", prefix, flavor, kind, unit.target.name());
self.fingerprint_dir(unit).join(name)
}
/// Path where compiler output is cached.
pub fn message_cache_path(&self, unit: &Unit) -> PathBuf {
self.fingerprint_dir(unit).join("output")
self.fingerprint_file_path(unit, "output-")
}
/// Returns the directory where a compiled build script is stored.
@ -252,14 +271,6 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
self.build_script_run_dir(unit).join("out")
}
/// Returns the file stem for a given target/profile combo (with metadata).
pub fn file_stem(&self, unit: &Unit) -> String {
match self.metas[unit] {
Some(ref metadata) => format!("{}-{}", unit.target.crate_name(), metadata),
None => self.bin_stem(unit),
}
}
/// Returns the path to the executable binary for the given bin target.
///
/// This should only to be used when a `Unit` is not available.
@ -272,13 +283,12 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
assert!(target.is_bin());
let dest = self.layout(kind).dest();
let info = bcx.target_data.info(kind);
let file_types = info
.file_types(
"bin",
FileFlavor::Normal,
let (file_types, _) = info
.rustc_outputs(
CompileMode::Build,
&TargetKind::Bin,
bcx.target_data.short_name(&kind),
)?
)
.expect("target must support `bin`");
let file_type = file_types
@ -286,10 +296,12 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
.find(|file_type| file_type.flavor == FileFlavor::Normal)
.expect("target must support `bin`");
Ok(dest.join(file_type.filename(target.name())))
Ok(dest.join(file_type.uplift_filename(target)))
}
/// Returns the filenames that the given unit will generate.
///
/// Note: It is not guaranteed that all of the files will be generated.
pub(super) fn outputs(
&self,
unit: &Unit,
@ -300,57 +312,50 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
.map(Arc::clone)
}
/// Returns the bin filename for a given target, without extension and metadata.
fn bin_stem(&self, unit: &Unit) -> String {
if unit.target.allows_dashes() {
unit.target.name().to_string()
} else {
unit.target.crate_name()
/// Returns the path where the output for the given unit and FileType
/// should be uplifted to.
///
/// Returns `None` if the unit shouldn't be uplifted (for example, a
/// dependent rlib).
fn uplift_to(&self, unit: &Unit, file_type: &FileType, from_path: &Path) -> Option<PathBuf> {
// Tests, check, doc, etc. should not be uplifted.
if unit.mode != CompileMode::Build || file_type.flavor == FileFlavor::Rmeta {
return None;
}
}
/// Returns a tuple `(hard_link_dir, filename_stem)` for the primary
/// output file for the given unit.
///
/// `hard_link_dir` is the directory where the file should be hard-linked
/// ("uplifted") to. For example, `/path/to/project/target`.
///
/// `filename_stem` is the base filename without an extension.
///
/// This function returns it in two parts so the caller can add
/// prefix/suffix to filename separately.
///
/// Returns an `Option` because in some cases we don't want to link
/// (eg a dependent lib).
fn link_stem(&self, unit: &Unit) -> Option<(PathBuf, String)> {
let out_dir = self.out_dir(unit);
let bin_stem = self.bin_stem(unit); // Stem without metadata.
let file_stem = self.file_stem(unit); // Stem with metadata.
// We currently only lift files up from the `deps` directory. If
// it was compiled into something like `example/` or `doc/` then
// we don't want to link it up.
if out_dir.ends_with("deps") {
// Don't lift up library dependencies.
if unit.target.is_bin() || self.roots.contains(unit) || unit.target.is_dylib() {
Some((
out_dir.parent().unwrap().to_owned(),
if unit.mode.is_any_test() {
file_stem
} else {
bin_stem
},
))
} else {
None
}
} else if bin_stem == file_stem {
None
} else if out_dir.ends_with("examples") || out_dir.parent().unwrap().ends_with("build") {
Some((out_dir, bin_stem))
} else {
None
// Only uplift:
// - Binaries: The user always wants to see these, even if they are
// implicitly built (for example for integration tests).
// - dylibs: This ensures that the dynamic linker pulls in all the
// latest copies (even if the dylib was built from a previous cargo
// build). There are complex reasons for this, see #8139, #6167, #6162.
// - Things directly requested from the command-line (the "roots").
// This one is a little questionable for rlibs (see #6131), but is
// historically how Cargo has operated. This is primarily useful to
// give the user access to staticlibs and cdylibs.
if !unit.target.is_bin()
&& !unit.target.is_custom_build()
&& file_type.crate_type != Some(CrateType::Dylib)
&& !self.roots.contains(unit)
{
return None;
}
let filename = file_type.uplift_filename(&unit.target);
let uplift_path = if unit.target.is_example() {
// Examples live in their own little world.
self.layout(unit.kind).examples().join(filename)
} else if unit.target.is_custom_build() {
self.build_script_dir(unit).join(filename)
} else {
self.layout(unit.kind).dest().join(filename)
};
if from_path == uplift_path {
// This can happen with things like examples that reside in the
// same directory, do not have a metadata hash (like on Windows),
// and do not have hyphens.
return None;
}
Some(uplift_path)
}
fn calc_outputs(
@ -359,18 +364,6 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
bcx: &BuildContext<'a, 'cfg>,
) -> CargoResult<Arc<Vec<OutputFile>>> {
let ret = match unit.mode {
CompileMode::Check { .. } => {
// This may be confusing. rustc outputs a file named `lib*.rmeta`
// for both libraries and binaries.
let file_stem = self.file_stem(unit);
let path = self.out_dir(unit).join(format!("lib{}.rmeta", file_stem));
vec![OutputFile {
path,
hardlink: None,
export_path: None,
flavor: FileFlavor::Linkable { rmeta: false },
}]
}
CompileMode::Doc { .. } => {
let path = self
.out_dir(unit)
@ -394,128 +387,73 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
// but Cargo does not know about that.
vec![]
}
CompileMode::Test | CompileMode::Build | CompileMode::Bench => {
self.calc_outputs_rustc(unit, bcx)?
}
CompileMode::Test
| CompileMode::Build
| CompileMode::Bench
| CompileMode::Check { .. } => self.calc_outputs_rustc(unit, bcx)?,
};
info!("Target filenames: {:?}", ret);
Ok(Arc::new(ret))
}
/// Computes the actual, full pathnames for all the files generated by rustc.
///
/// The `OutputFile` also contains the paths where those files should be
/// "uplifted" to.
fn calc_outputs_rustc(
&self,
unit: &Unit,
bcx: &BuildContext<'a, 'cfg>,
) -> CargoResult<Vec<OutputFile>> {
let mut ret = Vec::new();
let mut unsupported = Vec::new();
let out_dir = self.out_dir(unit);
let link_stem = self.link_stem(unit);
let info = bcx.target_data.info(unit.kind);
let file_stem = self.file_stem(unit);
let mut add = |crate_type: &str, flavor: FileFlavor| -> CargoResult<()> {
let crate_type = if crate_type == "lib" {
"rlib"
} else {
crate_type
};
let file_types = info.file_types(
crate_type,
flavor,
unit.target.kind(),
bcx.target_data.short_name(&unit.kind),
)?;
match file_types {
Some(types) => {
for file_type in types {
let path = out_dir.join(file_type.filename(&file_stem));
// Don't create hardlink for tests
let hardlink = if unit.mode.is_any_test() {
None
} else {
link_stem
.as_ref()
.map(|&(ref ld, ref ls)| ld.join(file_type.filename(ls)))
};
let export_path = if unit.target.is_custom_build() {
None
} else {
self.export_dir.as_ref().and_then(|export_dir| {
hardlink
.as_ref()
.map(|hardlink| export_dir.join(hardlink.file_name().unwrap()))
})
};
ret.push(OutputFile {
path,
hardlink,
export_path,
flavor: file_type.flavor,
});
}
}
// Not supported; don't worry about it.
None => {
unsupported.push(crate_type.to_string());
}
}
Ok(())
};
match *unit.target.kind() {
TargetKind::Bin
| TargetKind::CustomBuild
| TargetKind::ExampleBin
| TargetKind::Bench
| TargetKind::Test => {
add("bin", FileFlavor::Normal)?;
}
TargetKind::Lib(..) | TargetKind::ExampleLib(..) if unit.mode.is_any_test() => {
add("bin", FileFlavor::Normal)?;
}
TargetKind::ExampleLib(ref kinds) | TargetKind::Lib(ref kinds) => {
for kind in kinds {
add(
kind.crate_type(),
if kind.linkable() {
FileFlavor::Linkable { rmeta: false }
} else {
FileFlavor::Normal
},
)?;
}
let path = out_dir.join(format!("lib{}.rmeta", file_stem));
if !unit.requires_upstream_objects() {
ret.push(OutputFile {
path,
hardlink: None,
export_path: None,
flavor: FileFlavor::Linkable { rmeta: true },
});
}
}
}
if ret.is_empty() {
let triple = bcx.target_data.short_name(&unit.kind);
let (file_types, unsupported) =
info.rustc_outputs(unit.mode, unit.target.kind(), triple)?;
if file_types.is_empty() {
if !unsupported.is_empty() {
let unsupported_strs: Vec<_> = unsupported.iter().map(|ct| ct.as_str()).collect();
anyhow::bail!(
"cannot produce {} for `{}` as the target `{}` \
does not support these crate types",
unsupported.join(", "),
unsupported_strs.join(", "),
unit.pkg,
bcx.target_data.short_name(&unit.kind),
triple,
)
}
anyhow::bail!(
"cannot compile `{}` as the target `{}` does not \
support any of the output crate types",
unit.pkg,
bcx.target_data.short_name(&unit.kind),
triple,
);
}
Ok(ret)
// Convert FileType to OutputFile.
let mut outputs = Vec::new();
for file_type in file_types {
let meta = self.metadata(unit).map(|m| m.to_string());
let path = out_dir.join(file_type.output_filename(&unit.target, meta.as_deref()));
let hardlink = self.uplift_to(unit, &file_type, &path);
let export_path = if unit.target.is_custom_build() {
None
} else {
self.export_dir.as_ref().and_then(|export_dir| {
hardlink
.as_ref()
.map(|hardlink| export_dir.join(hardlink.file_name().unwrap()))
})
};
outputs.push(OutputFile {
path,
hardlink,
export_path,
flavor: file_type.flavor,
});
}
Ok(outputs)
}
}
@ -539,46 +477,10 @@ fn compute_metadata(
cx: &Context<'_, '_>,
metas: &mut HashMap<Unit, Option<Metadata>>,
) -> Option<Metadata> {
if unit.mode.is_doc_test() {
// Doc tests do not have metadata.
return None;
}
// No metadata for dylibs because of a couple issues:
// - macOS encodes the dylib name in the executable,
// - Windows rustc multiple files of which we can't easily link all of them.
//
// No metadata for bin because of an issue:
// - wasm32 rustc/emcc encodes the `.wasm` name in the `.js` (rust-lang/cargo#4535).
// - msvc: The path to the PDB is embedded in the executable, and we don't
// want the PDB path to include the hash in it.
//
// Two exceptions:
// 1) Upstream dependencies (we aren't exporting + need to resolve name conflict),
// 2) `__CARGO_DEFAULT_LIB_METADATA` env var.
//
// Note, however, that the compiler's build system at least wants
// path dependencies (eg libstd) to have hashes in filenames. To account for
// that we have an extra hack here which reads the
// `__CARGO_DEFAULT_LIB_METADATA` environment variable and creates a
// hash in the filename if that's present.
//
// This environment variable should not be relied on! It's
// just here for rustbuild. We need a more principled method
// doing this eventually.
let bcx = &cx.bcx;
let __cargo_default_lib_metadata = env::var("__CARGO_DEFAULT_LIB_METADATA");
let short_name = bcx.target_data.short_name(&unit.kind);
if !(unit.mode.is_any_test() || unit.mode.is_check())
&& (unit.target.is_dylib()
|| unit.target.is_cdylib()
|| (unit.target.is_executable() && short_name.starts_with("wasm32-"))
|| (unit.target.is_executable() && short_name.contains("msvc")))
&& unit.pkg.package_id().source_id().is_path()
&& __cargo_default_lib_metadata.is_err()
{
if !should_use_metadata(bcx, unit) {
return None;
}
let mut hasher = SipHasher::new();
// This is a generic version number that can be changed to make
@ -638,7 +540,7 @@ fn compute_metadata(
// Seed the contents of `__CARGO_DEFAULT_LIB_METADATA` to the hasher if present.
// This should be the release channel, to get a different hash for each channel.
if let Ok(ref channel) = __cargo_default_lib_metadata {
if let Ok(ref channel) = env::var("__CARGO_DEFAULT_LIB_METADATA") {
channel.hash(&mut hasher);
}
@ -685,3 +587,46 @@ fn hash_rustc_version(bcx: &BuildContext<'_, '_>, hasher: &mut SipHasher) {
// the future when cranelift sees more use, and people want to switch
// between different backends without recompiling.
}
/// Returns whether or not this unit should use a metadata hash.
fn should_use_metadata(bcx: &BuildContext<'_, '_>, unit: &Unit) -> bool {
if unit.mode.is_doc_test() {
// Doc tests do not have metadata.
return false;
}
if unit.mode.is_any_test() || unit.mode.is_check() {
// These always use metadata.
return true;
}
// No metadata in these cases:
//
// - dylibs:
// - macOS encodes the dylib name in the executable, so it can't be renamed.
// - TODO: Are there other good reasons? If not, maybe this should be macos specific?
// - Windows MSVC executables: The path to the PDB is embedded in the
// executable, and we don't want the PDB path to include the hash in it.
// - wasm32 executables: When using emscripten, the path to the .wasm file
// is embedded in the .js file, so we don't want the hash in there.
// TODO: Is this necessary for wasm32-unknown-unknown?
//
// This is only done for local packages, as we don't expect to export
// dependencies.
//
// The __CARGO_DEFAULT_LIB_METADATA env var is used to override this to
// force metadata in the hash. This is only used for building libstd. For
// example, if libstd is placed in a common location, we don't want a file
// named /usr/lib/libstd.so which could conflict with other rustc
// installs. TODO: Is this still a realistic concern?
// See https://github.com/rust-lang/cargo/issues/3005
let short_name = bcx.target_data.short_name(&unit.kind);
if (unit.target.is_dylib()
|| unit.target.is_cdylib()
|| (unit.target.is_executable() && short_name.starts_with("wasm32-"))
|| (unit.target.is_executable() && short_name.contains("msvc")))
&& unit.pkg.package_id().source_id().is_path()
&& env::var("__CARGO_DEFAULT_LIB_METADATA").is_err()
{
return false;
}
true
}

View File

@ -262,7 +262,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
/// Returns the executable for the specified unit (if any).
pub fn get_executable(&mut self, unit: &Unit) -> CargoResult<Option<PathBuf>> {
for output in self.outputs(unit)?.iter() {
if output.flavor == FileFlavor::DebugInfo {
if output.flavor != FileFlavor::Normal {
continue;
}

View File

@ -0,0 +1,97 @@
use std::fmt;
#[derive(Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub enum CrateType {
Bin,
Lib,
Rlib,
Dylib,
Cdylib,
Staticlib,
ProcMacro,
Other(String),
}
impl CrateType {
pub fn as_str(&self) -> &str {
match self {
CrateType::Bin => "bin",
CrateType::Lib => "lib",
CrateType::Rlib => "rlib",
CrateType::Dylib => "dylib",
CrateType::Cdylib => "cdylib",
CrateType::Staticlib => "staticlib",
CrateType::ProcMacro => "proc-macro",
CrateType::Other(s) => s,
}
}
pub fn is_linkable(&self) -> bool {
match self {
CrateType::Lib | CrateType::Rlib | CrateType::Dylib | CrateType::ProcMacro => true,
CrateType::Bin | CrateType::Cdylib | CrateType::Staticlib | CrateType::Other(..) => {
false
}
}
}
pub fn is_dynamic(&self) -> bool {
match self {
CrateType::Dylib | CrateType::Cdylib | CrateType::ProcMacro => true,
CrateType::Lib
| CrateType::Rlib
| CrateType::Bin
| CrateType::Staticlib
| CrateType::Other(..) => false,
}
}
pub fn requires_upstream_objects(&self) -> bool {
match self {
// "lib" == "rlib" and is a compilation that doesn't actually
// require upstream object files to exist, only upstream metadata
// files. As a result, it doesn't require upstream artifacts
CrateType::Lib | CrateType::Rlib => false,
// Everything else, however, is some form of "linkable output" or
// something that requires upstream object files.
_ => true,
}
}
}
impl fmt::Display for CrateType {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
self.as_str().fmt(f)
}
}
impl<'a> From<&'a String> for CrateType {
fn from(s: &'a String) -> Self {
match s.as_str() {
"bin" => CrateType::Bin,
"lib" => CrateType::Lib,
"rlib" => CrateType::Rlib,
"dylib" => CrateType::Dylib,
"cdylib" => CrateType::Cdylib,
"staticlib" => CrateType::Staticlib,
"procmacro" => CrateType::ProcMacro,
_ => CrateType::Other(s.clone()),
}
}
}
impl fmt::Debug for CrateType {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
self.to_string().fmt(f)
}
}
impl serde::Serialize for CrateType {
fn serialize<S>(&self, s: S) -> Result<S::Ok, S::Error>
where
S: serde::ser::Serializer,
{
self.to_string().serialize(s)
}
}

View File

@ -738,7 +738,7 @@ pub fn build_map(cx: &mut Context<'_, '_>) -> CargoResult<()> {
if dep_unit.target.for_host() {
ret.plugins.extend(dep_scripts.to_link.iter().cloned());
} else if dep_unit.target.linkable() {
} else if dep_unit.target.is_linkable() {
for &(pkg, metadata) in dep_scripts.to_link.iter() {
add_to_link(&mut ret, pkg, metadata);
}

View File

@ -43,8 +43,9 @@
//! The `Metadata` hash is a hash added to the output filenames to isolate
//! each unit. See the documentation in the `compilation_files` module for
//! more details. NOTE: Not all output files are isolated via filename hashes
//! (like dylibs), but the fingerprint directory always has the `Metadata`
//! hash in its directory name.
//! (like dylibs). The fingerprint directory uses a hash, but sometimes units
//! share the same fingerprint directory (when they don't have Metadata) so
//! care should be taken to handle this!
//!
//! Fingerprints and Metadata are similar, and track some of the same things.
//! The Metadata contains information that is required to keep Units separate.
@ -104,8 +105,9 @@
//! - A "dep-info" file which contains a list of source filenames for the
//! target. See below for details.
//! - An `invoked.timestamp` file whose filesystem mtime is updated every time
//! the Unit is built. This is an experimental feature used for cleaning
//! unused artifacts.
//! the Unit is built. This is used for capturing the time when the build
//! starts, to detect if files are changed in the middle of the build. See
//! below for more details.
//!
//! Note that some units are a little different. A Unit for *running* a build
//! script or for `rustdoc` does not have a dep-info file (it's not
@ -351,8 +353,7 @@ pub fn prepare_target(cx: &mut Context<'_, '_>, unit: &Unit, force: bool) -> Car
unit.target.name()
));
let bcx = cx.bcx;
let new = cx.files().fingerprint_dir(unit);
let loc = new.join(&filename(cx, unit));
let loc = cx.files().fingerprint_file_path(unit, "");
debug!("fingerprint at: {}", loc.display());
@ -1521,9 +1522,7 @@ pub fn prepare_init(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<()> {
/// Returns the location that the dep-info file will show up at for the `unit`
/// specified.
pub fn dep_info_loc(cx: &mut Context<'_, '_>, unit: &Unit) -> PathBuf {
cx.files()
.fingerprint_dir(unit)
.join(&format!("dep-{}", filename(cx, unit)))
cx.files().fingerprint_file_path(unit, "dep-")
}
/// Returns an absolute path that target directory.
@ -1679,24 +1678,6 @@ where
None
}
fn filename(cx: &mut Context<'_, '_>, unit: &Unit) -> String {
// file_stem includes metadata hash. Thus we have a different
// fingerprint for every metadata hash version. This works because
// even if the package is fresh, we'll still link the fresh target
let file_stem = cx.files().file_stem(unit);
let kind = unit.target.kind().description();
let flavor = if unit.mode.is_any_test() {
"test-"
} else if unit.mode.is_doc() {
"doc-"
} else if unit.mode.is_run_custom_build() {
"run-"
} else {
""
};
format!("{}{}-{}", flavor, kind, file_stem)
}
#[repr(u8)]
enum DepInfoPathType {
// src/, e.g. src/lib.rs

View File

@ -26,19 +26,20 @@
//! # packages
//! .fingerprint/
//! # Each package is in a separate directory.
//! # Note that different target kinds have different filename prefixes.
//! $pkgname-$META/
//! # Set of source filenames for this package.
//! dep-lib-$pkgname-$META
//! dep-lib-$targetname
//! # Timestamp when this package was last built.
//! invoked.timestamp
//! # The fingerprint hash.
//! lib-$pkgname-$META
//! lib-$targetname
//! # Detailed information used for logging the reason why
//! # something is being recompiled.
//! lib-$pkgname-$META.json
//! lib-$targetname.json
//! # The console output from the compiler. This is cached
//! # so that warnings can be redisplayed for "fresh" units.
//! output
//! output-lib-$targetname
//!
//! # This is the root directory for all rustc artifacts except build
//! # scripts, examples, and test and bench executables. Almost every

View File

@ -4,6 +4,7 @@ mod build_plan;
mod compilation;
mod compile_kind;
mod context;
mod crate_type;
mod custom_build;
mod fingerprint;
mod job;
@ -30,15 +31,17 @@ use lazycell::LazyCell;
use log::debug;
pub use self::build_config::{BuildConfig, CompileMode, MessageFormat};
pub use self::build_context::{BuildContext, FileFlavor, RustcTargetData, TargetInfo};
pub use self::build_context::{BuildContext, FileFlavor, FileType, RustcTargetData, TargetInfo};
use self::build_plan::BuildPlan;
pub use self::compilation::{Compilation, Doctest};
pub use self::compile_kind::{CompileKind, CompileTarget};
pub use self::context::{Context, Metadata};
pub use self::crate_type::CrateType;
pub use self::custom_build::{BuildOutput, BuildScriptOutputs, BuildScripts};
pub use self::job::Freshness;
use self::job::{Job, Work};
use self::job_queue::{JobQueue, JobState};
pub(crate) use self::layout::Layout;
use self::output_depinfo::output_depinfo;
use self::unit_graph::UnitDep;
pub use crate::core::compiler::unit::{Unit, UnitInterner};
@ -190,17 +193,12 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
// don't pass the `-l` flags.
let pass_l_flag = unit.target.is_lib() || !unit.pkg.targets().iter().any(|t| t.is_lib());
let pass_cdylib_link_args = unit.target.is_cdylib();
let do_rename = unit.target.allows_dashes() && !unit.mode.is_any_test();
let real_name = unit.target.name().to_string();
let crate_name = unit.target.crate_name();
// Rely on `target_filenames` iterator as source of truth rather than rederiving filestem.
let rustc_dep_info_loc = if do_rename && cx.files().metadata(unit).is_none() {
root.join(&crate_name)
} else {
root.join(&cx.files().file_stem(unit))
}
.with_extension("d");
let dep_info_name = match cx.files().metadata(unit) {
Some(metadata) => format!("{}-{}.d", unit.target.crate_name(), metadata),
None => format!("{}.d", unit.target.crate_name()),
};
let rustc_dep_info_loc = root.join(dep_info_name);
let dep_info_loc = fingerprint::dep_info_loc(cx, unit);
rustc.args(cx.bcx.rustflags_args(unit));
@ -292,20 +290,6 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
.chain_err(|| format!("could not compile `{}`.", name))?;
}
if do_rename && real_name != crate_name {
let dst = &outputs[0].path;
let src = dst.with_file_name(
dst.file_name()
.unwrap()
.to_str()
.unwrap()
.replace(&real_name, &crate_name),
);
if src.exists() && src.file_name() != dst.file_name() {
fs::rename(&src, &dst).chain_err(|| format!("could not rename crate {:?}", src))?;
}
}
if rustc_dep_info_loc.exists() {
fingerprint::translate_dep_info(
&rustc_dep_info_loc,
@ -532,7 +516,7 @@ where
fn prepare_rustc(
cx: &mut Context<'_, '_>,
crate_types: &[&str],
crate_types: &[CrateType],
unit: &Unit,
) -> CargoResult<ProcessBuilder> {
let is_primary = cx.is_primary_package(unit);
@ -734,7 +718,7 @@ fn build_base_args(
cx: &mut Context<'_, '_>,
cmd: &mut ProcessBuilder,
unit: &Unit,
crate_types: &[&str],
crate_types: &[CrateType],
) -> CargoResult<()> {
assert!(!unit.mode.is_run_custom_build());
@ -764,7 +748,7 @@ fn build_base_args(
if !test {
for crate_type in crate_types.iter() {
cmd.arg("--crate-type").arg(crate_type);
cmd.arg("--crate-type").arg(crate_type.as_str());
}
}
@ -780,7 +764,7 @@ fn build_base_args(
}
let prefer_dynamic = (unit.target.for_host() && !unit.target.is_custom_build())
|| (crate_types.contains(&"dylib") && bcx.ws.members().any(|p| *p != unit.pkg));
|| (crate_types.contains(&CrateType::Dylib) && bcx.ws.members().any(|p| *p != unit.pkg));
if prefer_dynamic {
cmd.arg("-C").arg("prefer-dynamic");
}
@ -984,7 +968,7 @@ fn build_deps_args(
// error in the future (see PR #4797).
if !deps
.iter()
.any(|dep| !dep.unit.mode.is_doc() && dep.unit.target.linkable())
.any(|dep| !dep.unit.mode.is_doc() && dep.unit.target.is_linkable())
{
if let Some(dep) = deps
.iter()
@ -1067,19 +1051,18 @@ pub fn extern_args(
};
let outputs = cx.outputs(&dep.unit)?;
let mut outputs = outputs.iter().filter_map(|output| match output.flavor {
FileFlavor::Linkable { rmeta } => Some((output, rmeta)),
_ => None,
});
if cx.only_requires_rmeta(unit, &dep.unit) {
let (output, _rmeta) = outputs
.find(|(_output, rmeta)| *rmeta)
.expect("failed to find rlib dep for pipelined dep");
if cx.only_requires_rmeta(unit, &dep.unit) || dep.unit.mode.is_check() {
// Example: rlib dependency for an rlib, rmeta is all that is required.
let output = outputs
.iter()
.find(|output| output.flavor == FileFlavor::Rmeta)
.expect("failed to find rmeta dep for pipelined dep");
pass(&output.path);
} else {
for (output, rmeta) in outputs {
if !rmeta {
// Example: a bin needs `rlib` for dependencies, it cannot use rmeta.
for output in outputs.iter() {
if output.flavor == FileFlavor::Linkable {
pass(&output.path);
}
}
@ -1088,7 +1071,7 @@ pub fn extern_args(
};
for dep in deps {
if dep.unit.target.linkable() && !dep.unit.mode.is_doc() {
if dep.unit.target.is_linkable() && !dep.unit.mode.is_doc() {
link_to(dep, dep.extern_crate_name, dep.noprelude)?;
}
}

View File

@ -132,7 +132,7 @@ pub fn output_depinfo(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<()>
for output in cx
.outputs(unit)?
.iter()
.filter(|o| o.flavor != FileFlavor::DebugInfo)
.filter(|o| !matches!(o.flavor, FileFlavor::DebugInfo | FileFlavor::Auxiliary))
{
if let Some(ref link_dst) = output.hardlink {
let output_path = link_dst.with_extension("d");

View File

@ -1,5 +1,5 @@
use crate::core::compiler::{CompileKind, CompileMode};
use crate::core::manifest::{LibKind, Target, TargetKind};
use crate::core::compiler::{CompileKind, CompileMode, CrateType};
use crate::core::manifest::{Target, TargetKind};
use crate::core::{profiles::Profile, InternedString, Package};
use crate::util::hex::short_hash;
use crate::util::Config;
@ -178,9 +178,9 @@ impl UnitInterner {
//
// At some point in the future, it would be nice to have a
// first-class way of overriding or specifying crate-types.
(true, TargetKind::Lib(crate_types)) if crate_types.contains(&LibKind::Dylib) => {
(true, TargetKind::Lib(crate_types)) if crate_types.contains(&CrateType::Dylib) => {
let mut new_target = Target::clone(target);
new_target.set_kind(TargetKind::Lib(vec![LibKind::Rlib]));
new_target.set_kind(TargetKind::Lib(vec![CrateType::Rlib]));
new_target
}
_ => target.clone(),

View File

@ -476,7 +476,7 @@ fn maybe_lib(
unit.pkg
.targets()
.iter()
.find(|t| t.linkable())
.find(|t| t.is_linkable())
.map(|t| {
let mode = check_or_build_mode(unit.mode, t);
new_unit_dep(
@ -681,7 +681,7 @@ fn connect_run_custom_build_deps(unit_dependencies: &mut UnitGraph) {
// Only deps with `links`.
.filter(|other| {
other.unit.pkg != unit.pkg
&& other.unit.target.linkable()
&& other.unit.target.is_linkable()
&& other.unit.pkg.manifest().links().is_some()
})
// Get the RunCustomBuild for other lib.

View File

@ -10,6 +10,7 @@ use serde::ser;
use serde::Serialize;
use url::Url;
use crate::core::compiler::CrateType;
use crate::core::interning::InternedString;
use crate::core::resolver::ResolveBehavior;
use crate::core::{Dependency, PackageId, PackageIdSpec, SourceId, Summary};
@ -96,73 +97,13 @@ pub struct ManifestMetadata {
pub links: Option<String>,
}
#[derive(Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub enum LibKind {
Lib,
Rlib,
Dylib,
ProcMacro,
Other(String),
}
impl LibKind {
/// Returns the argument suitable for `--crate-type` to pass to rustc.
pub fn crate_type(&self) -> &str {
match *self {
LibKind::Lib => "lib",
LibKind::Rlib => "rlib",
LibKind::Dylib => "dylib",
LibKind::ProcMacro => "proc-macro",
LibKind::Other(ref s) => s,
}
}
pub fn linkable(&self) -> bool {
match *self {
LibKind::Lib | LibKind::Rlib | LibKind::Dylib | LibKind::ProcMacro => true,
LibKind::Other(..) => false,
}
}
pub fn requires_upstream_objects(&self) -> bool {
match *self {
// "lib" == "rlib" and is a compilation that doesn't actually
// require upstream object files to exist, only upstream metadata
// files. As a result, it doesn't require upstream artifacts
LibKind::Lib | LibKind::Rlib => false,
// Everything else, however, is some form of "linkable output" or
// something that requires upstream object files.
_ => true,
}
}
}
impl fmt::Debug for LibKind {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
self.crate_type().fmt(f)
}
}
impl<'a> From<&'a String> for LibKind {
fn from(string: &'a String) -> Self {
match string.as_ref() {
"lib" => LibKind::Lib,
"rlib" => LibKind::Rlib,
"dylib" => LibKind::Dylib,
"proc-macro" => LibKind::ProcMacro,
s => LibKind::Other(s.to_string()),
}
}
}
#[derive(Clone, Hash, PartialEq, Eq, PartialOrd, Ord)]
pub enum TargetKind {
Lib(Vec<LibKind>),
Lib(Vec<CrateType>),
Bin,
Test,
Bench,
ExampleLib(Vec<LibKind>),
ExampleLib(Vec<CrateType>),
ExampleBin,
CustomBuild,
}
@ -173,8 +114,8 @@ impl ser::Serialize for TargetKind {
S: ser::Serializer,
{
use self::TargetKind::*;
match *self {
Lib(ref kinds) => s.collect_seq(kinds.iter().map(LibKind::crate_type)),
match self {
Lib(kinds) => s.collect_seq(kinds.iter().map(|t| t.to_string())),
Bin => ["bin"].serialize(s),
ExampleBin | ExampleLib(_) => ["example"].serialize(s),
Test => ["test"].serialize(s),
@ -223,6 +164,18 @@ impl TargetKind {
_ => true,
}
}
/// Returns the arguments suitable for `--crate-type` to pass to rustc.
pub fn rustc_crate_types(&self) -> Vec<CrateType> {
match self {
TargetKind::Lib(kinds) | TargetKind::ExampleLib(kinds) => kinds.clone(),
TargetKind::CustomBuild
| TargetKind::Bench
| TargetKind::Test
| TargetKind::ExampleBin
| TargetKind::Bin => vec![CrateType::Bin],
}
}
}
/// Information about a binary, a library, an example, etc. that is part of the
@ -303,7 +256,7 @@ struct SerializedTarget<'a> {
kind: &'a TargetKind,
/// Corresponds to `--crate-type` compiler attribute.
/// See https://doc.rust-lang.org/reference/linkage.html
crate_types: Vec<&'a str>,
crate_types: Vec<CrateType>,
name: &'a str,
src_path: Option<&'a PathBuf>,
edition: &'a str,
@ -659,7 +612,7 @@ impl Target {
pub fn lib_target(
name: &str,
crate_targets: Vec<LibKind>,
crate_targets: Vec<CrateType>,
src_path: PathBuf,
edition: Edition,
) -> Target {
@ -712,15 +665,12 @@ impl Target {
pub fn example_target(
name: &str,
crate_targets: Vec<LibKind>,
crate_targets: Vec<CrateType>,
src_path: PathBuf,
required_features: Option<Vec<String>>,
edition: Edition,
) -> Target {
let kind = if crate_targets.is_empty()
|| crate_targets
.iter()
.all(|t| *t == LibKind::Other("bin".into()))
let kind = if crate_targets.is_empty() || crate_targets.iter().all(|t| *t == CrateType::Bin)
{
TargetKind::ExampleBin
} else {
@ -812,21 +762,13 @@ impl Target {
pub fn doctestable(&self) -> bool {
match self.kind() {
TargetKind::Lib(ref kinds) => kinds
.iter()
.any(|k| *k == LibKind::Rlib || *k == LibKind::Lib || *k == LibKind::ProcMacro),
TargetKind::Lib(ref kinds) => kinds.iter().any(|k| {
*k == CrateType::Rlib || *k == CrateType::Lib || *k == CrateType::ProcMacro
}),
_ => false,
}
}
/// Whether or not this target allows dashes in its filename.
///
/// Rustc will always emit filenames with underscores, and Cargo will
/// rename them back to dashes if this is true.
pub fn allows_dashes(&self) -> bool {
self.is_bin() || self.is_example() || self.is_custom_build()
}
pub fn is_lib(&self) -> bool {
match self.kind() {
TargetKind::Lib(_) => true,
@ -836,29 +778,25 @@ impl Target {
pub fn is_dylib(&self) -> bool {
match self.kind() {
TargetKind::Lib(libs) => libs.iter().any(|l| *l == LibKind::Dylib),
TargetKind::Lib(libs) => libs.iter().any(|l| *l == CrateType::Dylib),
_ => false,
}
}
pub fn is_cdylib(&self) -> bool {
let libs = match self.kind() {
TargetKind::Lib(libs) => libs,
_ => return false,
};
libs.iter().any(|l| match *l {
LibKind::Other(ref s) => s == "cdylib",
match self.kind() {
TargetKind::Lib(libs) => libs.iter().any(|l| *l == CrateType::Cdylib),
_ => false,
})
}
}
/// Returns whether this target produces an artifact which can be linked
/// into a Rust crate.
///
/// This only returns true for certain kinds of libraries.
pub fn linkable(&self) -> bool {
pub fn is_linkable(&self) -> bool {
match self.kind() {
TargetKind::Lib(kinds) => kinds.iter().any(|k| k.linkable()),
TargetKind::Lib(kinds) => kinds.iter().any(|k| k.is_linkable()),
_ => false,
}
}
@ -900,25 +838,16 @@ impl Target {
}
/// Returns the arguments suitable for `--crate-type` to pass to rustc.
pub fn rustc_crate_types(&self) -> Vec<&str> {
match self.kind() {
TargetKind::Lib(ref kinds) | TargetKind::ExampleLib(ref kinds) => {
kinds.iter().map(LibKind::crate_type).collect()
}
TargetKind::CustomBuild
| TargetKind::Bench
| TargetKind::Test
| TargetKind::ExampleBin
| TargetKind::Bin => vec!["bin"],
}
pub fn rustc_crate_types(&self) -> Vec<CrateType> {
self.kind().rustc_crate_types()
}
pub fn can_lto(&self) -> bool {
match self.kind() {
TargetKind::Lib(ref v) => {
!v.contains(&LibKind::Rlib)
&& !v.contains(&LibKind::Dylib)
&& !v.contains(&LibKind::Lib)
TargetKind::Lib(v) => {
!v.contains(&CrateType::Rlib)
&& !v.contains(&CrateType::Dylib)
&& !v.contains(&CrateType::Lib)
}
_ => true,
}

View File

@ -5,7 +5,7 @@ pub use self::features::{
pub use self::features::{CliUnstable, Edition, Feature, Features};
pub use self::interning::InternedString;
pub use self::manifest::{EitherManifest, VirtualManifest};
pub use self::manifest::{LibKind, Manifest, Target, TargetKind};
pub use self::manifest::{Manifest, Target, TargetKind};
pub use self::package::{Package, PackageSet};
pub use self::package_id::PackageId;
pub use self::package_id_spec::PackageIdSpec;

View File

@ -1,22 +1,12 @@
use crate::core::InternedString;
use std::collections::HashMap;
use std::fs;
use std::path::Path;
use crate::core::compiler::unit_dependencies;
use crate::core::compiler::BuildContext;
use crate::core::compiler::{
BuildConfig, CompileKind, CompileMode, Context, RustcTargetData, UnitInterner,
};
use crate::core::profiles::{Profiles, UnitFor};
use crate::core::resolver::features::HasDevUnits;
use crate::core::resolver::ResolveOpts;
use crate::core::{PackageIdSpec, Workspace};
use crate::core::compiler::{CompileKind, CompileMode, Layout, RustcTargetData};
use crate::core::profiles::Profiles;
use crate::core::{InternedString, PackageIdSpec, Workspace};
use crate::ops;
use crate::ops::resolve::WorkspaceResolve;
use crate::util::errors::{CargoResult, CargoResultExt};
use crate::util::paths;
use crate::util::Config;
use std::fs;
use std::path::Path;
pub struct CleanOptions<'a> {
pub config: &'a Config,
@ -61,135 +51,131 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
if opts.spec.is_empty() {
return rm_rf(&target_dir.into_path_unlocked(), config);
}
let mut build_config = BuildConfig::new(config, Some(1), &opts.targets, CompileMode::Build)?;
build_config.requested_profile = opts.requested_profile;
let target_data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
// Resolve for default features. In the future, `cargo clean` should be rewritten
// so that it doesn't need to guess filename hashes.
let resolve_opts = ResolveOpts::new(
/*dev_deps*/ true,
&[],
/*all features*/ false,
/*default*/ true,
);
let specs = opts
.spec
.iter()
.map(|spec| PackageIdSpec::parse(spec))
.collect::<CargoResult<Vec<_>>>()?;
let ws_resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
&build_config.requested_kinds,
&resolve_opts,
&specs,
HasDevUnits::Yes,
)?;
let WorkspaceResolve {
pkg_set,
targeted_resolve: resolve,
resolved_features: features,
..
} = ws_resolve;
let interner = UnitInterner::new();
let mut units = Vec::new();
// Clean specific packages.
let requested_kinds = CompileKind::from_requested_targets(config, &opts.targets)?;
let target_data = RustcTargetData::new(ws, &requested_kinds)?;
let (pkg_set, resolve) = ops::resolve_ws(ws)?;
let prof_dir_name = profiles.get_dir_name();
let host_layout = Layout::new(ws, None, &prof_dir_name)?;
// Convert requested kinds to a Vec of layouts.
let target_layouts: Vec<(CompileKind, Layout)> = requested_kinds
.into_iter()
.filter_map(|kind| match kind {
CompileKind::Target(target) => match Layout::new(ws, Some(target), &prof_dir_name) {
Ok(layout) => Some(Ok((kind, layout))),
Err(e) => Some(Err(e)),
},
CompileKind::Host => None,
})
.collect::<CargoResult<_>>()?;
// A Vec of layouts. This is a little convoluted because there can only be
// one host_layout.
let layouts = if opts.targets.is_empty() {
vec![(CompileKind::Host, &host_layout)]
} else {
target_layouts
.iter()
.map(|(kind, layout)| (*kind, layout))
.collect()
};
// Create a Vec that also includes the host for things that need to clean both.
let layouts_with_host: Vec<(CompileKind, &Layout)> =
std::iter::once((CompileKind::Host, &host_layout))
.chain(layouts.iter().map(|(k, l)| (*k, *l)))
.collect();
for spec in opts.spec.iter() {
// Translate the spec to a Package
let pkgid = resolve.query(spec)?;
let pkg = pkg_set.get_one(pkgid)?;
// Cleaning individual rustdoc crates is currently not supported.
// For example, the search index would need to be rebuilt to fully
// remove it (otherwise you're left with lots of broken links).
// Doc tests produce no output.
// Generate all relevant `Unit` targets for this package
for target in pkg.targets() {
for kind in build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
{
for mode in CompileMode::all_modes() {
for unit_for in UnitFor::all_values() {
let profile = if mode.is_run_custom_build() {
profiles.get_profile_run_custom_build(&profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
/*is_local*/ true,
*unit_for,
CompileMode::Build,
))
} else {
profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
/*is_local*/ true,
*unit_for,
*mode,
)
};
// Use unverified here since this is being more
// exhaustive than what is actually needed.
let features_for = unit_for.map_to_features_for();
let features = features
.activated_features_unverified(pkg.package_id(), features_for)
.unwrap_or_default();
units.push(interner.intern(
pkg, target, profile, *kind, *mode, features, /*is_std*/ false,
));
}
}
}
// Get Packages for the specified specs.
let mut pkg_ids = Vec::new();
for spec_str in opts.spec.iter() {
// Translate the spec to a Package.
let spec = PackageIdSpec::parse(spec_str)?;
if spec.version().is_some() {
config.shell().warn(&format!(
"version qualifier in `-p {}` is ignored, \
cleaning all versions of `{}` found",
spec_str,
spec.name()
))?;
}
if spec.url().is_some() {
config.shell().warn(&format!(
"url qualifier in `-p {}` ignored, \
cleaning all versions of `{}` found",
spec_str,
spec.name()
))?;
}
let matches: Vec<_> = resolve.iter().filter(|id| spec.matches(*id)).collect();
if matches.is_empty() {
anyhow::bail!("package ID specification `{}` matched no packages", spec);
}
pkg_ids.extend(matches);
}
let packages = pkg_set.get_many(pkg_ids)?;
let unit_graph = unit_dependencies::build_unit_dependencies(
ws,
&pkg_set,
&resolve,
&features,
None,
&units,
&Default::default(),
build_config.mode,
&target_data,
&profiles,
&interner,
)?;
let extra_args = HashMap::new();
let bcx = BuildContext::new(
ws,
pkg_set,
&build_config,
profiles,
extra_args,
target_data,
units,
unit_graph,
)?;
let mut cx = Context::new(&bcx)?;
cx.prepare_units()?;
for pkg in packages {
let pkg_dir = format!("{}-*", pkg.name());
for unit in &bcx.roots {
if unit.mode.is_doc() || unit.mode.is_doc_test() {
// Cleaning individual rustdoc crates is currently not supported.
// For example, the search index would need to be rebuilt to fully
// remove it (otherwise you're left with lots of broken links).
// Doc tests produce no output.
continue;
// Clean fingerprints.
for (_, layout) in &layouts_with_host {
rm_rf_glob(&layout.fingerprint().join(&pkg_dir), config)?;
}
rm_rf(&cx.files().fingerprint_dir(unit), config)?;
if unit.target.is_custom_build() {
if unit.mode.is_run_custom_build() {
rm_rf(&cx.files().build_script_out_dir(unit), config)?;
} else {
rm_rf(&cx.files().build_script_dir(unit), config)?;
for target in pkg.targets() {
if target.is_custom_build() {
// Get both the build_script_build and the output directory.
for (_, layout) in &layouts_with_host {
rm_rf_glob(&layout.build().join(&pkg_dir), config)?;
}
continue;
}
continue;
}
let crate_name = target.crate_name();
for &mode in &[
CompileMode::Build,
CompileMode::Test,
CompileMode::Check { test: false },
] {
for (compile_kind, layout) in &layouts {
let triple = target_data.short_name(compile_kind);
for output in cx.outputs(unit)?.iter() {
rm_rf(&output.path, config)?;
if let Some(ref dst) = output.hardlink {
rm_rf(dst, config)?;
let (file_types, _unsupported) = target_data
.info(*compile_kind)
.rustc_outputs(mode, target.kind(), triple)?;
let (dir, uplift_dir) = if target.is_example() {
(layout.examples(), layout.examples())
} else {
(layout.deps(), layout.dest())
};
for file_type in file_types {
// Some files include a hash in the filename, some don't.
let hashed_name = file_type.output_filename(target, Some("*"));
let unhashed_name = file_type.output_filename(target, None);
rm_rf_glob(&dir.join(&hashed_name), config)?;
rm_rf(&dir.join(&unhashed_name), config)?;
// Remove dep-info file generated by rustc. It is not tracked in
// file_types. It does not have a prefix.
let hashed_dep_info = dir.join(format!("{}-*.d", crate_name));
let unhashed_dep_info = dir.join(format!("{}.d", crate_name));
rm_rf_glob(&hashed_dep_info, config)?;
rm_rf(&unhashed_dep_info, config)?;
// Remove the uplifted copy.
let uplifted_path = uplift_dir.join(file_type.uplift_filename(target));
rm_rf(&uplifted_path, config)?;
// Dep-info generated by Cargo itself.
let dep_info = uplifted_path.with_extension("d");
rm_rf(&dep_info, config)?;
}
// TODO: what to do about build_script_build?
let incremental = layout.incremental().join(format!("{}-*", crate_name));
rm_rf_glob(&incremental, config)?;
}
}
}
}
@ -197,8 +183,19 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
Ok(())
}
fn rm_rf_glob(pattern: &Path, config: &Config) -> CargoResult<()> {
// TODO: Display utf8 warning to user? Or switch to globset?
let pattern = pattern
.to_str()
.ok_or_else(|| anyhow::anyhow!("expected utf-8 path"))?;
for path in glob::glob(pattern)? {
rm_rf(&path?, config)?;
}
Ok(())
}
fn rm_rf(path: &Path, config: &Config) -> CargoResult<()> {
let m = fs::metadata(path);
let m = fs::symlink_metadata(path);
if m.as_ref().map(|s| s.is_dir()).unwrap_or(false) {
config
.shell()

View File

@ -832,9 +832,11 @@ fn generate_targets(
for proposal in filter_targets(packages, Target::is_lib, false, mode) {
let Proposal { target, pkg, .. } = proposal;
if mode.is_doc_test() && !target.doctestable() {
let types = target.rustc_crate_types();
let types_str: Vec<&str> = types.iter().map(|t| t.as_str()).collect();
ws.config().shell().warn(format!(
"doc tests are not supported for crate type(s) `{}` in package `{}`",
target.rustc_crate_types().join(", "),
types_str.join(", "),
pkg.name()
))?;
} else {

View File

@ -1,6 +1,6 @@
pub use self::cargo_clean::{clean, CleanOptions};
pub use self::cargo_compile::{
compile, compile_with_exec, compile_ws, resolve_all_features, CompileOptions,
compile, compile_with_exec, compile_ws, create_bcx, resolve_all_features, CompileOptions,
};
pub use self::cargo_compile::{CompileFilter, FilterRule, LibRule, Packages};
pub use self::cargo_doc::{doc, DocOptions};

View File

@ -14,7 +14,7 @@ use serde::{Deserialize, Serialize};
use url::Url;
use crate::core::dependency::DepKind;
use crate::core::manifest::{LibKind, ManifestMetadata, TargetSourcePath, Warnings};
use crate::core::manifest::{ManifestMetadata, TargetSourcePath, Warnings};
use crate::core::resolver::ResolveBehavior;
use crate::core::{Dependency, InternedString, Manifest, PackageId, Summary, Target};
use crate::core::{Edition, EitherManifest, Feature, Features, VirtualManifest, Workspace};

View File

@ -15,9 +15,10 @@ use std::fs::{self, DirEntry};
use std::path::{Path, PathBuf};
use super::{
LibKind, PathValue, StringOrBool, StringOrVec, TomlBenchTarget, TomlBinTarget,
TomlExampleTarget, TomlLibTarget, TomlManifest, TomlTarget, TomlTestTarget,
PathValue, StringOrBool, StringOrVec, TomlBenchTarget, TomlBinTarget, TomlExampleTarget,
TomlLibTarget, TomlManifest, TomlTarget, TomlTestTarget,
};
use crate::core::compiler::CrateType;
use crate::core::{Edition, Feature, Features, Target};
use crate::util::errors::{CargoResult, CargoResultExt};
use crate::util::restricted_names;
@ -222,15 +223,15 @@ fn clean_lib(
if kinds.len() > 1 {
anyhow::bail!("cannot mix `proc-macro` crate type with others");
}
vec![LibKind::ProcMacro]
vec![CrateType::ProcMacro]
}
(_, Some(true), Some(true)) => {
anyhow::bail!("`lib.plugin` and `lib.proc-macro` cannot both be `true`")
}
(Some(kinds), _, _) => kinds.iter().map(|s| s.into()).collect(),
(None, Some(true), _) => vec![LibKind::Dylib],
(None, _, Some(true)) => vec![LibKind::ProcMacro],
(None, _, _) => vec![LibKind::Lib],
(None, Some(true), _) => vec![CrateType::Dylib],
(None, _, Some(true)) => vec![CrateType::ProcMacro],
(None, _, _) => vec![CrateType::Lib],
};
let mut target = Target::lib_target(&lib.name(), crate_types, path, edition);

View File

@ -1614,7 +1614,7 @@ fn json_artifact_includes_executable_for_benchmark() {
{
"executable": "[..]/foo/target/release/deps/benchmark-[..][EXE]",
"features": [],
"filenames": [ "[..]/foo/target/release/deps/benchmark-[..][EXE]" ],
"filenames": "{...}",
"fresh": false,
"package_id": "foo 0.0.1 ([..])",
"profile": "{...}",

View File

@ -154,9 +154,7 @@ fn cargo_build_plan_build_script() {
"env": "{...}",
"kind": null,
"links": "{...}",
"outputs": [
"[..]/foo/target/debug/build/[..]/build_script_build-[..]"
],
"outputs": "{...}",
"package_name": "foo",
"package_version": "0.5.0",
"program": "rustc",

View File

@ -194,7 +194,7 @@ fn clears_cache_after_fix() {
// Fill the cache.
p.cargo("check").with_stderr_contains("[..]asdf[..]").run();
let cpath = p
.glob("target/debug/.fingerprint/foo-*/output")
.glob("target/debug/.fingerprint/foo-*/output-*")
.next()
.unwrap()
.unwrap();
@ -215,7 +215,10 @@ fn clears_cache_after_fix() {
",
)
.run();
assert_eq!(p.glob("target/debug/.fingerprint/foo-*/output").count(), 0);
assert_eq!(
p.glob("target/debug/.fingerprint/foo-*/output-*").count(),
0
);
// And again, check the cache is correct.
p.cargo("check")
@ -253,7 +256,10 @@ fn rustdoc() {
let rustdoc_stderr = as_str(&rustdoc_output.stderr);
assert!(rustdoc_stderr.contains("private"));
assert!(rustdoc_stderr.contains("\x1b["));
assert_eq!(p.glob("target/debug/.fingerprint/foo-*/output").count(), 1);
assert_eq!(
p.glob("target/debug/.fingerprint/foo-*/output-*").count(),
1
);
// Check the cached output.
let rustdoc_output = p
@ -331,14 +337,23 @@ fn doesnt_create_extra_files() {
p.cargo("build").run();
assert_eq!(p.glob("target/debug/.fingerprint/foo-*/output").count(), 0);
assert_eq!(p.glob("target/debug/.fingerprint/dep-*/output").count(), 0);
assert_eq!(
p.glob("target/debug/.fingerprint/foo-*/output-*").count(),
0
);
assert_eq!(
p.glob("target/debug/.fingerprint/dep-*/output-*").count(),
0
);
if is_coarse_mtime() {
sleep_ms(1000);
}
p.change_file("src/lib.rs", "fn unused() {}");
p.cargo("build").run();
assert_eq!(p.glob("target/debug/.fingerprint/foo-*/output").count(), 1);
assert_eq!(
p.glob("target/debug/.fingerprint/foo-*/output-*").count(),
1
);
}
#[cargo_test]
@ -491,3 +506,23 @@ fn rustc_workspace_wrapper() {
.with_stdout_does_not_contain("WRAPPER CALLED: rustc --crate-name foo src/lib.rs [..]")
.run();
}
#[cargo_test]
fn wacky_hashless_fingerprint() {
// On Windows, executables don't have hashes. This checks for a bad
// assumption that caused bad caching.
let p = project()
.file("src/bin/a.rs", "fn main() { let unused = 1; }")
.file("src/bin/b.rs", "fn main() {}")
.build();
p.cargo("build --bin b")
.with_stderr_does_not_contain("[..]unused[..]")
.run();
p.cargo("build --bin a")
.with_stderr_contains("[..]unused[..]")
.run();
// This should not pick up the cache from `a`.
p.cargo("build --bin b")
.with_stderr_does_not_contain("[..]unused[..]")
.run();
}

View File

@ -1,9 +1,10 @@
//! Tests for the `cargo clean` command.
use std::env;
use cargo_test_support::paths::CargoPathExt;
use cargo_test_support::registry::Package;
use cargo_test_support::{basic_bin_manifest, basic_manifest, git, main_file, project};
use cargo_test_support::{basic_bin_manifest, basic_manifest, git, main_file, project, rustc_host};
use std::env;
use std::path::Path;
#[cargo_test]
fn cargo_clean_simple() {
@ -291,6 +292,7 @@ fn clean_verbose() {
[REMOVING] [..]
[REMOVING] [..]
[REMOVING] [..]
[REMOVING] [..]
",
)
.run();
@ -319,3 +321,163 @@ fn clean_remove_rlib_rmeta() {
assert!(!p.target_debug_dir().join("libfoo.rlib").exists());
assert!(!rmeta.exists());
}
#[cargo_test]
fn package_cleans_all_the_things() {
// -p cleans everything
// Use dashes everywhere to make sure dash/underscore stuff is handled.
for crate_type in &["rlib", "dylib", "cdylib", "staticlib", "proc-macro"] {
// Try each crate type individually since the behavior changes when
// they are combined.
let p = project()
.file(
"Cargo.toml",
&format!(
r#"
[package]
name = "foo-bar"
version = "0.1.0"
[lib]
crate-type = ["{}"]
"#,
crate_type
),
)
.file("src/lib.rs", "")
.build();
p.cargo("build").run();
p.cargo("clean -p foo-bar").run();
assert_all_clean(&p.build_dir());
}
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo-bar"
version = "0.1.0"
edition = "2018"
[lib]
crate-type = ["rlib", "dylib", "staticlib"]
[[example]]
name = "foo-ex-rlib"
crate-type = ["rlib"]
test = true
[[example]]
name = "foo-ex-cdylib"
crate-type = ["cdylib"]
test = true
[[example]]
name = "foo-ex-bin"
test = true
"#,
)
.file("src/lib.rs", "")
.file("src/main.rs", "fn main() {}")
.file("src/bin/other-main.rs", "fn main() {}")
.file("examples/foo-ex-rlib.rs", "")
.file("examples/foo-ex-cdylib.rs", "")
.file("examples/foo-ex-bin.rs", "fn main() {}")
.file("tests/foo-test.rs", "")
.file("benches/foo-bench.rs", "")
.file("build.rs", "fn main() {}")
.build();
p.cargo("build --all-targets")
.env("CARGO_INCREMENTAL", "1")
.run();
p.cargo("test --all-targets")
.env("CARGO_INCREMENTAL", "1")
.run();
p.cargo("check --all-targets")
.env("CARGO_INCREMENTAL", "1")
.run();
p.cargo("clean -p foo-bar").run();
assert_all_clean(&p.build_dir());
// Try some targets.
p.cargo("build --all-targets --target")
.arg(rustc_host())
.run();
p.cargo("clean -p foo-bar --target").arg(rustc_host()).run();
assert_all_clean(&p.build_dir());
}
// Ensures that all files for the package have been deleted.
fn assert_all_clean(build_dir: &Path) {
let walker = walkdir::WalkDir::new(build_dir).into_iter();
for entry in walker.filter_entry(|e| {
let path = e.path();
// This is a known limitation, clean can't differentiate between
// the different build scripts from different packages.
!(path
.file_name()
.unwrap()
.to_str()
.unwrap()
.starts_with("build_script_build")
&& path
.parent()
.unwrap()
.file_name()
.unwrap()
.to_str()
.unwrap()
== "incremental")
}) {
let entry = entry.unwrap();
let path = entry.path();
if let ".rustc_info.json" | ".cargo-lock" = path.file_name().unwrap().to_str().unwrap() {
continue;
}
if path.is_symlink() || path.is_file() {
panic!("{:?} was not cleaned", path);
}
}
}
#[cargo_test]
fn clean_spec_multiple() {
// clean -p foo where foo matches multiple versions
Package::new("bar", "1.0.0").publish();
Package::new("bar", "2.0.0").publish();
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
[dependencies]
bar1 = {version="1.0", package="bar"}
bar2 = {version="2.0", package="bar"}
"#,
)
.file("src/lib.rs", "")
.build();
p.cargo("build").run();
p.cargo("clean -p bar:1.0.0")
.with_stderr(
"warning: version qualifier in `-p bar:1.0.0` is ignored, \
cleaning all versions of `bar` found",
)
.run();
let mut walker = walkdir::WalkDir::new(p.build_dir())
.into_iter()
.filter_map(|e| e.ok())
.filter(|e| {
let n = e.file_name().to_str().unwrap();
n.starts_with("bar") || n.starts_with("libbar")
});
if let Some(e) = walker.next() {
panic!("{:?} was not cleaned", e.path());
}
}

View File

@ -272,13 +272,13 @@ fn relative_depinfo_paths_ws() {
assert_deps_contains(
&p,
"target/debug/.fingerprint/pm-*/dep-lib-pm-*",
"target/debug/.fingerprint/pm-*/dep-lib-pm",
&[(1, "src/lib.rs"), (2, "debug/deps/libpmdep-*.rlib")],
);
assert_deps_contains(
&p,
&format!("target/{}/debug/.fingerprint/foo-*/dep-bin-foo*", host),
&format!("target/{}/debug/.fingerprint/foo-*/dep-bin-foo", host),
&[
(1, "src/main.rs"),
(
@ -296,7 +296,7 @@ fn relative_depinfo_paths_ws() {
assert_deps_contains(
&p,
"target/debug/.fingerprint/foo-*/dep-build-script-build_script_build-*",
"target/debug/.fingerprint/foo-*/dep-build-script-build-script-build",
&[(1, "build.rs"), (2, "debug/deps/libbdep-*.rlib")],
);
@ -400,13 +400,13 @@ fn relative_depinfo_paths_no_ws() {
assert_deps_contains(
&p,
"target/debug/.fingerprint/pm-*/dep-lib-pm-*",
"target/debug/.fingerprint/pm-*/dep-lib-pm",
&[(1, "src/lib.rs"), (2, "debug/deps/libpmdep-*.rlib")],
);
assert_deps_contains(
&p,
"target/debug/.fingerprint/foo-*/dep-bin-foo*",
"target/debug/.fingerprint/foo-*/dep-bin-foo",
&[
(1, "src/main.rs"),
(
@ -424,7 +424,7 @@ fn relative_depinfo_paths_no_ws() {
assert_deps_contains(
&p,
"target/debug/.fingerprint/foo-*/dep-build-script-build_script_build-*",
"target/debug/.fingerprint/foo-*/dep-build-script-build-script-build",
&[(1, "build.rs"), (2, "debug/deps/libbdep-*.rlib")],
);
@ -461,7 +461,7 @@ fn reg_dep_source_not_tracked() {
assert_deps(
&p,
"target/debug/.fingerprint/regdep-*/dep-lib-regdep-*",
"target/debug/.fingerprint/regdep-*/dep-lib-regdep",
|info_path, entries| {
for (kind, path) in entries {
if *kind == 1 {
@ -513,7 +513,7 @@ fn canonical_path() {
assert_deps_contains(
&p,
"target/debug/.fingerprint/foo-*/dep-lib-foo-*",
"target/debug/.fingerprint/foo-*/dep-lib-foo",
&[(1, "src/lib.rs"), (2, "debug/deps/libregdep-*.rmeta")],
);
}

View File

@ -497,7 +497,7 @@ fn metabuild_build_plan() {
"compile_mode": "build",
"kind": null,
"deps": [0, 1],
"outputs": ["[..]/target/debug/build/foo-[..]/metabuild_foo-[..][EXE]"],
"outputs": "{...}",
"links": "{...}",
"program": "rustc",
"args": "{...}",
@ -686,9 +686,7 @@ fn metabuild_json_artifact() {
{
"executable": null,
"features": [],
"filenames": [
"[..]/foo/target/debug/build/foo-[..]/metabuild-foo[EXE]"
],
"filenames": "{...}",
"fresh": false,
"package_id": "foo [..]",
"profile": "{...}",

View File

@ -90,8 +90,8 @@ fn dynamic_library_with_debug() {
check_dir_contents(
&p.root().join("out"),
&["libfoo.so"],
&["libfoo.dylib"],
&["foo.dll", "foo.dll.lib"],
&["libfoo.dylib", "libfoo.dylib.dSYM"],
&["foo.dll", "foo.dll.exp", "foo.dll.lib", "foo.pdb"],
&["foo.dll", "libfoo.dll.a"],
);
}

View File

@ -3360,7 +3360,7 @@ fn json_artifact_includes_test_flag() {
"name":"foo",
"src_path":"[..]lib.rs"
},
"filenames":["[..]/foo-[..]"],
"filenames":"{...}",
"fresh": false
}