Merge branch 'master' into trywithout

This commit is contained in:
Daniel Wagner-Hall 2020-05-02 11:50:04 +01:00
commit 624ce68126
76 changed files with 1456 additions and 661 deletions

View File

@ -1,12 +1,64 @@
# Changelog # Changelog
## Cargo 1.44 (2020-06-04) ## Cargo 1.45 (2020-07-16)
[bda50510...HEAD](https://github.com/rust-lang/cargo/compare/bda50510...HEAD) [ebda5065e...HEAD](https://github.com/rust-lang/cargo/compare/ebda5065e...HEAD)
### Added ### Added
### Changed
- Changed official documentation to recommend `.cargo/config.toml` filenames
(with the `.toml` extension). `.toml` extension support was added in 1.39.
[#8121](https://github.com/rust-lang/cargo/pull/8121)
- The `registry.index` config value is no longer allowed (it has been
deprecated for 4 years).
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- An error is generated if both `--index` and `--registry` are passed
(previously `--index` was silently ignored).
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- The `registry.token` config value is no longer used with the `--index` flag.
This is intended to avoid potentially leaking the crates.io token to another
registry.
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- Added a warning if `registry.token` is used with source replacement. It is
intended this will be an error in future versions.
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- Windows GNU targets now copy `.dll.a` import library files for DLL crate
types to the output directory.
[#8141](https://github.com/rust-lang/cargo/pull/8141)
- Dylibs for all dependencies are now unconditionally copied to the output
directory. Some obscure scenarios can cause an old dylib to be referenced
between builds, and this ensures that all the latest copies are used.
[#8139](https://github.com/rust-lang/cargo/pull/8139)
### Fixed
- Fixed copying Windows `.pdb` files to the output directory when the filename
contained dashes.
[#8123](https://github.com/rust-lang/cargo/pull/8123)
### Nightly only
- Fixed passing the full path for `--target` to `rustdoc` when using JSON spec
targets.
[#8094](https://github.com/rust-lang/cargo/pull/8094)
- `-Cembed-bitcode=no` renamed to `-Cbitcode-in-rlib=no`
[#8134](https://github.com/rust-lang/cargo/pull/8134)
- Added new `resolver` field to `Cargo.toml` to opt-in to the new feature
resolver.
[#8129](https://github.com/rust-lang/cargo/pull/8129)
## Cargo 1.44 (2020-06-04)
[bda50510...ebda5065e](https://github.com/rust-lang/cargo/compare/bda50510...ebda5065e)
### Added
- 🔥 Added the `cargo tree` command.
[docs](https://doc.rust-lang.org/nightly/cargo/commands/cargo-tree.html)
[#8062](https://github.com/rust-lang/cargo/pull/8062)
- Added warnings if a package has Windows-restricted filenames (like `nul`, - Added warnings if a package has Windows-restricted filenames (like `nul`,
`con`, `aux`, `prn`, etc.). `con`, `aux`, `prn`, etc.).
[#7959](https://github.com/rust-lang/cargo/pull/7959) [#7959](https://github.com/rust-lang/cargo/pull/7959)
- Added a `"build-finished"` JSON message when compilation is complete so that
tools can detect when they can stop listening for JSON messages with
commands like `cargo run` or `cargo test`.
[#8069](https://github.com/rust-lang/cargo/pull/8069)
### Changed ### Changed
- Valid package names are now restricted to Unicode XID identifiers. This is - Valid package names are now restricted to Unicode XID identifiers. This is
@ -19,22 +71,79 @@
[#7959](https://github.com/rust-lang/cargo/pull/7959) [#7959](https://github.com/rust-lang/cargo/pull/7959)
- Tests are no longer hard-linked into the output directory (`target/debug/`). - Tests are no longer hard-linked into the output directory (`target/debug/`).
This ensures tools will have access to debug symbols and execute tests in This ensures tools will have access to debug symbols and execute tests in
the same was as Cargo. Tools should use JSON messages to discover the path the same way as Cargo. Tools should use JSON messages to discover the path
to the executable. to the executable.
[#7965](https://github.com/rust-lang/cargo/pull/7965) [#7965](https://github.com/rust-lang/cargo/pull/7965)
- Updating git submodules now displays an "Updating" message for each - Updating git submodules now displays an "Updating" message for each
submodule. submodule.
[#7989](https://github.com/rust-lang/cargo/pull/7989) [#7989](https://github.com/rust-lang/cargo/pull/7989)
- File modification times are now preserved when extracting a `.crate` file.
This reverses the change made in 1.40 where the mtime was not preserved.
[#7935](https://github.com/rust-lang/cargo/pull/7935)
- Build script warnings are now displayed separately when the build script
fails.
[#8017](https://github.com/rust-lang/cargo/pull/8017)
- Removed the `git-checkout` subcommand.
[#8040](https://github.com/rust-lang/cargo/pull/8040)
- The progress bar is now enabled for all unix platforms. Previously it was
only Linux, macOS, and FreeBSD.
[#8054](https://github.com/rust-lang/cargo/pull/8054)
- Artifacts generated by pre-release versions of `rustc` now share the same
filenames. This means that changing nightly versions will not leave stale
files in the build directory.
[#8073](https://github.com/rust-lang/cargo/pull/8073)
- Invalid package names are rejected when using renamed dependencies.
[#8090](https://github.com/rust-lang/cargo/pull/8090)
- Added a certain class of HTTP2 errors as "spurious" that will get retried.
[#8102](https://github.com/rust-lang/cargo/pull/8102)
### Fixed ### Fixed
- Cargo no longer buffers excessive amounts of compiler output in memory. - Cargo no longer buffers excessive amounts of compiler output in memory.
[#7838](https://github.com/rust-lang/cargo/pull/7838) [#7838](https://github.com/rust-lang/cargo/pull/7838)
- Symbolic links in git repositories now work on Windows. - Symbolic links in git repositories now work on Windows.
[#7996](https://github.com/rust-lang/cargo/pull/7996) [#7996](https://github.com/rust-lang/cargo/pull/7996)
- Fixed an issue where `profile.dev` was not loaded from a config file with
`cargo test` when the `dev` profile was not defined in `Cargo.toml`.
[#8012](https://github.com/rust-lang/cargo/pull/8012)
- When a binary is built as an implicit dependency of an integration test,
it now checks `dep_name/feature_name` syntax in `required-features` correctly.
[#8020](https://github.com/rust-lang/cargo/pull/8020)
- Fixed an issue where Cargo would not detect that an executable (such as an
integration test) needs to be rebuilt when the previous build was
interrupted with Ctrl-C.
[#8087](https://github.com/rust-lang/cargo/pull/8087)
- Protect against some (unknown) situations where Cargo could panic when the
system monotonic clock doesn't appear to be monotonic.
[#8114](https://github.com/rust-lang/cargo/pull/8114)
### Nightly only ### Nightly only
- Fixed panic with new feature resolver and required-features. - Fixed panic with new feature resolver and required-features.
[#7962](https://github.com/rust-lang/cargo/pull/7962) [#7962](https://github.com/rust-lang/cargo/pull/7962)
- Added `RUSTC_WORKSPACE_WRAPPER` environment variable, which provides a way
to wrap `rustc` for workspace members only, and affects the filename hash so
that artifacts produced by the wrapper are cached separately. This usage can
be seen on nightly clippy with `cargo clippy -Zunstable-options`.
[#7533](https://github.com/rust-lang/cargo/pull/7533)
- Added `--unit-graph` CLI option to display Cargo's internal dependency graph
as JSON.
[#7977](https://github.com/rust-lang/cargo/pull/7977)
- Changed `-Zbuild_dep` to `-Zhost_dep`, and added proc-macros to the feature
decoupling logic.
[#8003](https://github.com/rust-lang/cargo/pull/8003)
[#8028](https://github.com/rust-lang/cargo/pull/8028)
- Fixed so that `--crate-version` is not automatically passed when the flag
is found in `RUSTDOCFLAGS`.
[#8014](https://github.com/rust-lang/cargo/pull/8014)
- Fixed panic with `-Zfeatures=dev_dep` and `check --profile=test`.
[#8027](https://github.com/rust-lang/cargo/pull/8027)
- Fixed panic with `-Zfeatures=itarget` with certain host dependencies.
[#8048](https://github.com/rust-lang/cargo/pull/8048)
- Added support for `-Cembed-bitcode=no`, which provides a performance boost
and disk-space usage reduction for non-LTO builds.
[#8066](https://github.com/rust-lang/cargo/pull/8066)
- `-Zpackage-features` has been extended with several changes intended to make
it easier to select features on the command-line in a workspace.
[#8074](https://github.com/rust-lang/cargo/pull/8074)
## Cargo 1.43 (2020-04-23) ## Cargo 1.43 (2020-04-23)
[9d32b7b0...rust-1.43.0](https://github.com/rust-lang/cargo/compare/9d32b7b0...rust-1.43.0) [9d32b7b0...rust-1.43.0](https://github.com/rust-lang/cargo/compare/9d32b7b0...rust-1.43.0)

View File

@ -1,6 +1,6 @@
[package] [package]
name = "cargo" name = "cargo"
version = "0.45.0" version = "0.46.0"
edition = "2018" edition = "2018"
authors = ["Yehuda Katz <wycats@gmail.com>", authors = ["Yehuda Katz <wycats@gmail.com>",
"Carl Lerche <me@carllerche.com>", "Carl Lerche <me@carllerche.com>",
@ -32,7 +32,7 @@ pretty_env_logger = { version = "0.4", optional = true }
anyhow = "1.0" anyhow = "1.0"
filetime = "0.2.9" filetime = "0.2.9"
flate2 = { version = "1.0.3", default-features = false, features = ["zlib"] } flate2 = { version = "1.0.3", default-features = false, features = ["zlib"] }
git2 = "0.13.1" git2 = "0.13.5"
git2-curl = "0.14.0" git2-curl = "0.14.0"
glob = "0.3.0" glob = "0.3.0"
hex = "0.4" hex = "0.4"
@ -44,7 +44,7 @@ jobserver = "0.1.21"
lazycell = "1.2.0" lazycell = "1.2.0"
libc = "0.2" libc = "0.2"
log = "0.4.6" log = "0.4.6"
libgit2-sys = "0.12.1" libgit2-sys = "0.12.5"
memchr = "2.1.3" memchr = "2.1.3"
num_cpus = "1.0" num_cpus = "1.0"
opener = "0.4" opener = "0.4"

View File

@ -41,6 +41,9 @@ jobs:
x86_64-msvc: x86_64-msvc:
TOOLCHAIN: stable-x86_64-pc-windows-msvc TOOLCHAIN: stable-x86_64-pc-windows-msvc
OTHER_TARGET: i686-pc-windows-msvc OTHER_TARGET: i686-pc-windows-msvc
x86_64-gnu:
TOOLCHAIN: nightly-x86_64-pc-windows-gnu
OTHER_TARGET: i686-pc-windows-gnu
- job: rustfmt - job: rustfmt
pool: pool:

View File

@ -4,7 +4,7 @@ steps:
rustup set profile minimal rustup set profile minimal
rustup component remove --toolchain=$TOOLCHAIN rust-docs || echo "already removed" rustup component remove --toolchain=$TOOLCHAIN rust-docs || echo "already removed"
rustup update --no-self-update $TOOLCHAIN rustup update --no-self-update $TOOLCHAIN
if [ "$TOOLCHAIN" = "nightly" ]; then if [[ "$TOOLCHAIN" == "nightly"* ]]; then
rustup component add --toolchain=$TOOLCHAIN rustc-dev rustup component add --toolchain=$TOOLCHAIN rustc-dev
fi fi
rustup default $TOOLCHAIN rustup default $TOOLCHAIN

View File

@ -190,6 +190,8 @@ pub fn alternate() -> &'static str {
"i686-unknown-linux-gnu" "i686-unknown-linux-gnu"
} else if cfg!(all(target_os = "windows", target_env = "msvc")) { } else if cfg!(all(target_os = "windows", target_env = "msvc")) {
"i686-pc-windows-msvc" "i686-pc-windows-msvc"
} else if cfg!(all(target_os = "windows", target_env = "gnu")) {
"i686-pc-windows-gnu"
} else { } else {
panic!("This test should be gated on cross_compile::disabled."); panic!("This test should be gated on cross_compile::disabled.");
} }

View File

@ -117,7 +117,6 @@ use std::path::{Path, PathBuf};
use std::process::{Command, Output}; use std::process::{Command, Output};
use std::str; use std::str;
use std::time::{self, Duration}; use std::time::{self, Duration};
use std::usize;
use cargo::util::{is_ci, CargoResult, ProcessBuilder, ProcessError, Rustc}; use cargo::util::{is_ci, CargoResult, ProcessBuilder, ProcessError, Rustc};
use serde_json::{self, Value}; use serde_json::{self, Value};
@ -1723,6 +1722,7 @@ fn _process(t: &OsStr) -> cargo::util::ProcessBuilder {
.env_remove("RUSTDOC") .env_remove("RUSTDOC")
.env_remove("RUSTC_WRAPPER") .env_remove("RUSTC_WRAPPER")
.env_remove("RUSTFLAGS") .env_remove("RUSTFLAGS")
.env_remove("RUSTDOCFLAGS")
.env_remove("XDG_CONFIG_HOME") // see #2345 .env_remove("XDG_CONFIG_HOME") // see #2345
.env("GIT_CONFIG_NOSYSTEM", "1") // keep trying to sandbox ourselves .env("GIT_CONFIG_NOSYSTEM", "1") // keep trying to sandbox ourselves
.env_remove("EMAIL") .env_remove("EMAIL")

View File

@ -25,7 +25,7 @@ proptest! {
0 0
} else { } else {
// but that local builds will give a small clear test case. // but that local builds will give a small clear test case.
std::u32::MAX u32::MAX
}, },
result_cache: prop::test_runner::basic_result_cache, result_cache: prop::test_runner::basic_result_cache,
.. ProptestConfig::default() .. ProptestConfig::default()

View File

@ -28,7 +28,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
let opts = CleanOptions { let opts = CleanOptions {
config, config,
spec: values(args, "package"), spec: values(args, "package"),
target: args.target(), targets: args.targets(),
requested_profile: args.get_profile_name(config, "dev", ProfileChecking::Checked)?, requested_profile: args.get_profile_name(config, "dev", ProfileChecking::Checked)?,
profile_specified: args.is_present("profile") || args.is_present("release"), profile_specified: args.is_present("profile") || args.is_present("release"),
doc: args.is_present("doc"), doc: args.is_present("doc"),

View File

@ -28,7 +28,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
let opts = FetchOptions { let opts = FetchOptions {
config, config,
target: args.target(), targets: args.targets(),
}; };
let _ = ops::fetch(&ws, &opts)?; let _ = ops::fetch(&ws, &opts)?;
Ok(()) Ok(())

View File

@ -12,13 +12,11 @@ pub fn cli() -> App {
) )
.arg(opt("quiet", "No output printed to stdout").short("q")) .arg(opt("quiet", "No output printed to stdout").short("q"))
.arg_features() .arg_features()
.arg( .arg(multi_opt(
opt( "filter-platform",
"filter-platform", "TRIPLE",
"Only include resolve dependencies matching the given target-triple", "Only include resolve dependencies matching the given target-triple",
) ))
.value_name("TRIPLE"),
)
.arg(opt( .arg(opt(
"no-deps", "no-deps",
"Output information only about the workspace members \ "Output information only about the workspace members \
@ -51,7 +49,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
all_features: args.is_present("all-features"), all_features: args.is_present("all-features"),
no_default_features: args.is_present("no-default-features"), no_default_features: args.is_present("no-default-features"),
no_deps: args.is_present("no-deps"), no_deps: args.is_present("no-deps"),
filter_platform: args.value_of("filter-platform").map(|s| s.to_string()), filter_platforms: args._values_of("filter-platform"),
version, version,
}; };

View File

@ -42,7 +42,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
list: args.is_present("list"), list: args.is_present("list"),
check_metadata: !args.is_present("no-metadata"), check_metadata: !args.is_present("no-metadata"),
allow_dirty: args.is_present("allow-dirty"), allow_dirty: args.is_present("allow-dirty"),
target: args.target(), targets: args.targets(),
jobs: args.jobs()?, jobs: args.jobs()?,
features: args._values_of("features"), features: args._values_of("features"),
all_features: args.is_present("all-features"), all_features: args.is_present("all-features"),

View File

@ -40,7 +40,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
index, index,
verify: !args.is_present("no-verify"), verify: !args.is_present("no-verify"),
allow_dirty: args.is_present("allow-dirty"), allow_dirty: args.is_present("allow-dirty"),
target: args.target(), targets: args.targets(),
jobs: args.jobs()?, jobs: args.jobs()?,
dry_run: args.is_present("dry-run"), dry_run: args.is_present("dry-run"),
registry, registry,

View File

@ -129,15 +129,15 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
)?; )?;
} }
let target = if args.is_present("all-targets") { let targets = if args.is_present("all-targets") {
config config
.shell() .shell()
.warn("the --all-targets flag has been changed to --target=all")?; .warn("the --all-targets flag has been changed to --target=all")?;
Some("all") vec!["all".to_string()]
} else { } else {
args.value_of("target") args._values_of("target")
}; };
let target = tree::Target::from_cli(target); let target = tree::Target::from_cli(targets);
let edge_kinds = parse_edge_kinds(config, args)?; let edge_kinds = parse_edge_kinds(config, args)?;
let graph_features = edge_kinds.contains(&EdgeKind::Feature); let graph_features = edge_kinds.contains(&EdgeKind::Feature);

View File

@ -2,6 +2,7 @@ use crate::core::compiler::CompileKind;
use crate::core::interning::InternedString; use crate::core::interning::InternedString;
use crate::util::ProcessBuilder; use crate::util::ProcessBuilder;
use crate::util::{CargoResult, Config, RustfixDiagnosticServer}; use crate::util::{CargoResult, Config, RustfixDiagnosticServer};
use anyhow::bail;
use serde::ser; use serde::ser;
use std::cell::RefCell; use std::cell::RefCell;
use std::path::PathBuf; use std::path::PathBuf;
@ -10,7 +11,7 @@ use std::path::PathBuf;
#[derive(Debug)] #[derive(Debug)]
pub struct BuildConfig { pub struct BuildConfig {
/// The requested kind of compilation for this session /// The requested kind of compilation for this session
pub requested_kind: CompileKind, pub requested_kinds: Vec<CompileKind>,
/// Number of rustc jobs to run in parallel. /// Number of rustc jobs to run in parallel.
pub jobs: u32, pub jobs: u32,
/// Build profile /// Build profile
@ -50,12 +51,11 @@ impl BuildConfig {
pub fn new( pub fn new(
config: &Config, config: &Config,
jobs: Option<u32>, jobs: Option<u32>,
requested_target: &Option<String>, requested_targets: &[String],
mode: CompileMode, mode: CompileMode,
) -> CargoResult<BuildConfig> { ) -> CargoResult<BuildConfig> {
let cfg = config.build_config()?; let cfg = config.build_config()?;
let requested_kind = let requested_kinds = CompileKind::from_requested_targets(config, requested_targets)?;
CompileKind::from_requested_target(config, requested_target.as_deref())?;
if jobs == Some(0) { if jobs == Some(0) {
anyhow::bail!("jobs must be at least 1") anyhow::bail!("jobs must be at least 1")
} }
@ -69,7 +69,7 @@ impl BuildConfig {
let jobs = jobs.or(cfg.jobs).unwrap_or(::num_cpus::get() as u32); let jobs = jobs.or(cfg.jobs).unwrap_or(::num_cpus::get() as u32);
Ok(BuildConfig { Ok(BuildConfig {
requested_kind, requested_kinds,
jobs, jobs,
requested_profile: InternedString::new("dev"), requested_profile: InternedString::new("dev"),
mode, mode,
@ -95,6 +95,13 @@ impl BuildConfig {
pub fn test(&self) -> bool { pub fn test(&self) -> bool {
self.mode == CompileMode::Test || self.mode == CompileMode::Bench self.mode == CompileMode::Test || self.mode == CompileMode::Bench
} }
pub fn single_requested_kind(&self) -> CargoResult<CompileKind> {
match self.requested_kinds.len() {
1 => Ok(self.requested_kinds[0]),
_ => bail!("only one `--target` argument is supported"),
}
}
} }
#[derive(Clone, Copy, Debug, PartialEq, Eq)] #[derive(Clone, Copy, Debug, PartialEq, Eq)]

View File

@ -1,8 +1,8 @@
use crate::core::compiler::unit_graph::UnitGraph; use crate::core::compiler::unit_graph::UnitGraph;
use crate::core::compiler::{BuildConfig, CompileKind, Unit}; use crate::core::compiler::{BuildConfig, CompileKind, Unit};
use crate::core::profiles::Profiles; use crate::core::profiles::Profiles;
use crate::core::PackageSet;
use crate::core::{InternedString, Workspace}; use crate::core::{InternedString, Workspace};
use crate::core::{PackageId, PackageSet};
use crate::util::config::Config; use crate::util::config::Config;
use crate::util::errors::CargoResult; use crate::util::errors::CargoResult;
use crate::util::Rustc; use crate::util::Rustc;
@ -99,10 +99,6 @@ impl<'a, 'cfg> BuildContext<'a, 'cfg> {
&self.target_data.info(unit.kind).rustdocflags &self.target_data.info(unit.kind).rustdocflags
} }
pub fn show_warnings(&self, pkg: PackageId) -> bool {
pkg.source_id().is_path() || self.config.extra_verbose()
}
pub fn extra_args_for(&self, unit: &Unit) -> Option<&Vec<String>> { pub fn extra_args_for(&self, unit: &Unit) -> Option<&Vec<String>> {
self.extra_compiler_args.get(unit) self.extra_compiler_args.get(unit)
} }

View File

@ -90,11 +90,18 @@ impl FileType {
impl TargetInfo { impl TargetInfo {
pub fn new( pub fn new(
config: &Config, config: &Config,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
rustc: &Rustc, rustc: &Rustc,
kind: CompileKind, kind: CompileKind,
) -> CargoResult<TargetInfo> { ) -> CargoResult<TargetInfo> {
let rustflags = env_args(config, requested_kind, &rustc.host, None, kind, "RUSTFLAGS")?; let rustflags = env_args(
config,
requested_kinds,
&rustc.host,
None,
kind,
"RUSTFLAGS",
)?;
let mut process = rustc.process(); let mut process = rustc.process();
process process
.arg("-") .arg("-")
@ -180,7 +187,7 @@ impl TargetInfo {
// information // information
rustflags: env_args( rustflags: env_args(
config, config,
requested_kind, requested_kinds,
&rustc.host, &rustc.host,
Some(&cfg), Some(&cfg),
kind, kind,
@ -188,7 +195,7 @@ impl TargetInfo {
)?, )?,
rustdocflags: env_args( rustdocflags: env_args(
config, config,
requested_kind, requested_kinds,
&rustc.host, &rustc.host,
Some(&cfg), Some(&cfg),
kind, kind,
@ -257,6 +264,17 @@ impl TargetInfo {
flavor: FileFlavor::Normal, flavor: FileFlavor::Normal,
should_replace_hyphens: false, should_replace_hyphens: false,
}) })
} else if target_triple.ends_with("windows-gnu")
&& crate_type.ends_with("dylib")
&& suffix == ".dll"
{
// LD can link DLL directly, but LLD requires the import library.
ret.push(FileType {
suffix: ".dll.a".to_string(),
prefix: "lib".to_string(),
flavor: FileFlavor::Normal,
should_replace_hyphens: false,
})
} }
// See rust-lang/cargo#4535. // See rust-lang/cargo#4535.
@ -405,7 +423,7 @@ fn output_err_info(cmd: &ProcessBuilder, stdout: &str, stderr: &str) -> String {
/// scripts, ...), even if it is the same as the target. /// scripts, ...), even if it is the same as the target.
fn env_args( fn env_args(
config: &Config, config: &Config,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
host_triple: &str, host_triple: &str,
target_cfg: Option<&[Cfg]>, target_cfg: Option<&[Cfg]>,
kind: CompileKind, kind: CompileKind,
@ -430,7 +448,7 @@ fn env_args(
// This means that, e.g., even if the specified --target is the // This means that, e.g., even if the specified --target is the
// same as the host, build scripts in plugins won't get // same as the host, build scripts in plugins won't get
// RUSTFLAGS. // RUSTFLAGS.
if !requested_kind.is_host() && kind.is_host() { if requested_kinds != &[CompileKind::Host] && kind.is_host() {
// This is probably a build script or plugin and we're // This is probably a build script or plugin and we're
// compiling with --target. In this scenario there are // compiling with --target. In this scenario there are
// no rustflags we can apply. // no rustflags we can apply.
@ -515,20 +533,25 @@ pub struct RustcTargetData {
} }
impl RustcTargetData { impl RustcTargetData {
pub fn new(ws: &Workspace<'_>, requested_kind: CompileKind) -> CargoResult<RustcTargetData> { pub fn new(
ws: &Workspace<'_>,
requested_kinds: &[CompileKind],
) -> CargoResult<RustcTargetData> {
let config = ws.config(); let config = ws.config();
let rustc = config.load_global_rustc(Some(ws))?; let rustc = config.load_global_rustc(Some(ws))?;
let host_config = config.target_cfg_triple(&rustc.host)?; let host_config = config.target_cfg_triple(&rustc.host)?;
let host_info = TargetInfo::new(config, requested_kind, &rustc, CompileKind::Host)?; let host_info = TargetInfo::new(config, requested_kinds, &rustc, CompileKind::Host)?;
let mut target_config = HashMap::new(); let mut target_config = HashMap::new();
let mut target_info = HashMap::new(); let mut target_info = HashMap::new();
if let CompileKind::Target(target) = requested_kind { for kind in requested_kinds {
let tcfg = config.target_cfg_triple(target.short_name())?; if let CompileKind::Target(target) = *kind {
target_config.insert(target, tcfg); let tcfg = config.target_cfg_triple(target.short_name())?;
target_info.insert( target_config.insert(target, tcfg);
target, target_info.insert(
TargetInfo::new(config, requested_kind, &rustc, CompileKind::Target(target))?, target,
); TargetInfo::new(config, requested_kinds, &rustc, *kind)?,
);
}
} }
Ok(RustcTargetData { Ok(RustcTargetData {

View File

@ -8,15 +8,14 @@ use semver::Version;
use super::BuildContext; use super::BuildContext;
use crate::core::compiler::CompileKind; use crate::core::compiler::CompileKind;
use crate::core::{Edition, Package, PackageId, Target}; use crate::core::compiler::Unit;
use crate::core::{Edition, Package, PackageId};
use crate::util::{self, config, join_paths, process, CargoResult, Config, ProcessBuilder}; use crate::util::{self, config, join_paths, process, CargoResult, Config, ProcessBuilder};
/// Structure with enough information to run `rustdoc --test`. /// Structure with enough information to run `rustdoc --test`.
pub struct Doctest { pub struct Doctest {
/// The package being doc-tested. /// What's being doctested
pub package: Package, pub unit: Unit,
/// The target being tested (currently always the package's lib).
pub target: Target,
/// Arguments needed to pass to rustdoc to run this test. /// Arguments needed to pass to rustdoc to run this test.
pub args: Vec<OsString>, pub args: Vec<OsString>,
/// Whether or not -Zunstable-options is needed. /// Whether or not -Zunstable-options is needed.
@ -26,11 +25,12 @@ pub struct Doctest {
/// A structure returning the result of a compilation. /// A structure returning the result of a compilation.
pub struct Compilation<'cfg> { pub struct Compilation<'cfg> {
/// An array of all tests created during this compilation. /// An array of all tests created during this compilation.
/// `(package, target, path_to_test_exe)` /// `(unit, path_to_test_exe)` where `unit` contains information such as the
pub tests: Vec<(Package, Target, PathBuf)>, /// package, compile target, etc.
pub tests: Vec<(Unit, PathBuf)>,
/// An array of all binaries created. /// An array of all binaries created.
pub binaries: Vec<PathBuf>, pub binaries: Vec<(Unit, PathBuf)>,
/// All directories for the output of native build commands. /// All directories for the output of native build commands.
/// ///
@ -41,20 +41,17 @@ pub struct Compilation<'cfg> {
pub native_dirs: BTreeSet<PathBuf>, pub native_dirs: BTreeSet<PathBuf>,
/// Root output directory (for the local package's artifacts) /// Root output directory (for the local package's artifacts)
pub root_output: PathBuf, pub root_output: HashMap<CompileKind, PathBuf>,
/// Output directory for rust dependencies. /// Output directory for rust dependencies.
/// May be for the host or for a specific target. /// May be for the host or for a specific target.
pub deps_output: PathBuf, pub deps_output: HashMap<CompileKind, PathBuf>,
/// Output directory for the rust host dependencies. /// The path to the host libdir for the compiler used
pub host_deps_output: PathBuf, sysroot_host_libdir: PathBuf,
/// The path to rustc's own libstd /// The path to libstd for each target
pub host_dylib_path: PathBuf, sysroot_target_libdir: HashMap<CompileKind, PathBuf>,
/// The path to libstd for the target
pub target_dylib_path: PathBuf,
/// Extra environment variables that were passed to compilations and should /// Extra environment variables that were passed to compilations and should
/// be passed to future invocations of programs. /// be passed to future invocations of programs.
@ -69,9 +66,6 @@ pub struct Compilation<'cfg> {
/// Flags to pass to rustdoc when invoked from cargo test, per package. /// Flags to pass to rustdoc when invoked from cargo test, per package.
pub rustdocflags: HashMap<PackageId, Vec<String>>, pub rustdocflags: HashMap<PackageId, Vec<String>>,
pub host: String,
pub target: String,
config: &'cfg Config, config: &'cfg Config,
/// Rustc process to be used by default /// Rustc process to be used by default
@ -82,7 +76,7 @@ pub struct Compilation<'cfg> {
/// rustc_workspace_wrapper_process /// rustc_workspace_wrapper_process
primary_rustc_process: Option<ProcessBuilder>, primary_rustc_process: Option<ProcessBuilder>,
target_runner: Option<(PathBuf, Vec<String>)>, target_runners: HashMap<CompileKind, Option<(PathBuf, Vec<String>)>>,
} }
impl<'cfg> Compilation<'cfg> { impl<'cfg> Compilation<'cfg> {
@ -100,23 +94,28 @@ impl<'cfg> Compilation<'cfg> {
} }
} }
let default_kind = bcx.build_config.requested_kind;
Ok(Compilation { Ok(Compilation {
// TODO: deprecated; remove. // TODO: deprecated; remove.
native_dirs: BTreeSet::new(), native_dirs: BTreeSet::new(),
root_output: PathBuf::from("/"), root_output: HashMap::new(),
deps_output: PathBuf::from("/"), deps_output: HashMap::new(),
host_deps_output: PathBuf::from("/"), sysroot_host_libdir: bcx
host_dylib_path: bcx
.target_data .target_data
.info(CompileKind::Host) .info(CompileKind::Host)
.sysroot_host_libdir .sysroot_host_libdir
.clone(), .clone(),
target_dylib_path: bcx sysroot_target_libdir: bcx
.target_data .build_config
.info(default_kind) .requested_kinds
.sysroot_target_libdir .iter()
.clone(), .chain(Some(&CompileKind::Host))
.map(|kind| {
(
*kind,
bcx.target_data.info(*kind).sysroot_target_libdir.clone(),
)
})
.collect(),
tests: Vec::new(), tests: Vec::new(),
binaries: Vec::new(), binaries: Vec::new(),
extra_env: HashMap::new(), extra_env: HashMap::new(),
@ -127,16 +126,20 @@ impl<'cfg> Compilation<'cfg> {
rustc_process: rustc, rustc_process: rustc,
rustc_workspace_wrapper_process, rustc_workspace_wrapper_process,
primary_rustc_process, primary_rustc_process,
host: bcx.host_triple().to_string(), target_runners: bcx
target: bcx.target_data.short_name(&default_kind).to_string(), .build_config
target_runner: target_runner(bcx, default_kind)?, .requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
.map(|kind| Ok((*kind, target_runner(bcx, *kind)?)))
.collect::<CargoResult<HashMap<_, _>>>()?,
}) })
} }
/// See `process`. /// See `process`.
pub fn rustc_process( pub fn rustc_process(
&self, &self,
pkg: &Package, unit: &Unit,
is_primary: bool, is_primary: bool,
is_workspace: bool, is_workspace: bool,
) -> CargoResult<ProcessBuilder> { ) -> CargoResult<ProcessBuilder> {
@ -148,17 +151,22 @@ impl<'cfg> Compilation<'cfg> {
self.rustc_process.clone() self.rustc_process.clone()
}; };
self.fill_env(rustc, pkg, true) self.fill_env(rustc, &unit.pkg, unit.kind, true)
} }
/// See `process`. /// See `process`.
pub fn rustdoc_process(&self, pkg: &Package, target: &Target) -> CargoResult<ProcessBuilder> { pub fn rustdoc_process(&self, unit: &Unit) -> CargoResult<ProcessBuilder> {
let mut p = self.fill_env(process(&*self.config.rustdoc()?), pkg, false)?; let mut p = self.fill_env(
if target.edition() != Edition::Edition2015 { process(&*self.config.rustdoc()?),
p.arg(format!("--edition={}", target.edition())); &unit.pkg,
unit.kind,
true,
)?;
if unit.target.edition() != Edition::Edition2015 {
p.arg(format!("--edition={}", unit.target.edition()));
} }
for crate_type in target.rustc_crate_types() { for crate_type in unit.target.rustc_crate_types() {
p.arg("--crate-type").arg(crate_type); p.arg("--crate-type").arg(crate_type);
} }
@ -171,20 +179,21 @@ impl<'cfg> Compilation<'cfg> {
cmd: T, cmd: T,
pkg: &Package, pkg: &Package,
) -> CargoResult<ProcessBuilder> { ) -> CargoResult<ProcessBuilder> {
self.fill_env(process(cmd), pkg, true) self.fill_env(process(cmd), pkg, CompileKind::Host, false)
} }
pub fn target_runner(&self) -> &Option<(PathBuf, Vec<String>)> { pub fn target_runner(&self, kind: CompileKind) -> Option<&(PathBuf, Vec<String>)> {
&self.target_runner self.target_runners.get(&kind).and_then(|x| x.as_ref())
} }
/// See `process`. /// See `process`.
pub fn target_process<T: AsRef<OsStr>>( pub fn target_process<T: AsRef<OsStr>>(
&self, &self,
cmd: T, cmd: T,
kind: CompileKind,
pkg: &Package, pkg: &Package,
) -> CargoResult<ProcessBuilder> { ) -> CargoResult<ProcessBuilder> {
let builder = if let Some((ref runner, ref args)) = *self.target_runner() { let builder = if let Some((runner, args)) = self.target_runner(kind) {
let mut builder = process(runner); let mut builder = process(runner);
builder.args(args); builder.args(args);
builder.arg(cmd); builder.arg(cmd);
@ -192,7 +201,7 @@ impl<'cfg> Compilation<'cfg> {
} else { } else {
process(cmd) process(cmd)
}; };
self.fill_env(builder, pkg, false) self.fill_env(builder, pkg, kind, false)
} }
/// Prepares a new process with an appropriate environment to run against /// Prepares a new process with an appropriate environment to run against
@ -204,26 +213,28 @@ impl<'cfg> Compilation<'cfg> {
&self, &self,
mut cmd: ProcessBuilder, mut cmd: ProcessBuilder,
pkg: &Package, pkg: &Package,
is_host: bool, kind: CompileKind,
is_rustc_tool: bool,
) -> CargoResult<ProcessBuilder> { ) -> CargoResult<ProcessBuilder> {
let mut search_path = if is_host { let mut search_path = Vec::new();
let mut search_path = vec![self.host_deps_output.clone()]; if is_rustc_tool {
search_path.push(self.host_dylib_path.clone()); search_path.push(self.deps_output[&CompileKind::Host].clone());
search_path search_path.push(self.sysroot_host_libdir.clone());
} else { } else {
let mut search_path = search_path.extend(super::filter_dynamic_search_path(
super::filter_dynamic_search_path(self.native_dirs.iter(), &self.root_output); self.native_dirs.iter(),
search_path.push(self.deps_output.clone()); &self.root_output[&kind],
search_path.push(self.root_output.clone()); ));
search_path.push(self.deps_output[&kind].clone());
search_path.push(self.root_output[&kind].clone());
// For build-std, we don't want to accidentally pull in any shared // For build-std, we don't want to accidentally pull in any shared
// libs from the sysroot that ships with rustc. This may not be // libs from the sysroot that ships with rustc. This may not be
// required (at least I cannot craft a situation where it // required (at least I cannot craft a situation where it
// matters), but is here to be safe. // matters), but is here to be safe.
if self.config.cli_unstable().build_std.is_none() { if self.config.cli_unstable().build_std.is_none() {
search_path.push(self.target_dylib_path.clone()); search_path.push(self.sysroot_target_libdir[&kind].clone());
} }
search_path }
};
let dylib_path = util::dylib_path(); let dylib_path = util::dylib_path();
let dylib_path_is_empty = dylib_path.is_empty(); let dylib_path_is_empty = dylib_path.is_empty();

View File

@ -1,7 +1,9 @@
use crate::core::{InternedString, Target}; use crate::core::{InternedString, Target};
use crate::util::errors::{CargoResult, CargoResultExt}; use crate::util::errors::{CargoResult, CargoResultExt};
use crate::util::Config; use crate::util::Config;
use anyhow::bail;
use serde::Serialize; use serde::Serialize;
use std::collections::BTreeSet;
use std::path::Path; use std::path::Path;
/// Indicator for how a unit is being compiled. /// Indicator for how a unit is being compiled.
@ -41,30 +43,42 @@ impl CompileKind {
} }
} }
/// Creates a new `CompileKind` based on the requested target. /// Creates a new list of `CompileKind` based on the requested list of
/// targets.
/// ///
/// If no target is given, this consults the config if the default is set. /// If no targets are given then this returns a single-element vector with
/// Otherwise returns `CompileKind::Host`. /// `CompileKind::Host`.
pub fn from_requested_target( pub fn from_requested_targets(
config: &Config, config: &Config,
target: Option<&str>, targets: &[String],
) -> CargoResult<CompileKind> { ) -> CargoResult<Vec<CompileKind>> {
let kind = match target { if targets.len() > 1 && !config.cli_unstable().multitarget {
Some(s) => CompileKind::Target(CompileTarget::new(s)?), bail!("specifying multiple `--target` flags requires `-Zmultitarget`")
None => match &config.build_config()?.target { }
Some(val) => { if targets.len() != 0 {
let value = if val.raw_value().ends_with(".json") { return Ok(targets
let path = val.clone().resolve_path(config); .iter()
path.to_str().expect("must be utf-8 in toml").to_string() .map(|value| Ok(CompileKind::Target(CompileTarget::new(value)?)))
} else { // First collect into a set to deduplicate any `--target` passed
val.raw_value().to_string() // more than once...
}; .collect::<CargoResult<BTreeSet<_>>>()?
CompileKind::Target(CompileTarget::new(&value)?) // ... then generate a flat list for everything else to use.
} .into_iter()
None => CompileKind::Host, .collect());
}, }
let kind = match &config.build_config()?.target {
Some(val) => {
let value = if val.raw_value().ends_with(".json") {
let path = val.clone().resolve_path(config);
path.to_str().expect("must be utf-8 in toml").to_string()
} else {
val.raw_value().to_string()
};
CompileKind::Target(CompileTarget::new(&value)?)
}
None => CompileKind::Host,
}; };
Ok(kind) Ok(vec![kind])
} }
} }

View File

@ -133,7 +133,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
.map(|unit| (unit, LazyCell::new())) .map(|unit| (unit, LazyCell::new()))
.collect(); .collect();
CompilationFiles { CompilationFiles {
ws: &cx.bcx.ws, ws: cx.bcx.ws,
host, host,
target, target,
export_dir: cx.bcx.build_config.export_dir.clone(), export_dir: cx.bcx.build_config.export_dir.clone(),
@ -332,7 +332,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
// we don't want to link it up. // we don't want to link it up.
if out_dir.ends_with("deps") { if out_dir.ends_with("deps") {
// Don't lift up library dependencies. // Don't lift up library dependencies.
if unit.target.is_bin() || self.roots.contains(unit) { if unit.target.is_bin() || self.roots.contains(unit) || unit.target.is_dylib() {
Some(( Some((
out_dir.parent().unwrap().to_owned(), out_dir.parent().unwrap().to_owned(),
if unit.mode.is_any_test() { if unit.mode.is_any_test() {

View File

@ -165,13 +165,13 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let bindst = output.bin_dst(); let bindst = output.bin_dst();
if unit.mode == CompileMode::Test { if unit.mode == CompileMode::Test {
self.compilation.tests.push(( self.compilation
unit.pkg.clone(), .tests
unit.target.clone(), .push((unit.clone(), output.path.clone()));
output.path.clone(),
));
} else if unit.target.is_executable() { } else if unit.target.is_executable() {
self.compilation.binaries.push(bindst.clone()); self.compilation
.binaries
.push((unit.clone(), bindst.clone()));
} }
} }
@ -199,8 +199,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let mut unstable_opts = false; let mut unstable_opts = false;
let args = compiler::extern_args(&self, unit, &mut unstable_opts)?; let args = compiler::extern_args(&self, unit, &mut unstable_opts)?;
self.compilation.to_doc_test.push(compilation::Doctest { self.compilation.to_doc_test.push(compilation::Doctest {
package: unit.pkg.clone(), unit: unit.clone(),
target: unit.target.clone(),
args, args,
unstable_opts, unstable_opts,
}); });
@ -273,9 +272,11 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let dest = self.bcx.profiles.get_dir_name(); let dest = self.bcx.profiles.get_dir_name();
let host_layout = Layout::new(self.bcx.ws, None, &dest)?; let host_layout = Layout::new(self.bcx.ws, None, &dest)?;
let mut targets = HashMap::new(); let mut targets = HashMap::new();
if let CompileKind::Target(target) = self.bcx.build_config.requested_kind { for kind in self.bcx.build_config.requested_kinds.iter() {
let layout = Layout::new(self.bcx.ws, Some(target), &dest)?; if let CompileKind::Target(target) = *kind {
targets.insert(target, layout); let layout = Layout::new(self.bcx.ws, Some(target), &dest)?;
targets.insert(target, layout);
}
} }
self.primary_packages self.primary_packages
.extend(self.bcx.roots.iter().map(|u| u.pkg.package_id())); .extend(self.bcx.roots.iter().map(|u| u.pkg.package_id()));
@ -302,12 +303,22 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
.chain_err(|| "couldn't prepare build directories")?; .chain_err(|| "couldn't prepare build directories")?;
} }
self.compilation.host_deps_output = self.files_mut().host.deps().to_path_buf();
let files = self.files.as_ref().unwrap(); let files = self.files.as_ref().unwrap();
let layout = files.layout(self.bcx.build_config.requested_kind); for &kind in self
self.compilation.root_output = layout.dest().to_path_buf(); .bcx
self.compilation.deps_output = layout.deps().to_path_buf(); .build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
{
let layout = files.layout(kind);
self.compilation
.root_output
.insert(kind, layout.dest().to_path_buf());
self.compilation
.deps_output
.insert(kind, layout.deps().to_path_buf());
}
Ok(()) Ok(())
} }

View File

@ -387,12 +387,9 @@ fn build_work(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Job> {
// state informing what variables were discovered via our script as // state informing what variables were discovered via our script as
// well. // well.
paths::write(&output_file, &output.stdout)?; paths::write(&output_file, &output.stdout)?;
log::debug!( // This mtime shift allows Cargo to detect if a source file was
"rewinding custom script output mtime {:?} to {}", // modified in the middle of the build.
output_file, paths::set_file_time_no_err(output_file, timestamp);
timestamp
);
filetime::set_file_times(output_file, timestamp, timestamp)?;
paths::write(&err_file, &output.stderr)?; paths::write(&err_file, &output.stderr)?;
paths::write(&root_output_file, util::path2bytes(&script_out_dir)?)?; paths::write(&root_output_file, util::path2bytes(&script_out_dir)?)?;
let parsed_output = let parsed_output =

View File

@ -1209,7 +1209,12 @@ fn calculate_normal(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Finger
let target_root = target_root(cx); let target_root = target_root(cx);
let local = if unit.mode.is_doc() { let local = if unit.mode.is_doc() {
// rustdoc does not have dep-info files. // rustdoc does not have dep-info files.
let fingerprint = pkg_fingerprint(cx.bcx, &unit.pkg)?; let fingerprint = pkg_fingerprint(cx.bcx, &unit.pkg).chain_err(|| {
format!(
"failed to determine package fingerprint for documenting {}",
unit.pkg
)
})?;
vec![LocalFingerprint::Precalculated(fingerprint)] vec![LocalFingerprint::Precalculated(fingerprint)]
} else { } else {
let dep_info = dep_info_loc(cx, unit); let dep_info = dep_info_loc(cx, unit);
@ -1270,7 +1275,18 @@ fn calculate_run_custom_build(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoRes
// the whole crate. // the whole crate.
let (gen_local, overridden) = build_script_local_fingerprints(cx, unit); let (gen_local, overridden) = build_script_local_fingerprints(cx, unit);
let deps = &cx.build_explicit_deps[unit]; let deps = &cx.build_explicit_deps[unit];
let local = (gen_local)(deps, Some(&|| pkg_fingerprint(cx.bcx, &unit.pkg)))?.unwrap(); let local = (gen_local)(
deps,
Some(&|| {
pkg_fingerprint(cx.bcx, &unit.pkg).chain_err(|| {
format!(
"failed to determine package fingerprint for build script for {}",
unit.pkg
)
})
}),
)?
.unwrap();
let output = deps.build_script_output.clone(); let output = deps.build_script_output.clone();
// Include any dependencies of our execution, which is typically just the // Include any dependencies of our execution, which is typically just the
@ -1521,7 +1537,7 @@ fn compare_old_fingerprint(
// update the mtime so other cleaners know we used it // update the mtime so other cleaners know we used it
let t = FileTime::from_system_time(SystemTime::now()); let t = FileTime::from_system_time(SystemTime::now());
debug!("mtime-on-use forcing {:?} to {}", loc, t); debug!("mtime-on-use forcing {:?} to {}", loc, t);
filetime::set_file_times(loc, t, t)?; paths::set_file_time_no_err(loc, t);
} }
let new_hash = new_fingerprint.hash(); let new_hash = new_fingerprint.hash();

View File

@ -889,7 +889,7 @@ impl<'cfg> DrainState<'cfg> {
artifact: Artifact, artifact: Artifact,
cx: &mut Context<'_, '_>, cx: &mut Context<'_, '_>,
) -> CargoResult<()> { ) -> CargoResult<()> {
if unit.mode.is_run_custom_build() && cx.bcx.show_warnings(unit.pkg.package_id()) { if unit.mode.is_run_custom_build() && unit.show_warnings(cx.bcx.config) {
self.emit_warnings(None, unit, cx)?; self.emit_warnings(None, unit, cx)?;
} }
let unlocked = self.queue.finish(unit, &artifact); let unlocked = self.queue.finish(unit, &artifact);

View File

@ -136,7 +136,7 @@ fn compile<'cfg>(
}; };
work.then(link_targets(cx, unit, false)?) work.then(link_targets(cx, unit, false)?)
} else { } else {
let work = if cx.bcx.show_warnings(unit.pkg.package_id()) { let work = if unit.show_warnings(bcx.config) {
replay_output_cache( replay_output_cache(
unit.pkg.package_id(), unit.pkg.package_id(),
&unit.target, &unit.target,
@ -223,6 +223,7 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
.to_path_buf(); .to_path_buf();
let fingerprint_dir = cx.files().fingerprint_dir(unit); let fingerprint_dir = cx.files().fingerprint_dir(unit);
let script_metadata = cx.find_build_script_metadata(unit.clone()); let script_metadata = cx.find_build_script_metadata(unit.clone());
let is_local = unit.is_local();
return Ok(Work::new(move |state| { return Ok(Work::new(move |state| {
// Only at runtime have we discovered what the extra -L and -l // Only at runtime have we discovered what the extra -L and -l
@ -312,7 +313,7 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
&pkg_root, &pkg_root,
&target_dir, &target_dir,
// Do not track source files in the fingerprint for registry dependencies. // Do not track source files in the fingerprint for registry dependencies.
current_id.source_id().is_path(), is_local,
) )
.chain_err(|| { .chain_err(|| {
internal(format!( internal(format!(
@ -320,8 +321,9 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
rustc_dep_info_loc.display() rustc_dep_info_loc.display()
)) ))
})?; })?;
debug!("rewinding mtime of {:?} to {}", dep_info_loc, timestamp); // This mtime shift allows Cargo to detect if a source file was
filetime::set_file_times(dep_info_loc, timestamp, timestamp)?; // modified in the middle of the build.
paths::set_file_time_no_err(dep_info_loc, timestamp);
} }
Ok(()) Ok(())
@ -537,7 +539,7 @@ fn prepare_rustc(
let mut base = cx let mut base = cx
.compilation .compilation
.rustc_process(&unit.pkg, is_primary, is_workspace)?; .rustc_process(unit, is_primary, is_workspace)?;
if cx.bcx.config.cli_unstable().jobserver_per_rustc { if cx.bcx.config.cli_unstable().jobserver_per_rustc {
let client = cx.new_jobserver()?; let client = cx.new_jobserver()?;
base.inherit_jobserver(&client); base.inherit_jobserver(&client);
@ -553,7 +555,7 @@ fn prepare_rustc(
fn rustdoc(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Work> { fn rustdoc(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Work> {
let bcx = cx.bcx; let bcx = cx.bcx;
let mut rustdoc = cx.compilation.rustdoc_process(&unit.pkg, &unit.target)?; let mut rustdoc = cx.compilation.rustdoc_process(unit)?;
rustdoc.inherit_jobserver(&cx.jobserver); rustdoc.inherit_jobserver(&cx.jobserver);
rustdoc.arg("--crate-name").arg(&unit.target.crate_name()); rustdoc.arg("--crate-name").arg(&unit.target.crate_name());
add_path_args(bcx, unit, &mut rustdoc); add_path_args(bcx, unit, &mut rustdoc);
@ -687,12 +689,12 @@ fn add_path_args(bcx: &BuildContext<'_, '_>, unit: &Unit, cmd: &mut ProcessBuild
fn add_cap_lints(bcx: &BuildContext<'_, '_>, unit: &Unit, cmd: &mut ProcessBuilder) { fn add_cap_lints(bcx: &BuildContext<'_, '_>, unit: &Unit, cmd: &mut ProcessBuilder) {
// If this is an upstream dep we don't want warnings from, turn off all // If this is an upstream dep we don't want warnings from, turn off all
// lints. // lints.
if !bcx.show_warnings(unit.pkg.package_id()) { if !unit.show_warnings(bcx.config) {
cmd.arg("--cap-lints").arg("allow"); cmd.arg("--cap-lints").arg("allow");
// If this is an upstream dep but we *do* want warnings, make sure that they // If this is an upstream dep but we *do* want warnings, make sure that they
// don't fail compilation. // don't fail compilation.
} else if !unit.pkg.package_id().source_id().is_path() { } else if !unit.is_local() {
cmd.arg("--cap-lints").arg("warn"); cmd.arg("--cap-lints").arg("warn");
} }
} }

View File

@ -96,8 +96,7 @@ fn add_deps_for_unit(
// Recursively traverse all transitive dependencies // Recursively traverse all transitive dependencies
let unit_deps = Vec::from(cx.unit_deps(unit)); // Create vec due to mutable borrow. let unit_deps = Vec::from(cx.unit_deps(unit)); // Create vec due to mutable borrow.
for dep in unit_deps { for dep in unit_deps {
let source_id = dep.unit.pkg.package_id().source_id(); if unit.is_local() {
if source_id.is_path() {
add_deps_for_unit(deps, cx, &dep.unit, visited)?; add_deps_for_unit(deps, cx, &dep.unit, visited)?;
} }
} }

View File

@ -34,7 +34,7 @@ pub fn parse_unstable_flag(value: Option<&str>) -> Vec<String> {
pub fn resolve_std<'cfg>( pub fn resolve_std<'cfg>(
ws: &Workspace<'cfg>, ws: &Workspace<'cfg>,
target_data: &RustcTargetData, target_data: &RustcTargetData,
requested_target: CompileKind, requested_targets: &[CompileKind],
crates: &[String], crates: &[String],
) -> CargoResult<(PackageSet<'cfg>, Resolve, ResolvedFeatures)> { ) -> CargoResult<(PackageSet<'cfg>, Resolve, ResolvedFeatures)> {
let src_path = detect_sysroot_src_path(target_data)?; let src_path = detect_sysroot_src_path(target_data)?;
@ -107,7 +107,7 @@ pub fn resolve_std<'cfg>(
let resolve = ops::resolve_ws_with_opts( let resolve = ops::resolve_ws_with_opts(
&std_ws, &std_ws,
target_data, target_data,
requested_target, requested_targets,
&opts, &opts,
&specs, &specs,
HasDevUnits::No, HasDevUnits::No,
@ -126,11 +126,11 @@ pub fn generate_std_roots(
crates: &[String], crates: &[String],
std_resolve: &Resolve, std_resolve: &Resolve,
std_features: &ResolvedFeatures, std_features: &ResolvedFeatures,
kind: CompileKind, kinds: &[CompileKind],
package_set: &PackageSet<'_>, package_set: &PackageSet<'_>,
interner: &UnitInterner, interner: &UnitInterner,
profiles: &Profiles, profiles: &Profiles,
) -> CargoResult<Vec<Unit>> { ) -> CargoResult<HashMap<CompileKind, Vec<Unit>>> {
// Generate the root Units for the standard library. // Generate the root Units for the standard library.
let std_ids = crates let std_ids = crates
.iter() .iter()
@ -138,29 +138,42 @@ pub fn generate_std_roots(
.collect::<CargoResult<Vec<PackageId>>>()?; .collect::<CargoResult<Vec<PackageId>>>()?;
// Convert PackageId to Package. // Convert PackageId to Package.
let std_pkgs = package_set.get_many(std_ids)?; let std_pkgs = package_set.get_many(std_ids)?;
// Generate a list of Units. // Generate a map of Units for each kind requested.
std_pkgs let mut ret = HashMap::new();
.into_iter() for pkg in std_pkgs {
.map(|pkg| { let lib = pkg
let lib = pkg .targets()
.targets() .iter()
.iter() .find(|t| t.is_lib())
.find(|t| t.is_lib()) .expect("std has a lib");
.expect("std has a lib"); let unit_for = UnitFor::new_normal();
let unit_for = UnitFor::new_normal(); // I don't think we need to bother with Check here, the difference
// I don't think we need to bother with Check here, the difference // in time is minimal, and the difference in caching is
// in time is minimal, and the difference in caching is // significant.
// significant. let mode = CompileMode::Build;
let mode = CompileMode::Build; let profile = profiles.get_profile(
let profile = pkg.package_id(),
profiles.get_profile(pkg.package_id(), /*is_member*/ false, unit_for, mode); /*is_member*/ false,
let features = /*is_local*/ false,
std_features.activated_features(pkg.package_id(), FeaturesFor::NormalOrDev); unit_for,
Ok(interner.intern( mode,
pkg, lib, profile, kind, mode, features, /*is_std*/ true, );
)) let features = std_features.activated_features(pkg.package_id(), FeaturesFor::NormalOrDev);
})
.collect::<CargoResult<Vec<_>>>() for kind in kinds {
let list = ret.entry(*kind).or_insert(Vec::new());
list.push(interner.intern(
pkg,
lib,
profile,
*kind,
mode,
features.clone(),
/*is_std*/ true,
));
}
}
return Ok(ret);
} }
fn detect_sysroot_src_path(target_data: &RustcTargetData) -> CargoResult<PathBuf> { fn detect_sysroot_src_path(target_data: &RustcTargetData) -> CargoResult<PathBuf> {

View File

@ -614,7 +614,13 @@ fn render_rustc_info(bcx: &BuildContext<'_, '_>) -> String {
.lines() .lines()
.next() .next()
.expect("rustc version"); .expect("rustc version");
let requested_target = bcx.target_data.short_name(&bcx.build_config.requested_kind); let requested_target = bcx
.build_config
.requested_kinds
.iter()
.map(|kind| bcx.target_data.short_name(kind))
.collect::<Vec<_>>()
.join(", ");
format!( format!(
"{}<br>Host: {}<br>Target: {}", "{}<br>Host: {}<br>Target: {}",
version, version,

View File

@ -2,6 +2,7 @@ use crate::core::compiler::{CompileKind, CompileMode};
use crate::core::manifest::{LibKind, Target, TargetKind}; use crate::core::manifest::{LibKind, Target, TargetKind};
use crate::core::{profiles::Profile, InternedString, Package}; use crate::core::{profiles::Profile, InternedString, Package};
use crate::util::hex::short_hash; use crate::util::hex::short_hash;
use crate::util::Config;
use std::cell::RefCell; use std::cell::RefCell;
use std::collections::HashSet; use std::collections::HashSet;
use std::fmt; use std::fmt;
@ -67,6 +68,19 @@ impl UnitInner {
pub fn requires_upstream_objects(&self) -> bool { pub fn requires_upstream_objects(&self) -> bool {
self.mode.is_any_test() || self.target.kind().requires_upstream_objects() self.mode.is_any_test() || self.target.kind().requires_upstream_objects()
} }
/// Returns whether or not this is a "local" package.
///
/// A "local" package is one that the user can likely edit, or otherwise
/// wants warnings, etc.
pub fn is_local(&self) -> bool {
self.pkg.package_id().source_id().is_path() && !self.is_std
}
/// Returns whether or not warnings should be displayed for this unit.
pub fn show_warnings(&self, config: &Config) -> bool {
self.is_local() || config.extra_verbose()
}
} }
impl Unit { impl Unit {

View File

@ -55,7 +55,7 @@ pub fn build_unit_dependencies<'a, 'cfg>(
features: &'a ResolvedFeatures, features: &'a ResolvedFeatures,
std_resolve: Option<&'a (Resolve, ResolvedFeatures)>, std_resolve: Option<&'a (Resolve, ResolvedFeatures)>,
roots: &[Unit], roots: &[Unit],
std_roots: &[Unit], std_roots: &HashMap<CompileKind, Vec<Unit>>,
global_mode: CompileMode, global_mode: CompileMode,
target_data: &'a RustcTargetData, target_data: &'a RustcTargetData,
profiles: &'a Profiles, profiles: &'a Profiles,
@ -108,14 +108,16 @@ pub fn build_unit_dependencies<'a, 'cfg>(
/// Compute all the dependencies for the standard library. /// Compute all the dependencies for the standard library.
fn calc_deps_of_std( fn calc_deps_of_std(
mut state: &mut State<'_, '_>, mut state: &mut State<'_, '_>,
std_roots: &[Unit], std_roots: &HashMap<CompileKind, Vec<Unit>>,
) -> CargoResult<Option<UnitGraph>> { ) -> CargoResult<Option<UnitGraph>> {
if std_roots.is_empty() { if std_roots.is_empty() {
return Ok(None); return Ok(None);
} }
// Compute dependencies for the standard library. // Compute dependencies for the standard library.
state.is_std = true; state.is_std = true;
deps_of_roots(std_roots, &mut state)?; for roots in std_roots.values() {
deps_of_roots(roots, &mut state)?;
}
state.is_std = false; state.is_std = false;
Ok(Some(std::mem::replace( Ok(Some(std::mem::replace(
&mut state.unit_dependencies, &mut state.unit_dependencies,
@ -124,11 +126,15 @@ fn calc_deps_of_std(
} }
/// Add the standard library units to the `unit_dependencies`. /// Add the standard library units to the `unit_dependencies`.
fn attach_std_deps(state: &mut State<'_, '_>, std_roots: &[Unit], std_unit_deps: UnitGraph) { fn attach_std_deps(
state: &mut State<'_, '_>,
std_roots: &HashMap<CompileKind, Vec<Unit>>,
std_unit_deps: UnitGraph,
) {
// Attach the standard library as a dependency of every target unit. // Attach the standard library as a dependency of every target unit.
for (unit, deps) in state.unit_dependencies.iter_mut() { for (unit, deps) in state.unit_dependencies.iter_mut() {
if !unit.kind.is_host() && !unit.mode.is_run_custom_build() { if !unit.kind.is_host() && !unit.mode.is_run_custom_build() {
deps.extend(std_roots.iter().map(|unit| UnitDep { deps.extend(std_roots[&unit.kind].iter().map(|unit| UnitDep {
unit: unit.clone(), unit: unit.clone(),
unit_for: UnitFor::new_normal(), unit_for: UnitFor::new_normal(),
extern_crate_name: unit.pkg.name(), extern_crate_name: unit.pkg.name(),
@ -574,10 +580,14 @@ fn new_unit_dep(
kind: CompileKind, kind: CompileKind,
mode: CompileMode, mode: CompileMode,
) -> CargoResult<UnitDep> { ) -> CargoResult<UnitDep> {
let profile = let is_local = pkg.package_id().source_id().is_path() && !state.is_std;
state let profile = state.profiles.get_profile(
.profiles pkg.package_id(),
.get_profile(pkg.package_id(), state.ws.is_member(pkg), unit_for, mode); state.ws.is_member(pkg),
is_local,
unit_for,
mode,
);
new_unit_dep_with_profile(state, parent, pkg, target, unit_for, kind, mode, profile) new_unit_dep_with_profile(state, parent, pkg, target, unit_for, kind, mode, profile)
} }

View File

@ -352,6 +352,7 @@ pub struct CliUnstable {
pub features: Option<Vec<String>>, pub features: Option<Vec<String>>,
pub crate_versions: bool, pub crate_versions: bool,
pub separate_nightlies: bool, pub separate_nightlies: bool,
pub multitarget: bool,
} }
impl CliUnstable { impl CliUnstable {
@ -430,6 +431,7 @@ impl CliUnstable {
"features" => self.features = Some(parse_features(v)), "features" => self.features = Some(parse_features(v)),
"crate-versions" => self.crate_versions = parse_empty(k, v)?, "crate-versions" => self.crate_versions = parse_empty(k, v)?,
"separate-nightlies" => self.separate_nightlies = parse_empty(k, v)?, "separate-nightlies" => self.separate_nightlies = parse_empty(k, v)?,
"multitarget" => self.multitarget = parse_empty(k, v)?,
_ => bail!("unknown `-Z` flag specified: {}", k), _ => bail!("unknown `-Z` flag specified: {}", k),
} }

View File

@ -353,7 +353,7 @@ compact_debug! {
kinds.clone(), kinds.clone(),
self.src_path.path().unwrap().to_path_buf(), self.src_path.path().unwrap().to_path_buf(),
self.edition, self.edition,
).inner.clone(), ).inner,
format!("lib_target({:?}, {:?}, {:?}, {:?})", format!("lib_target({:?}, {:?}, {:?}, {:?})",
self.name, kinds, self.src_path, self.edition), self.name, kinds, self.src_path, self.edition),
) )
@ -366,21 +366,21 @@ compact_debug! {
&self.name, &self.name,
path.to_path_buf(), path.to_path_buf(),
self.edition, self.edition,
).inner.clone(), ).inner,
format!("custom_build_target({:?}, {:?}, {:?})", format!("custom_build_target({:?}, {:?}, {:?})",
self.name, path, self.edition), self.name, path, self.edition),
) )
} }
TargetSourcePath::Metabuild => { TargetSourcePath::Metabuild => {
( (
Target::metabuild_target(&self.name).inner.clone(), Target::metabuild_target(&self.name).inner,
format!("metabuild_target({:?})", self.name), format!("metabuild_target({:?})", self.name),
) )
} }
} }
} }
_ => ( _ => (
Target::new(self.src_path.clone(), self.edition).inner.clone(), Target::new(self.src_path.clone(), self.edition).inner,
format!("with_path({:?}, {:?})", self.src_path, self.edition), format!("with_path({:?}, {:?})", self.src_path, self.edition),
), ),
} }

View File

@ -472,7 +472,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve: &Resolve, resolve: &Resolve,
root_ids: &[PackageId], root_ids: &[PackageId],
has_dev_units: HasDevUnits, has_dev_units: HasDevUnits,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
target_data: &RustcTargetData, target_data: &RustcTargetData,
) -> CargoResult<()> { ) -> CargoResult<()> {
fn collect_used_deps( fn collect_used_deps(
@ -480,7 +480,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve: &Resolve, resolve: &Resolve,
pkg_id: PackageId, pkg_id: PackageId,
has_dev_units: HasDevUnits, has_dev_units: HasDevUnits,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
target_data: &RustcTargetData, target_data: &RustcTargetData,
) -> CargoResult<()> { ) -> CargoResult<()> {
if !used.insert(pkg_id) { if !used.insert(pkg_id) {
@ -495,9 +495,11 @@ impl<'cfg> PackageSet<'cfg> {
// dependencies are used both for target and host. To tighten this // dependencies are used both for target and host. To tighten this
// up, this function would need to track "for_host" similar to how // up, this function would need to track "for_host" similar to how
// unit dependencies handles it. // unit dependencies handles it.
if !target_data.dep_platform_activated(dep, requested_kind) let activated = requested_kinds
&& !target_data.dep_platform_activated(dep, CompileKind::Host) .iter()
{ .chain(Some(&CompileKind::Host))
.any(|kind| target_data.dep_platform_activated(dep, *kind));
if !activated {
return false; return false;
} }
true true
@ -509,7 +511,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve, resolve,
dep_id, dep_id,
has_dev_units, has_dev_units,
requested_kind, requested_kinds,
target_data, target_data,
)?; )?;
} }
@ -527,7 +529,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve, resolve,
*id, *id,
has_dev_units, has_dev_units,
requested_kind, requested_kinds,
target_data, target_data,
)?; )?;
} }

View File

@ -287,6 +287,7 @@ impl Profiles {
&self, &self,
pkg_id: PackageId, pkg_id: PackageId,
is_member: bool, is_member: bool,
is_local: bool,
unit_for: UnitFor, unit_for: UnitFor,
mode: CompileMode, mode: CompileMode,
) -> Profile { ) -> Profile {
@ -360,7 +361,7 @@ impl Profiles {
// itself (aka crates.io / git dependencies) // itself (aka crates.io / git dependencies)
// //
// (see also https://github.com/rust-lang/cargo/issues/3972) // (see also https://github.com/rust-lang/cargo/issues/3972)
if !pkg_id.source_id().is_path() { if !is_local {
profile.incremental = false; profile.incremental = false;
} }
profile.name = profile_name; profile.name = profile_name;

View File

@ -175,7 +175,7 @@ impl ConflictCache {
dep: &Dependency, dep: &Dependency,
must_contain: Option<PackageId>, must_contain: Option<PackageId>,
) -> Option<&ConflictMap> { ) -> Option<&ConflictMap> {
let out = self.find(dep, &|id| cx.is_active(id), must_contain, std::usize::MAX); let out = self.find(dep, &|id| cx.is_active(id), must_contain, usize::MAX);
if cfg!(debug_assertions) { if cfg!(debug_assertions) {
if let Some(c) = &out { if let Some(c) = &out {
assert!(cx.is_conflicting(None, c).is_some()); assert!(cx.is_conflicting(None, c).is_some());

View File

@ -246,8 +246,8 @@ impl ResolvedFeatures {
pub struct FeatureResolver<'a, 'cfg> { pub struct FeatureResolver<'a, 'cfg> {
ws: &'a Workspace<'cfg>, ws: &'a Workspace<'cfg>,
target_data: &'a RustcTargetData, target_data: &'a RustcTargetData,
/// The platform to build for, requested by the user. /// The platforms to build for, requested by the user.
requested_target: CompileKind, requested_targets: &'a [CompileKind],
resolve: &'a Resolve, resolve: &'a Resolve,
package_set: &'a PackageSet<'cfg>, package_set: &'a PackageSet<'cfg>,
/// Options that change how the feature resolver operates. /// Options that change how the feature resolver operates.
@ -269,7 +269,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
package_set: &'a PackageSet<'cfg>, package_set: &'a PackageSet<'cfg>,
requested_features: &RequestedFeatures, requested_features: &RequestedFeatures,
specs: &[PackageIdSpec], specs: &[PackageIdSpec],
requested_target: CompileKind, requested_targets: &[CompileKind],
has_dev_units: HasDevUnits, has_dev_units: HasDevUnits,
) -> CargoResult<ResolvedFeatures> { ) -> CargoResult<ResolvedFeatures> {
use crate::util::profile; use crate::util::profile;
@ -287,7 +287,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
let mut r = FeatureResolver { let mut r = FeatureResolver {
ws, ws,
target_data, target_data,
requested_target, requested_targets,
resolve, resolve,
package_set, package_set,
opts, opts,
@ -536,8 +536,9 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
.dep_platform_activated(dep, CompileKind::Host); .dep_platform_activated(dep, CompileKind::Host);
} }
// Not a build dependency, and not for a build script, so must be Target. // Not a build dependency, and not for a build script, so must be Target.
self.target_data self.requested_targets
.dep_platform_activated(dep, self.requested_target) .iter()
.any(|kind| self.target_data.dep_platform_activated(dep, *kind))
}; };
self.resolve self.resolve
.deps(pkg_id) .deps(pkg_id)

View File

@ -792,7 +792,9 @@ impl<'cfg> Workspace<'cfg> {
if !manifest.patch().is_empty() { if !manifest.patch().is_empty() {
emit_warning("patch")?; emit_warning("patch")?;
} }
if manifest.resolve_behavior() != self.resolve_behavior { if manifest.resolve_behavior().is_some()
&& manifest.resolve_behavior() != self.resolve_behavior
{
// Only warn if they don't match. // Only warn if they don't match.
emit_warning("resolver")?; emit_warning("resolver")?;
} }

View File

@ -23,7 +23,7 @@ pub struct CleanOptions<'a> {
/// A list of packages to clean. If empty, everything is cleaned. /// A list of packages to clean. If empty, everything is cleaned.
pub spec: Vec<String>, pub spec: Vec<String>,
/// The target arch triple to clean, or None for the host arch /// The target arch triple to clean, or None for the host arch
pub target: Option<String>, pub targets: Vec<String>,
/// Whether to clean the release directory /// Whether to clean the release directory
pub profile_specified: bool, pub profile_specified: bool,
/// Whether to clean the directory of a certain build profile /// Whether to clean the directory of a certain build profile
@ -61,9 +61,9 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
if opts.spec.is_empty() { if opts.spec.is_empty() {
return rm_rf(&target_dir.into_path_unlocked(), config); return rm_rf(&target_dir.into_path_unlocked(), config);
} }
let mut build_config = BuildConfig::new(config, Some(1), &opts.target, CompileMode::Build)?; let mut build_config = BuildConfig::new(config, Some(1), &opts.targets, CompileMode::Build)?;
build_config.requested_profile = opts.requested_profile; build_config.requested_profile = opts.requested_profile;
let target_data = RustcTargetData::new(ws, build_config.requested_kind)?; let target_data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
// Resolve for default features. In the future, `cargo clean` should be rewritten // Resolve for default features. In the future, `cargo clean` should be rewritten
// so that it doesn't need to guess filename hashes. // so that it doesn't need to guess filename hashes.
let resolve_opts = ResolveOpts::new( let resolve_opts = ResolveOpts::new(
@ -80,7 +80,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
let ws_resolve = ops::resolve_ws_with_opts( let ws_resolve = ops::resolve_ws_with_opts(
ws, ws,
&target_data, &target_data,
build_config.requested_kind, &build_config.requested_kinds,
&resolve_opts, &resolve_opts,
&specs, &specs,
HasDevUnits::Yes, HasDevUnits::Yes,
@ -102,13 +102,18 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
// Generate all relevant `Unit` targets for this package // Generate all relevant `Unit` targets for this package
for target in pkg.targets() { for target in pkg.targets() {
for kind in [CompileKind::Host, build_config.requested_kind].iter() { for kind in build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
{
for mode in CompileMode::all_modes() { for mode in CompileMode::all_modes() {
for unit_for in UnitFor::all_values() { for unit_for in UnitFor::all_values() {
let profile = if mode.is_run_custom_build() { let profile = if mode.is_run_custom_build() {
profiles.get_profile_run_custom_build(&profiles.get_profile( profiles.get_profile_run_custom_build(&profiles.get_profile(
pkg.package_id(), pkg.package_id(),
ws.is_member(pkg), ws.is_member(pkg),
/*is_local*/ true,
*unit_for, *unit_for,
CompileMode::Build, CompileMode::Build,
)) ))
@ -116,6 +121,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
profiles.get_profile( profiles.get_profile(
pkg.package_id(), pkg.package_id(),
ws.is_member(pkg), ws.is_member(pkg),
/*is_local*/ true,
*unit_for, *unit_for,
*mode, *mode,
) )
@ -141,7 +147,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
&features, &features,
None, None,
&units, &units,
&[], &Default::default(),
build_config.mode, build_config.mode,
&target_data, &target_data,
&profiles, &profiles,

View File

@ -79,7 +79,7 @@ pub struct CompileOptions {
impl<'a> CompileOptions { impl<'a> CompileOptions {
pub fn new(config: &Config, mode: CompileMode) -> CargoResult<CompileOptions> { pub fn new(config: &Config, mode: CompileMode) -> CargoResult<CompileOptions> {
Ok(CompileOptions { Ok(CompileOptions {
build_config: BuildConfig::new(config, None, &None, mode)?, build_config: BuildConfig::new(config, None, &[], mode)?,
features: Vec::new(), features: Vec::new(),
all_features: false, all_features: false,
no_default_features: false, no_default_features: false,
@ -310,7 +310,7 @@ pub fn create_bcx<'a, 'cfg>(
} }
} }
let target_data = RustcTargetData::new(ws, build_config.requested_kind)?; let target_data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
let specs = spec.to_package_id_specs(ws)?; let specs = spec.to_package_id_specs(ws)?;
let dev_deps = ws.require_optional_deps() || filter.need_dev_deps(build_config.mode); let dev_deps = ws.require_optional_deps() || filter.need_dev_deps(build_config.mode);
@ -323,7 +323,7 @@ pub fn create_bcx<'a, 'cfg>(
let resolve = ops::resolve_ws_with_opts( let resolve = ops::resolve_ws_with_opts(
ws, ws,
&target_data, &target_data,
build_config.requested_kind, &build_config.requested_kinds,
&opts, &opts,
&specs, &specs,
has_dev_units, has_dev_units,
@ -341,14 +341,14 @@ pub fn create_bcx<'a, 'cfg>(
.shell() .shell()
.warn("-Zbuild-std does not currently fully support --build-plan")?; .warn("-Zbuild-std does not currently fully support --build-plan")?;
} }
if build_config.requested_kind.is_host() { if build_config.requested_kinds[0].is_host() {
// TODO: This should eventually be fixed. Unfortunately it is not // TODO: This should eventually be fixed. Unfortunately it is not
// easy to get the host triple in BuildConfig. Consider changing // easy to get the host triple in BuildConfig. Consider changing
// requested_target to an enum, or some other approach. // requested_target to an enum, or some other approach.
anyhow::bail!("-Zbuild-std requires --target"); anyhow::bail!("-Zbuild-std requires --target");
} }
let (std_package_set, std_resolve, std_features) = let (std_package_set, std_resolve, std_features) =
standard_lib::resolve_std(ws, &target_data, build_config.requested_kind, crates)?; standard_lib::resolve_std(ws, &target_data, &build_config.requested_kinds, crates)?;
pkg_set.add_set(std_package_set); pkg_set.add_set(std_package_set);
Some((std_resolve, std_features)) Some((std_resolve, std_features))
} else { } else {
@ -413,13 +413,13 @@ pub fn create_bcx<'a, 'cfg>(
ws, ws,
&to_builds, &to_builds,
filter, filter,
build_config.requested_kind, &build_config.requested_kinds,
build_config.mode, build_config.mode,
&resolve, &resolve,
&resolved_features, &resolved_features,
&pkg_set, &pkg_set,
&profiles, &profiles,
&interner, interner,
)?; )?;
let std_roots = if let Some(crates) = &config.cli_unstable().build_std { let std_roots = if let Some(crates) = &config.cli_unstable().build_std {
@ -440,13 +440,13 @@ pub fn create_bcx<'a, 'cfg>(
&crates, &crates,
std_resolve, std_resolve,
std_features, std_features,
build_config.requested_kind, &build_config.requested_kinds,
&pkg_set, &pkg_set,
&interner, interner,
&profiles, &profiles,
)? )?
} else { } else {
Vec::new() Default::default()
}; };
let mut extra_compiler_args = HashMap::new(); let mut extra_compiler_args = HashMap::new();
@ -491,7 +491,7 @@ pub fn create_bcx<'a, 'cfg>(
build_config.mode, build_config.mode,
&target_data, &target_data,
&profiles, &profiles,
&interner, interner,
)?; )?;
let bcx = BuildContext::new( let bcx = BuildContext::new(
@ -694,7 +694,7 @@ fn generate_targets(
ws: &Workspace<'_>, ws: &Workspace<'_>,
packages: &[&Package], packages: &[&Package],
filter: &CompileFilter, filter: &CompileFilter,
default_arch_kind: CompileKind, requested_kinds: &[CompileKind],
mode: CompileMode, mode: CompileMode,
resolve: &Resolve, resolve: &Resolve,
resolved_features: &features::ResolvedFeatures, resolved_features: &features::ResolvedFeatures,
@ -703,80 +703,90 @@ fn generate_targets(
interner: &UnitInterner, interner: &UnitInterner,
) -> CargoResult<Vec<Unit>> { ) -> CargoResult<Vec<Unit>> {
let config = ws.config(); let config = ws.config();
// Helper for creating a `Unit` struct. // Helper for creating a list of `Unit` structures
let new_unit = |pkg: &Package, target: &Target, target_mode: CompileMode| { let new_unit =
let unit_for = if target_mode.is_any_test() { |units: &mut HashSet<Unit>, pkg: &Package, target: &Target, target_mode: CompileMode| {
// NOTE: the `UnitFor` here is subtle. If you have a profile let unit_for = if target_mode.is_any_test() {
// with `panic` set, the `panic` flag is cleared for // NOTE: the `UnitFor` here is subtle. If you have a profile
// tests/benchmarks and their dependencies. If this // with `panic` set, the `panic` flag is cleared for
// was `normal`, then the lib would get compiled three // tests/benchmarks and their dependencies. If this
// times (once with panic, once without, and once with // was `normal`, then the lib would get compiled three
// `--test`). // times (once with panic, once without, and once with
// // `--test`).
// This would cause a problem for doc tests, which would fail //
// because `rustdoc` would attempt to link with both libraries // This would cause a problem for doc tests, which would fail
// at the same time. Also, it's probably not important (or // because `rustdoc` would attempt to link with both libraries
// even desirable?) for rustdoc to link with a lib with // at the same time. Also, it's probably not important (or
// `panic` set. // even desirable?) for rustdoc to link with a lib with
// // `panic` set.
// As a consequence, Examples and Binaries get compiled //
// without `panic` set. This probably isn't a bad deal. // As a consequence, Examples and Binaries get compiled
// // without `panic` set. This probably isn't a bad deal.
// Forcing the lib to be compiled three times during `cargo //
// test` is probably also not desirable. // Forcing the lib to be compiled three times during `cargo
UnitFor::new_test(config) // test` is probably also not desirable.
} else if target.for_host() { UnitFor::new_test(config)
// Proc macro / plugin should not have `panic` set. } else if target.for_host() {
UnitFor::new_compiler() // Proc macro / plugin should not have `panic` set.
} else { UnitFor::new_compiler()
UnitFor::new_normal() } else {
}; UnitFor::new_normal()
// Custom build units are added in `build_unit_dependencies`. };
assert!(!target.is_custom_build()); // Custom build units are added in `build_unit_dependencies`.
let target_mode = match target_mode { assert!(!target.is_custom_build());
CompileMode::Test => { let target_mode = match target_mode {
if target.is_example() && !filter.is_specific() && !target.tested() { CompileMode::Test => {
// Examples are included as regular binaries to verify if target.is_example() && !filter.is_specific() && !target.tested() {
// that they compile. // Examples are included as regular binaries to verify
CompileMode::Build // that they compile.
} else { CompileMode::Build
CompileMode::Test } else {
CompileMode::Test
}
} }
} CompileMode::Build => match *target.kind() {
CompileMode::Build => match *target.kind() { TargetKind::Test => CompileMode::Test,
TargetKind::Test => CompileMode::Test, TargetKind::Bench => CompileMode::Bench,
TargetKind::Bench => CompileMode::Bench, _ => CompileMode::Build,
_ => CompileMode::Build, },
}, // `CompileMode::Bench` is only used to inform `filter_default_targets`
// `CompileMode::Bench` is only used to inform `filter_default_targets` // which command is being used (`cargo bench`). Afterwards, tests
// which command is being used (`cargo bench`). Afterwards, tests // and benches are treated identically. Switching the mode allows
// and benches are treated identically. Switching the mode allows // de-duplication of units that are essentially identical. For
// de-duplication of units that are essentially identical. For // example, `cargo build --all-targets --release` creates the units
// example, `cargo build --all-targets --release` creates the units // (lib profile:bench, mode:test) and (lib profile:bench, mode:bench)
// (lib profile:bench, mode:test) and (lib profile:bench, mode:bench) // and since these are the same, we want them to be de-duplicated in
// and since these are the same, we want them to be de-duplicated in // `unit_dependencies`.
// `unit_dependencies`. CompileMode::Bench => CompileMode::Test,
CompileMode::Bench => CompileMode::Test, _ => target_mode,
_ => target_mode, };
};
let kind = default_arch_kind.for_target(target);
let profile =
profiles.get_profile(pkg.package_id(), ws.is_member(pkg), unit_for, target_mode);
// No need to worry about build-dependencies, roots are never build dependencies. let is_local = pkg.package_id().source_id().is_path();
let features_for = FeaturesFor::from_for_host(target.proc_macro()); let profile = profiles.get_profile(
let features = pkg.package_id(),
Vec::from(resolved_features.activated_features(pkg.package_id(), features_for)); ws.is_member(pkg),
interner.intern( is_local,
pkg, unit_for,
target, target_mode,
profile, );
kind,
target_mode, // No need to worry about build-dependencies, roots are never build dependencies.
features, let features_for = FeaturesFor::from_for_host(target.proc_macro());
/*is_std*/ false, let features = resolved_features.activated_features(pkg.package_id(), features_for);
)
}; for kind in requested_kinds {
let unit = interner.intern(
pkg,
target,
profile,
kind.for_target(target),
target_mode,
features.clone(),
/*is_std*/ false,
);
units.insert(unit);
}
};
// Create a list of proposed targets. // Create a list of proposed targets.
let mut proposals: Vec<Proposal<'_>> = Vec::new(); let mut proposals: Vec<Proposal<'_>> = Vec::new();
@ -921,8 +931,7 @@ fn generate_targets(
None => Vec::new(), None => Vec::new(),
}; };
if target.is_lib() || unavailable_features.is_empty() { if target.is_lib() || unavailable_features.is_empty() {
let unit = new_unit(pkg, target, mode); new_unit(&mut units, pkg, target, mode);
units.insert(unit);
} else if requires_features { } else if requires_features {
let required_features = target.required_features().unwrap(); let required_features = target.required_features().unwrap();
let quoted_required_features: Vec<String> = required_features let quoted_required_features: Vec<String> = required_features

View File

@ -25,12 +25,11 @@ pub fn doc(ws: &Workspace<'_>, options: &DocOptions) -> CargoResult<()> {
options.compile_opts.all_features, options.compile_opts.all_features,
!options.compile_opts.no_default_features, !options.compile_opts.no_default_features,
); );
let requested_kind = options.compile_opts.build_config.requested_kind; let target_data = RustcTargetData::new(ws, &options.compile_opts.build_config.requested_kinds)?;
let target_data = RustcTargetData::new(ws, requested_kind)?;
let ws_resolve = ops::resolve_ws_with_opts( let ws_resolve = ops::resolve_ws_with_opts(
ws, ws,
&target_data, &target_data,
requested_kind, &options.compile_opts.build_config.requested_kinds,
&opts, &opts,
&specs, &specs,
HasDevUnits::No, HasDevUnits::No,
@ -69,15 +68,20 @@ pub fn doc(ws: &Workspace<'_>, options: &DocOptions) -> CargoResult<()> {
} }
} }
let open_kind = if options.open_result {
Some(options.compile_opts.build_config.single_requested_kind()?)
} else {
None
};
let compilation = ops::compile(ws, &options.compile_opts)?; let compilation = ops::compile(ws, &options.compile_opts)?;
if options.open_result { if let Some(kind) = open_kind {
let name = match names.first() { let name = match names.first() {
Some(s) => s.to_string(), Some(s) => s.to_string(),
None => return Ok(()), None => return Ok(()),
}; };
let path = compilation let path = compilation.root_output[&kind]
.root_output
.with_file_name("doc") .with_file_name("doc")
.join(&name) .join(&name)
.join("index.html"); .join("index.html");

View File

@ -1,4 +1,4 @@
use crate::core::compiler::{BuildConfig, CompileMode, TargetInfo}; use crate::core::compiler::{BuildConfig, CompileMode, RustcTargetData};
use crate::core::{PackageSet, Resolve, Workspace}; use crate::core::{PackageSet, Resolve, Workspace};
use crate::ops; use crate::ops;
use crate::util::CargoResult; use crate::util::CargoResult;
@ -8,7 +8,7 @@ use std::collections::HashSet;
pub struct FetchOptions<'a> { pub struct FetchOptions<'a> {
pub config: &'a Config, pub config: &'a Config,
/// The target arch triple to fetch dependencies for /// The target arch triple to fetch dependencies for
pub target: Option<String>, pub targets: Vec<String>,
} }
/// Executes `cargo fetch`. /// Executes `cargo fetch`.
@ -21,14 +21,8 @@ pub fn fetch<'a>(
let jobs = Some(1); let jobs = Some(1);
let config = ws.config(); let config = ws.config();
let build_config = BuildConfig::new(config, jobs, &options.target, CompileMode::Build)?; let build_config = BuildConfig::new(config, jobs, &options.targets, CompileMode::Build)?;
let rustc = config.load_global_rustc(Some(ws))?; let data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
let target_info = TargetInfo::new(
config,
build_config.requested_kind,
&rustc,
build_config.requested_kind,
)?;
let mut fetched_packages = HashSet::new(); let mut fetched_packages = HashSet::new();
let mut deps_to_fetch = ws.members().map(|p| p.package_id()).collect::<Vec<_>>(); let mut deps_to_fetch = ws.members().map(|p| p.package_id()).collect::<Vec<_>>();
let mut to_download = Vec::new(); let mut to_download = Vec::new();
@ -43,20 +37,21 @@ pub fn fetch<'a>(
.deps(id) .deps(id)
.filter(|&(_id, deps)| { .filter(|&(_id, deps)| {
deps.iter().any(|d| { deps.iter().any(|d| {
// If no target was specified then all dependencies can // If no target was specified then all dependencies are
// be fetched. // fetched.
let target = match options.target { if options.targets.len() == 0 {
Some(ref t) => t, return true;
None => return true, }
};
// If this dependency is only available for certain // Otherwise we only download this dependency if any of the
// platforms, make sure we're only fetching it for that // requested platforms would match this dependency. Note
// platform. // that this is a bit lossy because not all dependencies are
let platform = match d.platform() { // always compiled for all platforms, but it should be
Some(p) => p, // "close enough" for now.
None => return true, build_config
}; .requested_kinds
platform.matches(target, target_info.cfg()) .iter()
.any(|kind| data.dep_platform_activated(d, *kind))
}) })
}) })
.map(|(id, _deps)| id); .map(|(id, _deps)| id);

View File

@ -376,7 +376,7 @@ fn install_one(
let mut binaries: Vec<(&str, &Path)> = compile let mut binaries: Vec<(&str, &Path)> = compile
.binaries .binaries
.iter() .iter()
.map(|bin| { .map(|(_, bin)| {
let name = bin.file_name().unwrap(); let name = bin.file_name().unwrap();
if let Some(s) = name.to_str() { if let Some(s) = name.to_str() {
Ok((s, bin.as_ref())) Ok((s, bin.as_ref()))
@ -612,7 +612,7 @@ fn make_ws_rustc_target<'cfg>(
ws.set_require_optional_deps(false); ws.set_require_optional_deps(false);
let rustc = config.load_global_rustc(Some(&ws))?; let rustc = config.load_global_rustc(Some(&ws))?;
let target = match &opts.build_config.requested_kind { let target = match &opts.build_config.single_requested_kind()? {
CompileKind::Host => rustc.host.as_str().to_owned(), CompileKind::Host => rustc.host.as_str().to_owned(),
CompileKind::Target(target) => target.short_name().to_owned(), CompileKind::Target(target) => target.short_name().to_owned(),
}; };

View File

@ -1,4 +1,4 @@
use crate::core::compiler::{CompileKind, CompileTarget, RustcTargetData}; use crate::core::compiler::{CompileKind, RustcTargetData};
use crate::core::dependency::DepKind; use crate::core::dependency::DepKind;
use crate::core::resolver::{HasDevUnits, Resolve, ResolveOpts}; use crate::core::resolver::{HasDevUnits, Resolve, ResolveOpts};
use crate::core::{Dependency, InternedString, Package, PackageId, Workspace}; use crate::core::{Dependency, InternedString, Package, PackageId, Workspace};
@ -17,7 +17,7 @@ pub struct OutputMetadataOptions {
pub all_features: bool, pub all_features: bool,
pub no_deps: bool, pub no_deps: bool,
pub version: u32, pub version: u32,
pub filter_platform: Option<String>, pub filter_platforms: Vec<String>,
} }
/// Loads the manifest, resolves the dependencies of the package to the concrete /// Loads the manifest, resolves the dependencies of the package to the concrete
@ -105,11 +105,9 @@ fn build_resolve_graph(
) -> CargoResult<(Vec<Package>, MetadataResolve)> { ) -> CargoResult<(Vec<Package>, MetadataResolve)> {
// TODO: Without --filter-platform, features are being resolved for `host` only. // TODO: Without --filter-platform, features are being resolved for `host` only.
// How should this work? // How should this work?
let requested_kind = match &metadata_opts.filter_platform { let requested_kinds =
Some(t) => CompileKind::Target(CompileTarget::new(t)?), CompileKind::from_requested_targets(ws.config(), &metadata_opts.filter_platforms)?;
None => CompileKind::Host, let target_data = RustcTargetData::new(ws, &requested_kinds)?;
};
let target_data = RustcTargetData::new(ws, requested_kind)?;
// Resolve entire workspace. // Resolve entire workspace.
let specs = Packages::All.to_package_id_specs(ws)?; let specs = Packages::All.to_package_id_specs(ws)?;
let resolve_opts = ResolveOpts::new( let resolve_opts = ResolveOpts::new(
@ -121,7 +119,7 @@ fn build_resolve_graph(
let ws_resolve = ops::resolve_ws_with_opts( let ws_resolve = ops::resolve_ws_with_opts(
ws, ws,
&target_data, &target_data,
requested_kind, &requested_kinds,
&resolve_opts, &resolve_opts,
&specs, &specs,
HasDevUnits::Yes, HasDevUnits::Yes,
@ -147,7 +145,7 @@ fn build_resolve_graph(
&ws_resolve.targeted_resolve, &ws_resolve.targeted_resolve,
&package_map, &package_map,
&target_data, &target_data,
requested_kind, &requested_kinds,
); );
} }
// Get a Vec of Packages. // Get a Vec of Packages.
@ -168,7 +166,7 @@ fn build_resolve_graph_r(
resolve: &Resolve, resolve: &Resolve,
package_map: &HashMap<PackageId, Package>, package_map: &HashMap<PackageId, Package>,
target_data: &RustcTargetData, target_data: &RustcTargetData,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
) { ) {
if node_map.contains_key(&pkg_id) { if node_map.contains_key(&pkg_id) {
return; return;
@ -177,12 +175,15 @@ fn build_resolve_graph_r(
let deps: Vec<Dep> = resolve let deps: Vec<Dep> = resolve
.deps(pkg_id) .deps(pkg_id)
.filter(|(_dep_id, deps)| match requested_kind { .filter(|(_dep_id, deps)| {
CompileKind::Target(_) => deps if requested_kinds == &[CompileKind::Host] {
.iter() true
.any(|dep| target_data.dep_platform_activated(dep, requested_kind)), } else {
// No --filter-platform is interpreted as "all platforms". requested_kinds.iter().any(|kind| {
CompileKind::Host => true, deps.iter()
.any(|dep| target_data.dep_platform_activated(dep, *kind))
})
}
}) })
.filter_map(|(dep_id, deps)| { .filter_map(|(dep_id, deps)| {
let dep_kinds: Vec<_> = deps.iter().map(DepKindInfo::from).collect(); let dep_kinds: Vec<_> = deps.iter().map(DepKindInfo::from).collect();
@ -213,7 +214,7 @@ fn build_resolve_graph_r(
resolve, resolve,
package_map, package_map,
target_data, target_data,
requested_kind, requested_kinds,
); );
} }
} }

View File

@ -29,7 +29,7 @@ pub struct PackageOpts<'cfg> {
pub allow_dirty: bool, pub allow_dirty: bool,
pub verify: bool, pub verify: bool,
pub jobs: Option<u32>, pub jobs: Option<u32>,
pub target: Option<String>, pub targets: Vec<String>,
pub features: Vec<String>, pub features: Vec<String>,
pub all_features: bool, pub all_features: bool,
pub no_default_features: bool, pub no_default_features: bool,
@ -50,8 +50,17 @@ struct ArchiveFile {
enum FileContents { enum FileContents {
/// Absolute path to the file on disk to add to the archive. /// Absolute path to the file on disk to add to the archive.
OnDisk(PathBuf), OnDisk(PathBuf),
/// Contents of a file generated in memory. /// Generates a file.
Generated(String), Generated(GeneratedFile),
}
enum GeneratedFile {
/// Generates `Cargo.toml` by rewriting the original.
Manifest,
/// Generates `Cargo.lock` in some cases (like if there is a binary).
Lockfile,
/// Adds a `.cargo-vcs_info.json` file if in a (clean) git repo.
VcsInfo(String),
} }
pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option<FileLock>> { pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option<FileLock>> {
@ -71,8 +80,6 @@ pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option
check_metadata(pkg, config)?; check_metadata(pkg, config)?;
} }
verify_dependencies(pkg)?;
if !pkg.manifest().exclude().is_empty() && !pkg.manifest().include().is_empty() { if !pkg.manifest().exclude().is_empty() && !pkg.manifest().include().is_empty() {
config.shell().warn( config.shell().warn(
"both package.include and package.exclude are specified; \ "both package.include and package.exclude are specified; \
@ -100,6 +107,8 @@ pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option
return Ok(None); return Ok(None);
} }
verify_dependencies(pkg)?;
let filename = format!("{}-{}.crate", pkg.name(), pkg.version()); let filename = format!("{}-{}.crate", pkg.name(), pkg.version());
let dir = ws.target_dir().join("package"); let dir = ws.target_dir().join("package");
let mut dst = { let mut dst = {
@ -156,11 +165,10 @@ fn build_ar_list(
rel_str: "Cargo.toml.orig".to_string(), rel_str: "Cargo.toml.orig".to_string(),
contents: FileContents::OnDisk(src_file), contents: FileContents::OnDisk(src_file),
}); });
let generated = pkg.to_registry_toml(ws)?;
result.push(ArchiveFile { result.push(ArchiveFile {
rel_path, rel_path,
rel_str, rel_str,
contents: FileContents::Generated(generated), contents: FileContents::Generated(GeneratedFile::Manifest),
}); });
} }
"Cargo.lock" => continue, "Cargo.lock" => continue,
@ -179,18 +187,17 @@ fn build_ar_list(
} }
} }
if pkg.include_lockfile() { if pkg.include_lockfile() {
let new_lock = build_lock(ws)?;
result.push(ArchiveFile { result.push(ArchiveFile {
rel_path: PathBuf::from("Cargo.lock"), rel_path: PathBuf::from("Cargo.lock"),
rel_str: "Cargo.lock".to_string(), rel_str: "Cargo.lock".to_string(),
contents: FileContents::Generated(new_lock), contents: FileContents::Generated(GeneratedFile::Lockfile),
}); });
} }
if let Some(vcs_info) = vcs_info { if let Some(vcs_info) = vcs_info {
result.push(ArchiveFile { result.push(ArchiveFile {
rel_path: PathBuf::from(VCS_INFO_FILE), rel_path: PathBuf::from(VCS_INFO_FILE),
rel_str: VCS_INFO_FILE.to_string(), rel_str: VCS_INFO_FILE.to_string(),
contents: FileContents::Generated(vcs_info), contents: FileContents::Generated(GeneratedFile::VcsInfo(vcs_info)),
}); });
} }
if let Some(license_file) = &pkg.manifest().metadata().license_file { if let Some(license_file) = &pkg.manifest().metadata().license_file {
@ -530,7 +537,12 @@ fn tar(
format!("could not archive source file `{}`", disk_path.display()) format!("could not archive source file `{}`", disk_path.display())
})?; })?;
} }
FileContents::Generated(contents) => { FileContents::Generated(generated_kind) => {
let contents = match generated_kind {
GeneratedFile::Manifest => pkg.to_registry_toml(ws)?,
GeneratedFile::Lockfile => build_lock(ws)?,
GeneratedFile::VcsInfo(s) => s,
};
header.set_entry_type(EntryType::file()); header.set_entry_type(EntryType::file());
header.set_mode(0o644); header.set_mode(0o644);
header.set_mtime( header.set_mtime(
@ -704,7 +716,7 @@ fn run_verify(ws: &Workspace<'_>, tar: &FileLock, opts: &PackageOpts<'_>) -> Car
ops::compile_with_exec( ops::compile_with_exec(
&ws, &ws,
&ops::CompileOptions { &ops::CompileOptions {
build_config: BuildConfig::new(config, opts.jobs, &opts.target, CompileMode::Build)?, build_config: BuildConfig::new(config, opts.jobs, &opts.targets, CompileMode::Build)?,
features: opts.features.clone(), features: opts.features.clone(),
no_default_features: opts.no_default_features, no_default_features: opts.no_default_features,
all_features: opts.all_features, all_features: opts.all_features,
@ -823,25 +835,12 @@ fn check_filename(file: &Path, shell: &mut Shell) -> CargoResult<()> {
file.display() file.display()
) )
} }
let mut check_windows = |name| -> CargoResult<()> { if restricted_names::is_windows_reserved_path(file) {
if restricted_names::is_windows_reserved(name) { shell.warn(format!(
shell.warn(format!( "file {} is a reserved Windows filename, \
"file {} is a reserved Windows filename, \
it will not work on Windows platforms", it will not work on Windows platforms",
file.display() file.display()
))?; ))?;
}
Ok(())
};
for component in file.iter() {
if let Some(component) = component.to_str() {
check_windows(component)?;
}
}
if file.extension().is_some() {
if let Some(stem) = file.file_stem().and_then(|s| s.to_str()) {
check_windows(stem)?;
}
} }
Ok(()) Ok(())
} }

View File

@ -70,16 +70,19 @@ pub fn run(
} }
} }
// `cargo run` is only compatible with one `--target` flag at most
options.build_config.single_requested_kind()?;
let compile = ops::compile(ws, options)?; let compile = ops::compile(ws, options)?;
assert_eq!(compile.binaries.len(), 1); assert_eq!(compile.binaries.len(), 1);
let exe = &compile.binaries[0]; let (unit, exe) = &compile.binaries[0];
let exe = match exe.strip_prefix(config.cwd()) { let exe = match exe.strip_prefix(config.cwd()) {
Ok(path) if path.file_name() == Some(path.as_os_str()) => Path::new(".").join(path), Ok(path) if path.file_name() == Some(path.as_os_str()) => Path::new(".").join(path),
Ok(path) => path.to_path_buf(), Ok(path) => path.to_path_buf(),
Err(_) => exe.to_path_buf(), Err(_) => exe.to_path_buf(),
}; };
let pkg = bins[0].0; let pkg = bins[0].0;
let mut process = compile.target_process(exe, pkg)?; let mut process = compile.target_process(exe, unit.kind, pkg)?;
process.args(args).cwd(config.cwd()); process.args(args).cwd(config.cwd());
config.shell().status("Running", process.to_string())?; config.shell().status("Running", process.to_string())?;

View File

@ -64,9 +64,7 @@ pub fn run_benches(
fn compile_tests<'a>(ws: &Workspace<'a>, options: &TestOptions) -> CargoResult<Compilation<'a>> { fn compile_tests<'a>(ws: &Workspace<'a>, options: &TestOptions) -> CargoResult<Compilation<'a>> {
let mut compilation = ops::compile(ws, &options.compile_opts)?; let mut compilation = ops::compile(ws, &options.compile_opts)?;
compilation compilation.tests.sort();
.tests
.sort_by(|a, b| (a.0.package_id(), &a.1, &a.2).cmp(&(b.0.package_id(), &b.1, &b.2)));
Ok(compilation) Ok(compilation)
} }
@ -78,16 +76,14 @@ fn run_unit_tests(
compilation: &Compilation<'_>, compilation: &Compilation<'_>,
) -> CargoResult<(Test, Vec<ProcessError>)> { ) -> CargoResult<(Test, Vec<ProcessError>)> {
let cwd = config.cwd(); let cwd = config.cwd();
let mut errors = Vec::new(); let mut errors = Vec::new();
for &(ref pkg, ref target, ref exe) in &compilation.tests { for (unit, exe) in compilation.tests.iter() {
let kind = target.kind(); let test = unit.target.name().to_string();
let test = target.name().to_string();
let exe_display = exe.strip_prefix(cwd).unwrap_or(exe).display(); let exe_display = exe.strip_prefix(cwd).unwrap_or(exe).display();
let mut cmd = compilation.target_process(exe, pkg)?; let mut cmd = compilation.target_process(exe, unit.kind, &unit.pkg)?;
cmd.args(test_args); cmd.args(test_args);
if target.harness() && config.shell().verbosity() == Verbosity::Quiet { if unit.target.harness() && config.shell().verbosity() == Verbosity::Quiet {
cmd.arg("--quiet"); cmd.arg("--quiet");
} }
config config
@ -102,7 +98,12 @@ fn run_unit_tests(
match result { match result {
Err(e) => { Err(e) => {
let e = e.downcast::<ProcessError>()?; let e = e.downcast::<ProcessError>()?;
errors.push((kind.clone(), test.clone(), pkg.name().to_string(), e)); errors.push((
unit.target.kind().clone(),
test.clone(),
unit.pkg.name().to_string(),
e,
));
if !options.no_fail_fast { if !options.no_fail_fast {
break; break;
} }
@ -137,48 +138,44 @@ fn run_doc_tests(
) -> CargoResult<(Test, Vec<ProcessError>)> { ) -> CargoResult<(Test, Vec<ProcessError>)> {
let mut errors = Vec::new(); let mut errors = Vec::new();
// The unstable doctest-xcompile feature enables both per-target-ignores and
// cross-compiling doctests. As a side effect, this feature also gates running
// doctests with runtools when target == host.
let doctest_xcompile = config.cli_unstable().doctest_xcompile;
let mut runtool: &Option<(std::path::PathBuf, Vec<String>)> = &None;
if doctest_xcompile {
runtool = compilation.target_runner();
} else if compilation.host != compilation.target {
return Ok((Test::Doc, errors));
}
for doctest_info in &compilation.to_doc_test { for doctest_info in &compilation.to_doc_test {
let Doctest { let Doctest {
package,
target,
args, args,
unstable_opts, unstable_opts,
unit,
} = doctest_info; } = doctest_info;
config.shell().status("Doc-tests", target.name())?;
let mut p = compilation.rustdoc_process(package, target)?;
p.arg("--test")
.arg(target.src_path().path().unwrap())
.arg("--crate-name")
.arg(&target.crate_name());
if doctest_xcompile { // Skip any `--target` tests unless `doctest-xcompile` is specified.
if let CompileKind::Target(target) = options.compile_opts.build_config.requested_kind { if !config.cli_unstable().doctest_xcompile && !unit.kind.is_host() {
continue;
}
config.shell().status("Doc-tests", unit.target.name())?;
let mut p = compilation.rustdoc_process(unit)?;
p.arg("--test")
.arg(unit.target.src_path().path().unwrap())
.arg("--crate-name")
.arg(&unit.target.crate_name());
if config.cli_unstable().doctest_xcompile {
if let CompileKind::Target(target) = unit.kind {
// use `rustc_target()` to properly handle JSON target paths // use `rustc_target()` to properly handle JSON target paths
p.arg("--target").arg(target.rustc_target()); p.arg("--target").arg(target.rustc_target());
} }
p.arg("-Zunstable-options"); p.arg("-Zunstable-options");
p.arg("--enable-per-target-ignores"); p.arg("--enable-per-target-ignores");
} if let Some((runtool, runtool_args)) = compilation.target_runner(unit.kind) {
p.arg("--runtool").arg(runtool);
if let Some((runtool, runtool_args)) = runtool { for arg in runtool_args {
p.arg("--runtool").arg(runtool); p.arg("--runtool-arg").arg(arg);
for arg in runtool_args { }
p.arg("--runtool-arg").arg(arg);
} }
} }
for &rust_dep in &[&compilation.deps_output] { for &rust_dep in &[
&compilation.deps_output[&unit.kind],
&compilation.deps_output[&CompileKind::Host],
] {
let mut arg = OsString::from("dependency="); let mut arg = OsString::from("dependency=");
arg.push(rust_dep); arg.push(rust_dep);
p.arg("-L").arg(arg); p.arg("-L").arg(arg);
@ -188,17 +185,11 @@ fn run_doc_tests(
p.arg("-L").arg(native_dep); p.arg("-L").arg(native_dep);
} }
for &host_rust_dep in &[&compilation.host_deps_output] {
let mut arg = OsString::from("dependency=");
arg.push(host_rust_dep);
p.arg("-L").arg(arg);
}
for arg in test_args { for arg in test_args {
p.arg("--test-args").arg(arg); p.arg("--test-args").arg(arg);
} }
if let Some(cfgs) = compilation.cfgs.get(&package.package_id()) { if let Some(cfgs) = compilation.cfgs.get(&unit.pkg.package_id()) {
for cfg in cfgs.iter() { for cfg in cfgs.iter() {
p.arg("--cfg").arg(cfg); p.arg("--cfg").arg(cfg);
} }
@ -212,7 +203,7 @@ fn run_doc_tests(
p.arg("-Zunstable-options"); p.arg("-Zunstable-options");
} }
if let Some(flags) = compilation.rustdocflags.get(&package.package_id()) { if let Some(flags) = compilation.rustdocflags.get(&unit.pkg.package_id()) {
p.args(flags); p.args(flags);
} }
config config

View File

@ -42,7 +42,7 @@ pub struct PublishOpts<'cfg> {
pub verify: bool, pub verify: bool,
pub allow_dirty: bool, pub allow_dirty: bool,
pub jobs: Option<u32>, pub jobs: Option<u32>,
pub target: Option<String>, pub targets: Vec<String>,
pub dry_run: bool, pub dry_run: bool,
pub registry: Option<String>, pub registry: Option<String>,
pub features: Vec<String>, pub features: Vec<String>,
@ -88,7 +88,7 @@ pub fn publish(ws: &Workspace<'_>, opts: &PublishOpts<'_>) -> CargoResult<()> {
list: false, list: false,
check_metadata: true, check_metadata: true,
allow_dirty: opts.allow_dirty, allow_dirty: opts.allow_dirty,
target: opts.target.clone(), targets: opts.targets.clone(),
jobs: opts.jobs, jobs: opts.jobs,
features: opts.features.clone(), features: opts.features.clone(),
all_features: opts.all_features, all_features: opts.all_features,
@ -811,7 +811,7 @@ fn get_source_id(
reg: Option<&String>, reg: Option<&String>,
) -> CargoResult<SourceId> { ) -> CargoResult<SourceId> {
match (reg, index) { match (reg, index) {
(Some(r), _) => SourceId::alt_registry(config, &r), (Some(r), _) => SourceId::alt_registry(config, r),
(_, Some(i)) => SourceId::for_registry(&i.into_url()?), (_, Some(i)) => SourceId::for_registry(&i.into_url()?),
_ => { _ => {
let map = SourceConfigMap::new(config)?; let map = SourceConfigMap::new(config)?;

View File

@ -75,7 +75,7 @@ pub fn resolve_ws<'a>(ws: &Workspace<'a>) -> CargoResult<(PackageSet<'a>, Resolv
pub fn resolve_ws_with_opts<'cfg>( pub fn resolve_ws_with_opts<'cfg>(
ws: &Workspace<'cfg>, ws: &Workspace<'cfg>,
target_data: &RustcTargetData, target_data: &RustcTargetData,
requested_target: CompileKind, requested_targets: &[CompileKind],
opts: &ResolveOpts, opts: &ResolveOpts,
specs: &[PackageIdSpec], specs: &[PackageIdSpec],
has_dev_units: HasDevUnits, has_dev_units: HasDevUnits,
@ -127,7 +127,7 @@ pub fn resolve_ws_with_opts<'cfg>(
let pkg_set = get_resolved_packages(&resolved_with_overrides, registry)?; let pkg_set = get_resolved_packages(&resolved_with_overrides, registry)?;
let member_ids = ws let member_ids = ws
.members_with_features(&specs, &opts.features)? .members_with_features(specs, &opts.features)?
.into_iter() .into_iter()
.map(|(p, _fts)| p.package_id()) .map(|(p, _fts)| p.package_id())
.collect::<Vec<_>>(); .collect::<Vec<_>>();
@ -135,8 +135,8 @@ pub fn resolve_ws_with_opts<'cfg>(
&resolved_with_overrides, &resolved_with_overrides,
&member_ids, &member_ids,
has_dev_units, has_dev_units,
requested_target, requested_targets,
&target_data, target_data,
)?; )?;
let resolved_features = FeatureResolver::resolve( let resolved_features = FeatureResolver::resolve(
@ -146,7 +146,7 @@ pub fn resolve_ws_with_opts<'cfg>(
&pkg_set, &pkg_set,
&opts.features, &opts.features,
specs, specs,
requested_target, requested_targets,
has_dev_units, has_dev_units,
)?; )?;

View File

@ -251,7 +251,7 @@ pub fn build<'a>(
specs: &[PackageIdSpec], specs: &[PackageIdSpec],
requested_features: &RequestedFeatures, requested_features: &RequestedFeatures,
target_data: &RustcTargetData, target_data: &RustcTargetData,
requested_kind: CompileKind, requested_kinds: &[CompileKind],
package_map: HashMap<PackageId, &'a Package>, package_map: HashMap<PackageId, &'a Package>,
opts: &TreeOptions, opts: &TreeOptions,
) -> CargoResult<Graph<'a>> { ) -> CargoResult<Graph<'a>> {
@ -261,19 +261,21 @@ pub fn build<'a>(
for (member, requested_features) in members_with_features { for (member, requested_features) in members_with_features {
let member_id = member.package_id(); let member_id = member.package_id();
let features_for = FeaturesFor::from_for_host(member.proc_macro()); let features_for = FeaturesFor::from_for_host(member.proc_macro());
let member_index = add_pkg( for kind in requested_kinds {
&mut graph, let member_index = add_pkg(
resolve, &mut graph,
resolved_features, resolve,
member_id, resolved_features,
features_for, member_id,
target_data, features_for,
requested_kind, target_data,
opts, *kind,
); opts,
if opts.graph_features { );
let fmap = resolve.summary(member_id).features(); if opts.graph_features {
add_cli_features(&mut graph, member_index, &requested_features, fmap); let fmap = resolve.summary(member_id).features();
add_cli_features(&mut graph, member_index, &requested_features, fmap);
}
} }
} }
if opts.graph_features { if opts.graph_features {

View File

@ -49,16 +49,16 @@ pub struct TreeOptions {
#[derive(PartialEq)] #[derive(PartialEq)]
pub enum Target { pub enum Target {
Host, Host,
Specific(String), Specific(Vec<String>),
All, All,
} }
impl Target { impl Target {
pub fn from_cli(target: Option<&str>) -> Target { pub fn from_cli(targets: Vec<String>) -> Target {
match target { match targets.len() {
None => Target::Host, 0 => Target::Host,
Some("all") => Target::All, 1 if targets[0] == "all" => Target::All,
Some(target) => Target::Specific(target.to_string()), _ => Target::Specific(targets),
} }
} }
} }
@ -126,14 +126,14 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
if opts.graph_features && opts.duplicates { if opts.graph_features && opts.duplicates {
bail!("the `-e features` flag does not support `--duplicates`"); bail!("the `-e features` flag does not support `--duplicates`");
} }
let requested_target = match &opts.target { let requested_targets = match &opts.target {
Target::All | Target::Host => None, Target::All | Target::Host => Vec::new(),
Target::Specific(t) => Some(t.as_ref()), Target::Specific(t) => t.clone(),
}; };
// TODO: Target::All is broken with -Zfeatures=itarget. To handle that properly, // TODO: Target::All is broken with -Zfeatures=itarget. To handle that properly,
// `FeatureResolver` will need to be taught what "all" means. // `FeatureResolver` will need to be taught what "all" means.
let requested_kind = CompileKind::from_requested_target(ws.config(), requested_target)?; let requested_kinds = CompileKind::from_requested_targets(ws.config(), &requested_targets)?;
let target_data = RustcTargetData::new(ws, requested_kind)?; let target_data = RustcTargetData::new(ws, &requested_kinds)?;
let specs = opts.packages.to_package_id_specs(ws)?; let specs = opts.packages.to_package_id_specs(ws)?;
let resolve_opts = ResolveOpts::new( let resolve_opts = ResolveOpts::new(
/*dev_deps*/ true, /*dev_deps*/ true,
@ -152,7 +152,7 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
let ws_resolve = ops::resolve_ws_with_opts( let ws_resolve = ops::resolve_ws_with_opts(
ws, ws,
&target_data, &target_data,
requested_kind, &requested_kinds,
&resolve_opts, &resolve_opts,
&specs, &specs,
has_dev, has_dev,
@ -172,7 +172,7 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
&specs, &specs,
&resolve_opts.features, &resolve_opts.features,
&target_data, &target_data,
requested_kind, &requested_kinds,
package_map, package_map,
opts, opts,
)?; )?;

View File

@ -96,6 +96,15 @@ impl<'cfg> PathSource<'cfg> {
/// are relevant for building this package, but it also contains logic to /// are relevant for building this package, but it also contains logic to
/// use other methods like .gitignore to filter the list of files. /// use other methods like .gitignore to filter the list of files.
pub fn list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> { pub fn list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> {
self._list_files(pkg).chain_err(|| {
format!(
"failed to determine list of files in {}",
pkg.root().display()
)
})
}
fn _list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> {
let root = pkg.root(); let root = pkg.root();
let no_include_option = pkg.manifest().include().is_empty(); let no_include_option = pkg.manifest().include().is_empty();
@ -111,17 +120,21 @@ impl<'cfg> PathSource<'cfg> {
} }
let ignore_include = include_builder.build()?; let ignore_include = include_builder.build()?;
let ignore_should_package = |relative_path: &Path| -> CargoResult<bool> { let ignore_should_package = |relative_path: &Path, is_dir: bool| -> CargoResult<bool> {
// "Include" and "exclude" options are mutually exclusive. // "Include" and "exclude" options are mutually exclusive.
if no_include_option { if no_include_option {
match ignore_exclude match ignore_exclude.matched_path_or_any_parents(relative_path, is_dir) {
.matched_path_or_any_parents(relative_path, /* is_dir */ false)
{
Match::None => Ok(true), Match::None => Ok(true),
Match::Ignore(_) => Ok(false), Match::Ignore(_) => Ok(false),
Match::Whitelist(_) => Ok(true), Match::Whitelist(_) => Ok(true),
} }
} else { } else {
if is_dir {
// Generally, include directives don't list every
// directory (nor should they!). Just skip all directory
// checks, and only check files.
return Ok(true);
}
match ignore_include match ignore_include
.matched_path_or_any_parents(relative_path, /* is_dir */ false) .matched_path_or_any_parents(relative_path, /* is_dir */ false)
{ {
@ -132,7 +145,7 @@ impl<'cfg> PathSource<'cfg> {
} }
}; };
let mut filter = |path: &Path| -> CargoResult<bool> { let mut filter = |path: &Path, is_dir: bool| -> CargoResult<bool> {
let relative_path = path.strip_prefix(root)?; let relative_path = path.strip_prefix(root)?;
let rel = relative_path.as_os_str(); let rel = relative_path.as_os_str();
@ -142,13 +155,13 @@ impl<'cfg> PathSource<'cfg> {
return Ok(true); return Ok(true);
} }
ignore_should_package(relative_path) ignore_should_package(relative_path, is_dir)
}; };
// Attempt Git-prepopulate only if no `include` (see rust-lang/cargo#4135). // Attempt Git-prepopulate only if no `include` (see rust-lang/cargo#4135).
if no_include_option { if no_include_option {
if let Some(result) = self.discover_git_and_list_files(pkg, root, &mut filter) { if let Some(result) = self.discover_git_and_list_files(pkg, root, &mut filter)? {
return result; return Ok(result);
} }
// no include option and not git repo discovered (see rust-lang/cargo#7183). // no include option and not git repo discovered (see rust-lang/cargo#7183).
return self.list_files_walk_except_dot_files_and_dirs(pkg, &mut filter); return self.list_files_walk_except_dot_files_and_dirs(pkg, &mut filter);
@ -162,50 +175,53 @@ impl<'cfg> PathSource<'cfg> {
&self, &self,
pkg: &Package, pkg: &Package,
root: &Path, root: &Path,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>, filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> Option<CargoResult<Vec<PathBuf>>> { ) -> CargoResult<Option<Vec<PathBuf>>> {
// If this package is in a Git repository, then we really do want to let repo = match git2::Repository::discover(root) {
// query the Git repository as it takes into account items such as Ok(repo) => repo,
// `.gitignore`. We're not quite sure where the Git repository is, Err(e) => {
// however, so we do a bit of a probe. log::debug!(
// "could not discover git repo at or above {}: {}",
// We walk this package's path upwards and look for a sibling root.display(),
// `Cargo.toml` and `.git` directory. If we find one then we assume that e
// we're part of that repository. );
let mut cur = root; return Ok(None);
loop {
if cur.join("Cargo.toml").is_file() {
// If we find a Git repository next to this `Cargo.toml`, we still
// check to see if we are indeed part of the index. If not, then
// this is likely an unrelated Git repo, so keep going.
if let Ok(repo) = git2::Repository::open(cur) {
let index = match repo.index() {
Ok(index) => index,
Err(err) => return Some(Err(err.into())),
};
let path = root.strip_prefix(cur).unwrap().join("Cargo.toml");
if index.get_path(&path, 0).is_some() {
return Some(self.list_files_git(pkg, &repo, filter));
}
}
} }
// Don't cross submodule boundaries. };
if cur.join(".git").is_dir() { let index = repo
break; .index()
} .chain_err(|| format!("failed to open git index at {}", repo.path().display()))?;
match cur.parent() { let repo_root = repo.workdir().ok_or_else(|| {
Some(parent) => cur = parent, anyhow::format_err!(
None => break, "did not expect repo at {} to be bare",
repo.path().display()
)
})?;
let repo_relative_path = match paths::strip_prefix_canonical(root, repo_root) {
Ok(p) => p,
Err(e) => {
log::warn!(
"cannot determine if path `{:?}` is in git repo `{:?}`: {:?}",
root,
repo_root,
e
);
return Ok(None);
} }
};
let manifest_path = repo_relative_path.join("Cargo.toml");
if index.get_path(&manifest_path, 0).is_some() {
return Ok(Some(self.list_files_git(pkg, &repo, filter)?));
} }
None // Package Cargo.toml is not in git, don't use git to guide our selection.
Ok(None)
} }
fn list_files_git( fn list_files_git(
&self, &self,
pkg: &Package, pkg: &Package,
repo: &git2::Repository, repo: &git2::Repository,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>, filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> { ) -> CargoResult<Vec<PathBuf>> {
warn!("list_files_git {}", pkg.package_id()); warn!("list_files_git {}", pkg.package_id());
let index = repo.index()?; let index = repo.index()?;
@ -289,7 +305,10 @@ impl<'cfg> PathSource<'cfg> {
continue; continue;
} }
if is_dir.unwrap_or_else(|| file_path.is_dir()) { // `is_dir` is None for symlinks. The `unwrap` checks if the
// symlink points to a directory.
let is_dir = is_dir.unwrap_or_else(|| file_path.is_dir());
if is_dir {
warn!(" found submodule {}", file_path.display()); warn!(" found submodule {}", file_path.display());
let rel = file_path.strip_prefix(root)?; let rel = file_path.strip_prefix(root)?;
let rel = rel.to_str().ok_or_else(|| { let rel = rel.to_str().ok_or_else(|| {
@ -307,7 +326,8 @@ impl<'cfg> PathSource<'cfg> {
PathSource::walk(&file_path, &mut ret, false, filter)?; PathSource::walk(&file_path, &mut ret, false, filter)?;
} }
} }
} else if (*filter)(&file_path)? { } else if (*filter)(&file_path, is_dir)? {
assert!(!is_dir);
// We found a file! // We found a file!
warn!(" found {}", file_path.display()); warn!(" found {}", file_path.display());
ret.push(file_path); ret.push(file_path);
@ -338,29 +358,28 @@ impl<'cfg> PathSource<'cfg> {
fn list_files_walk_except_dot_files_and_dirs( fn list_files_walk_except_dot_files_and_dirs(
&self, &self,
pkg: &Package, pkg: &Package,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>, filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> { ) -> CargoResult<Vec<PathBuf>> {
let root = pkg.root(); let root = pkg.root();
let mut exclude_dot_files_dir_builder = GitignoreBuilder::new(root); let mut exclude_dot_files_dir_builder = GitignoreBuilder::new(root);
exclude_dot_files_dir_builder.add_line(None, ".*")?; exclude_dot_files_dir_builder.add_line(None, ".*")?;
let ignore_dot_files_and_dirs = exclude_dot_files_dir_builder.build()?; let ignore_dot_files_and_dirs = exclude_dot_files_dir_builder.build()?;
let mut filter_ignore_dot_files_and_dirs = |path: &Path| -> CargoResult<bool> { let mut filter_ignore_dot_files_and_dirs =
let relative_path = path.strip_prefix(root)?; |path: &Path, is_dir: bool| -> CargoResult<bool> {
match ignore_dot_files_and_dirs let relative_path = path.strip_prefix(root)?;
.matched_path_or_any_parents(relative_path, /* is_dir */ false) match ignore_dot_files_and_dirs.matched_path_or_any_parents(relative_path, is_dir) {
{ Match::Ignore(_) => Ok(false),
Match::Ignore(_) => Ok(false), _ => filter(path, is_dir),
_ => filter(path), }
} };
};
self.list_files_walk(pkg, &mut filter_ignore_dot_files_and_dirs) self.list_files_walk(pkg, &mut filter_ignore_dot_files_and_dirs)
} }
fn list_files_walk( fn list_files_walk(
&self, &self,
pkg: &Package, pkg: &Package,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>, filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> { ) -> CargoResult<Vec<PathBuf>> {
let mut ret = Vec::new(); let mut ret = Vec::new();
PathSource::walk(pkg.root(), &mut ret, true, filter)?; PathSource::walk(pkg.root(), &mut ret, true, filter)?;
@ -371,12 +390,14 @@ impl<'cfg> PathSource<'cfg> {
path: &Path, path: &Path,
ret: &mut Vec<PathBuf>, ret: &mut Vec<PathBuf>,
is_root: bool, is_root: bool,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>, filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<()> { ) -> CargoResult<()> {
if !path.is_dir() { let is_dir = path.is_dir();
if (*filter)(path)? { if !is_root && !(*filter)(path, is_dir)? {
ret.push(path.to_path_buf()); return Ok(());
} }
if !is_dir {
ret.push(path.to_path_buf());
return Ok(()); return Ok(());
} }
// Don't recurse into any sub-packages that we have. // Don't recurse into any sub-packages that we have.
@ -415,7 +436,12 @@ impl<'cfg> PathSource<'cfg> {
let mut max = FileTime::zero(); let mut max = FileTime::zero();
let mut max_path = PathBuf::new(); let mut max_path = PathBuf::new();
for file in self.list_files(pkg)? { for file in self.list_files(pkg).chain_err(|| {
format!(
"failed to determine the most recently modified file in {}",
pkg.root().display()
)
})? {
// An `fs::stat` error here is either because path is a // An `fs::stat` error here is either because path is a
// broken symlink, a permissions error, or a race // broken symlink, a permissions error, or a race
// condition where this path was `rm`-ed -- either way, // condition where this path was `rm`-ed -- either way,

View File

@ -178,7 +178,7 @@ use crate::sources::PathSource;
use crate::util::errors::CargoResultExt; use crate::util::errors::CargoResultExt;
use crate::util::hex; use crate::util::hex;
use crate::util::into_url::IntoUrl; use crate::util::into_url::IntoUrl;
use crate::util::{CargoResult, Config, Filesystem}; use crate::util::{restricted_names, CargoResult, Config, Filesystem};
const PACKAGE_SOURCE_LOCK: &str = ".cargo-ok"; const PACKAGE_SOURCE_LOCK: &str = ".cargo-ok";
pub const CRATES_IO_INDEX: &str = "https://github.com/rust-lang/crates.io-index"; pub const CRATES_IO_INDEX: &str = "https://github.com/rust-lang/crates.io-index";
@ -495,11 +495,18 @@ impl<'cfg> RegistrySource<'cfg> {
prefix prefix
) )
} }
// Unpacking failed
// Once that's verified, unpack the entry as usual. let mut result = entry.unpack_in(parent).map_err(anyhow::Error::from);
entry if cfg!(windows) && restricted_names::is_windows_reserved_path(&entry_path) {
.unpack_in(parent) result = result.chain_err(|| {
.chain_err(|| format!("failed to unpack entry at `{}`", entry_path.display()))?; format!(
"`{}` appears to contain a reserved Windows path, \
it cannot be extracted on Windows",
entry_path.display()
)
});
}
result.chain_err(|| format!("failed to unpack entry at `{}`", entry_path.display()))?;
} }
// Write to the lock file to indicate that unpacking was successful. // Write to the lock file to indicate that unpacking was successful.

View File

@ -139,7 +139,7 @@ pub trait AppExt: Sized {
} }
fn arg_target_triple(self, target: &'static str) -> Self { fn arg_target_triple(self, target: &'static str) -> Self {
self._arg(opt("target", target).value_name("TRIPLE")) self._arg(multi_opt("target", target, "TRIPLE"))
} }
fn arg_target_dir(self) -> Self { fn arg_target_dir(self) -> Self {
@ -321,8 +321,8 @@ pub trait ArgMatchesExt {
self.value_of_u32("jobs") self.value_of_u32("jobs")
} }
fn target(&self) -> Option<String> { fn targets(&self) -> Vec<String> {
self._value_of("target").map(|s| s.to_string()) self._values_of("target")
} }
fn get_profile_name( fn get_profile_name(
@ -454,7 +454,7 @@ pub trait ArgMatchesExt {
} }
} }
let mut build_config = BuildConfig::new(config, self.jobs()?, &self.target(), mode)?; let mut build_config = BuildConfig::new(config, self.jobs()?, &self.targets(), mode)?;
build_config.message_format = message_format.unwrap_or(MessageFormat::Human); build_config.message_format = message_format.unwrap_or(MessageFormat::Human);
build_config.requested_profile = self.get_profile_name(config, "dev", profile_checking)?; build_config.requested_profile = self.get_profile_name(config, "dev", profile_checking)?;
build_config.build_plan = self._is_present("build-plan"); build_config.build_plan = self._is_present("build-plan");

View File

@ -34,31 +34,29 @@ impl<'a> Retry<'a> {
} }
fn maybe_spurious(err: &Error) -> bool { fn maybe_spurious(err: &Error) -> bool {
for e in err.chain() { if let Some(git_err) = err.downcast_ref::<git2::Error>() {
if let Some(git_err) = e.downcast_ref::<git2::Error>() { match git_err.class() {
match git_err.class() { git2::ErrorClass::Net | git2::ErrorClass::Os => return true,
git2::ErrorClass::Net | git2::ErrorClass::Os => return true, _ => (),
_ => (),
}
} }
if let Some(curl_err) = e.downcast_ref::<curl::Error>() { }
if curl_err.is_couldnt_connect() if let Some(curl_err) = err.downcast_ref::<curl::Error>() {
|| curl_err.is_couldnt_resolve_proxy() if curl_err.is_couldnt_connect()
|| curl_err.is_couldnt_resolve_host() || curl_err.is_couldnt_resolve_proxy()
|| curl_err.is_operation_timedout() || curl_err.is_couldnt_resolve_host()
|| curl_err.is_recv_error() || curl_err.is_operation_timedout()
|| curl_err.is_http2_error() || curl_err.is_recv_error()
|| curl_err.is_http2_stream_error() || curl_err.is_http2_error()
|| curl_err.is_ssl_connect_error() || curl_err.is_http2_stream_error()
|| curl_err.is_partial_file() || curl_err.is_ssl_connect_error()
{ || curl_err.is_partial_file()
return true; {
} return true;
} }
if let Some(not_200) = e.downcast_ref::<HttpNot200>() { }
if 500 <= not_200.code && not_200.code < 600 { if let Some(not_200) = err.downcast_ref::<HttpNot200>() {
return true; if 500 <= not_200.code && not_200.code < 600 {
} return true;
} }
} }
false false

View File

@ -393,3 +393,43 @@ fn _link_or_copy(src: &Path, dst: &Path) -> CargoResult<()> {
})?; })?;
Ok(()) Ok(())
} }
/// Changes the filesystem mtime (and atime if possible) for the given file.
///
/// This intentionally does not return an error, as this is sometimes not
/// supported on network filesystems. For the current uses in Cargo, this is a
/// "best effort" approach, and errors shouldn't be propagated.
pub fn set_file_time_no_err<P: AsRef<Path>>(path: P, time: FileTime) {
let path = path.as_ref();
match filetime::set_file_times(path, time, time) {
Ok(()) => log::debug!("set file mtime {} to {}", path.display(), time),
Err(e) => log::warn!(
"could not set mtime of {} to {}: {:?}",
path.display(),
time,
e
),
}
}
/// Strips `base` from `path`.
///
/// This canonicalizes both paths before stripping. This is useful if the
/// paths are obtained in different ways, and one or the other may or may not
/// have been normalized in some way.
pub fn strip_prefix_canonical<P: AsRef<Path>>(
path: P,
base: P,
) -> Result<PathBuf, std::path::StripPrefixError> {
// Not all filesystems support canonicalize. Just ignore if it doesn't work.
let safe_canonicalize = |path: &Path| match path.canonicalize() {
Ok(p) => p,
Err(e) => {
log::warn!("cannot canonicalize {:?}: {:?}", path, e);
path.to_path_buf()
}
};
let canon_path = safe_canonicalize(path.as_ref());
let canon_base = safe_canonicalize(base.as_ref());
canon_path.strip_prefix(canon_base).map(|p| p.to_path_buf())
}

View File

@ -2,6 +2,7 @@
use crate::util::CargoResult; use crate::util::CargoResult;
use anyhow::bail; use anyhow::bail;
use std::path::Path;
/// Returns `true` if the name contains non-ASCII characters. /// Returns `true` if the name contains non-ASCII characters.
pub fn is_non_ascii_name(name: &str) -> bool { pub fn is_non_ascii_name(name: &str) -> bool {
@ -81,3 +82,13 @@ pub fn validate_package_name(name: &str, what: &str, help: &str) -> CargoResult<
} }
Ok(()) Ok(())
} }
// Check the entire path for names reserved in Windows.
pub fn is_windows_reserved_path(path: &Path) -> bool {
path.iter()
.filter_map(|component| component.to_str())
.any(|component| {
let stem = component.split('.').next().unwrap();
is_windows_reserved(stem)
})
}

View File

@ -48,7 +48,11 @@ alter the way the JSON messages are computed and rendered. See the description
of the `--message-format` option in the [build command documentation] for more of the `--message-format` option in the [build command documentation] for more
details. details.
If you are using Rust, the [cargo_metadata] crate can be used to parse these
messages.
[build command documentation]: ../commands/cargo-build.md [build command documentation]: ../commands/cargo-build.md
[cargo_metadata]: https://crates.io/crates/cargo_metadata
#### Compiler messages #### Compiler messages

View File

@ -99,6 +99,25 @@ information from `.cargo/config.toml`. See the rustc issue for more information.
cargo test --target foo -Zdoctest-xcompile cargo test --target foo -Zdoctest-xcompile
``` ```
### multitarget
* Tracking Issue: [#8176](https://github.com/rust-lang/cargo/issues/8176)
This flag allows passing multiple `--target` flags to the `cargo` subcommand
selected. When multiple `--target` flags are passed the selected build targets
will be built for each of the selected architectures.
For example to compile a library for both 32 and 64-bit:
```
cargo build --target x86_64-unknown-linux-gnu --target i686-unknown-linux-gnu
```
or running tests for both targets:
```
cargo test --target x86_64-unknown-linux-gnu --target i686-unknown-linux-gnu
```
### Custom named profiles ### Custom named profiles
* Tracking Issue: [rust-lang/cargo#6988](https://github.com/rust-lang/cargo/issues/6988) * Tracking Issue: [rust-lang/cargo#6988](https://github.com/rust-lang/cargo/issues/6988)

View File

@ -3769,7 +3769,11 @@ fn cdylib_not_lifted() {
p.cargo("build").run(); p.cargo("build").run();
let files = if cfg!(windows) { let files = if cfg!(windows) {
vec!["foo.dll.lib", "foo.dll.exp", "foo.dll"] if cfg!(target_env = "msvc") {
vec!["foo.dll.lib", "foo.dll.exp", "foo.dll"]
} else {
vec!["libfoo.dll.a", "foo.dll"]
}
} else if cfg!(target_os = "macos") { } else if cfg!(target_os = "macos") {
vec!["libfoo.dylib"] vec!["libfoo.dylib"]
} else { } else {
@ -3803,7 +3807,11 @@ fn cdylib_final_outputs() {
p.cargo("build").run(); p.cargo("build").run();
let files = if cfg!(windows) { let files = if cfg!(windows) {
vec!["foo_bar.dll.lib", "foo_bar.dll"] if cfg!(target_env = "msvc") {
vec!["foo_bar.dll.lib", "foo_bar.dll"]
} else {
vec!["foo_bar.dll", "libfoo_bar.dll.a"]
}
} else if cfg!(target_os = "macos") { } else if cfg!(target_os = "macos") {
vec!["libfoo_bar.dylib"] vec!["libfoo_bar.dylib"]
} else { } else {

View File

@ -1663,7 +1663,7 @@ fn build_script_with_dynamic_native_dependency() {
let src = root.join(&file); let src = root.join(&file);
let dst = out_dir.join(&file); let dst = out_dir.join(&file);
fs::copy(src, dst).unwrap(); fs::copy(src, dst).unwrap();
if cfg!(windows) { if cfg!(target_env = "msvc") {
fs::copy(root.join("builder.dll.lib"), fs::copy(root.join("builder.dll.lib"),
out_dir.join("builder.dll.lib")).unwrap(); out_dir.join("builder.dll.lib")).unwrap();
} }
@ -3977,7 +3977,9 @@ fn links_interrupted_can_restart() {
fn build_script_scan_eacces() { fn build_script_scan_eacces() {
// build.rs causes a scan of the whole project, which can be a problem if // build.rs causes a scan of the whole project, which can be a problem if
// a directory is not accessible. // a directory is not accessible.
use cargo_test_support::git;
use std::os::unix::fs::PermissionsExt; use std::os::unix::fs::PermissionsExt;
let p = project() let p = project()
.file("src/lib.rs", "") .file("src/lib.rs", "")
.file("build.rs", "fn main() {}") .file("build.rs", "fn main() {}")
@ -3985,12 +3987,21 @@ fn build_script_scan_eacces() {
.build(); .build();
let path = p.root().join("secrets"); let path = p.root().join("secrets");
fs::set_permissions(&path, fs::Permissions::from_mode(0)).unwrap(); fs::set_permissions(&path, fs::Permissions::from_mode(0)).unwrap();
// "Caused by" is a string from libc such as the following: // The last "Caused by" is a string from libc such as the following:
// Permission denied (os error 13) // Permission denied (os error 13)
p.cargo("build") p.cargo("build")
.with_stderr( .with_stderr(
"\ "\
[ERROR] cannot read \"[..]/foo/secrets\" [ERROR] failed to determine package fingerprint for build script for foo v0.0.1 ([..]/foo)
Caused by:
failed to determine the most recently modified file in [..]/foo
Caused by:
failed to determine list of files in [..]/foo
Caused by:
cannot read \"[..]/foo/secrets\"
Caused by: Caused by:
[..] [..]
@ -3998,5 +4009,28 @@ Caused by:
) )
.with_status(101) .with_status(101)
.run(); .run();
// Try `package.exclude` to skip a directory.
p.change_file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.0.1"
exclude = ["secrets"]
"#,
);
p.cargo("build").run();
// Try with git. This succeeds because the git status walker ignores
// directories it can't access.
p.change_file("Cargo.toml", &basic_manifest("foo", "0.0.1"));
p.build_dir().rm_rf();
let repo = git::init(&p.root());
git::add(&repo);
git::commit(&repo);
p.cargo("build").run();
// Restore permissions so that the directory can be deleted.
fs::set_permissions(&path, fs::Permissions::from_mode(0o755)).unwrap(); fs::set_permissions(&path, fs::Permissions::from_mode(0o755)).unwrap();
} }

View File

@ -396,7 +396,6 @@ fn no_cross_doctests() {
[COMPILING] foo v0.0.1 ([CWD]) [COMPILING] foo v0.0.1 ([CWD])
[FINISHED] test [unoptimized + debuginfo] target(s) in [..] [FINISHED] test [unoptimized + debuginfo] target(s) in [..]
[RUNNING] target/{triple}/debug/deps/foo-[..][EXE] [RUNNING] target/{triple}/debug/deps/foo-[..][EXE]
[DOCTEST] foo
", ",
triple = target triple = target
)) ))

View File

@ -8,52 +8,8 @@ use std::thread;
use cargo_test_support::{project, slow_cpu_multiplier}; use cargo_test_support::{project, slow_cpu_multiplier};
#[cfg(unix)]
fn enabled() -> bool {
true
}
// On Windows support for these tests is only enabled through the usage of job
// objects. Support for nested job objects, however, was added in recent-ish
// versions of Windows, so this test may not always be able to succeed.
//
// As a result, we try to add ourselves to a job object here
// can succeed or not.
#[cfg(windows)]
fn enabled() -> bool {
use winapi::um::{handleapi, jobapi, jobapi2, processthreadsapi};
unsafe {
// If we're not currently in a job, then we can definitely run these
// tests.
let me = processthreadsapi::GetCurrentProcess();
let mut ret = 0;
let r = jobapi::IsProcessInJob(me, 0 as *mut _, &mut ret);
assert_ne!(r, 0);
if ret == ::winapi::shared::minwindef::FALSE {
return true;
}
// If we are in a job, then we can run these tests if we can be added to
// a nested job (as we're going to create a nested job no matter what as
// part of these tests.
//
// If we can't be added to a nested job, then these tests will
// definitely fail, and there's not much we can do about that.
let job = jobapi2::CreateJobObjectW(0 as *mut _, 0 as *const _);
assert!(!job.is_null());
let r = jobapi2::AssignProcessToJobObject(job, me);
handleapi::CloseHandle(job);
r != 0
}
}
#[cargo_test] #[cargo_test]
fn ctrl_c_kills_everyone() { fn ctrl_c_kills_everyone() {
if !enabled() {
return;
}
let listener = TcpListener::bind("127.0.0.1:0").unwrap(); let listener = TcpListener::bind("127.0.0.1:0").unwrap();
let addr = listener.local_addr().unwrap(); let addr = listener.local_addr().unwrap();
@ -132,7 +88,7 @@ fn ctrl_c_kills_everyone() {
} }
#[cfg(unix)] #[cfg(unix)]
fn ctrl_c(child: &mut Child) { pub fn ctrl_c(child: &mut Child) {
let r = unsafe { libc::kill(-(child.id() as i32), libc::SIGINT) }; let r = unsafe { libc::kill(-(child.id() as i32), libc::SIGINT) };
if r < 0 { if r < 0 {
panic!("failed to kill: {}", io::Error::last_os_error()); panic!("failed to kill: {}", io::Error::last_os_error());
@ -140,6 +96,6 @@ fn ctrl_c(child: &mut Child) {
} }
#[cfg(windows)] #[cfg(windows)]
fn ctrl_c(child: &mut Child) { pub fn ctrl_c(child: &mut Child) {
child.kill().unwrap(); child.kill().unwrap();
} }

View File

@ -1592,6 +1592,17 @@ fn resolver_enables_new_features() {
p.cargo("run --bin a") p.cargo("run --bin a")
.masquerade_as_nightly_cargo() .masquerade_as_nightly_cargo()
.env("EXPECTED_FEATS", "1") .env("EXPECTED_FEATS", "1")
.with_stderr(
"\
[UPDATING] [..]
[DOWNLOADING] crates ...
[DOWNLOADED] common [..]
[COMPILING] common v1.0.0
[COMPILING] a v0.1.0 [..]
[FINISHED] [..]
[RUNNING] `target/debug/a[EXE]`
",
)
.run(); .run();
// only normal+dev // only normal+dev

View File

@ -10,6 +10,7 @@ use std::process::Stdio;
use std::thread; use std::thread;
use std::time::SystemTime; use std::time::SystemTime;
use super::death;
use cargo_test_support::paths::{self, CargoPathExt}; use cargo_test_support::paths::{self, CargoPathExt};
use cargo_test_support::registry::Package; use cargo_test_support::registry::Package;
use cargo_test_support::{basic_manifest, is_coarse_mtime, project, rustc_host, sleep_ms}; use cargo_test_support::{basic_manifest, is_coarse_mtime, project, rustc_host, sleep_ms};
@ -2316,8 +2317,14 @@ LLVM version: 9.0
fn linking_interrupted() { fn linking_interrupted() {
// Interrupt during the linking phase shouldn't leave test executable as "fresh". // Interrupt during the linking phase shouldn't leave test executable as "fresh".
let listener = TcpListener::bind("127.0.0.1:0").unwrap(); // This is used to detect when linking starts, then to pause the linker so
let addr = listener.local_addr().unwrap(); // that the test can kill cargo.
let link_listener = TcpListener::bind("127.0.0.1:0").unwrap();
let link_addr = link_listener.local_addr().unwrap();
// This is used to detect when rustc exits.
let rustc_listener = TcpListener::bind("127.0.0.1:0").unwrap();
let rustc_addr = rustc_listener.local_addr().unwrap();
// Create a linker that we can interrupt. // Create a linker that we can interrupt.
let linker = project() let linker = project()
@ -2326,8 +2333,6 @@ fn linking_interrupted() {
.file( .file(
"src/main.rs", "src/main.rs",
&r#" &r#"
use std::io::Read;
fn main() { fn main() {
// Figure out the output filename. // Figure out the output filename.
let output = match std::env::args().find(|a| a.starts_with("/OUT:")) { let output = match std::env::args().find(|a| a.starts_with("/OUT:")) {
@ -2346,43 +2351,79 @@ fn linking_interrupted() {
std::fs::write(&output, "").unwrap(); std::fs::write(&output, "").unwrap();
// Tell the test that we are ready to be interrupted. // Tell the test that we are ready to be interrupted.
let mut socket = std::net::TcpStream::connect("__ADDR__").unwrap(); let mut socket = std::net::TcpStream::connect("__ADDR__").unwrap();
// Wait for the test to tell us to exit. // Wait for the test to kill us.
let _ = socket.read(&mut [0; 1]); std::thread::sleep(std::time::Duration::new(60, 0));
} }
"# "#
.replace("__ADDR__", &addr.to_string()), .replace("__ADDR__", &link_addr.to_string()),
) )
.build(); .build();
linker.cargo("build").run(); linker.cargo("build").run();
// Create a wrapper around rustc that will tell us when rustc is finished.
let rustc = project()
.at("rustc-waiter")
.file("Cargo.toml", &basic_manifest("rustc-waiter", "1.0.0"))
.file(
"src/main.rs",
&r#"
fn main() {
let mut conn = None;
// Check for a normal build (not -vV or --print).
if std::env::args().any(|arg| arg == "t1") {
// Tell the test that rustc has started.
conn = Some(std::net::TcpStream::connect("__ADDR__").unwrap());
}
let status = std::process::Command::new("rustc")
.args(std::env::args().skip(1))
.status()
.expect("rustc to run");
std::process::exit(status.code().unwrap_or(1));
}
"#
.replace("__ADDR__", &rustc_addr.to_string()),
)
.build();
rustc.cargo("build").run();
// Build it once so that the fingerprint gets saved to disk. // Build it once so that the fingerprint gets saved to disk.
let p = project() let p = project()
.file("src/lib.rs", "") .file("src/lib.rs", "")
.file("tests/t1.rs", "") .file("tests/t1.rs", "")
.build(); .build();
p.cargo("test --test t1 --no-run").run(); p.cargo("test --test t1 --no-run").run();
// Make a change, start a build, then interrupt it. // Make a change, start a build, then interrupt it.
p.change_file("src/lib.rs", "// modified"); p.change_file("src/lib.rs", "// modified");
let linker_env = format!( let linker_env = format!(
"CARGO_TARGET_{}_LINKER", "CARGO_TARGET_{}_LINKER",
rustc_host().to_uppercase().replace('-', "_") rustc_host().to_uppercase().replace('-', "_")
); );
// NOTE: This assumes that the paths to the linker or rustc are not in the
// fingerprint. But maybe they should be?
let mut cmd = p let mut cmd = p
.cargo("test --test t1 --no-run") .cargo("test --test t1 --no-run")
.env(&linker_env, linker.bin("linker")) .env(&linker_env, linker.bin("linker"))
.env("RUSTC", rustc.bin("rustc-waiter"))
.build_command(); .build_command();
let mut child = cmd let mut child = cmd
.stdout(Stdio::null()) .stdout(Stdio::null())
.stderr(Stdio::null()) .stderr(Stdio::null())
.env("__CARGO_TEST_SETSID_PLEASE_DONT_USE_ELSEWHERE", "1")
.spawn() .spawn()
.unwrap(); .unwrap();
// Wait for rustc to start.
let mut rustc_conn = rustc_listener.accept().unwrap().0;
// Wait for linking to start. // Wait for linking to start.
let mut conn = listener.accept().unwrap().0; drop(link_listener.accept().unwrap());
// Interrupt the child. // Interrupt the child.
child.kill().unwrap(); death::ctrl_c(&mut child);
// Note: rustc and the linker are still running, let them exit here. assert!(!child.wait().unwrap().success());
conn.write(b"X").unwrap(); // Wait for rustc to exit. If we don't wait, then the command below could
// start while rustc is still being torn down.
let mut buf = [0];
drop(rustc_conn.read_exact(&mut buf));
// Build again, shouldn't be fresh. // Build again, shouldn't be fresh.
p.cargo("test --test t1") p.cargo("test --test t1")

View File

@ -10,7 +10,9 @@ use cargo_test_support::install::{
}; };
use cargo_test_support::paths; use cargo_test_support::paths;
use cargo_test_support::registry::Package; use cargo_test_support::registry::Package;
use cargo_test_support::{basic_manifest, cargo_process, project, NO_SUCH_FILE_ERR_MSG}; use cargo_test_support::{
basic_manifest, cargo_process, project, symlink_supported, t, NO_SUCH_FILE_ERR_MSG,
};
fn pkg(name: &str, vers: &str) { fn pkg(name: &str, vers: &str) {
Package::new(name, vers) Package::new(name, vers)
@ -1458,3 +1460,40 @@ fn git_install_reads_workspace_manifest() {
.with_stderr_contains(" invalid type: integer `3`[..]") .with_stderr_contains(" invalid type: integer `3`[..]")
.run(); .run();
} }
#[cargo_test]
fn install_git_with_symlink_home() {
// Ensure that `cargo install` with a git repo is OK when CARGO_HOME is a
// symlink, and uses an build script.
if !symlink_supported() {
return;
}
let p = git::new("foo", |p| {
p.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
// This triggers discover_git_and_list_files for detecting changed files.
.file("build.rs", "fn main() {}")
});
#[cfg(unix)]
use std::os::unix::fs::symlink;
#[cfg(windows)]
use std::os::windows::fs::symlink_dir as symlink;
let actual = paths::root().join("actual-home");
t!(std::fs::create_dir(&actual));
t!(symlink(&actual, paths::home().join(".cargo")));
cargo_process("install --git")
.arg(p.url().to_string())
.with_stderr(
"\
[UPDATING] git repository [..]
[INSTALLING] foo v1.0.0 [..]
[COMPILING] foo v1.0.0 [..]
[FINISHED] [..]
[INSTALLING] [..]home/.cargo/bin/foo[..]
[INSTALLED] package `foo [..]
[WARNING] be sure to add [..]
",
)
.run();
}

View File

@ -67,6 +67,7 @@ mod message_format;
mod metabuild; mod metabuild;
mod metadata; mod metadata;
mod minimal_versions; mod minimal_versions;
mod multitarget;
mod net_config; mod net_config;
mod new; mod new;
mod offline; mod offline;

View File

@ -0,0 +1,144 @@
//! Tests for multiple `--target` flags to subcommands
use cargo_test_support::{basic_manifest, cross_compile, project, rustc_host};
#[cargo_test]
fn double_target_rejected() {
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build --target a --target b")
.with_stderr("error: specifying multiple `--target` flags requires `-Zmultitarget`")
.with_status(101)
.run();
}
#[cargo_test]
fn simple_build() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
assert!(p.target_bin(&t1, "foo").is_file());
assert!(p.target_bin(&t2, "foo").is_file());
}
#[cargo_test]
fn simple_test() {
if !cross_compile::can_run_on_host() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/lib.rs", "fn main() {}")
.build();
p.cargo("test -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.with_stderr_contains(&format!("[RUNNING] [..]{}[..]", t1))
.with_stderr_contains(&format!("[RUNNING] [..]{}[..]", t2))
.run();
}
#[cargo_test]
fn simple_run() {
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("run -Z multitarget --target a --target b")
.with_stderr("error: only one `--target` argument is supported")
.with_status(101)
.masquerade_as_nightly_cargo()
.run();
}
#[cargo_test]
fn simple_doc() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/lib.rs", "//! empty lib")
.build();
p.cargo("doc -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
assert!(p.build_dir().join(&t1).join("doc/foo/index.html").is_file());
assert!(p.build_dir().join(&t2).join("doc/foo/index.html").is_file());
}
#[cargo_test]
fn simple_check() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("check -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
}
#[cargo_test]
fn same_value_twice() {
if cross_compile::disabled() {
return;
}
let t = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build -Z multitarget")
.arg("--target")
.arg(&t)
.arg("--target")
.arg(&t)
.masquerade_as_nightly_cargo()
.run();
assert!(p.target_bin(&t, "foo").is_file());
}

View File

@ -20,6 +20,7 @@ fn binary_with_debug() {
&["foo"], &["foo"],
&["foo", "foo.dSYM"], &["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"], &["foo.exe", "foo.pdb"],
&["foo.exe"],
); );
} }
@ -55,6 +56,7 @@ fn static_library_with_debug() {
&["libfoo.a"], &["libfoo.a"],
&["libfoo.a"], &["libfoo.a"],
&["foo.lib"], &["foo.lib"],
&["libfoo.a"],
); );
} }
@ -90,6 +92,7 @@ fn dynamic_library_with_debug() {
&["libfoo.so"], &["libfoo.so"],
&["libfoo.dylib"], &["libfoo.dylib"],
&["foo.dll", "foo.dll.lib"], &["foo.dll", "foo.dll.lib"],
&["foo.dll", "libfoo.dll.a"],
); );
} }
@ -124,6 +127,7 @@ fn rlib_with_debug() {
&["libfoo.rlib"], &["libfoo.rlib"],
&["libfoo.rlib"], &["libfoo.rlib"],
&["libfoo.rlib"], &["libfoo.rlib"],
&["libfoo.rlib"],
); );
} }
@ -167,6 +171,7 @@ fn include_only_the_binary_from_the_current_package() {
&["foo"], &["foo"],
&["foo", "foo.dSYM"], &["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"], &["foo.exe", "foo.pdb"],
&["foo.exe"],
); );
} }
@ -242,6 +247,7 @@ fn avoid_build_scripts() {
&["a", "b"], &["a", "b"],
&["a", "a.dSYM", "b", "b.dSYM"], &["a", "a.dSYM", "b", "b.dSYM"],
&["a.exe", "a.pdb", "b.exe", "b.pdb"], &["a.exe", "a.pdb", "b.exe", "b.pdb"],
&["a.exe", "b.exe"],
); );
} }
@ -266,6 +272,7 @@ fn cargo_build_out_dir() {
&["foo"], &["foo"],
&["foo", "foo.dSYM"], &["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"], &["foo.exe", "foo.pdb"],
&["foo.exe"],
); );
} }
@ -273,10 +280,15 @@ fn check_dir_contents(
out_dir: &Path, out_dir: &Path,
expected_linux: &[&str], expected_linux: &[&str],
expected_mac: &[&str], expected_mac: &[&str],
expected_win: &[&str], expected_win_msvc: &[&str],
expected_win_gnu: &[&str],
) { ) {
let expected = if cfg!(target_os = "windows") { let expected = if cfg!(target_os = "windows") {
expected_win if cfg!(target_env = "msvc") {
expected_win_msvc
} else {
expected_win_gnu
}
} else if cfg!(target_os = "macos") { } else if cfg!(target_os = "macos") {
expected_mac expected_mac
} else { } else {

View File

@ -1,12 +1,12 @@
//! Tests for the `cargo package` command. //! Tests for the `cargo package` command.
use cargo_test_support::paths::CargoPathExt; use cargo_test_support::paths::CargoPathExt;
use cargo_test_support::registry::Package; use cargo_test_support::publish::validate_crate_contents;
use cargo_test_support::registry::{self, Package};
use cargo_test_support::{ use cargo_test_support::{
basic_manifest, cargo_process, git, path2url, paths, project, publish::validate_crate_contents, basic_manifest, cargo_process, git, path2url, paths, project, symlink_supported, t,
registry, symlink_supported, t,
}; };
use std::fs::{read_to_string, File}; use std::fs::{self, read_to_string, File};
use std::path::Path; use std::path::Path;
#[cargo_test] #[cargo_test]
@ -1691,3 +1691,157 @@ fn package_restricted_windows() {
) )
.run(); .run();
} }
#[cargo_test]
fn finds_git_in_parent() {
// Test where `Cargo.toml` is not in the root of the git repo.
let repo_path = paths::root().join("repo");
fs::create_dir(&repo_path).unwrap();
let p = project()
.at("repo/foo")
.file("Cargo.toml", &basic_manifest("foo", "0.1.0"))
.file("src/lib.rs", "")
.build();
let repo = git::init(&repo_path);
git::add(&repo);
git::commit(&repo);
p.change_file("ignoreme", "");
p.change_file("ignoreme2", "");
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
Cargo.toml
Cargo.toml.orig
ignoreme
ignoreme2
src/lib.rs
",
)
.run();
p.change_file(".gitignore", "ignoreme");
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
.gitignore
Cargo.toml
Cargo.toml.orig
ignoreme2
src/lib.rs
",
)
.run();
fs::write(repo_path.join(".gitignore"), "ignoreme2").unwrap();
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
.gitignore
Cargo.toml
Cargo.toml.orig
src/lib.rs
",
)
.run();
}
#[cargo_test]
#[cfg(windows)]
fn reserved_windows_name() {
Package::new("bar", "1.0.0")
.file("src/lib.rs", "pub mod aux;")
.file("src/aux.rs", "")
.publish();
let p = project()
.file(
"Cargo.toml",
r#"
[project]
name = "foo"
version = "0.0.1"
authors = []
license = "MIT"
description = "foo"
[dependencies]
bar = "1.0.0"
"#,
)
.file("src/main.rs", "extern crate bar;\nfn main() { }")
.build();
p.cargo("package")
.with_status(101)
.with_stderr_contains(
"\
error: failed to verify package tarball
Caused by:
failed to download replaced source registry `[..]`
Caused by:
failed to unpack package `[..] `[..]`)`
Caused by:
failed to unpack entry at `[..]aux.rs`
Caused by:
`[..]aux.rs` appears to contain a reserved Windows path, it cannot be extracted on Windows
Caused by:
failed to unpack `[..]aux.rs`
Caused by:
failed to unpack `[..]aux.rs` into `[..]aux.rs`",
)
.run();
}
#[cargo_test]
fn list_with_path_and_lock() {
// Allow --list even for something that isn't packageable.
// Init an empty registry because a versionless path dep will search for
// the package on crates.io.
registry::init();
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
license = "MIT"
description = "foo"
homepage = "foo"
[dependencies]
bar = {path="bar"}
"#,
)
.file("src/main.rs", "fn main() {}")
.file("bar/Cargo.toml", &basic_manifest("bar", "0.1.0"))
.file("bar/src/lib.rs", "")
.build();
p.cargo("package --list")
.with_stdout(
"\
Cargo.lock
Cargo.toml
Cargo.toml.orig
src/main.rs
",
)
.run();
p.cargo("package")
.with_status(101)
.with_stderr(
"\
error: all path dependencies must have a version specified when packaging.
dependency `bar` does not specify a version.
",
)
.run();
}

View File

@ -180,7 +180,7 @@ fn plugin_with_dynamic_native_dependency() {
let src = root.join(&file); let src = root.join(&file);
let dst = out_dir.join(&file); let dst = out_dir.join(&file);
fs::copy(src, dst).unwrap(); fs::copy(src, dst).unwrap();
if cfg!(windows) { if cfg!(target_env = "msvc") {
fs::copy(root.join("builder.dll.lib"), fs::copy(root.join("builder.dll.lib"),
out_dir.join("builder.dll.lib")).unwrap(); out_dir.join("builder.dll.lib")).unwrap();
} }
@ -435,5 +435,5 @@ fn shared_panic_abort_plugins() {
.file("baz/src/lib.rs", "") .file("baz/src/lib.rs", "")
.build(); .build();
p.cargo("build").run(); p.cargo("build -v").run();
} }

View File

@ -405,7 +405,8 @@ fn named_config_profile() {
let dep_pkg = PackageId::new("dep", "0.1.0", crates_io).unwrap(); let dep_pkg = PackageId::new("dep", "0.1.0", crates_io).unwrap();
// normal package // normal package
let p = profiles.get_profile(a_pkg, true, UnitFor::new_normal(), CompileMode::Build); let mode = CompileMode::Build;
let p = profiles.get_profile(a_pkg, true, true, UnitFor::new_normal(), mode);
assert_eq!(p.name, "foo"); assert_eq!(p.name, "foo");
assert_eq!(p.codegen_units, Some(2)); // "foo" from config assert_eq!(p.codegen_units, Some(2)); // "foo" from config
assert_eq!(p.opt_level, "1"); // "middle" from manifest assert_eq!(p.opt_level, "1"); // "middle" from manifest
@ -414,7 +415,7 @@ fn named_config_profile() {
assert_eq!(p.overflow_checks, true); // "dev" built-in (ignore package override) assert_eq!(p.overflow_checks, true); // "dev" built-in (ignore package override)
// build-override // build-override
let bo = profiles.get_profile(a_pkg, true, UnitFor::new_host(false), CompileMode::Build); let bo = profiles.get_profile(a_pkg, true, true, UnitFor::new_host(false), mode);
assert_eq!(bo.name, "foo"); assert_eq!(bo.name, "foo");
assert_eq!(bo.codegen_units, Some(6)); // "foo" build override from config assert_eq!(bo.codegen_units, Some(6)); // "foo" build override from config
assert_eq!(bo.opt_level, "1"); // SAME as normal assert_eq!(bo.opt_level, "1"); // SAME as normal
@ -423,7 +424,7 @@ fn named_config_profile() {
assert_eq!(bo.overflow_checks, true); // SAME as normal assert_eq!(bo.overflow_checks, true); // SAME as normal
// package overrides // package overrides
let po = profiles.get_profile(dep_pkg, false, UnitFor::new_normal(), CompileMode::Build); let po = profiles.get_profile(dep_pkg, false, true, UnitFor::new_normal(), mode);
assert_eq!(po.name, "foo"); assert_eq!(po.name, "foo");
assert_eq!(po.codegen_units, Some(7)); // "foo" package override from config assert_eq!(po.codegen_units, Some(7)); // "foo" package override from config
assert_eq!(po.opt_level, "1"); // SAME as normal assert_eq!(po.opt_level, "1"); // SAME as normal

View File

@ -297,8 +297,8 @@ fn no_warn_workspace_extras() {
.cwd("a") .cwd("a")
.with_stderr( .with_stderr(
"\ "\
[UPDATING] `[..]` index
[PACKAGING] a v0.1.0 ([..]) [PACKAGING] a v0.1.0 ([..])
[UPDATING] `[..]` index
", ",
) )
.run(); .run();
@ -328,10 +328,10 @@ fn warn_package_with_yanked() {
p.cargo("package --no-verify") p.cargo("package --no-verify")
.with_stderr( .with_stderr(
"\ "\
[PACKAGING] foo v0.0.1 ([..])
[UPDATING] `[..]` index [UPDATING] `[..]` index
[WARNING] package `bar v0.1.0` in Cargo.lock is yanked in registry \ [WARNING] package `bar v0.1.0` in Cargo.lock is yanked in registry \
`crates.io`, consider updating to a version that is not yanked `crates.io`, consider updating to a version that is not yanked
[PACKAGING] foo v0.0.1 ([..])
", ",
) )
.run(); .run();
@ -469,6 +469,7 @@ fn ignore_lockfile_inner() {
"\ "\
[PACKAGING] bar v0.0.1 ([..]) [PACKAGING] bar v0.0.1 ([..])
[ARCHIVING] .cargo_vcs_info.json [ARCHIVING] .cargo_vcs_info.json
[ARCHIVING] .gitignore
[ARCHIVING] Cargo.lock [ARCHIVING] Cargo.lock
[ARCHIVING] Cargo.toml [ARCHIVING] Cargo.toml
[ARCHIVING] Cargo.toml.orig [ARCHIVING] Cargo.toml.orig

View File

@ -359,13 +359,18 @@ fn package_with_path_deps() {
.file("notyet/src/lib.rs", "") .file("notyet/src/lib.rs", "")
.build(); .build();
p.cargo("package -v") p.cargo("package")
.with_status(101) .with_status(101)
.with_stderr_contains( .with_stderr_contains(
"\ "\
[ERROR] no matching package named `notyet` found [PACKAGING] foo [..]
location searched: registry [..] [UPDATING] [..]
required by package `foo v0.0.1 ([..])` [ERROR] failed to prepare local package for uploading
Caused by:
no matching package named `notyet` found
location searched: registry `https://github.com/rust-lang/crates.io-index`
required by package `foo v0.0.1 [..]`
", ",
) )
.run(); .run();
@ -375,8 +380,8 @@ required by package `foo v0.0.1 ([..])`
p.cargo("package") p.cargo("package")
.with_stderr( .with_stderr(
"\ "\
[UPDATING] `[..]` index
[PACKAGING] foo v0.0.1 ([CWD]) [PACKAGING] foo v0.0.1 ([CWD])
[UPDATING] `[..]` index
[VERIFYING] foo v0.0.1 ([CWD]) [VERIFYING] foo v0.0.1 ([CWD])
[DOWNLOADING] crates ... [DOWNLOADING] crates ...
[DOWNLOADED] notyet v0.0.1 (registry `[ROOT][..]`) [DOWNLOADED] notyet v0.0.1 (registry `[ROOT][..]`)

View File

@ -849,6 +849,7 @@ fn run_with_library_paths() {
fn main() {{ fn main() {{
let search_path = std::env::var_os("{}").unwrap(); let search_path = std::env::var_os("{}").unwrap();
let paths = std::env::split_paths(&search_path).collect::<Vec<_>>(); let paths = std::env::split_paths(&search_path).collect::<Vec<_>>();
println!("{{:#?}}", paths);
assert!(paths.contains(&r#"{}"#.into())); assert!(paths.contains(&r#"{}"#.into()));
assert!(paths.contains(&r#"{}"#.into())); assert!(paths.contains(&r#"{}"#.into()));
}} }}

View File

@ -6,9 +6,9 @@ use std::env;
#[cargo_test] #[cargo_test]
fn rustc_info_cache() { fn rustc_info_cache() {
// TODO: need to gate this on nightly as soon as -Cbitcode-in-rlib lands in // Needs `-Cbitcode-in-rlib` to ride to stable before this can be enabled
// nightly // everywhere.
if true { if !cargo_test_support::is_nightly() {
return; return;
} }

View File

@ -21,6 +21,12 @@ fn setup() -> Option<Setup> {
return None; return None;
} }
if cfg!(all(target_os = "windows", target_env = "gnu")) {
// FIXME: contains object files that we don't handle yet:
// https://github.com/rust-lang/wg-cargo-std-aware/issues/46
return None;
}
// Our mock sysroot requires a few packages from crates.io, so make sure // Our mock sysroot requires a few packages from crates.io, so make sure
// they're "published" to crates.io. Also edit their code a bit to make sure // they're "published" to crates.io. Also edit their code a bit to make sure
// that they have access to our custom crates with custom apis. // that they have access to our custom crates with custom apis.
@ -297,7 +303,7 @@ fn lib_nostd() {
r#" r#"
#![no_std] #![no_std]
pub fn foo() { pub fn foo() {
assert_eq!(core::u8::MIN, 0); assert_eq!(u8::MIN, 0);
} }
"#, "#,
) )
@ -512,7 +518,7 @@ fn doctest() {
) )
.build(); .build();
p.cargo("test --doc -v") p.cargo("test --doc -v -Zdoctest-xcompile")
.build_std(&setup) .build_std(&setup)
.with_stdout_contains("test src/lib.rs - f [..] ... ok") .with_stdout_contains("test src/lib.rs - f [..] ... ok")
.target_host() .target_host()
@ -570,3 +576,31 @@ fn macro_expanded_shadow() {
p.cargo("build -v").build_std(&setup).target_host().run(); p.cargo("build -v").build_std(&setup).target_host().run();
} }
#[cargo_test]
fn ignores_incremental() {
// Incremental is not really needed for std, make sure it is disabled.
// Incremental also tends to have bugs that affect std libraries more than
// any other crate.
let setup = match setup() {
Some(s) => s,
None => return,
};
let p = project().file("src/lib.rs", "").build();
p.cargo("build")
.env("CARGO_INCREMENTAL", "1")
.build_std(&setup)
.target_host()
.run();
let incremental: Vec<_> = p
.glob(format!("target/{}/debug/incremental/*", rustc_host()))
.map(|e| e.unwrap())
.collect();
assert_eq!(incremental.len(), 1);
assert!(incremental[0]
.file_name()
.unwrap()
.to_str()
.unwrap()
.starts_with("foo-"));
}

View File

@ -129,7 +129,7 @@ fn cargo_test_overflow_checks() {
use std::panic; use std::panic;
pub fn main() { pub fn main() {
let r = panic::catch_unwind(|| { let r = panic::catch_unwind(|| {
[1, i32::max_value()].iter().sum::<i32>(); [1, i32::MAX].iter().sum::<i32>();
}); });
assert!(r.is_err()); assert!(r.is_err());
}"#, }"#,