+Version 1.8.0 (2016-04-14)
+==========================
+
+Language
+--------
+
+* Rust supports overloading of compound assignment statements like
+ `+=` by implementing the [`AddAssign`], [`SubAssign`],
+ [`MulAssign`], [`DivAssign`], [`RemAssign`], [`BitAndAssign`],
+ [`BitOrAssign`], [`BitXorAssign`], [`ShlAssign`], or [`ShrAssign`]
+ traits. [RFC 953].
+* Empty structs can be defined with braces, as in `struct Foo { }`, in
+ addition to the non-braced form, `struct Foo;`. [RFC 218].
+
+Libraries
+---------
+
+* Stabilized APIs:
+ * [`str::encode_utf16`][] (renamed from `utf16_units`)
+ * [`str::EncodeUtf16`][] (renamed from `Utf16Units`)
+ * [`Ref::map`]
+ * [`RefMut::map`]
+ * [`ptr::drop_in_place`]
+ * [`time::Instant`]
+ * [`time::SystemTime`]
+ * [`Instant::now`]
+ * [`Instant::duration_since`][] (renamed from `duration_from_earlier`)
+ * [`Instant::elapsed`]
+ * [`SystemTime::now`]
+ * [`SystemTime::duration_since`][] (renamed from `duration_from_earlier`)
+ * [`SystemTime::elapsed`]
+ * Various `Add`/`Sub` impls for `Time` and `SystemTime`
+ * [`SystemTimeError`]
+ * [`SystemTimeError::duration`]
+ * Various impls for `SystemTimeError`
+ * [`UNIX_EPOCH`]
+ * [`AddAssign`], [`SubAssign`], [`MulAssign`], [`DivAssign`],
+ [`RemAssign`], [`BitAndAssign`], [`BitOrAssign`],
+ [`BitXorAssign`], [`ShlAssign`], [`ShrAssign`].
+* [The `write!` and `writeln!` macros correctly emit errors if any of
+ their arguments can't be formatted][1.8w].
+* [Various I/O functions support large files on 32-bit Linux][1.8l].
+* [The Unix-specific `raw` modules, which contain a number of
+ redefined C types are deprecated][1.8r], including `os::raw::unix`,
+ `os::raw::macos`, and `os::raw::linux`. These modules defined types
+ such as `ino_t` and `dev_t`. The inconsistency of these definitions
+ across platforms was making it difficult to implement `std`
+ correctly. Those that need these definitions should use the `libc`
+ crate. [RFC 1415].
+* The Unix-specific `MetadataExt` traits, including
+ `os::unix::fs::MetadataExt`, which expose values such as inode
+ numbers [no longer return platform-specific types][1.8r], but
+ instead return widened integers. [RFC 1415].
+* [`btree_set::{IntoIter, Iter, Range}` are covariant][1.8cv].
+* [Atomic loads and stores are not volatile][1.8a].
+* [All types in `sync::mpsc` implement `fmt::Debug`][1.8mp].
+
+Performance
+-----------
+
+* [Inlining hash functions lead to a 3% compile-time improvement in
+ some workloads][1.8h].
+* When using jemalloc, its symbols are [unprefixed so that it
+ overrides the libc malloc implementation][1.8h]. This means that for
+ rustc, LLVM is now using jemalloc, which results in a 6%
+ compile-time improvement on a specific workload.
+* [Avoid quadratic growth in function size due to cleanups][1.8cu].
+
+Misc
+----
+
+* [32-bit MSVC builds finally implement unwinding][1.8ms].
+ i686-pc-windows-msvc is now considered a tier-1 platform.
+* [The `--print targets` flag prints a list of supported targets][1.8t].
+* [The `--print cfg` flag prints the `cfg`s defined for the current
+ target][1.8cf].
+* [`rustc` can be built with an new Cargo-based build system, written
+ in Rust][1.8b]. It will eventually replace Rust's Makefile-based
+ build system. To enable it configure with `configure --rustbuild`.
+* [Errors for non-exhaustive `match` patterns now list up to 3 missing
+ variants while also indicating the total number of missing variants
+ if more than 3][1.8m].
+* [Executable stacks are disabled on Linux and BSD][1.8nx].
+* The Rust Project now publishes binary releases of the standard
+ library for a number of tier-2 targets:
+ `armv7-unknown-linux-gnueabihf`, `powerpc-unknown-linux-gnu`,
+ `powerpc64-unknown-linux-gnu`, `powerpc64le-unknown-linux-gnu`
+ `x86_64-rumprun-netbsd`. These can be installed with
+ tools such as [multirust][1.8mr].
+
+Cargo
+-----
+
+* [`cargo init` creates a new Cargo project in the current
+ directory][1.8ci]. It is otherwise like `cargo new`.
+* [Cargo has configuration keys for `-v` and
+ `--color`][1.8cc]. `verbose` and `color`, respectively, go in the
+ `[term]` section of `.cargo/config`.
+* [Configuration keys that evaluate to strings or integers can be set
+ via environment variables][1.8ce]. For example the `build.jobs` key
+ can be set via `CARGO_BUILD_JOBS`. Environment variables take
+ precedence over config files.
+* [Target-specific dependencies support Rust `cfg` syntax for
+ describing targets][1.8cfg] so that dependencies for multiple
+ targets can be specified together. [RFC 1361].
+* [The environment variables `CARGO_TARGET_ROOT`, `RUSTC`, and
+ `RUSTDOC` take precedence over the `build.target-dir`,
+ `build.rustc`, and `build.rustdoc` configuration values][1.8cv].
+* [The child process tree is killed on Windows when Cargo is
+ killed][1.8ck].
+* [The `build.target` configuration value sets the target platform,
+ like `--target`][1.8ct].
+
+Compatibility Notes
+-------------------
+
+* [Unstable compiler flags have been further restricted][1.8u]. Since
+ 1.0 `-Z` flags have been considered unstable, and other flags that
+ were considered unstable additionally required passing `-Z
+ unstable-options` to access. Unlike unstable language and library
+ features though, these options have been accessible on the stable
+ release channel. Going forward, *new unstable flags will not be
+ available on the stable release channel*, and old unstable flags
+ will warn about their usage. In the future, all unstable flags will
+ be unavailable on the stable release channel.
+* [It is no longer possible to `match` on empty enum variants using
+ the `Variant(..)` syntax][1.8v]. This has been a warning since 1.6.
+* The Unix-specific `MetadataExt` traits, including
+ `os::unix::fs::MetadataExt`, which expose values such as inode
+ numbers [no longer return platform-specific types][1.8r], but
+ instead return widened integers. [RFC 1415].
+* [Modules sourced from the filesystem cannot appear within arbitrary
+ blocks, but only within other modules][1.8m].
+* [`--cfg` compiler flags are parsed strictly as identifiers][1.8c].
+* On Unix, [stack overflow triggers a runtime abort instead of a
+ SIGSEGV][1.8so].
+* [`Command::spawn` and its equivalents return an error if any of
+ its command-line arguments contain interior `NUL`s][1.8n].
+* [Tuple and unit enum variants from other crates are in the type
+ namespace][1.8tn].
+* [On Windows `rustc` emits `.lib` files for the `staticlib` library
+ type instead of `.a` files][1.8st]. Additionally, for the MSVC
+ toolchain, `rustc` emits import libraries named `foo.dll.lib`
+ instead of `foo.lib`.
+
+
+[1.8a]: https://github.com/rust-lang/rust/pull/30962
+[1.8b]: https://github.com/rust-lang/rust/pull/31123
+[1.8c]: https://github.com/rust-lang/rust/pull/31530
+[1.8cc]: https://github.com/rust-lang/cargo/pull/2397
+[1.8ce]: https://github.com/rust-lang/cargo/pull/2398
+[1.8cf]: https://github.com/rust-lang/rust/pull/31278
+[1.8cfg]: https://github.com/rust-lang/cargo/pull/2328
+[1.8ci]: https://github.com/rust-lang/cargo/pull/2081
+[1.8ck]: https://github.com/rust-lang/cargo/pull/2370
+[1.8ct]: https://github.com/rust-lang/cargo/pull/2335
+[1.8cu]: https://github.com/rust-lang/rust/pull/31390
+[1.8cv]: https://github.com/rust-lang/cargo/issues/2365
+[1.8cv]: https://github.com/rust-lang/rust/pull/30998
+[1.8h]: https://github.com/rust-lang/rust/pull/31460
+[1.8l]: https://github.com/rust-lang/rust/pull/31668
+[1.8m]: https://github.com/rust-lang/rust/pull/31020
+[1.8m]: https://github.com/rust-lang/rust/pull/31534
+[1.8mp]: https://github.com/rust-lang/rust/pull/30894
+[1.8mr]: https://users.rust-lang.org/t/multirust-0-8-with-cross-std-installation/4901
+[1.8ms]: https://github.com/rust-lang/rust/pull/30448
+[1.8n]: https://github.com/rust-lang/rust/pull/31056
+[1.8nx]: https://github.com/rust-lang/rust/pull/30859
+[1.8r]: https://github.com/rust-lang/rust/pull/31551
+[1.8so]: https://github.com/rust-lang/rust/pull/31333
+[1.8st]: https://github.com/rust-lang/rust/pull/29520
+[1.8t]: https://github.com/rust-lang/rust/pull/31358
+[1.8tn]: https://github.com/rust-lang/rust/pull/30882
+[1.8u]: https://github.com/rust-lang/rust/pull/31793
+[1.8v]: https://github.com/rust-lang/rust/pull/31757
+[1.8w]: https://github.com/rust-lang/rust/pull/31904
+[RFC 1361]: https://github.com/rust-lang/rfcs/blob/master/text/1361-cargo-cfg-dependencies.md
+[RFC 1415]: https://github.com/rust-lang/rfcs/blob/master/text/1415-trim-std-os.md
+[RFC 218]: https://github.com/rust-lang/rfcs/blob/master/text/0218-empty-struct-with-braces.md
+[RFC 953]: https://github.com/rust-lang/rfcs/blob/master/text/0953-op-assign.md
+[`AddAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.AddAssign.html
+[`BitAndAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.BitAndAssign.html
+[`BitOrAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.BitOrAssign.html
+[`BitXorAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.BitXorAssign.html
+[`DivAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.DivAssign.html
+[`Instant::duration_since`]: http://doc.rust-lang.org/nightly/std/time/struct.Instant.html#method.duration_since
+[`Instant::elapsed`]: http://doc.rust-lang.org/nightly/std/time/struct.Instant.html#method.elapsed
+[`Instant::now`]: http://doc.rust-lang.org/nightly/std/time/struct.Instant.html#method.now
+[`MulAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.MulAssign.html
+[`Ref::map`]: http://doc.rust-lang.org/nightly/std/cell/struct.Ref.html#method.map
+[`RefMut::map`]: http://doc.rust-lang.org/nightly/std/cell/struct.RefMut.html#method.map
+[`RemAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.RemAssign.html
+[`ShlAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.ShlAssign.html
+[`ShrAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.ShrAssign.html
+[`SubAssign`]: http://doc.rust-lang.org/nightly/std/ops/trait.SubAssign.html
+[`SystemTime::duration_since`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTime.html#method.duration_since
+[`SystemTime::elapsed`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTime.html#method.elapsed
+[`SystemTime::now`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTime.html#method.now
+[`SystemTimeError::duration`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTimeError.html#method.duration
+[`SystemTimeError`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTimeError.html
+[`UNIX_EPOCH`]: http://doc.rust-lang.org/nightly/std/time/constant.UNIX_EPOCH.html
+[`ptr::drop_in_place`]: http://doc.rust-lang.org/nightly/std/ptr/fn.drop_in_place.html
+[`str::EncodeUtf16`]: http://doc.rust-lang.org/nightly/std/str/struct.EncodeUtf16.html
+[`str::encode_utf16`]: http://doc.rust-lang.org/nightly/std/primitive.str.html#method.encode_utf16
+[`time::Instant`]: http://doc.rust-lang.org/nightly/std/time/struct.Instant.html
+[`time::SystemTime`]: http://doc.rust-lang.org/nightly/std/time/struct.SystemTime.html
+
+
Version 1.7.0 (2016-03-03)
==========================
if [ -n "$CFG_ENABLE_ORBIT" ]; then putvar CFG_ENABLE_ORBIT; fi
-# A magic value that allows the compiler to use unstable features
-# during the bootstrap even when doing so would normally be an error
-# because of feature staging or because the build turns on
-# warnings-as-errors and unstable features default to warnings. The
-# build has to match this key in an env var. Meant to be a mild
-# deterrent from users just turning on unstable features on the stable
-# channel.
-# Basing CFG_BOOTSTRAP_KEY on CFG_BOOTSTRAP_KEY lets it get picked up
-# during a Makefile reconfig.
-CFG_BOOTSTRAP_KEY="${CFG_BOOTSTRAP_KEY-`date +%H:%M:%S`}"
-putvar CFG_BOOTSTRAP_KEY
-
step_msg "looking for build programs"
probe_need CFG_CURLORWGET curl wget
rustc_trans rustc_back rustc_llvm rustc_privacy rustc_lint \
rustc_data_structures rustc_platform_intrinsics \
rustc_plugin rustc_metadata rustc_passes rustc_save_analysis \
- rustc_const_eval rustc_const_math
+ rustc_const_eval rustc_const_math rustc_incremental
HOST_CRATES := syntax syntax_ext $(RUSTC_CRATES) rustdoc fmt_macros \
flate arena graphviz rbml log serialize
TOOLS := compiletest rustdoc rustc rustbook error_index_generator
DEPS_rustc_driver := arena flate getopts graphviz libc rustc rustc_back rustc_borrowck \
rustc_typeck rustc_mir rustc_resolve log syntax serialize rustc_llvm \
rustc_trans rustc_privacy rustc_lint rustc_plugin \
- rustc_metadata syntax_ext rustc_passes rustc_save_analysis rustc_const_eval
+ rustc_metadata syntax_ext rustc_passes rustc_save_analysis rustc_const_eval \
+ rustc_incremental
DEPS_rustc_lint := rustc log syntax rustc_const_eval
DEPS_rustc_llvm := native:rustllvm libc std rustc_bitflags
DEPS_rustc_metadata := rustc syntax rbml rustc_const_math
DEPS_rustc_privacy := rustc log syntax
DEPS_rustc_trans := arena flate getopts graphviz libc rustc rustc_back rustc_mir \
log syntax serialize rustc_llvm rustc_platform_intrinsics \
- rustc_const_math rustc_const_eval
+ rustc_const_math rustc_const_eval rustc_incremental
+DEPS_rustc_incremental := rbml rustc serialize rustc_data_structures
DEPS_rustc_save_analysis := rustc log syntax
DEPS_rustc_typeck := rustc syntax rustc_platform_intrinsics rustc_const_math \
rustc_const_eval
# versions in the same place
CFG_FILENAME_EXTRA=$(shell printf '%s' $(CFG_RELEASE)$(CFG_EXTRA_FILENAME) | $(CFG_HASH_COMMAND))
+# A magic value that allows the compiler to use unstable features during the
+# bootstrap even when doing so would normally be an error because of feature
+# staging or because the build turns on warnings-as-errors and unstable features
+# default to warnings. The build has to match this key in an env var.
+#
+# This value is keyed off the release to ensure that all compilers for one
+# particular release have the same bootstrap key. Note that this is
+# intentionally not "secure" by any definition, this is largely just a deterrent
+# from users enabling unstable features on the stable compiler.
+CFG_BOOTSTRAP_KEY=$(CFG_FILENAME_EXTRA)
+
ifeq ($(CFG_RELEASE_CHANNEL),stable)
# This is the normal semver version string, e.g. "0.12.0", "0.12.0-nightly"
CFG_RELEASE=$(CFG_RELEASE_NUM)
check-stage$(1)-T-$(2)-H-$(3)-doc-crates-exec \
check-stage$(1)-T-$(2)-H-$(3)-debuginfo-gdb-exec \
check-stage$(1)-T-$(2)-H-$(3)-debuginfo-lldb-exec \
+ check-stage$(1)-T-$(2)-H-$(3)-incremental-exec \
check-stage$(1)-T-$(2)-H-$(3)-doc-exec \
check-stage$(1)-T-$(2)-H-$(3)-pretty-exec
CODEGEN_RS := $(call rwildcard,$(S)src/test/codegen/,*.rs)
CODEGEN_CC := $(call rwildcard,$(S)src/test/codegen/,*.cc)
CODEGEN_UNITS_RS := $(call rwildcard,$(S)src/test/codegen-units/,*.rs)
+INCREMENTAL_RS := $(call rwildcard,$(S)src/test/incremental/,*.rs)
RUSTDOCCK_RS := $(call rwildcard,$(S)src/test/rustdoc/,*.rs)
RPASS_TESTS := $(RPASS_RS)
DEBUGINFO_LLDB_TESTS := $(DEBUGINFO_LLDB_RS)
CODEGEN_TESTS := $(CODEGEN_RS) $(CODEGEN_CC)
CODEGEN_UNITS_TESTS := $(CODEGEN_UNITS_RS)
+INCREMENTAL_TESTS := $(INCREMENTAL_RS)
RUSTDOCCK_TESTS := $(RUSTDOCCK_RS)
CTEST_SRC_BASE_rpass = run-pass
CTEST_MODE_codegen-units = codegen-units
CTEST_RUNTOOL_codegen-units = $(CTEST_RUNTOOL)
+CTEST_SRC_BASE_incremental = incremental
+CTEST_BUILD_BASE_incremental = incremental
+CTEST_MODE_incremental = incremental
+CTEST_RUNTOOL_incremental = $(CTEST_RUNTOOL)
+
CTEST_SRC_BASE_rustdocck = rustdoc
CTEST_BUILD_BASE_rustdocck = rustdoc
CTEST_MODE_rustdocck = rustdoc
$(S)src/etc/lldb_rust_formatters.py
CTEST_DEPS_codegen_$(1)-T-$(2)-H-$(3) = $$(CODEGEN_TESTS)
CTEST_DEPS_codegen-units_$(1)-T-$(2)-H-$(3) = $$(CODEGEN_UNITS_TESTS)
+CTEST_DEPS_incremental_$(1)-T-$(2)-H-$(3) = $$(INCREMENTAL_TESTS)
CTEST_DEPS_rustdocck_$(1)-T-$(2)-H-$(3) = $$(RUSTDOCCK_TESTS) \
$$(HBIN$(1)_H_$(3))/rustdoc$$(X_$(3)) \
$(S)src/etc/htmldocck.py
endef
CTEST_NAMES = rpass rpass-valgrind rpass-full rfail-full cfail-full rfail cfail pfail \
- debuginfo-gdb debuginfo-lldb codegen codegen-units rustdocck
+ debuginfo-gdb debuginfo-lldb codegen codegen-units rustdocck incremental
$(foreach host,$(CFG_HOST), \
$(eval $(foreach target,$(CFG_TARGET), \
debuginfo-lldb \
codegen \
codegen-units \
+ incremental \
doc \
$(foreach docname,$(DOC_NAMES),doc-$(docname)) \
pretty \
version = "0.0.0"
dependencies = [
"build_helper 0.1.0",
- "cmake 0.1.13 (registry+https://github.com/rust-lang/crates.io-index)",
+ "cmake 0.1.16 (registry+https://github.com/rust-lang/crates.io-index)",
"filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)",
- "gcc 0.3.25 (registry+https://github.com/rust-lang/crates.io-index)",
+ "gcc 0.3.26 (registry+https://github.com/rust-lang/crates.io-index)",
"getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)",
"kernel32-sys 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)",
- "libc 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.9 (registry+https://github.com/rust-lang/crates.io-index)",
"num_cpus 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
- "rustc-serialize 0.3.18 (registry+https://github.com/rust-lang/crates.io-index)",
- "toml 0.1.27 (registry+https://github.com/rust-lang/crates.io-index)",
- "winapi 0.2.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rustc-serialize 0.3.19 (registry+https://github.com/rust-lang/crates.io-index)",
+ "toml 0.1.28 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
[[package]]
name = "cmake"
-version = "0.1.13"
+version = "0.1.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "gcc 0.3.25 (registry+https://github.com/rust-lang/crates.io-index)",
+ "gcc 0.3.26 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.9 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "gcc"
-version = "0.3.25"
+version = "0.3.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "winapi 0.2.5 (registry+https://github.com/rust-lang/crates.io-index)",
+ "winapi 0.2.6 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "libc"
-version = "0.2.7"
+version = "0.2.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "libc 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)",
+ "libc 0.2.9 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rustc-serialize"
-version = "0.3.18"
+version = "0.3.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "toml"
-version = "0.1.27"
+version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
- "rustc-serialize 0.3.18 (registry+https://github.com/rust-lang/crates.io-index)",
+ "rustc-serialize 0.3.19 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi"
-version = "0.2.5"
+version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
}
pub fn cargotest(build: &Build, stage: u32, host: &str) {
+
let ref compiler = Compiler::new(stage, host);
+
+ // Configure PATH to find the right rustc. NB. we have to use PATH
+ // and not RUSTC because the Cargo test suite has tests that will
+ // fail if rustc is not spelled `rustc`.
+ let path = build.sysroot(compiler).join("bin");
+ let old_path = ::std::env::var("PATH").expect("");
+ let sep = if cfg!(windows) { ";" } else {":" };
+ let ref newpath = format!("{}{}{}", path.display(), sep, old_path);
+
build.run(build.tool_cmd(compiler, "cargotest")
- .env("RUSTC", build.compiler_path(compiler))
- .env("RUSTDOC", build.rustdoc(compiler))
+ .env("PATH", newpath)
.arg(&build.cargo));
}
use build_helper::output;
-use build::util::{exe, staticlib, libdir, mtime, is_dylib};
+use build::util::{exe, staticlib, libdir, mtime, is_dylib, copy};
use build::{Build, Compiler, Mode};
/// Build the standard library.
let libdir = build.sysroot_libdir(compiler, target);
let _ = fs::remove_dir_all(&libdir);
t!(fs::create_dir_all(&libdir));
- t!(fs::hard_link(&build.compiler_rt_built.borrow()[target],
- libdir.join(staticlib("compiler-rt", target))));
+ copy(&build.compiler_rt_built.borrow()[target],
+ &libdir.join(staticlib("compiler-rt", target)));
build_startup_objects(build, target, &libdir);
if host != compiler.host {
let _ = fs::remove_dir_all(&libdir);
t!(fs::create_dir_all(&libdir));
- t!(fs::hard_link(&build.compiler_rt_built.borrow()[target],
- libdir.join(staticlib("compiler-rt", target))));
+ copy(&build.compiler_rt_built.borrow()[target],
+ &libdir.join(staticlib("compiler-rt", target)));
}
add_to_sysroot(&out_dir, &libdir);
/// Only required for musl targets that statically link to libc
fn copy_third_party_objects(build: &Build, target: &str, into: &Path) {
for &obj in &["crt1.o", "crti.o", "crtn.o"] {
- t!(fs::copy(compiler_file(build.cc(target), obj), into.join(obj)));
+ copy(&compiler_file(build.cc(target), obj), &into.join(obj));
}
}
}
for obj in ["crt2.o", "dllcrt2.o"].iter() {
- t!(fs::copy(compiler_file(build.cc(target), obj), into.join(obj)));
+ copy(&compiler_file(build.cc(target), obj), &into.join(obj));
}
}
build.cargo_out(compiler, Mode::Libtest, target).join("libtest_shim.rlib")
}
-fn compiler_file(compiler: &Path, file: &str) -> String {
- output(Command::new(compiler)
- .arg(format!("-print-file-name={}", file))).trim().to_string()
+fn compiler_file(compiler: &Path, file: &str) -> PathBuf {
+ let out = output(Command::new(compiler)
+ .arg(format!("-print-file-name={}", file)));
+ PathBuf::from(out.trim())
}
/// Prepare a new compiler from the artifacts in `stage`
for f in t!(fs::read_dir(&src_libdir)).map(|f| t!(f)) {
let filename = f.file_name().into_string().unwrap();
if is_dylib(&filename) {
- t!(fs::hard_link(&f.path(), sysroot_libdir.join(&filename)));
+ copy(&f.path(), &sysroot_libdir.join(&filename));
}
}
t!(fs::create_dir_all(&bindir));
let compiler = build.compiler_path(&Compiler::new(stage, host));
let _ = fs::remove_file(&compiler);
- t!(fs::hard_link(rustc, compiler));
+ copy(&rustc, &compiler);
// See if rustdoc exists to link it into place
let rustdoc = exe("rustdoc", host);
let rustdoc_dst = bindir.join(&rustdoc);
if fs::metadata(&rustdoc_src).is_ok() {
let _ = fs::remove_file(&rustdoc_dst);
- t!(fs::hard_link(&rustdoc_src, &rustdoc_dst));
+ copy(&rustdoc_src, &rustdoc_dst);
}
}
let (_, path) = paths.iter().map(|path| {
(mtime(&path).seconds(), path)
}).max().unwrap();
- t!(fs::hard_link(&path,
- sysroot_dst.join(path.file_name().unwrap())));
+ copy(&path, &sysroot_dst.join(path.file_name().unwrap()));
}
}
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
- .arg(format!("--package-name={}", name))
+ .arg(format!("--package-name={}-{}", name, host))
.arg("--component-name=rust-docs")
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg("--bulk-dirs=share/doc/rust/html");
// As part of this step, *also* copy the docs directory to a directory which
// buildbot typically uploads.
- let dst = distdir(build).join("doc").join(&build.package_vers);
- t!(fs::create_dir_all(&dst));
- cp_r(&src, &dst);
+ if host == build.config.build {
+ let dst = distdir(build).join("doc").join(&build.package_vers);
+ t!(fs::create_dir_all(&dst));
+ cp_r(&src, &dst);
+ }
}
pub fn mingw(build: &Build, host: &str) {
use build::{Build, Compiler, Mode};
use build::util::{up_to_date, cp_r};
-pub fn rustbook(build: &Build, stage: u32, host: &str, name: &str, out: &Path) {
+pub fn rustbook(build: &Build, stage: u32, target: &str, name: &str, out: &Path) {
t!(fs::create_dir_all(out));
let out = out.join(name);
- let compiler = Compiler::new(stage, host);
+ let compiler = Compiler::new(stage, &build.config.build);
let src = build.src.join("src/doc").join(name);
let index = out.join("index.html");
let rustbook = build.tool(&compiler, "rustbook");
if up_to_date(&src, &index) && up_to_date(&rustbook, &index) {
return
}
- println!("Rustbook stage{} ({}) - {}", stage, host, name);
+ println!("Rustbook stage{} ({}) - {}", stage, target, name);
let _ = fs::remove_dir_all(&out);
build.run(build.tool_cmd(&compiler, "rustbook")
.arg("build")
.arg(out));
}
-pub fn standalone(build: &Build, stage: u32, host: &str, out: &Path) {
- println!("Documenting stage{} standalone ({})", stage, host);
+pub fn standalone(build: &Build, stage: u32, target: &str, out: &Path) {
+ println!("Documenting stage{} standalone ({})", stage, target);
t!(fs::create_dir_all(out));
- let compiler = Compiler::new(stage, host);
+ let compiler = Compiler::new(stage, &build.config.build);
let favicon = build.src.join("src/doc/favicon.inc");
let footer = build.src.join("src/doc/footer.inc");
}
}
-pub fn std(build: &Build, stage: u32, host: &str, out: &Path) {
- println!("Documenting stage{} std ({})", stage, host);
- let compiler = Compiler::new(stage, host);
+pub fn std(build: &Build, stage: u32, target: &str, out: &Path) {
+ println!("Documenting stage{} std ({})", stage, target);
+ t!(fs::create_dir_all(out));
+ let compiler = Compiler::new(stage, &build.config.build);
let out_dir = build.stage_out(&compiler, Mode::Libstd)
- .join(host).join("doc");
+ .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
build.clear_if_dirty(&out_dir, &rustdoc);
- let mut cargo = build.cargo(&compiler, Mode::Libstd, host, "doc");
+ let mut cargo = build.cargo(&compiler, Mode::Libstd, target, "doc");
cargo.arg("--manifest-path")
.arg(build.src.join("src/rustc/std_shim/Cargo.toml"))
.arg("--features").arg(build.std_features());
cp_r(&out_dir, out)
}
-pub fn test(build: &Build, stage: u32, host: &str, out: &Path) {
- println!("Documenting stage{} test ({})", stage, host);
- let compiler = Compiler::new(stage, host);
+pub fn test(build: &Build, stage: u32, target: &str, out: &Path) {
+ println!("Documenting stage{} test ({})", stage, target);
+ let compiler = Compiler::new(stage, &build.config.build);
let out_dir = build.stage_out(&compiler, Mode::Libtest)
- .join(host).join("doc");
+ .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
build.clear_if_dirty(&out_dir, &rustdoc);
- let mut cargo = build.cargo(&compiler, Mode::Libtest, host, "doc");
+ let mut cargo = build.cargo(&compiler, Mode::Libtest, target, "doc");
cargo.arg("--manifest-path")
.arg(build.src.join("src/rustc/test_shim/Cargo.toml"));
build.run(&mut cargo);
cp_r(&out_dir, out)
}
-pub fn rustc(build: &Build, stage: u32, host: &str, out: &Path) {
- println!("Documenting stage{} compiler ({})", stage, host);
- let compiler = Compiler::new(stage, host);
+pub fn rustc(build: &Build, stage: u32, target: &str, out: &Path) {
+ println!("Documenting stage{} compiler ({})", stage, target);
+ let compiler = Compiler::new(stage, &build.config.build);
let out_dir = build.stage_out(&compiler, Mode::Librustc)
- .join(host).join("doc");
+ .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
if !up_to_date(&rustdoc, &out_dir.join("rustc/index.html")) {
t!(fs::remove_dir_all(&out_dir));
}
- let mut cargo = build.cargo(&compiler, Mode::Librustc, host, "doc");
+ let mut cargo = build.cargo(&compiler, Mode::Librustc, target, "doc");
cargo.arg("--manifest-path")
.arg(build.src.join("src/rustc/Cargo.toml"))
.arg("--features").arg(build.rustc_features());
cp_r(&out_dir, out)
}
-pub fn error_index(build: &Build, stage: u32, host: &str, out: &Path) {
- println!("Documenting stage{} error index ({})", stage, host);
- let compiler = Compiler::new(stage, host);
+pub fn error_index(build: &Build, stage: u32, target: &str, out: &Path) {
+ println!("Documenting stage{} error index ({})", stage, target);
+ t!(fs::create_dir_all(out));
+ let compiler = Compiler::new(stage, &build.config.build);
let mut index = build.tool_cmd(&compiler, "error_index_generator");
index.arg("html");
index.arg(out.join("error-index.html"));
}
}
}
+
+ for host in build.flags.host.iter() {
+ if !build.config.host.contains(host) {
+ panic!("specified host `{}` is not in the ./configure list", host);
+ }
+ }
+ for target in build.flags.target.iter() {
+ if !build.config.target.contains(target) {
+ panic!("specified target `{}` is not in the ./configure list",
+ target);
+ }
+ }
}
vec![self.llvm(()).target(&build.config.build)]
}
Source::Llvm { _dummy } => Vec::new(),
+
+ // Note that all doc targets depend on artifacts from the build
+ // architecture, not the target (which is where we're generating
+ // docs into).
Source::DocStd { stage } => {
- vec![self.libstd(self.compiler(stage))]
+ let compiler = self.target(&build.config.build).compiler(stage);
+ vec![self.libstd(compiler)]
}
Source::DocTest { stage } => {
- vec![self.libtest(self.compiler(stage))]
+ let compiler = self.target(&build.config.build).compiler(stage);
+ vec![self.libtest(compiler)]
}
Source::DocBook { stage } |
Source::DocNomicon { stage } |
Source::DocStyle { stage } => {
- vec![self.tool_rustbook(stage)]
+ vec![self.target(&build.config.build).tool_rustbook(stage)]
}
Source::DocErrorIndex { stage } => {
- vec![self.tool_error_index(stage)]
+ vec![self.target(&build.config.build).tool_error_index(stage)]
}
Source::DocStandalone { stage } => {
- vec![self.rustc(stage)]
+ vec![self.target(&build.config.build).rustc(stage)]
}
Source::DocRustc { stage } => {
vec![self.doc_test(stage)]
vec![self.librustc(self.compiler(stage))]
}
Source::ToolCargoTest { stage } => {
- vec![self.libstd(self.compiler(stage))]
+ vec![self.librustc(self.compiler(stage))]
}
Source::DistDocs { stage } => vec![self.doc(stage)],
Source::Dist { stage } => {
let mut base = Vec::new();
- base.push(self.dist_docs(stage));
for host in build.config.host.iter() {
let host = self.target(host);
let compiler = self.compiler(stage);
for target in build.config.target.iter() {
- base.push(self.target(target).dist_std(compiler));
+ let target = self.target(target);
+ base.push(target.dist_docs(stage));
+ base.push(target.dist_std(compiler));
}
}
return base
}).unwrap_or(FileTime::zero())
}
+pub fn copy(src: &Path, dst: &Path) {
+ let res = fs::hard_link(src, dst);
+ let res = res.or_else(|_| fs::copy(src, dst).map(|_| ()));
+ if let Err(e) = res {
+ panic!("failed to copy `{}` to `{}`: {}", src.display(),
+ dst.display(), e)
+ }
+}
+
pub fn cp_r(src: &Path, dst: &Path) {
for f in t!(fs::read_dir(src)) {
let f = t!(f);
cp_r(&path, &dst);
} else {
let _ = fs::remove_file(&dst);
- t!(fs::hard_link(&path, dst));
+ copy(&path, &dst);
}
}
}
if target.contains("musl") || target.contains("msvc") {
PathBuf::from("ar")
} else {
+ let parent = cc.parent().unwrap();
let file = cc.file_name().unwrap().to_str().unwrap();
- cc.parent().unwrap().join(file.replace("gcc", "ar")
- .replace("cc", "ar")
- .replace("clang", "ar"))
+ for suffix in &["gcc", "cc", "clang"] {
+ if let Some(idx) = file.rfind(suffix) {
+ let mut file = file[..idx].to_owned();
+ file.push_str("ar");
+ return parent.join(&file);
+ }
+ }
+ parent.join(file)
}
}
DebugInfoLldb,
Codegen,
Rustdoc,
- CodegenUnits
+ CodegenUnits,
+ Incremental,
}
impl FromStr for Mode {
"codegen" => Ok(Codegen),
"rustdoc" => Ok(Rustdoc),
"codegen-units" => Ok(CodegenUnits),
+ "incremental" => Ok(Incremental),
_ => Err(()),
}
}
Codegen => "codegen",
Rustdoc => "rustdoc",
CodegenUnits => "codegen-units",
+ Incremental => "incremental",
}, f)
}
}
#![feature(box_syntax)]
#![feature(libc)]
#![feature(rustc_private)]
-#![feature(str_char)]
#![feature(test)]
#![feature(question_mark)]
reqopt("", "aux-base", "directory to find auxiliary test files", "PATH"),
reqopt("", "stage-id", "the target-stage identifier", "stageN-TARGET"),
reqopt("", "mode", "which sort of compile tests to run",
- "(compile-fail|parse-fail|run-fail|run-pass|run-pass-valgrind|pretty|debug-info)"),
+ "(compile-fail|parse-fail|run-fail|run-pass|\
+ run-pass-valgrind|pretty|debug-info|incremental)"),
optflag("", "ignored", "run tests marked as ignored"),
optopt("", "runtool", "supervisor program to run tests under \
(eg. emulator, valgrind)", "PROGRAM"),
// used to be a regex "(^|[^0-9])([0-9]\.[0-9]+)"
for (pos, c) in full_version_line.char_indices() {
- if !c.is_digit(10) { continue }
- if pos + 2 >= full_version_line.len() { continue }
- if full_version_line.char_at(pos + 1) != '.' { continue }
- if !full_version_line.char_at(pos + 2).is_digit(10) { continue }
- if pos > 0 && full_version_line.char_at_reverse(pos).is_digit(10) {
+ if !c.is_digit(10) {
+ continue
+ }
+ if pos + 2 >= full_version_line.len() {
+ continue
+ }
+ if full_version_line[pos + 1..].chars().next().unwrap() != '.' {
+ continue
+ }
+ if !full_version_line[pos + 2..].chars().next().unwrap().is_digit(10) {
+ continue
+ }
+ if pos > 0 && full_version_line[..pos].chars().next_back()
+ .unwrap().is_digit(10) {
continue
}
let mut end = pos + 3;
while end < full_version_line.len() &&
- full_version_line.char_at(end).is_digit(10) {
+ full_version_line[end..].chars().next()
+ .unwrap().is_digit(10) {
end += 1;
}
return Some(full_version_line[pos..end].to_owned());
for (pos, l) in full_version_line.char_indices() {
if l != 'l' && l != 'L' { continue }
if pos + 5 >= full_version_line.len() { continue }
- let l = full_version_line.char_at(pos + 1);
+ let l = full_version_line[pos + 1..].chars().next().unwrap();
if l != 'l' && l != 'L' { continue }
- let d = full_version_line.char_at(pos + 2);
+ let d = full_version_line[pos + 2..].chars().next().unwrap();
if d != 'd' && d != 'D' { continue }
- let b = full_version_line.char_at(pos + 3);
+ let b = full_version_line[pos + 3..].chars().next().unwrap();
if b != 'b' && b != 'B' { continue }
- let dash = full_version_line.char_at(pos + 4);
+ let dash = full_version_line[pos + 4..].chars().next().unwrap();
if dash != '-' { continue }
let vers = full_version_line[pos + 5..].chars().take_while(|c| {
tag: &str)
-> Option<(WhichLine, ExpectedError)> {
let start = match line.find(tag) { Some(i) => i, None => return None };
- let (follow, adjusts) = if line.char_at(start + tag.len()) == '|' {
+ let (follow, adjusts) = if line[start + tag.len()..].chars().next().unwrap() == '|' {
(true, 0)
} else {
(false, line[start + tag.len()..].chars().take_while(|c| *c == '^').count())
use common::Config;
use common::{CompileFail, ParseFail, Pretty, RunFail, RunPass, RunPassValgrind};
use common::{Codegen, DebugInfoLldb, DebugInfoGdb, Rustdoc, CodegenUnits};
+use common::{Incremental};
use errors::{self, ErrorKind};
use header::TestProps;
use header;
Codegen => run_codegen_test(&config, &props, &testpaths),
Rustdoc => run_rustdoc_test(&config, &props, &testpaths),
CodegenUnits => run_codegen_units_test(&config, &props, &testpaths),
+ Incremental => run_incremental_test(&config, &props, &testpaths),
}
}
if *idx >= haystack.len() {
return false;
}
- let ch = haystack.char_at(*idx);
+ let ch = haystack[*idx..].chars().next().unwrap();
if ch != needle {
return false;
}
fn scan_integer(haystack: &str, idx: &mut usize) -> bool {
let mut i = *idx;
while i < haystack.len() {
- let ch = haystack.char_at(i);
+ let ch = haystack[i..].chars().next().unwrap();
if ch < '0' || '9' < ch {
break;
}
if haystack_i >= haystack.len() {
return false;
}
- let ch = haystack.char_at(haystack_i);
+ let ch = haystack[haystack_i..].chars().next().unwrap();
haystack_i += ch.len_utf8();
if !scan_char(needle, ch, &mut needle_i) {
return false;
panic!();
}
}
+
+fn run_incremental_test(config: &Config, props: &TestProps, testpaths: &TestPaths) {
+ // Basic plan for a test incremental/foo/bar.rs:
+ // - load list of revisions pass1, fail2, pass3
+ // - each should begin with `rpass`, `rfail`, or `cfail`
+ // - if `rpass`, expect compile and execution to succeed
+ // - if `cfail`, expect compilation to fail
+ // - if `rfail`, expect execution to fail
+ // - create a directory build/foo/bar.incremental
+ // - compile foo/bar.rs with -Z incremental=.../foo/bar.incremental and -C pass1
+ // - because name of revision starts with "pass", expect success
+ // - compile foo/bar.rs with -Z incremental=.../foo/bar.incremental and -C fail2
+ // - because name of revision starts with "fail", expect an error
+ // - load expected errors as usual, but filter for those that end in `[fail2]`
+ // - compile foo/bar.rs with -Z incremental=.../foo/bar.incremental and -C pass3
+ // - because name of revision starts with "pass", expect success
+ // - execute build/foo/bar.exe and save output
+ //
+ // FIXME -- use non-incremental mode as an oracle? That doesn't apply
+ // to #[rustc_dirty] and clean tests I guess
+
+ assert!(!props.revisions.is_empty(), "incremental tests require a list of revisions");
+
+ let output_base_name = output_base_name(config, testpaths);
+
+ // Create the incremental workproduct directory.
+ let incremental_dir = output_base_name.with_extension("incremental");
+ if incremental_dir.exists() {
+ fs::remove_dir_all(&incremental_dir).unwrap();
+ }
+ fs::create_dir_all(&incremental_dir).unwrap();
+
+ if config.verbose {
+ print!("incremental_dir={}", incremental_dir.display());
+ }
+
+ for revision in &props.revisions {
+ let mut revision_props = props.clone();
+ header::load_props_into(&mut revision_props, &testpaths.file, Some(&revision));
+
+ revision_props.compile_flags.extend(vec![
+ format!("-Z"),
+ format!("incremental={}", incremental_dir.display()),
+ format!("--cfg"),
+ format!("{}", revision),
+ ]);
+
+ if config.verbose {
+ print!("revision={:?} revision_props={:#?}", revision, revision_props);
+ }
+
+ if revision.starts_with("rpass") {
+ run_rpass_test_revision(config, &revision_props, testpaths, Some(&revision));
+ } else if revision.starts_with("rfail") {
+ run_rfail_test_revision(config, &revision_props, testpaths, Some(&revision));
+ } else if revision.starts_with("cfail") {
+ run_cfail_test_revision(config, &revision_props, testpaths, Some(&revision));
+ } else {
+ fatal(
+ Some(revision),
+ "revision name must begin with rpass, rfail, or cfail");
+ }
+ }
+}
incorrectly also helps rule out data races, one of the worst kinds of
concurrency bugs.
-As an example, here is a Rust program that could have a data race in many
+As an example, here is a Rust program that would have a data race in many
languages. It will not compile:
```ignore
for i in 0..3 {
thread::spawn(move || {
- data[i] += 1;
+ data[0] += i;
});
}
```text
8:17 error: capture of moved value: `data`
- data[i] += 1;
+ data[0] += i;
^~~~
```
`data` gets moved out of `main` in the first call to `spawn()`, so subsequent
calls in the loop cannot use this variable.
-Note that this specific example will not cause a data race since different array
-indices are being accessed. But this can't be determined at compile time, and in
-a similar situation where `i` is a constant or is random, you would have a data
-race.
-
So, we need some type that lets us have more than one owning reference to a
value. Usually, we'd use `Rc<T>` for this, which is a reference counted type
that provides shared ownership. It has some runtime bookkeeping that keeps track
// use it in a thread
thread::spawn(move || {
- data_ref[i] += 1;
+ data_ref[0] += i;
});
}
for i in 0..3 {
let data = data.clone();
thread::spawn(move || {
- data[i] += 1;
+ data[0] += i;
});
}
```text
<anon>:11:24 error: cannot borrow immutable borrowed content as mutable
-<anon>:11 data[i] += 1;
+<anon>:11 data[0] += i;
^~~~
```
let data = data.clone();
thread::spawn(move || {
let mut data = data.lock().unwrap();
- data[i] += 1;
+ data[0] += i;
});
}
# let data = data.clone();
thread::spawn(move || {
let mut data = data.lock().unwrap();
- data[i] += 1;
+ data[0] += i;
});
# }
# thread::sleep(Duration::from_millis(50));
#[unsafe_no_drop_flag]
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Arc<T: ?Sized> {
- // FIXME #12808: strange name to try to avoid interfering with
- // field accesses of the contained type via Deref
- _ptr: Shared<ArcInner<T>>,
+ ptr: Shared<ArcInner<T>>,
}
#[stable(feature = "rust1", since = "1.0.0")]
#[unsafe_no_drop_flag]
#[stable(feature = "arc_weak", since = "1.4.0")]
pub struct Weak<T: ?Sized> {
- // FIXME #12808: strange name to try to avoid interfering with
- // field accesses of the contained type via Deref
- _ptr: Shared<ArcInner<T>>,
+ ptr: Shared<ArcInner<T>>,
}
#[stable(feature = "arc_weak", since = "1.4.0")]
weak: atomic::AtomicUsize::new(1),
data: data,
};
- Arc { _ptr: unsafe { Shared::new(Box::into_raw(x)) } }
+ Arc { ptr: unsafe { Shared::new(Box::into_raw(x)) } }
}
/// Unwraps the contained value if the `Arc<T>` has exactly one strong reference.
atomic::fence(Acquire);
unsafe {
- let ptr = *this._ptr;
+ let ptr = *this.ptr;
let elem = ptr::read(&(*ptr).data);
// Make a weak pointer to clean up the implicit strong-weak reference
- let _weak = Weak { _ptr: this._ptr };
+ let _weak = Weak { ptr: this.ptr };
mem::forget(this);
Ok(elem)
// synchronize with the write coming from `is_unique`, so that the
// events prior to that write happen before this read.
match this.inner().weak.compare_exchange_weak(cur, cur + 1, Acquire, Relaxed) {
- Ok(_) => return Weak { _ptr: this._ptr },
+ Ok(_) => return Weak { ptr: this.ptr },
Err(old) => cur = old,
}
}
// `ArcInner` structure itself is `Sync` because the inner data is
// `Sync` as well, so we're ok loaning out an immutable pointer to these
// contents.
- unsafe { &**self._ptr }
+ unsafe { &**self.ptr }
}
// Non-inlined part of `drop`.
#[inline(never)]
unsafe fn drop_slow(&mut self) {
- let ptr = *self._ptr;
+ let ptr = *self.ptr;
// Destroy the data at this time, even though we may not free the box
// allocation itself (there may still be weak pointers lying around).
}
}
- Arc { _ptr: self._ptr }
+ Arc { ptr: self.ptr }
}
}
// Materialize our own implicit weak pointer, so that it can clean
// up the ArcInner as needed.
- let weak = Weak { _ptr: this._ptr };
+ let weak = Weak { ptr: this.ptr };
// mark the data itself as already deallocated
unsafe {
// here (due to zeroing) because data is no longer accessed by
// other threads (due to there being no more strong refs at this
// point).
- let mut swap = Arc::new(ptr::read(&(**weak._ptr).data));
+ let mut swap = Arc::new(ptr::read(&(**weak.ptr).data));
mem::swap(this, &mut swap);
mem::forget(swap);
}
// As with `get_mut()`, the unsafety is ok because our reference was
// either unique to begin with, or became one upon cloning the contents.
unsafe {
- let inner = &mut **this._ptr;
+ let inner = &mut **this.ptr;
&mut inner.data
}
}
// the Arc itself to be `mut`, so we're returning the only possible
// reference to the inner data.
unsafe {
- let inner = &mut **this._ptr;
+ let inner = &mut **this.ptr;
Some(&mut inner.data)
}
} else {
// This structure has #[unsafe_no_drop_flag], so this drop glue may run
// more than once (but it is guaranteed to be zeroed after the first if
// it's run more than once)
- let thin = *self._ptr as *const ();
+ let thin = *self.ptr as *const ();
if thin as usize == mem::POST_DROP_USIZE {
return;
// Relaxed is valid for the same reason it is on Arc's Clone impl
match inner.strong.compare_exchange_weak(n, n + 1, Relaxed, Relaxed) {
- Ok(_) => return Some(Arc { _ptr: self._ptr }),
+ Ok(_) => return Some(Arc { ptr: self.ptr }),
Err(old) => n = old,
}
}
#[inline]
fn inner(&self) -> &ArcInner<T> {
// See comments above for why this is "safe"
- unsafe { &**self._ptr }
+ unsafe { &**self.ptr }
}
}
}
}
- return Weak { _ptr: self._ptr };
+ return Weak { ptr: self.ptr };
}
}
/// } // implicit drop
/// ```
fn drop(&mut self) {
- let ptr = *self._ptr;
+ let ptr = *self.ptr;
let thin = ptr as *const ();
// see comments above for why this check is here
#[stable(feature = "rust1", since = "1.0.0")]
impl<T: ?Sized> fmt::Pointer for Arc<T> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
- fmt::Pointer::fmt(&*self._ptr, f)
+ fmt::Pointer::fmt(&*self.ptr, f)
}
}
issue = "30425")]
pub fn new() -> Weak<T> {
unsafe {
- Weak { _ptr: Shared::new(Box::into_raw(box ArcInner {
+ Weak { ptr: Shared::new(Box::into_raw(box ArcInner {
strong: atomic::AtomicUsize::new(0),
weak: atomic::AtomicUsize::new(1),
data: uninitialized(),
#[unsafe_no_drop_flag]
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Rc<T: ?Sized> {
- // FIXME #12808: strange names to try to avoid interfering with field
- // accesses of the contained type via Deref
- _ptr: Shared<RcBox<T>>,
+ ptr: Shared<RcBox<T>>,
}
#[stable(feature = "rust1", since = "1.0.0")]
// pointers, which ensures that the weak destructor never frees
// the allocation while the strong destructor is running, even
// if the weak pointer is stored inside the strong one.
- _ptr: Shared::new(Box::into_raw(box RcBox {
+ ptr: Shared::new(Box::into_raw(box RcBox {
strong: Cell::new(1),
weak: Cell::new(1),
value: value,
// pointer while also handling drop logic by just crafting a
// fake Weak.
this.dec_strong();
- let _weak = Weak { _ptr: this._ptr };
+ let _weak = Weak { ptr: this.ptr };
forget(this);
Ok(val)
}
#[stable(feature = "rc_weak", since = "1.4.0")]
pub fn downgrade(this: &Self) -> Weak<T> {
this.inc_weak();
- Weak { _ptr: this._ptr }
+ Weak { ptr: this.ptr }
}
/// Get the number of weak references to this value.
#[stable(feature = "rc_unique", since = "1.4.0")]
pub fn get_mut(this: &mut Self) -> Option<&mut T> {
if Rc::is_unique(this) {
- let inner = unsafe { &mut **this._ptr };
+ let inner = unsafe { &mut **this.ptr };
Some(&mut inner.value)
} else {
None
} else if Rc::weak_count(this) != 0 {
// Can just steal the data, all that's left is Weaks
unsafe {
- let mut swap = Rc::new(ptr::read(&(**this._ptr).value));
+ let mut swap = Rc::new(ptr::read(&(**this.ptr).value));
mem::swap(this, &mut swap);
swap.dec_strong();
// Remove implicit strong-weak ref (no need to craft a fake
// reference count is guaranteed to be 1 at this point, and we required
// the `Rc<T>` itself to be `mut`, so we're returning the only possible
// reference to the inner value.
- let inner = unsafe { &mut **this._ptr };
+ let inner = unsafe { &mut **this.ptr };
&mut inner.value
}
}
#[unsafe_destructor_blind_to_params]
fn drop(&mut self) {
unsafe {
- let ptr = *self._ptr;
+ let ptr = *self.ptr;
let thin = ptr as *const ();
if thin as usize != mem::POST_DROP_USIZE {
#[inline]
fn clone(&self) -> Rc<T> {
self.inc_strong();
- Rc { _ptr: self._ptr }
+ Rc { ptr: self.ptr }
}
}
#[stable(feature = "rust1", since = "1.0.0")]
impl<T: ?Sized> fmt::Pointer for Rc<T> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
- fmt::Pointer::fmt(&*self._ptr, f)
+ fmt::Pointer::fmt(&*self.ptr, f)
}
}
#[unsafe_no_drop_flag]
#[stable(feature = "rc_weak", since = "1.4.0")]
pub struct Weak<T: ?Sized> {
- // FIXME #12808: strange names to try to avoid interfering with
- // field accesses of the contained type via Deref
- _ptr: Shared<RcBox<T>>,
+ ptr: Shared<RcBox<T>>,
}
#[stable(feature = "rc_weak", since = "1.4.0")]
None
} else {
self.inc_strong();
- Some(Rc { _ptr: self._ptr })
+ Some(Rc { ptr: self.ptr })
}
}
}
/// ```
fn drop(&mut self) {
unsafe {
- let ptr = *self._ptr;
+ let ptr = *self.ptr;
let thin = ptr as *const ();
if thin as usize != mem::POST_DROP_USIZE {
#[inline]
fn clone(&self) -> Weak<T> {
self.inc_weak();
- Weak { _ptr: self._ptr }
+ Weak { ptr: self.ptr }
}
}
pub fn new() -> Weak<T> {
unsafe {
Weak {
- _ptr: Shared::new(Box::into_raw(box RcBox {
+ ptr: Shared::new(Box::into_raw(box RcBox {
strong: Cell::new(0),
weak: Cell::new(1),
value: uninitialized(),
// the contract anyway.
// This allows the null check to be elided in the destructor if we
// manipulated the reference count in the same function.
- assume(!(*(&self._ptr as *const _ as *const *const ())).is_null());
- &(**self._ptr)
+ assume(!(*(&self.ptr as *const _ as *const *const ())).is_null());
+ &(**self.ptr)
}
}
}
// the contract anyway.
// This allows the null check to be elided in the destructor if we
// manipulated the reference count in the same function.
- assume(!(*(&self._ptr as *const _ as *const *const ())).is_null());
- &(**self._ptr)
+ assume(!(*(&self.ptr as *const _ as *const *const ())).is_null());
+ &(**self.ptr)
}
}
}
/// The value may be any borrowed form of the set's value type,
/// but the ordering on the borrowed form *must* match the
/// ordering on the value type.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn get<Q: ?Sized>(&self, value: &Q) -> Option<&T>
where T: Borrow<Q>,
Q: Ord
/// Adds a value to the set, replacing the existing value, if any, that is equal to the given
/// one. Returns the replaced value.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn replace(&mut self, value: T) -> Option<T> {
Recover::replace(&mut self.map, value)
}
/// The value may be any borrowed form of the set's value type,
/// but the ordering on the borrowed form *must* match the
/// ordering on the value type.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn take<Q: ?Sized>(&mut self, value: &Q) -> Option<T>
where T: Borrow<Q>,
Q: Ord
test(no_crate_inject, attr(allow(unused_variables), deny(warnings))))]
#![cfg_attr(test, allow(deprecated))] // rand
-#![cfg_attr(not(test), feature(copy_from_slice))] // impl [T]
#![cfg_attr(not(stage0), deny(warnings))]
#![feature(alloc)]
#![feature(box_patterns)]
#![feature(box_syntax)]
#![feature(core_intrinsics)]
-#![feature(decode_utf16)]
#![feature(dropck_parametricity)]
#![feature(fmt_internals)]
#![feature(heap_api)]
/// # Example
///
/// ```rust
- /// #![feature(copy_from_slice)]
/// let mut dst = [0, 0, 0];
/// let src = [1, 2, 3];
///
/// dst.copy_from_slice(&src);
/// assert_eq!(src, dst);
/// ```
- #[unstable(feature = "copy_from_slice", issue = "31755")]
+ #[stable(feature = "copy_from_slice", since = "1.9.0")]
pub fn copy_from_slice(&mut self, src: &[T]) where T: Copy {
core_slice::SliceExt::copy_from_slice(self, src)
}
/// # Examples
///
/// ```
- /// #![feature(str_char)]
- ///
/// let s = "Löwe 老虎 Léopard";
/// assert!(s.is_char_boundary(0));
/// // start of `老`
/// // third byte of `老`
/// assert!(!s.is_char_boundary(8));
/// ```
- #[unstable(feature = "str_char",
- reason = "it is unclear whether this method pulls its weight \
- with the existence of the char_indices iterator or \
- this method may want to be replaced with checked \
- slicing",
- issue = "27754")]
+ #[stable(feature = "is_char_boundary", since = "1.9.0")]
#[inline]
pub fn is_char_boundary(&self, index: usize) -> bool {
core_str::StrExt::is_char_boundary(self, index)
///
/// ```
/// #![feature(str_char)]
+ /// #![allow(deprecated)]
///
/// use std::str::CharRange;
///
removed altogether",
issue = "27754")]
#[inline]
+ #[rustc_deprecated(reason = "use slicing plus chars() plus len_utf8",
+ since = "1.9.0")]
+ #[allow(deprecated)]
pub fn char_range_at(&self, start: usize) -> CharRange {
core_str::StrExt::char_range_at(self, start)
}
///
/// ```
/// #![feature(str_char)]
+ /// #![allow(deprecated)]
///
/// use std::str::CharRange;
///
eventually removed altogether",
issue = "27754")]
#[inline]
+ #[rustc_deprecated(reason = "use slicing plus chars().rev() plus len_utf8",
+ since = "1.9.0")]
+ #[allow(deprecated)]
pub fn char_range_at_reverse(&self, start: usize) -> CharRange {
core_str::StrExt::char_range_at_reverse(self, start)
}
///
/// ```
/// #![feature(str_char)]
+ /// #![allow(deprecated)]
///
/// let s = "abπc";
/// assert_eq!(s.char_at(1), 'b');
subslice",
issue = "27754")]
#[inline]
+ #[allow(deprecated)]
+ #[rustc_deprecated(reason = "use slicing plus chars()",
+ since = "1.9.0")]
pub fn char_at(&self, i: usize) -> char {
core_str::StrExt::char_at(self, i)
}
///
/// ```
/// #![feature(str_char)]
+ /// #![allow(deprecated)]
///
/// let s = "abπc";
/// assert_eq!(s.char_at_reverse(1), 'a');
cases generate panics",
issue = "27754")]
#[inline]
+ #[rustc_deprecated(reason = "use slicing plus chars().rev()",
+ since = "1.9.0")]
+ #[allow(deprecated)]
pub fn char_at_reverse(&self, i: usize) -> char {
core_str::StrExt::char_at_reverse(self, i)
}
///
/// ```
/// #![feature(str_char)]
+ /// #![allow(deprecated)]
///
/// let s = "Łódź"; // \u{141}o\u{301}dz\u{301}
/// let (c, s1) = s.slice_shift_char().unwrap();
and/or char_indices iterators",
issue = "27754")]
#[inline]
+ #[rustc_deprecated(reason = "use chars() plus Chars::as_str",
+ since = "1.9.0")]
+ #[allow(deprecated)]
pub fn slice_shift_char(&self) -> Option<(char, &str)> {
core_str::StrExt::slice_shift_char(self)
}
#[inline]
#[stable(feature = "rust1", since = "1.0.0")]
pub fn pop(&mut self) -> Option<char> {
- let len = self.len();
- if len == 0 {
- return None;
- }
-
- let ch = self.char_at_reverse(len);
+ let ch = match self.chars().rev().next() {
+ Some(ch) => ch,
+ None => return None,
+ };
+ let newlen = self.len() - ch.len_utf8();
unsafe {
- self.vec.set_len(len - ch.len_utf8());
+ self.vec.set_len(newlen);
}
Some(ch)
}
#[inline]
#[stable(feature = "rust1", since = "1.0.0")]
pub fn remove(&mut self, idx: usize) -> char {
- let len = self.len();
- assert!(idx < len);
+ let ch = match self[idx..].chars().next() {
+ Some(ch) => ch,
+ None => panic!("cannot remove a char from the end of a string"),
+ };
- let ch = self.char_at(idx);
let next = idx + ch.len_utf8();
+ let len = self.len();
unsafe {
ptr::copy(self.vec.as_ptr().offset(next as isize),
self.vec.as_mut_ptr().offset(idx as isize),
#![deny(warnings)]
-#![feature(ascii)]
#![feature(binary_heap_extras)]
#![feature(box_syntax)]
#![feature(btree_range)]
#![feature(collections)]
#![feature(collections_bound)]
-#![feature(copy_from_slice)]
#![feature(const_fn)]
#![feature(fn_traits)]
#![feature(enumset)]
#![feature(map_values_mut)]
#![feature(pattern)]
#![feature(rand)]
-#![feature(set_recovery)]
#![feature(step_by)]
#![feature(str_char)]
#![feature(str_escape)]
}
#[test]
+#[allow(deprecated)]
fn test_slice_shift_char() {
let data = "ประเทศไทย中";
assert_eq!(data.slice_shift_char(), Some(('ป', "ระเทศไทย中")));
}
#[test]
+#[allow(deprecated)]
fn test_slice_shift_char_2() {
let empty = "";
assert_eq!(empty.slice_shift_char(), None);
}
#[test]
+#[allow(deprecated)]
fn test_char_at() {
let s = "ศไทย中华Việt Nam";
let v = vec!['ศ','ไ','ท','ย','中','华','V','i','ệ','t',' ','N','a','m'];
}
#[test]
+#[allow(deprecated)]
fn test_char_at_reverse() {
let s = "ศไทย中华Việt Nam";
let v = vec!['ศ','ไ','ท','ย','中','华','V','i','ệ','t',' ','N','a','m'];
}
#[test]
+#[allow(deprecated)]
fn test_char_range_at() {
let data = "b¢€𤭢𤭢€¢b";
assert_eq!('b', data.char_range_at(0).ch);
}
#[test]
+#[allow(deprecated)]
fn test_char_range_at_reverse_underflow() {
assert_eq!("abc".char_range_at_reverse(0).next, 0);
}
pub fn borrow(&self) -> Ref<T> {
match BorrowRef::new(&self.borrow) {
Some(b) => Ref {
- _value: unsafe { &*self.value.get() },
- _borrow: b,
+ value: unsafe { &*self.value.get() },
+ borrow: b,
},
None => panic!("RefCell<T> already mutably borrowed"),
}
pub fn borrow_mut(&self) -> RefMut<T> {
match BorrowRefMut::new(&self.borrow) {
Some(b) => RefMut {
- _value: unsafe { &mut *self.value.get() },
- _borrow: b,
+ value: unsafe { &mut *self.value.get() },
+ borrow: b,
},
None => panic!("RefCell<T> already borrowed"),
}
impl<T: ?Sized + Eq> Eq for RefCell<T> {}
struct BorrowRef<'b> {
- _borrow: &'b Cell<BorrowFlag>,
+ borrow: &'b Cell<BorrowFlag>,
}
impl<'b> BorrowRef<'b> {
WRITING => None,
b => {
borrow.set(b + 1);
- Some(BorrowRef { _borrow: borrow })
+ Some(BorrowRef { borrow: borrow })
},
}
}
impl<'b> Drop for BorrowRef<'b> {
#[inline]
fn drop(&mut self) {
- let borrow = self._borrow.get();
+ let borrow = self.borrow.get();
debug_assert!(borrow != WRITING && borrow != UNUSED);
- self._borrow.set(borrow - 1);
+ self.borrow.set(borrow - 1);
}
}
fn clone(&self) -> BorrowRef<'b> {
// Since this Ref exists, we know the borrow flag
// is not set to WRITING.
- let borrow = self._borrow.get();
+ let borrow = self.borrow.get();
debug_assert!(borrow != WRITING && borrow != UNUSED);
- self._borrow.set(borrow + 1);
- BorrowRef { _borrow: self._borrow }
+ self.borrow.set(borrow + 1);
+ BorrowRef { borrow: self.borrow }
}
}
/// See the [module-level documentation](index.html) for more.
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Ref<'b, T: ?Sized + 'b> {
- // FIXME #12808: strange name to try to avoid interfering with
- // field accesses of the contained type via Deref
- _value: &'b T,
- _borrow: BorrowRef<'b>,
+ value: &'b T,
+ borrow: BorrowRef<'b>,
}
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
fn deref(&self) -> &T {
- self._value
+ self.value
}
}
#[inline]
pub fn clone(orig: &Ref<'b, T>) -> Ref<'b, T> {
Ref {
- _value: orig._value,
- _borrow: orig._borrow.clone(),
+ value: orig.value,
+ borrow: orig.borrow.clone(),
}
}
where F: FnOnce(&T) -> &U
{
Ref {
- _value: f(orig._value),
- _borrow: orig._borrow,
+ value: f(orig.value),
+ borrow: orig.borrow,
}
}
pub fn filter_map<U: ?Sized, F>(orig: Ref<'b, T>, f: F) -> Option<Ref<'b, U>>
where F: FnOnce(&T) -> Option<&U>
{
- f(orig._value).map(move |new| Ref {
- _value: new,
- _borrow: orig._borrow,
+ f(orig.value).map(move |new| Ref {
+ value: new,
+ borrow: orig.borrow,
})
}
}
where F: FnOnce(&mut T) -> &mut U
{
RefMut {
- _value: f(orig._value),
- _borrow: orig._borrow,
+ value: f(orig.value),
+ borrow: orig.borrow,
}
}
pub fn filter_map<U: ?Sized, F>(orig: RefMut<'b, T>, f: F) -> Option<RefMut<'b, U>>
where F: FnOnce(&mut T) -> Option<&mut U>
{
- let RefMut { _value, _borrow } = orig;
- f(_value).map(move |new| RefMut {
- _value: new,
- _borrow: _borrow,
+ let RefMut { value, borrow } = orig;
+ f(value).map(move |new| RefMut {
+ value: new,
+ borrow: borrow,
})
}
}
struct BorrowRefMut<'b> {
- _borrow: &'b Cell<BorrowFlag>,
+ borrow: &'b Cell<BorrowFlag>,
}
impl<'b> Drop for BorrowRefMut<'b> {
#[inline]
fn drop(&mut self) {
- let borrow = self._borrow.get();
+ let borrow = self.borrow.get();
debug_assert!(borrow == WRITING);
- self._borrow.set(UNUSED);
+ self.borrow.set(UNUSED);
}
}
match borrow.get() {
UNUSED => {
borrow.set(WRITING);
- Some(BorrowRefMut { _borrow: borrow })
+ Some(BorrowRefMut { borrow: borrow })
},
_ => None,
}
/// See the [module-level documentation](index.html) for more.
#[stable(feature = "rust1", since = "1.0.0")]
pub struct RefMut<'b, T: ?Sized + 'b> {
- // FIXME #12808: strange name to try to avoid interfering with
- // field accesses of the contained type via Deref
- _value: &'b mut T,
- _borrow: BorrowRefMut<'b>,
+ value: &'b mut T,
+ borrow: BorrowRefMut<'b>,
}
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
fn deref(&self) -> &T {
- self._value
+ self.value
}
}
impl<'b, T: ?Sized> DerefMut for RefMut<'b, T> {
#[inline]
fn deref_mut(&mut self) -> &mut T {
- self._value
+ self.value
}
}
/// Basic usage:
///
/// ```
- /// assert_eq!(u32::from_str_radix("A", 16), Ok(10));
+ /// assert_eq!(i32::from_str_radix("A", 16), Ok(10));
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
pub fn from_str_radix(src: &str, radix: u32) -> Result<Self, ParseIntError> {
/// Basic usage:
///
/// ```
- /// let n = 0b01001100u8;
+ /// let n = -0b1000_0000i8;
///
- /// assert_eq!(n.count_ones(), 3);
+ /// assert_eq!(n.count_ones(), 1);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0b01001100u8;
+ /// let n = -0b1000_0000i8;
///
- /// assert_eq!(n.count_zeros(), 5);
+ /// assert_eq!(n.count_zeros(), 7);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0b0101000u16;
+ /// let n = -1i16;
///
- /// assert_eq!(n.leading_zeros(), 10);
+ /// assert_eq!(n.leading_zeros(), 0);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0b0101000u16;
+ /// let n = -4i8;
///
- /// assert_eq!(n.trailing_zeros(), 3);
+ /// assert_eq!(n.trailing_zeros(), 2);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
- /// let m = 0x3456789ABCDEF012u64;
+ /// let n = 0x0123456789ABCDEFi64;
+ /// let m = -0x76543210FEDCBA99i64;
///
- /// assert_eq!(n.rotate_left(12), m);
+ /// assert_eq!(n.rotate_left(32), m);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
- /// let m = 0xDEF0123456789ABCu64;
+ /// let n = 0x0123456789ABCDEFi64;
+ /// let m = -0xFEDCBA987654322i64;
///
- /// assert_eq!(n.rotate_right(12), m);
+ /// assert_eq!(n.rotate_right(4), m);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
- /// let m = 0xEFCDAB8967452301u64;
+ /// let n = 0x0123456789ABCDEFi64;
+ /// let m = -0x1032547698BADCFFi64;
///
/// assert_eq!(n.swap_bytes(), m);
/// ```
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
+ /// let n = 0x0123456789ABCDEFi64;
///
/// if cfg!(target_endian = "big") {
- /// assert_eq!(u64::from_be(n), n)
+ /// assert_eq!(i64::from_be(n), n)
/// } else {
- /// assert_eq!(u64::from_be(n), n.swap_bytes())
+ /// assert_eq!(i64::from_be(n), n.swap_bytes())
/// }
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
+ /// let n = 0x0123456789ABCDEFi64;
///
/// if cfg!(target_endian = "little") {
- /// assert_eq!(u64::from_le(n), n)
+ /// assert_eq!(i64::from_le(n), n)
/// } else {
- /// assert_eq!(u64::from_le(n), n.swap_bytes())
+ /// assert_eq!(i64::from_le(n), n.swap_bytes())
/// }
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
+ /// let n = 0x0123456789ABCDEFi64;
///
/// if cfg!(target_endian = "big") {
/// assert_eq!(n.to_be(), n)
/// Basic usage:
///
/// ```
- /// let n = 0x0123456789ABCDEFu64;
+ /// let n = 0x0123456789ABCDEFi64;
///
/// if cfg!(target_endian = "little") {
/// assert_eq!(n.to_le(), n)
/// Basic usage:
///
/// ```
- /// assert_eq!(5u16.checked_add(65530), Some(65535));
- /// assert_eq!(6u16.checked_add(65530), None);
+ /// assert_eq!(7i16.checked_add(32760), Some(32767));
+ /// assert_eq!(8i16.checked_add(32760), None);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// assert_eq!(5u8.checked_mul(51), Some(255));
- /// assert_eq!(5u8.checked_mul(52), None);
+ /// assert_eq!(6i8.checked_mul(21), Some(126));
+ /// assert_eq!(6i8.checked_mul(22), None);
/// ```
#[stable(feature = "rust1", since = "1.0.0")]
#[inline]
/// Basic usage:
///
/// ```
- /// assert_eq!(1u8.wrapping_shl(7), 128);
- /// assert_eq!(1u8.wrapping_shl(8), 1);
+ /// assert_eq!((-1i8).wrapping_shl(7), -128);
+ /// assert_eq!((-1i8).wrapping_shl(8), -1);
/// ```
#[stable(feature = "num_wrapping", since = "1.2.0")]
#[inline(always)]
/// Basic usage:
///
/// ```
- /// assert_eq!(128u8.wrapping_shr(7), 1);
- /// assert_eq!(128u8.wrapping_shr(8), 128);
+ /// assert_eq!((-128i8).wrapping_shr(7), -1);
+ /// assert_eq!((-128i8).wrapping_shr(8), -128);
/// ```
#[stable(feature = "num_wrapping", since = "1.2.0")]
#[inline(always)]
///
/// Leading and trailing whitespace represent an error.
///
- /// # Arguments
- ///
- /// * src - A string slice
- /// * radix - The base to use. Must lie in the range [2 .. 36]
+ /// # Examples
///
- /// # Return value
+ /// Basic usage:
///
- /// `Err(ParseIntError)` if the string did not represent a valid number.
- /// Otherwise, `Ok(n)` where `n` is the integer represented by `src`.
+ /// ```
+ /// assert_eq!(u32::from_str_radix("A", 16), Ok(10));
+ /// ```
#[stable(feature = "rust1", since = "1.0.0")]
pub fn from_str_radix(src: &str, radix: u32) -> Result<Self, ParseIntError> {
from_str_radix(src, radix)
/// Basic usage:
///
/// ```
- /// assert_eq!(100i8.wrapping_rem(10), 0);
+ /// assert_eq!(100u8.wrapping_rem(10), 0);
/// ```
#[stable(feature = "num_wrapping", since = "1.2.0")]
#[inline(always)]
/// where `mask` removes any high-order bits of `rhs` that
/// would cause the shift to exceed the bitwidth of the type.
///
+ /// Note that this is *not* the same as a rotate-left; the
+ /// RHS of a wrapping shift-left is restricted to the range
+ /// of the type, rather than the bits shifted out of the LHS
+ /// being returned to the other end. The primitive integer
+ /// types all implement a `rotate_left` function, which may
+ /// be what you want instead.
+ ///
/// # Examples
///
/// Basic usage:
/// where `mask` removes any high-order bits of `rhs` that
/// would cause the shift to exceed the bitwidth of the type.
///
+ /// Note that this is *not* the same as a rotate-right; the
+ /// RHS of a wrapping shift-right is restricted to the range
+ /// of the type, rather than the bits shifted out of the LHS
+ /// being returned to the other end. The primitive integer
+ /// types all implement a `rotate_right` function, which may
+ /// be what you want instead.
+ ///
/// # Examples
///
/// Basic usage:
///
/// Volatile operations are intended to act on I/O memory, and are guaranteed
/// to not be elided or reordered by the compiler across other volatile
-/// operations. See the LLVM documentation on [[volatile]].
+/// operations.
///
-/// [volatile]: http://llvm.org/docs/LangRef.html#volatile-memory-accesses
+/// # Notes
+///
+/// Rust does not currently have a rigorously and formally defined memory model,
+/// so the precise semantics of what "volatile" means here is subject to change
+/// over time. That being said, the semantics will almost always end up pretty
+/// similar to [C11's definition of volatile][c11].
+///
+/// [c11]: http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
///
/// # Safety
///
/// `zero_memory`, or `copy_memory`). Note that `*src = foo` counts as a use
/// because it will attempt to drop the value previously at `*src`.
#[inline]
-#[unstable(feature = "volatile", reason = "recently added", issue = "31756")]
+#[stable(feature = "volatile", since = "1.9.0")]
pub unsafe fn read_volatile<T>(src: *const T) -> T {
intrinsics::volatile_load(src)
}
///
/// Volatile operations are intended to act on I/O memory, and are guaranteed
/// to not be elided or reordered by the compiler across other volatile
-/// operations. See the LLVM documentation on [[volatile]].
+/// operations.
+///
+/// # Notes
///
-/// [volatile]: http://llvm.org/docs/LangRef.html#volatile-memory-accesses
+/// Rust does not currently have a rigorously and formally defined memory model,
+/// so the precise semantics of what "volatile" means here is subject to change
+/// over time. That being said, the semantics will almost always end up pretty
+/// similar to [C11's definition of volatile][c11].
+///
+/// [c11]: http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
///
/// # Safety
///
/// This is appropriate for initializing uninitialized memory, or overwriting
/// memory that has previously been `read` from.
#[inline]
-#[unstable(feature = "volatile", reason = "recently added", issue = "31756")]
+#[stable(feature = "volatile", since = "1.9.0")]
pub unsafe fn write_volatile<T>(dst: *mut T, src: T) {
intrinsics::volatile_store(dst, src);
}
/// operation because the returned value could be pointing to invalid
/// memory.
///
+ /// Additionally, the lifetime `'a` returned is arbitrarily chosen and does
+ /// not necessarily reflect the actual lifetime of the data.
+ ///
/// # Examples
///
/// Basic usage:
/// }
/// }
/// ```
- #[unstable(feature = "ptr_as_ref",
- reason = "Option is not clearly the right return type, and we \
- may want to tie the return lifetime to a borrow of \
- the raw pointer",
- issue = "27780")]
+ #[stable(feature = "ptr_as_ref", since = "1.9.0")]
#[inline]
- pub unsafe fn as_ref<'a>(&self) -> Option<&'a T> where T: Sized {
+ pub unsafe fn as_ref<'a>(self) -> Option<&'a T> where T: Sized {
if self.is_null() {
None
} else {
- Some(&**self)
+ Some(&*self)
}
}
/// operation because the returned value could be pointing to invalid
/// memory.
///
+ /// Additionally, the lifetime `'a` returned is arbitrarily chosen and does
+ /// not necessarily reflect the actual lifetime of the data.
+ ///
/// # Examples
///
/// Basic usage:
/// }
/// }
/// ```
- #[unstable(feature = "ptr_as_ref",
- reason = "Option is not clearly the right return type, and we \
- may want to tie the return lifetime to a borrow of \
- the raw pointer",
- issue = "27780")]
+ #[stable(feature = "ptr_as_ref", since = "1.9.0")]
#[inline]
- pub unsafe fn as_ref<'a>(&self) -> Option<&'a T> where T: Sized {
+ pub unsafe fn as_ref<'a>(self) -> Option<&'a T> where T: Sized {
if self.is_null() {
None
} else {
- Some(&**self)
+ Some(&*self)
}
}
/// # Safety
///
/// As with `as_ref`, this is unsafe because it cannot verify the validity
- /// of the returned pointer.
+ /// of the returned pointer, nor can it ensure that the lifetime `'a`
+ /// returned is indeed a valid lifetime for the contained data.
///
/// # Examples
///
/// let mut s = [1, 2, 3];
/// let ptr: *mut u32 = s.as_mut_ptr();
/// ```
- #[unstable(feature = "ptr_as_ref",
- reason = "return value does not necessarily convey all possible \
- information",
- issue = "27780")]
+ #[stable(feature = "ptr_as_ref", since = "1.9.0")]
#[inline]
- pub unsafe fn as_mut<'a>(&self) -> Option<&'a mut T> where T: Sized {
+ pub unsafe fn as_mut<'a>(self) -> Option<&'a mut T> where T: Sized {
if self.is_null() {
None
} else {
- Some(&mut **self)
+ Some(&mut *self)
}
}
}
/// ```
#[repr(C)]
#[allow(missing_debug_implementations)]
+#[rustc_deprecated(reason = "use raw accessors/constructors in `slice` module",
+ since = "1.9.0")]
+#[unstable(feature = "raw", issue = "27751")]
pub struct Slice<T> {
pub data: *const T,
pub len: usize,
}
+#[allow(deprecated)]
impl<T> Copy for Slice<T> {}
+#[allow(deprecated)]
impl<T> Clone for Slice<T> {
fn clone(&self) -> Slice<T> { *self }
}
/// This trait is meant to map equivalences between raw structs and their
/// corresponding rust values.
+#[rustc_deprecated(reason = "use raw accessors/constructors in `slice` module",
+ since = "1.9.0")]
+#[unstable(feature = "raw", issue = "27751")]
pub unsafe trait Repr<T> {
/// This function "unwraps" a rust value (without consuming it) into its raw
/// struct representation. This can be used to read/write different values
fn repr(&self) -> T { unsafe { mem::transmute_copy(&self) } }
}
+#[allow(deprecated)]
unsafe impl<T> Repr<Slice<T>> for [T] {}
+#[allow(deprecated)]
unsafe impl Repr<Slice<u8>> for str {}
//! # #[allow(dead_code)]
//! enum Result<T, E> {
//! Ok(T),
-//! Err(E)
+//! Err(E),
//! }
//! ```
//!
//! None => Err("invalid header length"),
//! Some(&1) => Ok(Version::Version1),
//! Some(&2) => Ok(Version::Version2),
-//! Some(_) => Err("invalid version")
+//! Some(_) => Err("invalid version"),
//! }
//! }
//!
/// Contains the error value
#[stable(feature = "rust1", since = "1.0.0")]
- Err(#[stable(feature = "rust1", since = "1.0.0")] E)
+ Err(#[stable(feature = "rust1", since = "1.0.0")] E),
}
/////////////////////////////////////////////////////////////////////////////
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<i32, &str> = Ok(-3);
/// assert_eq!(x.is_ok(), true);
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<i32, &str> = Ok(-3);
/// assert_eq!(x.is_err(), false);
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// assert_eq!(x.ok(), Some(2));
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// assert_eq!(x.err(), None);
/// Produces a new `Result`, containing a reference
/// into the original, leaving the original in place.
///
+ /// # Examples
+ ///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// assert_eq!(x.as_ref(), Ok(&2));
/// Converts from `Result<T, E>` to `Result<&mut T, &mut E>`
///
+ /// # Examples
+ ///
+ /// Basic usage:
+ ///
/// ```
/// fn mutate(r: &mut Result<i32, i32>) {
/// match r.as_mut() {
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// fn stringify(x: u32) -> String { format!("error code: {}", x) }
///
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(7);
/// assert_eq!(x.iter().next(), Some(&7));
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let mut x: Result<u32, &str> = Ok(7);
/// match x.iter_mut().next() {
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// let y: Result<&str, &str> = Err("late error");
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// fn sq(x: u32) -> Result<u32, u32> { Ok(x * x) }
/// fn err(x: u32) -> Result<u32, u32> { Err(x) }
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// let y: Result<u32, &str> = Err("late error");
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// fn sq(x: u32) -> Result<u32, u32> { Ok(x * x) }
/// fn err(x: u32) -> Result<u32, u32> { Err(x) }
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let optb = 2;
/// let x: Result<u32, &str> = Ok(9);
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// fn count(x: &str) -> usize { x.len() }
///
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(2);
/// assert_eq!(x.unwrap(), 2);
/// passed message, and the content of the `Err`.
///
/// # Examples
+ ///
+ /// Basic usage:
+ ///
/// ```{.should_panic}
/// let x: Result<u32, &str> = Err("emergency failure");
/// x.expect("Testing expect"); // panics with `Testing expect: emergency failure`
///
/// # Examples
///
+ /// Basic usage:
+ ///
/// ```
/// let x: Result<u32, &str> = Ok(5);
/// let v: Vec<u32> = x.into_iter().collect();
use ptr;
use mem;
use marker::{Copy, Send, Sync, self};
-use raw::Repr;
-// Avoid conflicts with *both* the Slice trait (buggy) and the `slice::raw` module.
-use raw::Slice as RawSlice;
+#[repr(C)]
+struct Repr<T> {
+ pub data: *const T,
+ pub len: usize,
+}
//
// Extension traits
fn ends_with(&self, needle: &[Self::Item]) -> bool where Self::Item: PartialEq;
#[stable(feature = "clone_from_slice", since = "1.7.0")]
- fn clone_from_slice(&mut self, &[Self::Item]) where Self::Item: Clone;
- #[unstable(feature = "copy_from_slice", issue = "31755")]
+ fn clone_from_slice(&mut self, src: &[Self::Item]) where Self::Item: Clone;
+ #[stable(feature = "copy_from_slice", since = "1.9.0")]
fn copy_from_slice(&mut self, src: &[Self::Item]) where Self::Item: Copy;
}
}
#[inline]
- fn len(&self) -> usize { self.repr().len }
+ fn len(&self) -> usize {
+ unsafe {
+ mem::transmute::<&[T], Repr<T>>(self).len
+ }
+ }
#[inline]
fn get_mut(&mut self, index: usize) -> Option<&mut T> {
}
/// Immutable slice iterator
+///
+/// # Examples
+///
+/// Basic usage:
+///
+/// ```
+/// // First, we declare a type which has `iter` method to get the `Iter` struct (&[usize here]):
+/// let slice = &[1, 2, 3];
+///
+/// // Then, we iterate over it:
+/// for element in slice.iter() {
+/// println!("{}", element);
+/// }
+/// ```
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Iter<'a, T: 'a> {
ptr: *const T,
///
/// This has the same lifetime as the original slice, and so the
/// iterator can continue to be used while this exists.
+ ///
+ /// # Examples
+ ///
+ /// Basic usage:
+ ///
+ /// ```
+ /// // First, we declare a type which has the `iter` method to get the `Iter`
+ /// // struct (&[usize here]):
+ /// let slice = &[1, 2, 3];
+ ///
+ /// // Then, we get the iterator:
+ /// let mut iter = slice.iter();
+ /// // So if we print what `as_slice` method returns here, we have "[1, 2, 3]":
+ /// println!("{:?}", iter.as_slice());
+ ///
+ /// // Next, we move to the second element of the slice:
+ /// iter.next();
+ /// // Now `as_slice` returns "[2, 3]":
+ /// println!("{:?}", iter.as_slice());
+ /// ```
#[stable(feature = "iter_to_slice", since = "1.4.0")]
pub fn as_slice(&self) -> &'a [T] {
make_slice!(self.ptr, self.end)
}
/// Mutable slice iterator.
+///
+/// # Examples
+///
+/// Basic usage:
+///
+/// ```
+/// // First, we declare a type which has `iter_mut` method to get the `IterMut`
+/// // struct (&[usize here]):
+/// let mut slice = &mut [1, 2, 3];
+///
+/// // Then, we iterate over it and increment each element value:
+/// for element in slice.iter_mut() {
+/// *element += 1;
+/// }
+///
+/// // We now have "[2, 3, 4]":
+/// println!("{:?}", slice);
+/// ```
#[stable(feature = "rust1", since = "1.0.0")]
pub struct IterMut<'a, T: 'a> {
ptr: *mut T,
/// to consume the iterator. Consider using the `Slice` and
/// `SliceMut` implementations for obtaining slices with more
/// restricted lifetimes that do not consume the iterator.
+ ///
+ /// # Examples
+ ///
+ /// Basic usage:
+ ///
+ /// ```
+ /// // First, we declare a type which has `iter_mut` method to get the `IterMut`
+ /// // struct (&[usize here]):
+ /// let mut slice = &mut [1, 2, 3];
+ ///
+ /// {
+ /// // Then, we get the iterator:
+ /// let mut iter = slice.iter_mut();
+ /// // We move to next element:
+ /// iter.next();
+ /// // So if we print what `into_slice` method returns here, we have "[2, 3]":
+ /// println!("{:?}", iter.into_slice());
+ /// }
+ ///
+ /// // Now let's modify a value of the slice:
+ /// {
+ /// // First we get back the iterator:
+ /// let mut iter = slice.iter_mut();
+ /// // We change the value of the first element of the slice returned by the `next` method:
+ /// *iter.next().unwrap() += 1;
+ /// }
+ /// // Now slice is "[2, 2, 3]":
+ /// println!("{:?}", slice);
+ /// ```
#[stable(feature = "iter_to_slice", since = "1.4.0")]
pub fn into_slice(self) -> &'a mut [T] {
make_mut_slice!(self.ptr, self.end)
#[inline]
#[stable(feature = "rust1", since = "1.0.0")]
pub unsafe fn from_raw_parts<'a, T>(p: *const T, len: usize) -> &'a [T] {
- mem::transmute(RawSlice { data: p, len: len })
+ mem::transmute(Repr { data: p, len: len })
}
/// Performs the same functionality as `from_raw_parts`, except that a mutable
#[inline]
#[stable(feature = "rust1", since = "1.0.0")]
pub unsafe fn from_raw_parts_mut<'a, T>(p: *mut T, len: usize) -> &'a mut [T] {
- mem::transmute(RawSlice { data: p, len: len })
+ mem::transmute(Repr { data: p, len: len })
}
//
use mem;
use ops::{Fn, FnMut, FnOnce};
use option::Option::{self, None, Some};
-use raw::{Repr, Slice};
use result::Result::{self, Ok, Err};
use slice::{self, SliceExt};
#[stable(feature = "core", since = "1.6.0")]
fn trim_right_matches<'a, P: Pattern<'a>>(&'a self, pat: P) -> &'a str
where P::Searcher: ReverseSearcher<'a>;
- #[unstable(feature = "str_char",
- reason = "it is unclear whether this method pulls its weight \
- with the existence of the char_indices iterator or \
- this method may want to be replaced with checked \
- slicing",
- issue = "27754")]
+ #[stable(feature = "is_char_boundary", since = "1.9.0")]
fn is_char_boundary(&self, index: usize) -> bool;
#[unstable(feature = "str_char",
reason = "often replaced by char_indices, this method may \
be removed in favor of just char_at() or eventually \
removed altogether",
issue = "27754")]
+ #[rustc_deprecated(reason = "use slicing plus chars() plus len_utf8",
+ since = "1.9.0")]
fn char_range_at(&self, start: usize) -> CharRange;
#[unstable(feature = "str_char",
reason = "often replaced by char_indices, this method may \
be removed in favor of just char_at_reverse() or \
eventually removed altogether",
issue = "27754")]
+ #[rustc_deprecated(reason = "use slicing plus chars().rev() plus len_utf8",
+ since = "1.9.0")]
fn char_range_at_reverse(&self, start: usize) -> CharRange;
#[unstable(feature = "str_char",
reason = "frequently replaced by the chars() iterator, this \
iterators or by getting the first char from a \
subslice",
issue = "27754")]
+ #[rustc_deprecated(reason = "use slicing plus chars()",
+ since = "1.9.0")]
fn char_at(&self, i: usize) -> char;
#[unstable(feature = "str_char",
reason = "see char_at for more details, but reverse semantics \
are also somewhat unclear, especially with which \
cases generate panics",
issue = "27754")]
+ #[rustc_deprecated(reason = "use slicing plus chars().rev()",
+ since = "1.9.0")]
fn char_at_reverse(&self, i: usize) -> char;
#[stable(feature = "core", since = "1.6.0")]
fn as_bytes(&self) -> &[u8];
may not be warranted with the existence of the chars \
and/or char_indices iterators",
issue = "27754")]
+ #[rustc_deprecated(reason = "use chars() plus Chars::as_str",
+ since = "1.9.0")]
fn slice_shift_char(&self) -> Option<(char, &str)>;
#[stable(feature = "core", since = "1.6.0")]
fn as_ptr(&self) -> *const u8;
#[inline]
unsafe fn slice_unchecked(&self, begin: usize, end: usize) -> &str {
- mem::transmute(Slice {
- data: self.as_ptr().offset(begin as isize),
- len: end - begin,
- })
+ let ptr = self.as_ptr().offset(begin as isize);
+ let len = end - begin;
+ from_utf8_unchecked(slice::from_raw_parts(ptr, len))
}
#[inline]
unsafe fn slice_mut_unchecked(&mut self, begin: usize, end: usize) -> &mut str {
- mem::transmute(Slice {
- data: self.as_ptr().offset(begin as isize),
- len: end - begin,
- })
+ let ptr = self.as_ptr().offset(begin as isize);
+ let len = end - begin;
+ mem::transmute(slice::from_raw_parts_mut(ptr as *mut u8, len))
}
#[inline]
if index == 0 || index == self.len() { return true; }
match self.as_bytes().get(index) {
None => false,
- Some(&b) => b < 128 || b >= 192,
+ // This is bit magic equivalent to: b < 128 || b >= 192
+ Some(&b) => (b as i8) >= -0x40,
}
}
}
#[inline]
+ #[allow(deprecated)]
fn char_at(&self, i: usize) -> char {
self.char_range_at(i).ch
}
#[inline]
+ #[allow(deprecated)]
fn char_at_reverse(&self, i: usize) -> char {
self.char_range_at_reverse(i).ch
}
}
#[inline]
+ #[allow(deprecated)]
fn slice_shift_char(&self) -> Option<(char, &str)> {
if self.is_empty() {
None
}
#[inline]
- fn len(&self) -> usize { self.repr().len }
+ fn len(&self) -> usize {
+ self.as_bytes().len()
+ }
#[inline]
fn is_empty(&self) -> bool { self.len() == 0 }
// option. This file may not be copied, modified, or distributed
// except according to those terms.
+use std::char;
+
#[test]
fn test_is_lowercase() {
assert!('a'.is_lowercase());
#[test]
fn test_decode_utf16() {
fn check(s: &[u16], expected: &[Result<char, u16>]) {
- assert_eq!(::std::char::decode_utf16(s.iter().cloned()).collect::<Vec<_>>(), expected);
+ let v = char::decode_utf16(s.iter().cloned())
+ .map(|r| r.map_err(|e| e.unpaired_surrogate()))
+ .collect::<Vec<_>>();
+ assert_eq!(v, expected);
}
check(&[0xD800, 0x41, 0x42], &[Err(0xD800), Ok('A'), Ok('B')]);
check(&[0xD800, 0], &[Err(0xD800), Ok('\0')]);
#![feature(box_syntax)]
#![feature(cell_extras)]
#![feature(const_fn)]
-#![feature(copy_from_slice)]
#![feature(core_float)]
#![feature(core_private_bignum)]
#![feature(core_private_diy_float)]
#![feature(dec2flt)]
-#![feature(decode_utf16)]
#![feature(fixed_size_array)]
#![feature(float_extras)]
#![feature(flt2dec)]
#![feature(libc)]
#![feature(nonzero)]
#![feature(peekable_is_empty)]
-#![feature(ptr_as_ref)]
#![feature(rand)]
#![feature(raw)]
#![feature(slice_patterns)]
#![deny(missing_docs)]
#![feature(staged_api)]
-#![feature(str_char)]
use self::Name::*;
use self::HasArg::*;
impl Name {
fn from_str(nm: &str) -> Name {
if nm.len() == 1 {
- Short(nm.char_at(0))
+ Short(nm.chars().next().unwrap())
} else {
Long(nm.to_owned())
}
}
(1, 0) => {
Opt {
- name: Short(short_name.char_at(0)),
+ name: Short(short_name.chars().next().unwrap()),
hasarg: hasarg,
occur: occur,
aliases: Vec::new(),
hasarg: hasarg,
occur: occur,
aliases: vec![Opt {
- name: Short(short_name.char_at(0)),
+ name: Short(short_name.chars().next().unwrap()),
hasarg: hasarg,
occur: occur,
aliases: Vec::new(),
let mut j = 1;
names = Vec::new();
while j < curlen {
- let ch = cur.char_at(j);
+ let ch = cur[j..].chars().next().unwrap();
let opt = Short(ch);
// In a series of potential options (eg. -aheJ), if we
test(attr(deny(warnings))))]
#![cfg_attr(not(stage0), deny(warnings))]
-#![feature(copy_from_slice)]
#![feature(rustc_private)]
#![feature(staged_api)]
#![feature(question_mark)]
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use std::fmt::Debug;
+
+#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, RustcEncodable, RustcDecodable)]
+pub enum DepNode<D: Clone + Debug> {
+ // The `D` type is "how definitions are identified".
+ // During compilation, it is always `DefId`, but when serializing
+ // it is mapped to `DefPath`.
+
+ // Represents the `Krate` as a whole (the `hir::Krate` value) (as
+ // distinct from the krate module). This is basically a hash of
+ // the entire krate, so if you read from `Krate` (e.g., by calling
+ // `tcx.map.krate()`), we will have to assume that any change
+ // means that you need to be recompiled. This is because the
+ // `Krate` value gives you access to all other items. To avoid
+ // this fate, do not call `tcx.map.krate()`; instead, prefer
+ // wrappers like `tcx.visit_all_items_in_krate()`. If there is no
+ // suitable wrapper, you can use `tcx.dep_graph.ignore()` to gain
+ // access to the krate, but you must remember to add suitable
+ // edges yourself for the individual items that you read.
+ Krate,
+
+ // Represents the HIR node with the given node-id
+ Hir(D),
+
+ // Represents different phases in the compiler.
+ CrateReader,
+ CollectLanguageItems,
+ CheckStaticRecursion,
+ ResolveLifetimes,
+ RegionResolveCrate,
+ CheckLoops,
+ PluginRegistrar,
+ StabilityIndex,
+ CollectItem(D),
+ Coherence,
+ EffectCheck,
+ Liveness,
+ Resolve,
+ EntryPoint,
+ CheckEntryFn,
+ CoherenceCheckImpl(D),
+ CoherenceOverlapCheck(D),
+ CoherenceOverlapCheckSpecial(D),
+ CoherenceOverlapInherentCheck(D),
+ CoherenceOrphanCheck(D),
+ Variance,
+ WfCheck(D),
+ TypeckItemType(D),
+ TypeckItemBody(D),
+ Dropck,
+ DropckImpl(D),
+ CheckConst(D),
+ Privacy,
+ IntrinsicCheck(D),
+ MatchCheck(D),
+ MirMapConstruction(D),
+ MirTypeck(D),
+ BorrowCheck(D),
+ RvalueCheck(D),
+ Reachability,
+ DeadCheck,
+ StabilityCheck,
+ LateLintCheck,
+ IntrinsicUseCheck,
+ TransCrate,
+ TransCrateItem(D),
+ TransInlinedItem(D),
+ TransWriteMetadata,
+
+ // Nodes representing bits of computed IR in the tcx. Each shared
+ // table in the tcx (or elsewhere) maps to one of these
+ // nodes. Often we map multiple tables to the same node if there
+ // is no point in distinguishing them (e.g., both the type and
+ // predicates for an item wind up in `ItemSignature`). Other
+ // times, such as `ImplItems` vs `TraitItemDefIds`, tables which
+ // might be mergable are kept distinct because the sets of def-ids
+ // to which they apply are disjoint, and hence we might as well
+ // have distinct labels for easier debugging.
+ ImplOrTraitItems(D),
+ ItemSignature(D),
+ FieldTy(D),
+ TraitItemDefIds(D),
+ InherentImpls(D),
+ ImplItems(D),
+
+ // The set of impls for a given trait. Ultimately, it would be
+ // nice to get more fine-grained here (e.g., to include a
+ // simplified type), but we can't do that until we restructure the
+ // HIR to distinguish the *header* of an impl from its body. This
+ // is because changes to the header may change the self-type of
+ // the impl and hence would require us to be more conservative
+ // than changes in the impl body.
+ TraitImpls(D),
+
+ // Nodes representing caches. To properly handle a true cache, we
+ // don't use a DepTrackingMap, but rather we push a task node.
+ // Otherwise the write into the map would be incorrectly
+ // attributed to the first task that happened to fill the cache,
+ // which would yield an overly conservative dep-graph.
+ TraitItems(D),
+ ReprHints(D),
+ TraitSelect(D),
+}
+
+impl<D: Clone + Debug> DepNode<D> {
+ /// Used in testing
+ pub fn from_label_string(label: &str, data: D) -> Result<DepNode<D>, ()> {
+ macro_rules! check {
+ ($($name:ident,)*) => {
+ match label {
+ $(stringify!($name) => Ok(DepNode::$name(data)),)*
+ _ => Err(())
+ }
+ }
+ }
+
+ check! {
+ CollectItem,
+ BorrowCheck,
+ TransCrateItem,
+ TypeckItemType,
+ TypeckItemBody,
+ ImplOrTraitItems,
+ ItemSignature,
+ FieldTy,
+ TraitItemDefIds,
+ InherentImpls,
+ ImplItems,
+ TraitImpls,
+ ReprHints,
+ }
+ }
+
+ pub fn map_def<E, OP>(&self, mut op: OP) -> Option<DepNode<E>>
+ where OP: FnMut(&D) -> Option<E>, E: Clone + Debug
+ {
+ use self::DepNode::*;
+
+ match *self {
+ Krate => Some(Krate),
+ CrateReader => Some(CrateReader),
+ CollectLanguageItems => Some(CollectLanguageItems),
+ CheckStaticRecursion => Some(CheckStaticRecursion),
+ ResolveLifetimes => Some(ResolveLifetimes),
+ RegionResolveCrate => Some(RegionResolveCrate),
+ CheckLoops => Some(CheckLoops),
+ PluginRegistrar => Some(PluginRegistrar),
+ StabilityIndex => Some(StabilityIndex),
+ Coherence => Some(Coherence),
+ EffectCheck => Some(EffectCheck),
+ Liveness => Some(Liveness),
+ Resolve => Some(Resolve),
+ EntryPoint => Some(EntryPoint),
+ CheckEntryFn => Some(CheckEntryFn),
+ Variance => Some(Variance),
+ Dropck => Some(Dropck),
+ Privacy => Some(Privacy),
+ Reachability => Some(Reachability),
+ DeadCheck => Some(DeadCheck),
+ StabilityCheck => Some(StabilityCheck),
+ LateLintCheck => Some(LateLintCheck),
+ IntrinsicUseCheck => Some(IntrinsicUseCheck),
+ TransCrate => Some(TransCrate),
+ TransWriteMetadata => Some(TransWriteMetadata),
+ Hir(ref d) => op(d).map(Hir),
+ CollectItem(ref d) => op(d).map(CollectItem),
+ CoherenceCheckImpl(ref d) => op(d).map(CoherenceCheckImpl),
+ CoherenceOverlapCheck(ref d) => op(d).map(CoherenceOverlapCheck),
+ CoherenceOverlapCheckSpecial(ref d) => op(d).map(CoherenceOverlapCheckSpecial),
+ CoherenceOverlapInherentCheck(ref d) => op(d).map(CoherenceOverlapInherentCheck),
+ CoherenceOrphanCheck(ref d) => op(d).map(CoherenceOrphanCheck),
+ WfCheck(ref d) => op(d).map(WfCheck),
+ TypeckItemType(ref d) => op(d).map(TypeckItemType),
+ TypeckItemBody(ref d) => op(d).map(TypeckItemBody),
+ DropckImpl(ref d) => op(d).map(DropckImpl),
+ CheckConst(ref d) => op(d).map(CheckConst),
+ IntrinsicCheck(ref d) => op(d).map(IntrinsicCheck),
+ MatchCheck(ref d) => op(d).map(MatchCheck),
+ MirMapConstruction(ref d) => op(d).map(MirMapConstruction),
+ MirTypeck(ref d) => op(d).map(MirTypeck),
+ BorrowCheck(ref d) => op(d).map(BorrowCheck),
+ RvalueCheck(ref d) => op(d).map(RvalueCheck),
+ TransCrateItem(ref d) => op(d).map(TransCrateItem),
+ TransInlinedItem(ref d) => op(d).map(TransInlinedItem),
+ ImplOrTraitItems(ref d) => op(d).map(ImplOrTraitItems),
+ ItemSignature(ref d) => op(d).map(ItemSignature),
+ FieldTy(ref d) => op(d).map(FieldTy),
+ TraitItemDefIds(ref d) => op(d).map(TraitItemDefIds),
+ InherentImpls(ref d) => op(d).map(InherentImpls),
+ ImplItems(ref d) => op(d).map(ImplItems),
+ TraitImpls(ref d) => op(d).map(TraitImpls),
+ TraitItems(ref d) => op(d).map(TraitItems),
+ ReprHints(ref d) => op(d).map(ReprHints),
+ TraitSelect(ref d) => op(d).map(TraitSelect),
+ }
+ }
+}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
+use hir::def_id::DefId;
use rustc_data_structures::fnv::FnvHashMap;
use std::cell::RefCell;
use std::ops::Index;
pub trait DepTrackingMapConfig {
type Key: Eq + Hash + Clone;
type Value: Clone;
- fn to_dep_node(key: &Self::Key) -> DepNode;
+ fn to_dep_node(key: &Self::Key) -> DepNode<DefId>;
}
impl<M: DepTrackingMapConfig> DepTrackingMap<M> {
// except according to those terms.
use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
+use std::fmt::Debug;
+use std::hash::Hash;
use super::{DepGraphQuery, DepNode};
-pub struct DepGraphEdges {
- nodes: Vec<DepNode>,
- indices: FnvHashMap<DepNode, IdIndex>,
+pub struct DepGraphEdges<D: Clone + Debug + Eq + Hash> {
+ nodes: Vec<DepNode<D>>,
+ indices: FnvHashMap<DepNode<D>, IdIndex>,
edges: FnvHashSet<(IdIndex, IdIndex)>,
open_nodes: Vec<OpenNode>,
}
Ignore,
}
-impl DepGraphEdges {
- pub fn new() -> DepGraphEdges {
+impl<D: Clone + Debug + Eq + Hash> DepGraphEdges<D> {
+ pub fn new() -> DepGraphEdges<D> {
DepGraphEdges {
nodes: vec![],
indices: FnvHashMap(),
}
}
- fn id(&self, index: IdIndex) -> DepNode {
- self.nodes[index.index()]
+ fn id(&self, index: IdIndex) -> DepNode<D> {
+ self.nodes[index.index()].clone()
}
/// Creates a node for `id` in the graph.
- fn make_node(&mut self, id: DepNode) -> IdIndex {
+ fn make_node(&mut self, id: DepNode<D>) -> IdIndex {
if let Some(&i) = self.indices.get(&id) {
return i;
}
assert_eq!(popped_node, OpenNode::Ignore);
}
- pub fn push_task(&mut self, key: DepNode) {
+ pub fn push_task(&mut self, key: DepNode<D>) {
let top_node = self.current_node();
let new_node = self.make_node(key);
}
}
- pub fn pop_task(&mut self, key: DepNode) {
+ pub fn pop_task(&mut self, key: DepNode<D>) {
let popped_node = self.open_nodes.pop().unwrap();
assert_eq!(OpenNode::Node(self.indices[&key]), popped_node);
}
/// Indicates that the current task `C` reads `v` by adding an
/// edge from `v` to `C`. If there is no current task, panics. If
/// you want to suppress this edge, use `ignore`.
- pub fn read(&mut self, v: DepNode) {
+ pub fn read(&mut self, v: DepNode<D>) {
let source = self.make_node(v);
self.add_edge_from_current_node(|current| (source, current))
}
/// Indicates that the current task `C` writes `v` by adding an
/// edge from `C` to `v`. If there is no current task, panics. If
/// you want to suppress this edge, use `ignore`.
- pub fn write(&mut self, v: DepNode) {
+ pub fn write(&mut self, v: DepNode<D>) {
let target = self.make_node(v);
self.add_edge_from_current_node(|current| (current, target))
}
}
}
- pub fn query(&self) -> DepGraphQuery {
+ pub fn query(&self) -> DepGraphQuery<D> {
let edges: Vec<_> = self.edges.iter()
.map(|&(i, j)| (self.id(i), self.id(j)))
.collect();
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use hir::def_id::DefId;
+use std::rc::Rc;
+
+use super::dep_node::DepNode;
+use super::query::DepGraphQuery;
+use super::raii;
+use super::thread::{DepGraphThreadData, DepMessage};
+
+#[derive(Clone)]
+pub struct DepGraph {
+ data: Rc<DepGraphThreadData>
+}
+
+impl DepGraph {
+ pub fn new(enabled: bool) -> DepGraph {
+ DepGraph {
+ data: Rc::new(DepGraphThreadData::new(enabled))
+ }
+ }
+
+ /// True if we are actually building a dep-graph. If this returns false,
+ /// then the other methods on this `DepGraph` will have no net effect.
+ #[inline]
+ pub fn enabled(&self) -> bool {
+ self.data.enabled()
+ }
+
+ pub fn query(&self) -> DepGraphQuery<DefId> {
+ self.data.query()
+ }
+
+ pub fn in_ignore<'graph>(&'graph self) -> raii::IgnoreTask<'graph> {
+ raii::IgnoreTask::new(&self.data)
+ }
+
+ pub fn in_task<'graph>(&'graph self, key: DepNode<DefId>) -> raii::DepTask<'graph> {
+ raii::DepTask::new(&self.data, key)
+ }
+
+ pub fn with_ignore<OP,R>(&self, op: OP) -> R
+ where OP: FnOnce() -> R
+ {
+ let _task = self.in_ignore();
+ op()
+ }
+
+ pub fn with_task<OP,R>(&self, key: DepNode<DefId>, op: OP) -> R
+ where OP: FnOnce() -> R
+ {
+ let _task = self.in_task(key);
+ op()
+ }
+
+ pub fn read(&self, v: DepNode<DefId>) {
+ self.data.enqueue(DepMessage::Read(v));
+ }
+
+ pub fn write(&self, v: DepNode<DefId>) {
+ self.data.enqueue(DepMessage::Write(v));
+ }
+}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-use self::thread::{DepGraphThreadData, DepMessage};
-use hir::def_id::DefId;
-use syntax::ast::NodeId;
-use ty::TyCtxt;
-use hir;
-use hir::intravisit::Visitor;
-use std::rc::Rc;
-
+mod dep_node;
mod dep_tracking_map;
mod edges;
+mod graph;
mod query;
mod raii;
mod thread;
-
-#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
-pub enum DepNode {
- // Represents the `Krate` as a whole (the `hir::Krate` value) (as
- // distinct from the krate module). This is basically a hash of
- // the entire krate, so if you read from `Krate` (e.g., by calling
- // `tcx.map.krate()`), we will have to assume that any change
- // means that you need to be recompiled. This is because the
- // `Krate` value gives you access to all other items. To avoid
- // this fate, do not call `tcx.map.krate()`; instead, prefer
- // wrappers like `tcx.visit_all_items_in_krate()`. If there is no
- // suitable wrapper, you can use `tcx.dep_graph.ignore()` to gain
- // access to the krate, but you must remember to add suitable
- // edges yourself for the individual items that you read.
- Krate,
-
- // Represents the HIR node with the given node-id
- Hir(DefId),
-
- // Represents different phases in the compiler.
- CrateReader,
- CollectLanguageItems,
- CheckStaticRecursion,
- ResolveLifetimes,
- RegionResolveCrate,
- CheckLoops,
- PluginRegistrar,
- StabilityIndex,
- CollectItem(DefId),
- Coherence,
- EffectCheck,
- Liveness,
- Resolve,
- EntryPoint,
- CheckEntryFn,
- CoherenceCheckImpl(DefId),
- CoherenceOverlapCheck(DefId),
- CoherenceOverlapCheckSpecial(DefId),
- CoherenceOverlapInherentCheck(DefId),
- CoherenceOrphanCheck(DefId),
- Variance,
- WfCheck(DefId),
- TypeckItemType(DefId),
- TypeckItemBody(DefId),
- Dropck,
- DropckImpl(DefId),
- CheckConst(DefId),
- Privacy,
- IntrinsicCheck(DefId),
- MatchCheck(DefId),
- MirMapConstruction(DefId),
- MirTypeck(NodeId),
- BorrowCheck(DefId),
- RvalueCheck(DefId),
- Reachability,
- DeadCheck,
- StabilityCheck,
- LateLintCheck,
- IntrinsicUseCheck,
- TransCrate,
- TransCrateItem(DefId),
- TransInlinedItem(DefId),
- TransWriteMetadata,
-
- // Nodes representing bits of computed IR in the tcx. Each shared
- // table in the tcx (or elsewhere) maps to one of these
- // nodes. Often we map multiple tables to the same node if there
- // is no point in distinguishing them (e.g., both the type and
- // predicates for an item wind up in `ItemSignature`). Other
- // times, such as `ImplItems` vs `TraitItemDefIds`, tables which
- // might be mergable are kept distinct because the sets of def-ids
- // to which they apply are disjoint, and hence we might as well
- // have distinct labels for easier debugging.
- ImplOrTraitItems(DefId),
- ItemSignature(DefId),
- FieldTy(DefId),
- TraitItemDefIds(DefId),
- InherentImpls(DefId),
- ImplItems(DefId),
-
- // The set of impls for a given trait. Ultimately, it would be
- // nice to get more fine-grained here (e.g., to include a
- // simplified type), but we can't do that until we restructure the
- // HIR to distinguish the *header* of an impl from its body. This
- // is because changes to the header may change the self-type of
- // the impl and hence would require us to be more conservative
- // than changes in the impl body.
- TraitImpls(DefId),
-
- // Nodes representing caches. To properly handle a true cache, we
- // don't use a DepTrackingMap, but rather we push a task node.
- // Otherwise the write into the map would be incorrectly
- // attributed to the first task that happened to fill the cache,
- // which would yield an overly conservative dep-graph.
- TraitItems(DefId),
- ReprHints(DefId),
- TraitSelect(DefId),
-}
-
-#[derive(Clone)]
-pub struct DepGraph {
- data: Rc<DepGraphThreadData>
-}
-
-impl DepGraph {
- pub fn new(enabled: bool) -> DepGraph {
- DepGraph {
- data: Rc::new(DepGraphThreadData::new(enabled))
- }
- }
-
- /// True if we are actually building a dep-graph. If this returns false,
- /// then the other methods on this `DepGraph` will have no net effect.
- #[inline]
- pub fn enabled(&self) -> bool {
- self.data.enabled()
- }
-
- pub fn query(&self) -> DepGraphQuery {
- self.data.query()
- }
-
- pub fn in_ignore<'graph>(&'graph self) -> raii::IgnoreTask<'graph> {
- raii::IgnoreTask::new(&self.data)
- }
-
- pub fn in_task<'graph>(&'graph self, key: DepNode) -> raii::DepTask<'graph> {
- raii::DepTask::new(&self.data, key)
- }
-
- pub fn with_ignore<OP,R>(&self, op: OP) -> R
- where OP: FnOnce() -> R
- {
- let _task = self.in_ignore();
- op()
- }
-
- pub fn with_task<OP,R>(&self, key: DepNode, op: OP) -> R
- where OP: FnOnce() -> R
- {
- let _task = self.in_task(key);
- op()
- }
-
- pub fn read(&self, v: DepNode) {
- self.data.enqueue(DepMessage::Read(v));
- }
-
- pub fn write(&self, v: DepNode) {
- self.data.enqueue(DepMessage::Write(v));
- }
-}
+mod visit;
pub use self::dep_tracking_map::{DepTrackingMap, DepTrackingMapConfig};
-
+pub use self::dep_node::DepNode;
+pub use self::graph::DepGraph;
pub use self::query::DepGraphQuery;
-
-/// Visit all the items in the krate in some order. When visiting a
-/// particular item, first create a dep-node by calling `dep_node_fn`
-/// and push that onto the dep-graph stack of tasks, and also create a
-/// read edge from the corresponding AST node. This is used in
-/// compiler passes to automatically record the item that they are
-/// working on.
-pub fn visit_all_items_in_krate<'tcx,V,F>(tcx: &TyCtxt<'tcx>,
- mut dep_node_fn: F,
- visitor: &mut V)
- where F: FnMut(DefId) -> DepNode, V: Visitor<'tcx>
-{
- struct TrackingVisitor<'visit, 'tcx: 'visit, F: 'visit, V: 'visit> {
- tcx: &'visit TyCtxt<'tcx>,
- dep_node_fn: &'visit mut F,
- visitor: &'visit mut V
- }
-
- impl<'visit, 'tcx, F, V> Visitor<'tcx> for TrackingVisitor<'visit, 'tcx, F, V>
- where F: FnMut(DefId) -> DepNode, V: Visitor<'tcx>
- {
- fn visit_item(&mut self, i: &'tcx hir::Item) {
- let item_def_id = self.tcx.map.local_def_id(i.id);
- let task_id = (self.dep_node_fn)(item_def_id);
- let _task = self.tcx.dep_graph.in_task(task_id);
- debug!("Started task {:?}", task_id);
- self.tcx.dep_graph.read(DepNode::Hir(item_def_id));
- self.visitor.visit_item(i)
- }
- }
-
- let krate = tcx.dep_graph.with_ignore(|| tcx.map.krate());
- let mut tracking_visitor = TrackingVisitor {
- tcx: tcx,
- dep_node_fn: &mut dep_node_fn,
- visitor: visitor
- };
- krate.visit_all_items(&mut tracking_visitor)
-}
+pub use self::visit::visit_all_items_in_krate;
use rustc_data_structures::fnv::FnvHashMap;
use rustc_data_structures::graph::{Graph, NodeIndex};
+use std::fmt::Debug;
+use std::hash::Hash;
use super::DepNode;
-pub struct DepGraphQuery {
- pub graph: Graph<DepNode, ()>,
- pub indices: FnvHashMap<DepNode, NodeIndex>,
+pub struct DepGraphQuery<D: Clone + Debug + Hash + Eq> {
+ pub graph: Graph<DepNode<D>, ()>,
+ pub indices: FnvHashMap<DepNode<D>, NodeIndex>,
}
-impl DepGraphQuery {
- pub fn new(nodes: &[DepNode], edges: &[(DepNode, DepNode)]) -> DepGraphQuery {
+impl<D: Clone + Debug + Hash + Eq> DepGraphQuery<D> {
+ pub fn new(nodes: &[DepNode<D>],
+ edges: &[(DepNode<D>, DepNode<D>)])
+ -> DepGraphQuery<D> {
let mut graph = Graph::new();
let mut indices = FnvHashMap();
for node in nodes {
}
}
- pub fn nodes(&self) -> Vec<DepNode> {
+ pub fn contains_node(&self, node: &DepNode<D>) -> bool {
+ self.indices.contains_key(&node)
+ }
+
+ pub fn nodes(&self) -> Vec<DepNode<D>> {
self.graph.all_nodes()
.iter()
.map(|n| n.data.clone())
.collect()
}
- pub fn edges(&self) -> Vec<(DepNode,DepNode)> {
+ pub fn edges(&self) -> Vec<(DepNode<D>,DepNode<D>)> {
self.graph.all_edges()
.iter()
.map(|edge| (edge.source(), edge.target()))
- .map(|(s, t)| (self.graph.node_data(s).clone(), self.graph.node_data(t).clone()))
+ .map(|(s, t)| (self.graph.node_data(s).clone(),
+ self.graph.node_data(t).clone()))
.collect()
}
/// All nodes reachable from `node`. In other words, things that
/// will have to be recomputed if `node` changes.
- pub fn dependents(&self, node: DepNode) -> Vec<DepNode> {
+ pub fn transitive_dependents(&self, node: DepNode<D>) -> Vec<DepNode<D>> {
if let Some(&index) = self.indices.get(&node) {
self.graph.depth_traverse(index)
- .map(|dependent_node| self.graph.node_data(dependent_node).clone())
+ .map(|s| self.graph.node_data(s).clone())
+ .collect()
+ } else {
+ vec![]
+ }
+ }
+
+ /// Just the outgoing edges from `node`.
+ pub fn immediate_dependents(&self, node: DepNode<D>) -> Vec<DepNode<D>> {
+ if let Some(&index) = self.indices.get(&node) {
+ self.graph.successor_nodes(index)
+ .map(|s| self.graph.node_data(s).clone())
.collect()
} else {
vec![]
// option. This file may not be copied, modified, or distributed
// except according to those terms.
+use hir::def_id::DefId;
use super::DepNode;
use super::thread::{DepGraphThreadData, DepMessage};
pub struct DepTask<'graph> {
data: &'graph DepGraphThreadData,
- key: DepNode,
+ key: DepNode<DefId>,
}
impl<'graph> DepTask<'graph> {
- pub fn new(data: &'graph DepGraphThreadData, key: DepNode) -> DepTask<'graph> {
+ pub fn new(data: &'graph DepGraphThreadData, key: DepNode<DefId>)
+ -> DepTask<'graph> {
data.enqueue(DepMessage::PushTask(key));
DepTask { data: data, key: key }
}
//! to accumulate more messages. This way we only ever have two vectors
//! allocated (and both have a fairly large capacity).
+use hir::def_id::DefId;
use rustc_data_structures::veccell::VecCell;
use std::cell::Cell;
use std::sync::mpsc::{self, Sender, Receiver};
use super::edges::DepGraphEdges;
pub enum DepMessage {
- Read(DepNode),
- Write(DepNode),
- PushTask(DepNode),
- PopTask(DepNode),
+ Read(DepNode<DefId>),
+ Write(DepNode<DefId>),
+ PushTask(DepNode<DefId>),
+ PopTask(DepNode<DefId>),
PushIgnore,
PopIgnore,
Query,
swap_out: Sender<Vec<DepMessage>>,
// where to receive query results
- query_in: Receiver<DepGraphQuery>,
+ query_in: Receiver<DepGraphQuery<DefId>>,
}
const INITIAL_CAPACITY: usize = 2048;
self.swap_out.send(old_messages).unwrap();
}
- pub fn query(&self) -> DepGraphQuery {
+ pub fn query(&self) -> DepGraphQuery<DefId> {
assert!(self.enabled, "cannot query if dep graph construction not enabled");
self.enqueue(DepMessage::Query);
self.swap();
/// Definition of the depgraph thread.
pub fn main(swap_in: Receiver<Vec<DepMessage>>,
swap_out: Sender<Vec<DepMessage>>,
- query_out: Sender<DepGraphQuery>) {
+ query_out: Sender<DepGraphQuery<DefId>>) {
let mut edges = DepGraphEdges::new();
// the compiler thread always expects a fresh buffer to be
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use hir;
+use hir::def_id::DefId;
+use hir::intravisit::Visitor;
+use ty::TyCtxt;
+
+use super::dep_node::DepNode;
+
+
+/// Visit all the items in the krate in some order. When visiting a
+/// particular item, first create a dep-node by calling `dep_node_fn`
+/// and push that onto the dep-graph stack of tasks, and also create a
+/// read edge from the corresponding AST node. This is used in
+/// compiler passes to automatically record the item that they are
+/// working on.
+pub fn visit_all_items_in_krate<'tcx,V,F>(tcx: &TyCtxt<'tcx>,
+ mut dep_node_fn: F,
+ visitor: &mut V)
+ where F: FnMut(DefId) -> DepNode<DefId>, V: Visitor<'tcx>
+{
+ struct TrackingVisitor<'visit, 'tcx: 'visit, F: 'visit, V: 'visit> {
+ tcx: &'visit TyCtxt<'tcx>,
+ dep_node_fn: &'visit mut F,
+ visitor: &'visit mut V
+ }
+
+ impl<'visit, 'tcx, F, V> Visitor<'tcx> for TrackingVisitor<'visit, 'tcx, F, V>
+ where F: FnMut(DefId) -> DepNode<DefId>, V: Visitor<'tcx>
+ {
+ fn visit_item(&mut self, i: &'tcx hir::Item) {
+ let item_def_id = self.tcx.map.local_def_id(i.id);
+ let task_id = (self.dep_node_fn)(item_def_id);
+ let _task = self.tcx.dep_graph.in_task(task_id);
+ debug!("Started task {:?}", task_id);
+ self.tcx.dep_graph.read(DepNode::Hir(item_def_id));
+ self.visitor.visit_item(i)
+ }
+ }
+
+ let krate = tcx.dep_graph.with_ignore(|| tcx.map.krate());
+ let mut tracking_visitor = TrackingVisitor {
+ tcx: tcx,
+ dep_node_fn: &mut dep_node_fn,
+ visitor: visitor
+ };
+ krate.visit_all_items(&mut tracking_visitor)
+}
}
}
+ pub fn retrace_path(&self, path: &DefPath) -> Option<DefIndex> {
+ debug!("retrace_path(path={:?})", path);
+
+ // we assume that we only want to retrace paths relative to
+ // the crate root
+ assert!(path.is_local());
+
+ let root_key = DefKey {
+ parent: None,
+ disambiguated_data: DisambiguatedDefPathData {
+ data: DefPathData::CrateRoot,
+ disambiguator: 0,
+ },
+ };
+ let root_id = self.key_map[&root_key];
+
+ debug!("retrace_path: root_id={:?}", root_id);
+
+ let mut id = root_id;
+ for data in &path.data {
+ let key = DefKey { parent: Some(id), disambiguated_data: data.clone() };
+ debug!("key = {:?}", key);
+ id = match self.key_map.get(&key) {
+ Some(&id) => id,
+ None => return None
+ };
+ }
+
+ Some(id)
+ }
+
pub fn create_def_with_parent(&mut self,
parent: Option<DefIndex>,
node_id: ast::NodeId,
data: DefPathData)
-> DefIndex {
+ debug!("create_def_with_parent(parent={:?}, node_id={:?}, data={:?})",
+ parent, node_id, data);
+
assert!(!self.node_map.contains_key(&node_id),
"adding a def'n for node-id {:?} and data {:?} but a previous def'n exists: {:?}",
node_id,
data,
self.data[self.node_map[&node_id].as_usize()]);
+ assert!(parent.is_some() ^ match data {
+ DefPathData::CrateRoot | DefPathData::InlinedRoot(_) => true,
+ _ => false,
+ });
+
// Find a unique DefKey. This basically means incrementing the disambiguator
// until we get no match.
let mut key = DefKey {
key.disambiguated_data.disambiguator += 1;
}
+ debug!("create_def_with_parent: after disambiguation, key = {:?}", key);
+
// Create the definition.
let index = DefIndex::new(self.data.len());
self.data.push(DefData { key: key.clone(), node_id: node_id });
+ debug!("create_def_with_parent: node_map[{:?}] = {:?}", node_id, index);
self.node_map.insert(node_id, index);
+ debug!("create_def_with_parent: key_map[{:?}] = {:?}", key, index);
self.key_map.insert(key, index);
+
index
}
}
self.dep_graph.read(self.dep_node(id));
}
- fn dep_node(&self, id0: NodeId) -> DepNode {
+ fn dep_node(&self, id0: NodeId) -> DepNode<DefId> {
let map = self.map.borrow();
let mut id = id0;
loop {
self.definitions.borrow().def_path(def_id.index)
}
+ pub fn retrace_path(&self, path: &DefPath) -> Option<DefId> {
+ self.definitions.borrow().retrace_path(path)
+ .map(DefId::local)
+ }
+
pub fn local_def_id(&self, node: NodeId) -> DefId {
self.opt_local_def_id(node).unwrap_or_else(|| {
bug!("local_def_id: no entry for `{}`, which has a map of `{:?}`",
let mut out_idx = 0;
self.commasep(Inconsistent, &a.outputs, |s, out| {
- match out.constraint.slice_shift_char() {
- Some(('=', operand)) if out.is_rw => {
- s.print_string(&format!("+{}", operand), ast::StrStyle::Cooked)?
+ let mut ch = out.constraint.chars();
+ match ch.next() {
+ Some('=') if out.is_rw => {
+ s.print_string(&format!("+{}", ch.as_str()),
+ ast::StrStyle::Cooked)?
}
- _ => s.print_string(&out.constraint, ast::StrStyle::Cooked)?,
+ _ => s.print_string(&out.constraint,
+ ast::StrStyle::Cooked)?,
}
s.popen()?;
s.print_expr(&outputs[out_idx])?;
//! Original issue: https://github.com/rust-lang/rust/issues/10207
use std::fmt;
-use std::hash::{Hash, SipHasher, Hasher};
-use hir;
-use hir::intravisit as visit;
#[derive(Clone, PartialEq, Debug)]
pub struct Svh {
}
impl Svh {
- pub fn new(hash: &str) -> Svh {
+ /// Create a new `Svh` given the hash. If you actually want to
+ /// compute the SVH from some HIR, you want the `calculate_svh`
+ /// function found in `librustc_trans`.
+ pub fn new(hash: String) -> Svh {
assert!(hash.len() == 16);
- Svh { hash: hash.to_string() }
+ Svh { hash: hash }
}
- pub fn as_str<'a>(&'a self) -> &'a str {
- &self.hash
- }
-
- pub fn calculate(crate_disambiguator: &str, krate: &hir::Crate) -> Svh {
- // FIXME (#14132): This is better than it used to be, but it still not
- // ideal. We now attempt to hash only the relevant portions of the
- // Crate AST as well as the top-level crate attributes. (However,
- // the hashing of the crate attributes should be double-checked
- // to ensure it is not incorporating implementation artifacts into
- // the hash that are not otherwise visible.)
-
- // FIXME: this should use SHA1, not SipHash. SipHash is not built to
- // avoid collisions.
- let mut state = SipHasher::new();
-
- "crate_disambiguator".hash(&mut state);
- crate_disambiguator.len().hash(&mut state);
- crate_disambiguator.hash(&mut state);
-
- {
- let mut visit = svh_visitor::make(&mut state, krate);
- visit::walk_crate(&mut visit, krate);
- }
-
- // FIXME (#14132): This hash is still sensitive to e.g. the
- // spans of the crate Attributes and their underlying
- // MetaItems; we should make ContentHashable impl for those
- // types and then use hash_content. But, since all crate
- // attributes should appear near beginning of the file, it is
- // not such a big deal to be sensitive to their spans for now.
- //
- // We hash only the MetaItems instead of the entire Attribute
- // to avoid hashing the AttrId
- for attr in &krate.attrs {
- attr.node.value.hash(&mut state);
- }
-
- let hash = state.finish();
- return Svh {
- hash: (0..64).step_by(4).map(|i| hex(hash >> i)).collect()
- };
+ pub fn from_hash(hash: u64) -> Svh {
+ return Svh::new((0..64).step_by(4).map(|i| hex(hash >> i)).collect());
fn hex(b: u64) -> char {
let b = (b & 0xf) as u8;
b as char
}
}
+
+ pub fn as_str<'a>(&'a self) -> &'a str {
+ &self.hash
+ }
}
impl fmt::Display for Svh {
f.pad(self.as_str())
}
}
-
-// FIXME (#14132): Even this SVH computation still has implementation
-// artifacts: namely, the order of item declaration will affect the
-// hash computation, but for many kinds of items the order of
-// declaration should be irrelevant to the ABI.
-
-mod svh_visitor {
- pub use self::SawExprComponent::*;
- pub use self::SawStmtComponent::*;
- use self::SawAbiComponent::*;
- use syntax::ast::{self, Name, NodeId};
- use syntax::codemap::Span;
- use syntax::parse::token;
- use hir::intravisit as visit;
- use hir::intravisit::{Visitor, FnKind};
- use hir::*;
- use hir;
-
- use std::hash::{Hash, SipHasher};
-
- pub struct StrictVersionHashVisitor<'a> {
- pub krate: &'a Crate,
- pub st: &'a mut SipHasher,
- }
-
- pub fn make<'a>(st: &'a mut SipHasher, krate: &'a Crate) -> StrictVersionHashVisitor<'a> {
- StrictVersionHashVisitor { st: st, krate: krate }
- }
-
- // To off-load the bulk of the hash-computation on #[derive(Hash)],
- // we define a set of enums corresponding to the content that our
- // crate visitor will encounter as it traverses the ast.
- //
- // The important invariant is that all of the Saw*Component enums
- // do not carry any Spans, Names, or Idents.
- //
- // Not carrying any Names/Idents is the important fix for problem
- // noted on PR #13948: using the ident.name as the basis for a
- // hash leads to unstable SVH, because ident.name is just an index
- // into intern table (i.e. essentially a random address), not
- // computed from the name content.
- //
- // With the below enums, the SVH computation is not sensitive to
- // artifacts of how rustc was invoked nor of how the source code
- // was laid out. (Or at least it is *less* sensitive.)
-
- // This enum represents the different potential bits of code the
- // visitor could encounter that could affect the ABI for the crate,
- // and assigns each a distinct tag to feed into the hash computation.
- #[derive(Hash)]
- enum SawAbiComponent<'a> {
-
- // FIXME (#14132): should we include (some function of)
- // ident.ctxt as well?
- SawIdent(token::InternedString),
- SawStructDef(token::InternedString),
-
- SawLifetime(token::InternedString),
- SawLifetimeDef(token::InternedString),
-
- SawMod,
- SawForeignItem,
- SawItem,
- SawDecl,
- SawTy,
- SawGenerics,
- SawFn,
- SawTraitItem,
- SawImplItem,
- SawStructField,
- SawVariant,
- SawExplicitSelf,
- SawPath,
- SawBlock,
- SawPat,
- SawLocal,
- SawArm,
- SawExpr(SawExprComponent<'a>),
- SawStmt(SawStmtComponent),
- }
-
- /// SawExprComponent carries all of the information that we want
- /// to include in the hash that *won't* be covered by the
- /// subsequent recursive traversal of the expression's
- /// substructure by the visitor.
- ///
- /// We know every Expr_ variant is covered by a variant because
- /// `fn saw_expr` maps each to some case below. Ensuring that
- /// each variant carries an appropriate payload has to be verified
- /// by hand.
- ///
- /// (However, getting that *exactly* right is not so important
- /// because the SVH is just a developer convenience; there is no
- /// guarantee of collision-freedom, hash collisions are just
- /// (hopefully) unlikely.)
- #[derive(Hash)]
- pub enum SawExprComponent<'a> {
-
- SawExprLoop(Option<token::InternedString>),
- SawExprField(token::InternedString),
- SawExprTupField(usize),
- SawExprBreak(Option<token::InternedString>),
- SawExprAgain(Option<token::InternedString>),
-
- SawExprBox,
- SawExprVec,
- SawExprCall,
- SawExprMethodCall,
- SawExprTup,
- SawExprBinary(hir::BinOp_),
- SawExprUnary(hir::UnOp),
- SawExprLit(ast::LitKind),
- SawExprCast,
- SawExprType,
- SawExprIf,
- SawExprWhile,
- SawExprMatch,
- SawExprClosure,
- SawExprBlock,
- SawExprAssign,
- SawExprAssignOp(hir::BinOp_),
- SawExprIndex,
- SawExprPath(Option<usize>),
- SawExprAddrOf(hir::Mutability),
- SawExprRet,
- SawExprInlineAsm(&'a hir::InlineAsm),
- SawExprStruct,
- SawExprRepeat,
- }
-
- fn saw_expr<'a>(node: &'a Expr_) -> SawExprComponent<'a> {
- match *node {
- ExprBox(..) => SawExprBox,
- ExprVec(..) => SawExprVec,
- ExprCall(..) => SawExprCall,
- ExprMethodCall(..) => SawExprMethodCall,
- ExprTup(..) => SawExprTup,
- ExprBinary(op, _, _) => SawExprBinary(op.node),
- ExprUnary(op, _) => SawExprUnary(op),
- ExprLit(ref lit) => SawExprLit(lit.node.clone()),
- ExprCast(..) => SawExprCast,
- ExprType(..) => SawExprType,
- ExprIf(..) => SawExprIf,
- ExprWhile(..) => SawExprWhile,
- ExprLoop(_, id) => SawExprLoop(id.map(|id| id.name.as_str())),
- ExprMatch(..) => SawExprMatch,
- ExprClosure(..) => SawExprClosure,
- ExprBlock(..) => SawExprBlock,
- ExprAssign(..) => SawExprAssign,
- ExprAssignOp(op, _, _) => SawExprAssignOp(op.node),
- ExprField(_, name) => SawExprField(name.node.as_str()),
- ExprTupField(_, id) => SawExprTupField(id.node),
- ExprIndex(..) => SawExprIndex,
- ExprPath(ref qself, _) => SawExprPath(qself.as_ref().map(|q| q.position)),
- ExprAddrOf(m, _) => SawExprAddrOf(m),
- ExprBreak(id) => SawExprBreak(id.map(|id| id.node.name.as_str())),
- ExprAgain(id) => SawExprAgain(id.map(|id| id.node.name.as_str())),
- ExprRet(..) => SawExprRet,
- ExprInlineAsm(ref a,_,_) => SawExprInlineAsm(a),
- ExprStruct(..) => SawExprStruct,
- ExprRepeat(..) => SawExprRepeat,
- }
- }
-
- /// SawStmtComponent is analogous to SawExprComponent, but for statements.
- #[derive(Hash)]
- pub enum SawStmtComponent {
- SawStmtDecl,
- SawStmtExpr,
- SawStmtSemi,
- }
-
- fn saw_stmt(node: &Stmt_) -> SawStmtComponent {
- match *node {
- StmtDecl(..) => SawStmtDecl,
- StmtExpr(..) => SawStmtExpr,
- StmtSemi(..) => SawStmtSemi,
- }
- }
-
- impl<'a> Visitor<'a> for StrictVersionHashVisitor<'a> {
- fn visit_nested_item(&mut self, item: ItemId) {
- self.visit_item(self.krate.item(item.id))
- }
-
- fn visit_variant_data(&mut self, s: &'a VariantData, name: Name,
- g: &'a Generics, _: NodeId, _: Span) {
- SawStructDef(name.as_str()).hash(self.st);
- visit::walk_generics(self, g);
- visit::walk_struct_def(self, s)
- }
-
- fn visit_variant(&mut self, v: &'a Variant, g: &'a Generics, item_id: NodeId) {
- SawVariant.hash(self.st);
- // walk_variant does not call walk_generics, so do it here.
- visit::walk_generics(self, g);
- visit::walk_variant(self, v, g, item_id)
- }
-
- // All of the remaining methods just record (in the hash
- // SipHasher) that the visitor saw that particular variant
- // (with its payload), and continue walking as the default
- // visitor would.
- //
- // Some of the implementations have some notes as to how one
- // might try to make their SVH computation less discerning
- // (e.g. by incorporating reachability analysis). But
- // currently all of their implementations are uniform and
- // uninteresting.
- //
- // (If you edit a method such that it deviates from the
- // pattern, please move that method up above this comment.)
-
- fn visit_name(&mut self, _: Span, name: Name) {
- SawIdent(name.as_str()).hash(self.st);
- }
-
- fn visit_lifetime(&mut self, l: &'a Lifetime) {
- SawLifetime(l.name.as_str()).hash(self.st);
- }
-
- fn visit_lifetime_def(&mut self, l: &'a LifetimeDef) {
- SawLifetimeDef(l.lifetime.name.as_str()).hash(self.st);
- }
-
- // We do recursively walk the bodies of functions/methods
- // (rather than omitting their bodies from the hash) since
- // monomorphization and cross-crate inlining generally implies
- // that a change to a crate body will require downstream
- // crates to be recompiled.
- fn visit_expr(&mut self, ex: &'a Expr) {
- SawExpr(saw_expr(&ex.node)).hash(self.st); visit::walk_expr(self, ex)
- }
-
- fn visit_stmt(&mut self, s: &'a Stmt) {
- SawStmt(saw_stmt(&s.node)).hash(self.st); visit::walk_stmt(self, s)
- }
-
- fn visit_foreign_item(&mut self, i: &'a ForeignItem) {
- // FIXME (#14132) ideally we would incorporate privacy (or
- // perhaps reachability) somewhere here, so foreign items
- // that do not leak into downstream crates would not be
- // part of the ABI.
- SawForeignItem.hash(self.st); visit::walk_foreign_item(self, i)
- }
-
- fn visit_item(&mut self, i: &'a Item) {
- // FIXME (#14132) ideally would incorporate reachability
- // analysis somewhere here, so items that never leak into
- // downstream crates (e.g. via monomorphisation or
- // inlining) would not be part of the ABI.
- SawItem.hash(self.st); visit::walk_item(self, i)
- }
-
- fn visit_mod(&mut self, m: &'a Mod, _s: Span, _n: NodeId) {
- SawMod.hash(self.st); visit::walk_mod(self, m)
- }
-
- fn visit_decl(&mut self, d: &'a Decl) {
- SawDecl.hash(self.st); visit::walk_decl(self, d)
- }
-
- fn visit_ty(&mut self, t: &'a Ty) {
- SawTy.hash(self.st); visit::walk_ty(self, t)
- }
-
- fn visit_generics(&mut self, g: &'a Generics) {
- SawGenerics.hash(self.st); visit::walk_generics(self, g)
- }
-
- fn visit_fn(&mut self, fk: FnKind<'a>, fd: &'a FnDecl,
- b: &'a Block, s: Span, _: NodeId) {
- SawFn.hash(self.st); visit::walk_fn(self, fk, fd, b, s)
- }
-
- fn visit_trait_item(&mut self, ti: &'a TraitItem) {
- SawTraitItem.hash(self.st); visit::walk_trait_item(self, ti)
- }
-
- fn visit_impl_item(&mut self, ii: &'a ImplItem) {
- SawImplItem.hash(self.st); visit::walk_impl_item(self, ii)
- }
-
- fn visit_struct_field(&mut self, s: &'a StructField) {
- SawStructField.hash(self.st); visit::walk_struct_field(self, s)
- }
-
- fn visit_explicit_self(&mut self, es: &'a ExplicitSelf) {
- SawExplicitSelf.hash(self.st); visit::walk_explicit_self(self, es)
- }
-
- fn visit_path(&mut self, path: &'a Path, _: ast::NodeId) {
- SawPath.hash(self.st); visit::walk_path(self, path)
- }
-
- fn visit_path_list_item(&mut self, prefix: &'a Path, item: &'a PathListItem) {
- SawPath.hash(self.st); visit::walk_path_list_item(self, prefix, item)
- }
-
- fn visit_block(&mut self, b: &'a Block) {
- SawBlock.hash(self.st); visit::walk_block(self, b)
- }
-
- fn visit_pat(&mut self, p: &'a Pat) {
- SawPat.hash(self.st); visit::walk_pat(self, p)
- }
-
- fn visit_local(&mut self, l: &'a Local) {
- SawLocal.hash(self.st); visit::walk_local(self, l)
- }
-
- fn visit_arm(&mut self, a: &'a Arm) {
- SawArm.hash(self.st); visit::walk_arm(self, a)
- }
- }
-}
#![feature(box_syntax)]
#![feature(collections)]
#![feature(const_fn)]
-#![feature(copy_from_slice)]
#![feature(enumset)]
#![feature(iter_arith)]
#![feature(libc)]
#![feature(slice_patterns)]
#![feature(staged_api)]
#![feature(step_by)]
-#![feature(str_char)]
#![feature(question_mark)]
#![cfg_attr(test, feature(test))]
ty::tls::with_opt(|opt_tcx| {
if let Some(tcx) = opt_tcx {
- let data = tcx.region_maps.code_extents.borrow()[self.0 as usize];
- write!(f, "/{:?}", data)?;
+ if let Some(data) = tcx.region_maps.code_extents.borrow().get(self.0 as usize) {
+ write!(f, "/{:?}", data)?;
+ }
}
Ok(())
})?;
self.visit_operand(arg);
}
if let Some((ref $($mutability)* destination, target)) = *destination {
- self.visit_lvalue(destination, LvalueContext::Store);
+ self.visit_lvalue(destination, LvalueContext::Call);
self.visit_branch(block, target);
}
cleanup.map(|t| self.visit_branch(block, t));
#[derive(Copy, Clone, Debug)]
pub enum LvalueContext {
- // Appears as LHS of an assignment or as dest of a call
+ // Appears as LHS of an assignment
Store,
+ // Dest of a call
+ Call,
+
// Being dropped
Drop,
pub continue_parse_after_error: bool,
pub mir_opt_level: usize,
- /// if true, build up the dep-graph
- pub build_dep_graph: bool,
-
- /// if true, -Z dump-dep-graph was passed to dump out the dep-graph
- pub dump_dep_graph: bool,
+ /// if Some, enable incremental compilation, using the given
+ /// directory to store intermediate results
+ pub incremental: Option<PathBuf>,
pub no_analysis: bool,
pub debugging_opts: DebuggingOptions,
treat_err_as_bug: false,
continue_parse_after_error: false,
mir_opt_level: 1,
- build_dep_graph: false,
- dump_dep_graph: false,
+ incremental: None,
no_analysis: false,
debugging_opts: basic_debugging_options(),
prints: Vec::new(),
}
}
+impl Options {
+ /// True if there is a reason to build the dep graph.
+ pub fn build_dep_graph(&self) -> bool {
+ self.incremental.is_some() ||
+ self.debugging_opts.dump_dep_graph ||
+ self.debugging_opts.query_dep_graph
+ }
+}
+
// The type of entry function, so
// users can have their own entry
// functions that don't start a
"treat all errors that occur as bugs"),
continue_parse_after_error: bool = (false, parse_bool,
"attempt to recover from parse errors (experimental)"),
- incr_comp: bool = (false, parse_bool,
+ incremental: Option<String> = (None, parse_opt_string,
"enable incremental compilation (experimental)"),
dump_dep_graph: bool = (false, parse_bool,
"dump the dependency graph to $RUST_DEP_GRAPH (default: /tmp/dep_graph.gv)"),
+ query_dep_graph: bool = (false, parse_bool,
+ "enable queries of the dependency graph for regression testing"),
no_analysis: bool = (false, parse_bool,
"parse and expand the source, but run no analysis"),
extra_plugins: Vec<String> = (Vec::new(), parse_list,
let treat_err_as_bug = debugging_opts.treat_err_as_bug;
let continue_parse_after_error = debugging_opts.continue_parse_after_error;
let mir_opt_level = debugging_opts.mir_opt_level.unwrap_or(1);
- let incremental_compilation = debugging_opts.incr_comp;
- let dump_dep_graph = debugging_opts.dump_dep_graph;
let no_analysis = debugging_opts.no_analysis;
let mut output_types = HashMap::new();
let crate_name = matches.opt_str("crate-name");
+ let incremental = debugging_opts.incremental.as_ref().map(|m| PathBuf::from(m));
+
Options {
crate_types: crate_types,
gc: gc,
treat_err_as_bug: treat_err_as_bug,
continue_parse_after_error: continue_parse_after_error,
mir_opt_level: mir_opt_level,
- build_dep_graph: incremental_compilation || dump_dep_graph,
- dump_dep_graph: dump_dep_graph,
+ incremental: incremental,
no_analysis: no_analysis,
debugging_opts: debugging_opts,
prints: prints,
return None
}
let first = msg.match_indices("expected").filter(|s| {
- s.0 > 0 && (msg.char_at_reverse(s.0) == ' ' ||
- msg.char_at_reverse(s.0) == '(')
+ let last = msg[..s.0].chars().rev().next();
+ last == Some(' ') || last == Some('(')
}).map(|(a, b)| (a - 1, a + b.len()));
let second = msg.match_indices("found").filter(|s| {
- msg.char_at_reverse(s.0) == ' '
+ msg[..s.0].chars().rev().next() == Some(' ')
}).map(|(a, b)| (a - 1, a + b.len()));
let mut new_msg = String::new();
// except according to those terms.
use dep_graph::DepNode;
+use hir::def_id::DefId;
use ty::{Ty, TyS};
use ty::tls;
}
#[inline]
- pub fn get(&self, dep_node: DepNode) -> Option<Ty<'tcx>> {
+ pub fn get(&self, dep_node: DepNode<DefId>) -> Option<Ty<'tcx>> {
tls::with(|tcx| tcx.dep_graph.read(dep_node));
self.untracked_get()
}
}
#[inline]
- pub fn unwrap(&self, dep_node: DepNode) -> Ty<'tcx> {
+ pub fn unwrap(&self, dep_node: DepNode<DefId>) -> Ty<'tcx> {
self.get(dep_node).unwrap()
}
- pub fn fulfill(&self, dep_node: DepNode, value: Ty<'lt>) {
+ pub fn fulfill(&self, dep_node: DepNode<DefId>, value: Ty<'lt>) {
tls::with(|tcx| tcx.dep_graph.write(dep_node));
// Invariant (A) is fulfilled, because by (B), every alias
impl<'tcx> DepTrackingMapConfig for $ty_name<'tcx> {
type Key = $key;
type Value = $value;
- fn to_dep_node(key: &$key) -> DepNode { DepNode::$node_name(*key) }
+ fn to_dep_node(key: &$key) -> DepNode<DefId> { DepNode::$node_name(*key) }
}
}
}
}
/// Creates the dep-node for selecting/evaluating this trait reference.
- fn dep_node(&self) -> DepNode {
+ fn dep_node(&self) -> DepNode<DefId> {
DepNode::TraitSelect(self.def_id())
}
self.0.def_id()
}
- pub fn dep_node(&self) -> DepNode {
+ pub fn dep_node(&self) -> DepNode<DefId> {
// ok to skip binder since depnode does not care about regions
self.0.dep_node()
}
pub fn visit_all_items_in_krate<V,F>(&self,
dep_node_fn: F,
visitor: &mut V)
- where F: FnMut(DefId) -> DepNode, V: Visitor<'tcx>
+ where F: FnMut(DefId) -> DepNode<DefId>, V: Visitor<'tcx>
{
dep_graph::visit_all_items_in_krate(self, dep_node_fn, visitor);
}
#![feature(box_syntax)]
#![feature(const_fn)]
-#![feature(copy_from_slice)]
#![feature(libc)]
#![feature(rand)]
#![feature(rustc_private)]
key!(target_family, optional);
key!(is_like_osx, bool);
key!(is_like_windows, bool);
+ key!(is_like_msvc, bool);
key!(linker_is_gnu, bool);
key!(has_rpath, bool);
key!(no_compiler_rt, bool);
rustc_plugin = { path = "../librustc_plugin" }
rustc_passes = { path = "../librustc_passes" }
rustc_privacy = { path = "../librustc_privacy" }
+rustc_incremental = { path = "../librustc_incremental" }
rustc_resolve = { path = "../librustc_resolve" }
rustc_save_analysis = { path = "../librustc_save_analysis" }
rustc_trans = { path = "../librustc_trans" }
use rustc::util::nodemap::NodeSet;
use rustc_back::sha2::{Sha256, Digest};
use rustc_borrowck as borrowck;
+use rustc_incremental;
use rustc_resolve as resolve;
use rustc_metadata::macro_import;
use rustc_metadata::creader::LocalCrateReader;
let expanded_crate = assign_node_ids(sess, expanded_crate);
// Lower ast -> hir.
let lcx = LoweringContext::new(sess, Some(&expanded_crate));
- let dep_graph = DepGraph::new(sess.opts.build_dep_graph);
+ let dep_graph = DepGraph::new(sess.opts.build_dep_graph());
let mut hir_forest = time(sess.time_passes(),
"lowering ast -> hir",
|| hir_map::Forest::new(lower_crate(&lcx, &expanded_crate),
index,
name,
|tcx| {
+ time(time_passes,
+ "load_dep_graph",
+ || rustc_incremental::load_dep_graph(tcx));
+
// passes are timed inside typeck
try_with_f!(typeck::check_crate(tcx, trait_map), (tcx, None, analysis));
passes.run_passes(tcx, &mut mir_map);
});
+ let translation =
+ time(time_passes,
+ "translation",
+ move || trans::trans_crate(tcx, &mir_map, analysis));
+
time(time_passes,
- "translation",
- move || trans::trans_crate(tcx, &mir_map, analysis))
+ "assert dep graph",
+ move || rustc_incremental::assert_dep_graph(tcx));
+
+ time(time_passes,
+ "serialize dep graph",
+ move || rustc_incremental::save_dep_graph(tcx));
+
+ translation
}
/// Run LLVM itself, producing a bitcode file, assembly file or object file
extern crate rustc_lint;
extern crate rustc_plugin;
extern crate rustc_privacy;
+extern crate rustc_incremental;
extern crate rustc_metadata;
extern crate rustc_mir;
extern crate rustc_resolve;
--- /dev/null
+[package]
+authors = ["The Rust Project Developers"]
+name = "rustc_incremental"
+version = "0.0.0"
+
+[lib]
+name = "rustc_incremental"
+path = "lib.rs"
+crate-type = ["dylib"]
+
+[dependencies]
+graphviz = { path = "../libgraphviz" }
+rbml = { path = "../librbml" }
+rustc = { path = "../librustc" }
+rustc_data_structures = { path = "../librustc_data_structures" }
+serialize = { path = "../libserialize" }
+log = { path = "../liblog" }
+syntax = { path = "../libsyntax" }
--- /dev/null
+// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! This pass is only used for the UNIT TESTS and DEBUGGING NEEDS
+//! around dependency graph construction. It serves two purposes; it
+//! will dump graphs in graphviz form to disk, and it searches for
+//! `#[rustc_if_this_changed]` and `#[rustc_then_this_would_need]`
+//! annotations. These annotations can be used to test whether paths
+//! exist in the graph. These checks run after trans, so they view the
+//! the final state of the dependency graph. Note that there are
+//! similar assertions found in `persist::dirty_clean` which check the
+//! **initial** state of the dependency graph, just after it has been
+//! loaded from disk.
+//!
+//! In this code, we report errors on each `rustc_if_this_changed`
+//! annotation. If a path exists in all cases, then we would report
+//! "all path(s) exist". Otherwise, we report: "no path to `foo`" for
+//! each case where no path exists. `compile-fail` tests can then be
+//! used to check when paths exist or do not.
+//!
+//! The full form of the `rustc_if_this_changed` annotation is
+//! `#[rustc_if_this_changed(id)]`. The `"id"` is optional and
+//! defaults to `"id"` if omitted.
+//!
+//! Example:
+//!
+//! ```
+//! #[rustc_if_this_changed]
+//! fn foo() { }
+//!
+//! #[rustc_then_this_would_need("trans")] //~ ERROR no path from `foo`
+//! fn bar() { }
+//!
+//! #[rustc_then_this_would_need("trans")] //~ ERROR OK
+//! fn baz() { foo(); }
+//! ```
+
+use graphviz as dot;
+use rustc::dep_graph::{DepGraphQuery, DepNode};
+use rustc::hir::def_id::DefId;
+use rustc::ty::TyCtxt;
+use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
+use rustc_data_structures::graph::{Direction, INCOMING, OUTGOING, NodeIndex};
+use rustc::hir;
+use rustc::hir::intravisit::Visitor;
+use graphviz::IntoCow;
+use std::env;
+use std::fs::File;
+use std::io::Write;
+use syntax::ast;
+use syntax::attr::AttrMetaMethods;
+use syntax::codemap::Span;
+use syntax::parse::token::InternedString;
+
+const IF_THIS_CHANGED: &'static str = "rustc_if_this_changed";
+const THEN_THIS_WOULD_NEED: &'static str = "rustc_then_this_would_need";
+const ID: &'static str = "id";
+
+pub fn assert_dep_graph(tcx: &TyCtxt) {
+ let _ignore = tcx.dep_graph.in_ignore();
+
+ if tcx.sess.opts.debugging_opts.dump_dep_graph {
+ dump_graph(tcx);
+ }
+
+ // Find annotations supplied by user (if any).
+ let (if_this_changed, then_this_would_need) = {
+ let mut visitor = IfThisChanged { tcx: tcx,
+ if_this_changed: FnvHashMap(),
+ then_this_would_need: FnvHashMap() };
+ tcx.map.krate().visit_all_items(&mut visitor);
+ (visitor.if_this_changed, visitor.then_this_would_need)
+ };
+
+ if !if_this_changed.is_empty() || !then_this_would_need.is_empty() {
+ assert!(tcx.sess.opts.debugging_opts.query_dep_graph,
+ "cannot use the `#[{}]` or `#[{}]` annotations \
+ without supplying `-Z query-dep-graph`",
+ IF_THIS_CHANGED, THEN_THIS_WOULD_NEED);
+ }
+
+ // Check paths.
+ check_paths(tcx, &if_this_changed, &then_this_would_need);
+}
+
+type SourceHashMap =
+ FnvHashMap<InternedString,
+ FnvHashSet<(Span, DefId, DepNode<DefId>)>>;
+type TargetHashMap =
+ FnvHashMap<InternedString,
+ FnvHashSet<(Span, InternedString, ast::NodeId, DepNode<DefId>)>>;
+
+struct IfThisChanged<'a, 'tcx:'a> {
+ tcx: &'a TyCtxt<'tcx>,
+ if_this_changed: SourceHashMap,
+ then_this_would_need: TargetHashMap,
+}
+
+impl<'a, 'tcx> IfThisChanged<'a, 'tcx> {
+ fn process_attrs(&mut self, node_id: ast::NodeId, def_id: DefId) {
+ for attr in self.tcx.get_attrs(def_id).iter() {
+ if attr.check_name(IF_THIS_CHANGED) {
+ let mut id = None;
+ for meta_item in attr.meta_item_list().unwrap_or_default() {
+ match meta_item.node {
+ ast::MetaItemKind::Word(ref s) if id.is_none() => id = Some(s.clone()),
+ _ => {
+ self.tcx.sess.span_err(
+ meta_item.span,
+ &format!("unexpected meta-item {:?}", meta_item.node));
+ }
+ }
+ }
+ let id = id.unwrap_or(InternedString::new(ID));
+ self.if_this_changed.entry(id)
+ .or_insert(FnvHashSet())
+ .insert((attr.span, def_id, DepNode::Hir(def_id)));
+ } else if attr.check_name(THEN_THIS_WOULD_NEED) {
+ let mut dep_node_interned = None;
+ let mut id = None;
+ for meta_item in attr.meta_item_list().unwrap_or_default() {
+ match meta_item.node {
+ ast::MetaItemKind::Word(ref s) if dep_node_interned.is_none() =>
+ dep_node_interned = Some(s.clone()),
+ ast::MetaItemKind::Word(ref s) if id.is_none() =>
+ id = Some(s.clone()),
+ _ => {
+ self.tcx.sess.span_err(
+ meta_item.span,
+ &format!("unexpected meta-item {:?}", meta_item.node));
+ }
+ }
+ }
+ let dep_node = match dep_node_interned {
+ Some(ref n) => {
+ match DepNode::from_label_string(&n[..], def_id) {
+ Ok(n) => n,
+ Err(()) => {
+ self.tcx.sess.span_fatal(
+ attr.span,
+ &format!("unrecognized DepNode variant {:?}", n));
+ }
+ }
+ }
+ None => {
+ self.tcx.sess.span_fatal(
+ attr.span,
+ &format!("missing DepNode variant"));
+ }
+ };
+ let id = id.unwrap_or(InternedString::new(ID));
+ self.then_this_would_need
+ .entry(id)
+ .or_insert(FnvHashSet())
+ .insert((attr.span, dep_node_interned.clone().unwrap(), node_id, dep_node));
+ }
+ }
+ }
+}
+
+impl<'a, 'tcx> Visitor<'tcx> for IfThisChanged<'a, 'tcx> {
+ fn visit_item(&mut self, item: &'tcx hir::Item) {
+ let def_id = self.tcx.map.local_def_id(item.id);
+ self.process_attrs(item.id, def_id);
+ }
+}
+
+fn check_paths(tcx: &TyCtxt,
+ if_this_changed: &SourceHashMap,
+ then_this_would_need: &TargetHashMap)
+{
+ // Return early here so as not to construct the query, which is not cheap.
+ if if_this_changed.is_empty() {
+ return;
+ }
+ let query = tcx.dep_graph.query();
+ for (id, sources) in if_this_changed {
+ let targets = match then_this_would_need.get(id) {
+ Some(targets) => targets,
+ None => {
+ for &(source_span, _, _) in sources.iter().take(1) {
+ tcx.sess.span_err(
+ source_span,
+ &format!("no targets for id `{}`", id));
+ }
+ continue;
+ }
+ };
+
+ for &(_, source_def_id, source_dep_node) in sources {
+ let dependents = query.transitive_dependents(source_dep_node);
+ for &(target_span, ref target_pass, _, ref target_dep_node) in targets {
+ if !dependents.contains(&target_dep_node) {
+ tcx.sess.span_err(
+ target_span,
+ &format!("no path from `{}` to `{}`",
+ tcx.item_path_str(source_def_id),
+ target_pass));
+ } else {
+ tcx.sess.span_err(
+ target_span,
+ &format!("OK"));
+ }
+ }
+ }
+ }
+}
+
+fn dump_graph(tcx: &TyCtxt) {
+ let path: String = env::var("RUST_DEP_GRAPH").unwrap_or_else(|_| format!("dep_graph"));
+ let query = tcx.dep_graph.query();
+
+ let nodes = match env::var("RUST_DEP_GRAPH_FILTER") {
+ Ok(string) => {
+ // Expect one of: "-> target", "source -> target", or "source ->".
+ let parts: Vec<_> = string.split("->").collect();
+ if parts.len() > 2 {
+ bug!("Invalid RUST_DEP_GRAPH_FILTER: expected '[source] -> [target]'");
+ }
+ let sources = node_set(&query, &parts[0]);
+ let targets = node_set(&query, &parts[1]);
+ filter_nodes(&query, &sources, &targets)
+ }
+ Err(_) => {
+ query.nodes()
+ .into_iter()
+ .collect()
+ }
+ };
+ let edges = filter_edges(&query, &nodes);
+
+ { // dump a .txt file with just the edges:
+ let txt_path = format!("{}.txt", path);
+ let mut file = File::create(&txt_path).unwrap();
+ for &(source, target) in &edges {
+ write!(file, "{:?} -> {:?}\n", source, target).unwrap();
+ }
+ }
+
+ { // dump a .dot file in graphviz format:
+ let dot_path = format!("{}.dot", path);
+ let mut v = Vec::new();
+ dot::render(&GraphvizDepGraph(nodes, edges), &mut v).unwrap();
+ File::create(&dot_path).and_then(|mut f| f.write_all(&v)).unwrap();
+ }
+}
+
+pub struct GraphvizDepGraph(FnvHashSet<DepNode<DefId>>,
+ Vec<(DepNode<DefId>, DepNode<DefId>)>);
+
+impl<'a, 'tcx> dot::GraphWalk<'a> for GraphvizDepGraph {
+ type Node = DepNode<DefId>;
+ type Edge = (DepNode<DefId>, DepNode<DefId>);
+ fn nodes(&self) -> dot::Nodes<DepNode<DefId>> {
+ let nodes: Vec<_> = self.0.iter().cloned().collect();
+ nodes.into_cow()
+ }
+ fn edges(&self) -> dot::Edges<(DepNode<DefId>, DepNode<DefId>)> {
+ self.1[..].into_cow()
+ }
+ fn source(&self, edge: &(DepNode<DefId>, DepNode<DefId>)) -> DepNode<DefId> {
+ edge.0
+ }
+ fn target(&self, edge: &(DepNode<DefId>, DepNode<DefId>)) -> DepNode<DefId> {
+ edge.1
+ }
+}
+
+impl<'a, 'tcx> dot::Labeller<'a> for GraphvizDepGraph {
+ type Node = DepNode<DefId>;
+ type Edge = (DepNode<DefId>, DepNode<DefId>);
+ fn graph_id(&self) -> dot::Id {
+ dot::Id::new("DependencyGraph").unwrap()
+ }
+ fn node_id(&self, n: &DepNode<DefId>) -> dot::Id {
+ let s: String =
+ format!("{:?}", n).chars()
+ .map(|c| if c == '_' || c.is_alphanumeric() { c } else { '_' })
+ .collect();
+ debug!("n={:?} s={:?}", n, s);
+ dot::Id::new(s).unwrap()
+ }
+ fn node_label(&self, n: &DepNode<DefId>) -> dot::LabelText {
+ dot::LabelText::label(format!("{:?}", n))
+ }
+}
+
+// Given an optional filter like `"x,y,z"`, returns either `None` (no
+// filter) or the set of nodes whose labels contain all of those
+// substrings.
+fn node_set(query: &DepGraphQuery<DefId>, filter: &str)
+ -> Option<FnvHashSet<DepNode<DefId>>>
+{
+ debug!("node_set(filter={:?})", filter);
+
+ if filter.trim().is_empty() {
+ return None;
+ }
+
+ let filters: Vec<&str> = filter.split("&").map(|s| s.trim()).collect();
+
+ debug!("node_set: filters={:?}", filters);
+
+ Some(query.nodes()
+ .into_iter()
+ .filter(|n| {
+ let s = format!("{:?}", n);
+ filters.iter().all(|f| s.contains(f))
+ })
+ .collect())
+}
+
+fn filter_nodes(query: &DepGraphQuery<DefId>,
+ sources: &Option<FnvHashSet<DepNode<DefId>>>,
+ targets: &Option<FnvHashSet<DepNode<DefId>>>)
+ -> FnvHashSet<DepNode<DefId>>
+{
+ if let &Some(ref sources) = sources {
+ if let &Some(ref targets) = targets {
+ walk_between(query, sources, targets)
+ } else {
+ walk_nodes(query, sources, OUTGOING)
+ }
+ } else if let &Some(ref targets) = targets {
+ walk_nodes(query, targets, INCOMING)
+ } else {
+ query.nodes().into_iter().collect()
+ }
+}
+
+fn walk_nodes(query: &DepGraphQuery<DefId>,
+ starts: &FnvHashSet<DepNode<DefId>>,
+ direction: Direction)
+ -> FnvHashSet<DepNode<DefId>>
+{
+ let mut set = FnvHashSet();
+ for start in starts {
+ debug!("walk_nodes: start={:?} outgoing?={:?}", start, direction == OUTGOING);
+ if set.insert(*start) {
+ let mut stack = vec![query.indices[start]];
+ while let Some(index) = stack.pop() {
+ for (_, edge) in query.graph.adjacent_edges(index, direction) {
+ let neighbor_index = edge.source_or_target(direction);
+ let neighbor = query.graph.node_data(neighbor_index);
+ if set.insert(*neighbor) {
+ stack.push(neighbor_index);
+ }
+ }
+ }
+ }
+ }
+ set
+}
+
+fn walk_between(query: &DepGraphQuery<DefId>,
+ sources: &FnvHashSet<DepNode<DefId>>,
+ targets: &FnvHashSet<DepNode<DefId>>)
+ -> FnvHashSet<DepNode<DefId>>
+{
+ // This is a bit tricky. We want to include a node only if it is:
+ // (a) reachable from a source and (b) will reach a target. And we
+ // have to be careful about cycles etc. Luckily efficiency is not
+ // a big concern!
+
+ #[derive(Copy, Clone, PartialEq)]
+ enum State { Undecided, Deciding, Included, Excluded }
+
+ let mut node_states = vec![State::Undecided; query.graph.len_nodes()];
+
+ for &target in targets {
+ node_states[query.indices[&target].0] = State::Included;
+ }
+
+ for source in sources.iter().map(|n| query.indices[n]) {
+ recurse(query, &mut node_states, source);
+ }
+
+ return query.nodes()
+ .into_iter()
+ .filter(|n| {
+ let index = query.indices[n];
+ node_states[index.0] == State::Included
+ })
+ .collect();
+
+ fn recurse(query: &DepGraphQuery<DefId>,
+ node_states: &mut [State],
+ node: NodeIndex)
+ -> bool
+ {
+ match node_states[node.0] {
+ // known to reach a target
+ State::Included => return true,
+
+ // known not to reach a target
+ State::Excluded => return false,
+
+ // backedge, not yet known, say false
+ State::Deciding => return false,
+
+ State::Undecided => { }
+ }
+
+ node_states[node.0] = State::Deciding;
+
+ for neighbor_index in query.graph.successor_nodes(node) {
+ if recurse(query, node_states, neighbor_index) {
+ node_states[node.0] = State::Included;
+ }
+ }
+
+ // if we didn't find a path to target, then set to excluded
+ if node_states[node.0] == State::Deciding {
+ node_states[node.0] = State::Excluded;
+ false
+ } else {
+ assert!(node_states[node.0] == State::Included);
+ true
+ }
+ }
+}
+
+fn filter_edges(query: &DepGraphQuery<DefId>,
+ nodes: &FnvHashSet<DepNode<DefId>>)
+ -> Vec<(DepNode<DefId>, DepNode<DefId>)>
+{
+ query.edges()
+ .into_iter()
+ .filter(|&(source, target)| nodes.contains(&source) && nodes.contains(&target))
+ .collect()
+}
--- /dev/null
+// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! Calculation of a Strict Version Hash for crates. For a length
+//! comment explaining the general idea, see `librustc/middle/svh.rs`.
+
+use std::hash::{Hash, SipHasher, Hasher};
+use rustc::hir::def_id::{CRATE_DEF_INDEX, DefId};
+use rustc::hir::svh::Svh;
+use rustc::ty;
+use rustc::hir::intravisit::{self, Visitor};
+
+use self::svh_visitor::StrictVersionHashVisitor;
+
+pub trait SvhCalculate {
+ /// Calculate the SVH for an entire krate.
+ fn calculate_krate_hash(&self) -> Svh;
+
+ /// Calculate the SVH for a particular item.
+ fn calculate_item_hash(&self, def_id: DefId) -> u64;
+}
+
+impl<'tcx> SvhCalculate for ty::TyCtxt<'tcx> {
+ fn calculate_krate_hash(&self) -> Svh {
+ // FIXME (#14132): This is better than it used to be, but it still not
+ // ideal. We now attempt to hash only the relevant portions of the
+ // Crate AST as well as the top-level crate attributes. (However,
+ // the hashing of the crate attributes should be double-checked
+ // to ensure it is not incorporating implementation artifacts into
+ // the hash that are not otherwise visible.)
+
+ let crate_disambiguator = self.sess.crate_disambiguator.get();
+ let krate = self.map.krate();
+
+ // FIXME: this should use SHA1, not SipHash. SipHash is not built to
+ // avoid collisions.
+ let mut state = SipHasher::new();
+ debug!("state: {:?}", state);
+
+ // FIXME(#32753) -- at (*) we `to_le` for endianness, but is
+ // this enough, and does it matter anyway?
+ "crate_disambiguator".hash(&mut state);
+ crate_disambiguator.as_str().len().to_le().hash(&mut state); // (*)
+ crate_disambiguator.as_str().hash(&mut state);
+
+ debug!("crate_disambiguator: {:?}", crate_disambiguator.as_str());
+ debug!("state: {:?}", state);
+
+ {
+ let mut visit = StrictVersionHashVisitor::new(&mut state, self);
+ krate.visit_all_items(&mut visit);
+ }
+
+ // FIXME (#14132): This hash is still sensitive to e.g. the
+ // spans of the crate Attributes and their underlying
+ // MetaItems; we should make ContentHashable impl for those
+ // types and then use hash_content. But, since all crate
+ // attributes should appear near beginning of the file, it is
+ // not such a big deal to be sensitive to their spans for now.
+ //
+ // We hash only the MetaItems instead of the entire Attribute
+ // to avoid hashing the AttrId
+ for attr in &krate.attrs {
+ debug!("krate attr {:?}", attr);
+ attr.node.value.hash(&mut state);
+ }
+
+ Svh::from_hash(state.finish())
+ }
+
+ fn calculate_item_hash(&self, def_id: DefId) -> u64 {
+ assert!(def_id.is_local());
+
+ let mut state = SipHasher::new();
+
+ {
+ let mut visit = StrictVersionHashVisitor::new(&mut state, self);
+ if def_id.index == CRATE_DEF_INDEX {
+ // the crate root itself is not registered in the map
+ // as an item, so we have to fetch it this way
+ let krate = self.map.krate();
+ intravisit::walk_crate(&mut visit, krate);
+ } else {
+ let node_id = self.map.as_local_node_id(def_id).unwrap();
+ visit.visit_item(self.map.expect_item(node_id));
+ }
+ }
+
+ state.finish()
+ }
+}
+
+// FIXME (#14132): Even this SVH computation still has implementation
+// artifacts: namely, the order of item declaration will affect the
+// hash computation, but for many kinds of items the order of
+// declaration should be irrelevant to the ABI.
+
+mod svh_visitor {
+ pub use self::SawExprComponent::*;
+ pub use self::SawStmtComponent::*;
+ use self::SawAbiComponent::*;
+ use syntax::ast::{self, Name, NodeId};
+ use syntax::codemap::Span;
+ use syntax::parse::token;
+ use rustc::ty;
+ use rustc::hir;
+ use rustc::hir::*;
+ use rustc::hir::intravisit as visit;
+ use rustc::hir::intravisit::{Visitor, FnKind};
+
+ use std::hash::{Hash, SipHasher};
+
+ pub struct StrictVersionHashVisitor<'a, 'tcx: 'a> {
+ pub tcx: &'a ty::TyCtxt<'tcx>,
+ pub st: &'a mut SipHasher,
+ }
+
+ impl<'a, 'tcx> StrictVersionHashVisitor<'a, 'tcx> {
+ pub fn new(st: &'a mut SipHasher,
+ tcx: &'a ty::TyCtxt<'tcx>)
+ -> Self {
+ StrictVersionHashVisitor { st: st, tcx: tcx }
+ }
+ }
+
+ // To off-load the bulk of the hash-computation on #[derive(Hash)],
+ // we define a set of enums corresponding to the content that our
+ // crate visitor will encounter as it traverses the ast.
+ //
+ // The important invariant is that all of the Saw*Component enums
+ // do not carry any Spans, Names, or Idents.
+ //
+ // Not carrying any Names/Idents is the important fix for problem
+ // noted on PR #13948: using the ident.name as the basis for a
+ // hash leads to unstable SVH, because ident.name is just an index
+ // into intern table (i.e. essentially a random address), not
+ // computed from the name content.
+ //
+ // With the below enums, the SVH computation is not sensitive to
+ // artifacts of how rustc was invoked nor of how the source code
+ // was laid out. (Or at least it is *less* sensitive.)
+
+ // This enum represents the different potential bits of code the
+ // visitor could encounter that could affect the ABI for the crate,
+ // and assigns each a distinct tag to feed into the hash computation.
+ #[derive(Hash)]
+ enum SawAbiComponent<'a> {
+
+ // FIXME (#14132): should we include (some function of)
+ // ident.ctxt as well?
+ SawIdent(token::InternedString),
+ SawStructDef(token::InternedString),
+
+ SawLifetime(token::InternedString),
+ SawLifetimeDef(token::InternedString),
+
+ SawMod,
+ SawForeignItem,
+ SawItem,
+ SawDecl,
+ SawTy,
+ SawGenerics,
+ SawFn,
+ SawTraitItem,
+ SawImplItem,
+ SawStructField,
+ SawVariant,
+ SawExplicitSelf,
+ SawPath,
+ SawBlock,
+ SawPat,
+ SawLocal,
+ SawArm,
+ SawExpr(SawExprComponent<'a>),
+ SawStmt(SawStmtComponent),
+ }
+
+ /// SawExprComponent carries all of the information that we want
+ /// to include in the hash that *won't* be covered by the
+ /// subsequent recursive traversal of the expression's
+ /// substructure by the visitor.
+ ///
+ /// We know every Expr_ variant is covered by a variant because
+ /// `fn saw_expr` maps each to some case below. Ensuring that
+ /// each variant carries an appropriate payload has to be verified
+ /// by hand.
+ ///
+ /// (However, getting that *exactly* right is not so important
+ /// because the SVH is just a developer convenience; there is no
+ /// guarantee of collision-freedom, hash collisions are just
+ /// (hopefully) unlikely.)
+ #[derive(Hash)]
+ pub enum SawExprComponent<'a> {
+
+ SawExprLoop(Option<token::InternedString>),
+ SawExprField(token::InternedString),
+ SawExprTupField(usize),
+ SawExprBreak(Option<token::InternedString>),
+ SawExprAgain(Option<token::InternedString>),
+
+ SawExprBox,
+ SawExprVec,
+ SawExprCall,
+ SawExprMethodCall,
+ SawExprTup,
+ SawExprBinary(hir::BinOp_),
+ SawExprUnary(hir::UnOp),
+ SawExprLit(ast::LitKind),
+ SawExprCast,
+ SawExprType,
+ SawExprIf,
+ SawExprWhile,
+ SawExprMatch,
+ SawExprClosure,
+ SawExprBlock,
+ SawExprAssign,
+ SawExprAssignOp(hir::BinOp_),
+ SawExprIndex,
+ SawExprPath(Option<usize>),
+ SawExprAddrOf(hir::Mutability),
+ SawExprRet,
+ SawExprInlineAsm(&'a hir::InlineAsm),
+ SawExprStruct,
+ SawExprRepeat,
+ }
+
+ fn saw_expr<'a>(node: &'a Expr_) -> SawExprComponent<'a> {
+ match *node {
+ ExprBox(..) => SawExprBox,
+ ExprVec(..) => SawExprVec,
+ ExprCall(..) => SawExprCall,
+ ExprMethodCall(..) => SawExprMethodCall,
+ ExprTup(..) => SawExprTup,
+ ExprBinary(op, _, _) => SawExprBinary(op.node),
+ ExprUnary(op, _) => SawExprUnary(op),
+ ExprLit(ref lit) => SawExprLit(lit.node.clone()),
+ ExprCast(..) => SawExprCast,
+ ExprType(..) => SawExprType,
+ ExprIf(..) => SawExprIf,
+ ExprWhile(..) => SawExprWhile,
+ ExprLoop(_, id) => SawExprLoop(id.map(|id| id.name.as_str())),
+ ExprMatch(..) => SawExprMatch,
+ ExprClosure(..) => SawExprClosure,
+ ExprBlock(..) => SawExprBlock,
+ ExprAssign(..) => SawExprAssign,
+ ExprAssignOp(op, _, _) => SawExprAssignOp(op.node),
+ ExprField(_, name) => SawExprField(name.node.as_str()),
+ ExprTupField(_, id) => SawExprTupField(id.node),
+ ExprIndex(..) => SawExprIndex,
+ ExprPath(ref qself, _) => SawExprPath(qself.as_ref().map(|q| q.position)),
+ ExprAddrOf(m, _) => SawExprAddrOf(m),
+ ExprBreak(id) => SawExprBreak(id.map(|id| id.node.name.as_str())),
+ ExprAgain(id) => SawExprAgain(id.map(|id| id.node.name.as_str())),
+ ExprRet(..) => SawExprRet,
+ ExprInlineAsm(ref a,_,_) => SawExprInlineAsm(a),
+ ExprStruct(..) => SawExprStruct,
+ ExprRepeat(..) => SawExprRepeat,
+ }
+ }
+
+ /// SawStmtComponent is analogous to SawExprComponent, but for statements.
+ #[derive(Hash)]
+ pub enum SawStmtComponent {
+ SawStmtDecl,
+ SawStmtExpr,
+ SawStmtSemi,
+ }
+
+ fn saw_stmt(node: &Stmt_) -> SawStmtComponent {
+ match *node {
+ StmtDecl(..) => SawStmtDecl,
+ StmtExpr(..) => SawStmtExpr,
+ StmtSemi(..) => SawStmtSemi,
+ }
+ }
+
+ impl<'a, 'tcx> Visitor<'a> for StrictVersionHashVisitor<'a, 'tcx> {
+ fn visit_nested_item(&mut self, item: ItemId) {
+ debug!("visit_nested_item: {:?} st={:?}", item, self.st);
+ let def_path = self.tcx.map.def_path_from_id(item.id);
+ def_path.hash(self.st);
+ }
+
+ fn visit_variant_data(&mut self, s: &'a VariantData, name: Name,
+ g: &'a Generics, _: NodeId, _: Span) {
+ SawStructDef(name.as_str()).hash(self.st);
+ visit::walk_generics(self, g);
+ visit::walk_struct_def(self, s)
+ }
+
+ fn visit_variant(&mut self, v: &'a Variant, g: &'a Generics, item_id: NodeId) {
+ SawVariant.hash(self.st);
+ // walk_variant does not call walk_generics, so do it here.
+ visit::walk_generics(self, g);
+ visit::walk_variant(self, v, g, item_id)
+ }
+
+ // All of the remaining methods just record (in the hash
+ // SipHasher) that the visitor saw that particular variant
+ // (with its payload), and continue walking as the default
+ // visitor would.
+ //
+ // Some of the implementations have some notes as to how one
+ // might try to make their SVH computation less discerning
+ // (e.g. by incorporating reachability analysis). But
+ // currently all of their implementations are uniform and
+ // uninteresting.
+ //
+ // (If you edit a method such that it deviates from the
+ // pattern, please move that method up above this comment.)
+
+ fn visit_name(&mut self, _: Span, name: Name) {
+ SawIdent(name.as_str()).hash(self.st);
+ }
+
+ fn visit_lifetime(&mut self, l: &'a Lifetime) {
+ SawLifetime(l.name.as_str()).hash(self.st);
+ }
+
+ fn visit_lifetime_def(&mut self, l: &'a LifetimeDef) {
+ SawLifetimeDef(l.lifetime.name.as_str()).hash(self.st);
+ }
+
+ // We do recursively walk the bodies of functions/methods
+ // (rather than omitting their bodies from the hash) since
+ // monomorphization and cross-crate inlining generally implies
+ // that a change to a crate body will require downstream
+ // crates to be recompiled.
+ fn visit_expr(&mut self, ex: &'a Expr) {
+ SawExpr(saw_expr(&ex.node)).hash(self.st); visit::walk_expr(self, ex)
+ }
+
+ fn visit_stmt(&mut self, s: &'a Stmt) {
+ SawStmt(saw_stmt(&s.node)).hash(self.st); visit::walk_stmt(self, s)
+ }
+
+ fn visit_foreign_item(&mut self, i: &'a ForeignItem) {
+ // FIXME (#14132) ideally we would incorporate privacy (or
+ // perhaps reachability) somewhere here, so foreign items
+ // that do not leak into downstream crates would not be
+ // part of the ABI.
+ SawForeignItem.hash(self.st); visit::walk_foreign_item(self, i)
+ }
+
+ fn visit_item(&mut self, i: &'a Item) {
+ debug!("visit_item: {:?} st={:?}", i, self.st);
+ // FIXME (#14132) ideally would incorporate reachability
+ // analysis somewhere here, so items that never leak into
+ // downstream crates (e.g. via monomorphisation or
+ // inlining) would not be part of the ABI.
+ SawItem.hash(self.st); visit::walk_item(self, i)
+ }
+
+ fn visit_mod(&mut self, m: &'a Mod, _s: Span, _n: NodeId) {
+ SawMod.hash(self.st); visit::walk_mod(self, m)
+ }
+
+ fn visit_decl(&mut self, d: &'a Decl) {
+ SawDecl.hash(self.st); visit::walk_decl(self, d)
+ }
+
+ fn visit_ty(&mut self, t: &'a Ty) {
+ SawTy.hash(self.st); visit::walk_ty(self, t)
+ }
+
+ fn visit_generics(&mut self, g: &'a Generics) {
+ SawGenerics.hash(self.st); visit::walk_generics(self, g)
+ }
+
+ fn visit_fn(&mut self, fk: FnKind<'a>, fd: &'a FnDecl,
+ b: &'a Block, s: Span, _: NodeId) {
+ SawFn.hash(self.st); visit::walk_fn(self, fk, fd, b, s)
+ }
+
+ fn visit_trait_item(&mut self, ti: &'a TraitItem) {
+ SawTraitItem.hash(self.st); visit::walk_trait_item(self, ti)
+ }
+
+ fn visit_impl_item(&mut self, ii: &'a ImplItem) {
+ SawImplItem.hash(self.st); visit::walk_impl_item(self, ii)
+ }
+
+ fn visit_struct_field(&mut self, s: &'a StructField) {
+ SawStructField.hash(self.st); visit::walk_struct_field(self, s)
+ }
+
+ fn visit_explicit_self(&mut self, es: &'a ExplicitSelf) {
+ SawExplicitSelf.hash(self.st); visit::walk_explicit_self(self, es)
+ }
+
+ fn visit_path(&mut self, path: &'a Path, _: ast::NodeId) {
+ SawPath.hash(self.st); visit::walk_path(self, path)
+ }
+
+ fn visit_path_list_item(&mut self, prefix: &'a Path, item: &'a PathListItem) {
+ SawPath.hash(self.st); visit::walk_path_list_item(self, prefix, item)
+ }
+
+ fn visit_block(&mut self, b: &'a Block) {
+ SawBlock.hash(self.st); visit::walk_block(self, b)
+ }
+
+ fn visit_pat(&mut self, p: &'a Pat) {
+ SawPat.hash(self.st); visit::walk_pat(self, p)
+ }
+
+ fn visit_local(&mut self, l: &'a Local) {
+ SawLocal.hash(self.st); visit::walk_local(self, l)
+ }
+
+ fn visit_arm(&mut self, a: &'a Arm) {
+ SawArm.hash(self.st); visit::walk_arm(self, a)
+ }
+ }
+}
--- /dev/null
+// Copyright 2012-2013 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! Support for serializing the dep-graph and reloading it.
+
+#![crate_name = "rustc_incremental"]
+#![unstable(feature = "rustc_private", issue = "27812")]
+#![crate_type = "dylib"]
+#![crate_type = "rlib"]
+#![doc(html_logo_url = "https://www.rust-lang.org/logos/rust-logo-128x128-blk-v2.png",
+ html_favicon_url = "https://doc.rust-lang.org/favicon.ico",
+ html_root_url = "https://doc.rust-lang.org/nightly/")]
+#![cfg_attr(not(stage0), deny(warnings))]
+
+#![feature(rustc_private)]
+#![feature(staged_api)]
+
+extern crate graphviz;
+extern crate rbml;
+#[macro_use] extern crate rustc;
+extern crate rustc_data_structures;
+extern crate serialize as rustc_serialize;
+
+#[macro_use] extern crate log;
+#[macro_use] extern crate syntax;
+
+mod assert_dep_graph;
+mod calculate_svh;
+mod persist;
+
+pub use assert_dep_graph::assert_dep_graph;
+pub use calculate_svh::SvhCalculate;
+pub use persist::load_dep_graph;
+pub use persist::save_dep_graph;
--- /dev/null
+This is the code to load/save the dependency graph. Loading is assumed
+to run early in compilation, and saving at the very end. When loading,
+the basic idea is that we will load up the dependency graph from the
+previous compilation and compare the hashes of our HIR nodes to the
+hashes of the HIR nodes that existed at the time. For each node whose
+hash has changed, or which no longer exists in the new HIR, we can
+remove that node from the old graph along with any nodes that depend
+on it. Then we add what's left to the new graph (if any such nodes or
+edges already exist, then there would be no effect, but since we do
+this first thing, they do not).
+
+
+
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! The data that we will serialize and deserialize.
+
+use rustc::dep_graph::DepNode;
+use rustc_serialize::{Decoder as RustcDecoder,
+ Encodable as RustcEncodable, Encoder as RustcEncoder};
+
+use super::directory::DefPathIndex;
+
+#[derive(Debug, RustcEncodable, RustcDecodable)]
+pub struct SerializedDepGraph {
+ pub nodes: Vec<DepNode<DefPathIndex>>,
+ pub edges: Vec<SerializedEdge>,
+ pub hashes: Vec<SerializedHash>,
+}
+
+pub type SerializedEdge = (DepNode<DefPathIndex>, DepNode<DefPathIndex>);
+
+#[derive(Debug, RustcEncodable, RustcDecodable)]
+pub struct SerializedHash {
+ pub index: DefPathIndex,
+
+ /// the hash itself, computed by `calculate_item_hash`
+ pub hash: u64,
+}
+
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! Code to convert a DefId into a DefPath (when serializing) and then
+//! back again (when deserializing). Note that the new DefId
+//! necessarily will not be the same as the old (and of course the
+//! item might even be removed in the meantime).
+
+use rustc::dep_graph::DepNode;
+use rustc::hir::map::DefPath;
+use rustc::hir::def_id::DefId;
+use rustc::ty;
+use rustc::util::nodemap::DefIdMap;
+use rustc_serialize::{Decoder as RustcDecoder,
+ Encodable as RustcEncodable, Encoder as RustcEncoder};
+use std::fmt::{self, Debug};
+
+/// Index into the DefIdDirectory
+#[derive(Copy, Clone, Debug, PartialOrd, Ord, Hash, PartialEq, Eq,
+ RustcEncodable, RustcDecodable)]
+pub struct DefPathIndex {
+ index: u32
+}
+
+#[derive(RustcEncodable, RustcDecodable)]
+pub struct DefIdDirectory {
+ // N.B. don't use Removable here because these def-ids are loaded
+ // directly without remapping, so loading them should not fail.
+ paths: Vec<DefPath>
+}
+
+impl DefIdDirectory {
+ pub fn new() -> DefIdDirectory {
+ DefIdDirectory { paths: vec![] }
+ }
+
+ pub fn retrace(&self, tcx: &ty::TyCtxt) -> RetracedDefIdDirectory {
+ let ids = self.paths.iter()
+ .map(|path| tcx.map.retrace_path(path))
+ .collect();
+ RetracedDefIdDirectory { ids: ids }
+ }
+}
+
+#[derive(Debug, RustcEncodable, RustcDecodable)]
+pub struct RetracedDefIdDirectory {
+ ids: Vec<Option<DefId>>
+}
+
+impl RetracedDefIdDirectory {
+ pub fn def_id(&self, index: DefPathIndex) -> Option<DefId> {
+ self.ids[index.index as usize]
+ }
+
+ pub fn map(&self, node: DepNode<DefPathIndex>) -> Option<DepNode<DefId>> {
+ node.map_def(|&index| self.def_id(index))
+ }
+}
+
+pub struct DefIdDirectoryBuilder<'a,'tcx:'a> {
+ tcx: &'a ty::TyCtxt<'tcx>,
+ hash: DefIdMap<Option<DefPathIndex>>,
+ directory: DefIdDirectory,
+}
+
+impl<'a,'tcx> DefIdDirectoryBuilder<'a,'tcx> {
+ pub fn new(tcx: &'a ty::TyCtxt<'tcx>) -> DefIdDirectoryBuilder<'a, 'tcx> {
+ DefIdDirectoryBuilder {
+ tcx: tcx,
+ hash: DefIdMap(),
+ directory: DefIdDirectory::new()
+ }
+ }
+
+ pub fn add(&mut self, def_id: DefId) -> Option<DefPathIndex> {
+ if !def_id.is_local() {
+ // FIXME(#32015) clarify story about cross-crate dep tracking
+ return None;
+ }
+
+ let tcx = self.tcx;
+ let paths = &mut self.directory.paths;
+ self.hash.entry(def_id)
+ .or_insert_with(|| {
+ let def_path = tcx.def_path(def_id);
+ if !def_path.is_local() {
+ return None;
+ }
+ let index = paths.len() as u32;
+ paths.push(def_path);
+ Some(DefPathIndex { index: index })
+ })
+ .clone()
+ }
+
+ pub fn map(&mut self, node: DepNode<DefId>) -> Option<DepNode<DefPathIndex>> {
+ node.map_def(|&def_id| self.add(def_id))
+ }
+
+ pub fn into_directory(self) -> DefIdDirectory {
+ self.directory
+ }
+}
+
+impl Debug for DefIdDirectory {
+ fn fmt(&self, fmt: &mut fmt::Formatter) -> Result<(), fmt::Error> {
+ fmt.debug_list()
+ .entries(self.paths.iter().enumerate())
+ .finish()
+ }
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! Debugging code to test the state of the dependency graph just
+//! after it is loaded from disk. For each node marked with
+//! `#[rustc_clean]` or `#[rustc_dirty]`, we will check that a
+//! suitable node for that item either appears or does not appear in
+//! the dep-graph, as appropriate:
+//!
+//! - `#[rustc_dirty(label="TypeckItemBody", cfg="rev2")]` if we are
+//! in `#[cfg(rev2)]`, then there MUST NOT be a node
+//! `DepNode::TypeckItemBody(X)` where `X` is the def-id of the
+//! current node.
+//! - `#[rustc_clean(label="TypeckItemBody", cfg="rev2")]` same as above,
+//! except that the node MUST exist.
+//!
+//! Errors are reported if we are in the suitable configuration but
+//! the required condition is not met.
+
+use rustc::dep_graph::{DepGraphQuery, DepNode};
+use rustc::hir;
+use rustc::hir::def_id::DefId;
+use rustc::hir::intravisit::Visitor;
+use syntax::ast::{self, Attribute, MetaItem};
+use syntax::attr::AttrMetaMethods;
+use syntax::parse::token::InternedString;
+use rustc::ty;
+
+const DIRTY: &'static str = "rustc_dirty";
+const CLEAN: &'static str = "rustc_clean";
+const LABEL: &'static str = "label";
+const CFG: &'static str = "cfg";
+
+pub fn check_dirty_clean_annotations(tcx: &ty::TyCtxt) {
+ let _ignore = tcx.dep_graph.in_ignore();
+ let query = tcx.dep_graph.query();
+ let krate = tcx.map.krate();
+ krate.visit_all_items(&mut DirtyCleanVisitor {
+ tcx: tcx,
+ query: &query,
+ });
+}
+
+pub struct DirtyCleanVisitor<'a, 'tcx:'a> {
+ tcx: &'a ty::TyCtxt<'tcx>,
+ query: &'a DepGraphQuery<DefId>,
+}
+
+impl<'a, 'tcx> DirtyCleanVisitor<'a, 'tcx> {
+ fn expect_associated_value(&self, item: &MetaItem) -> InternedString {
+ if let Some(value) = item.value_str() {
+ value
+ } else {
+ self.tcx.sess.span_fatal(
+ item.span,
+ &format!("associated value expected for `{}`", item.name()));
+ }
+ }
+
+ /// Given a `#[rustc_dirty]` or `#[rustc_clean]` attribute, scan
+ /// for a `cfg="foo"` attribute and check whether we have a cfg
+ /// flag called `foo`.
+ fn check_config(&self, attr: &ast::Attribute) -> bool {
+ debug!("check_config(attr={:?})", attr);
+ let config = &self.tcx.map.krate().config;
+ debug!("check_config: config={:?}", config);
+ for item in attr.meta_item_list().unwrap_or(&[]) {
+ if item.check_name(CFG) {
+ let value = self.expect_associated_value(item);
+ debug!("check_config: searching for cfg {:?}", value);
+ for cfg in &config[..] {
+ if cfg.check_name(&value[..]) {
+ debug!("check_config: matched {:?}", cfg);
+ return true;
+ }
+ }
+ }
+ }
+ debug!("check_config: no match found");
+ return false;
+ }
+
+ fn dep_node(&self, attr: &Attribute, def_id: DefId) -> DepNode<DefId> {
+ for item in attr.meta_item_list().unwrap_or(&[]) {
+ if item.check_name(LABEL) {
+ let value = self.expect_associated_value(item);
+ match DepNode::from_label_string(&value[..], def_id) {
+ Ok(def_id) => return def_id,
+ Err(()) => {
+ self.tcx.sess.span_fatal(
+ item.span,
+ &format!("dep-node label `{}` not recognized", value));
+ }
+ }
+ }
+ }
+
+ self.tcx.sess.span_fatal(attr.span, "no `label` found");
+ }
+
+ fn dep_node_str(&self, dep_node: DepNode<DefId>) -> DepNode<String> {
+ dep_node.map_def(|&def_id| Some(self.tcx.item_path_str(def_id))).unwrap()
+ }
+
+ fn assert_dirty(&self, item: &hir::Item, dep_node: DepNode<DefId>) {
+ debug!("assert_dirty({:?})", dep_node);
+
+ if self.query.contains_node(&dep_node) {
+ let dep_node_str = self.dep_node_str(dep_node);
+ self.tcx.sess.span_err(
+ item.span,
+ &format!("`{:?}` found in dep graph, but should be dirty", dep_node_str));
+ }
+ }
+
+ fn assert_clean(&self, item: &hir::Item, dep_node: DepNode<DefId>) {
+ debug!("assert_clean({:?})", dep_node);
+
+ if !self.query.contains_node(&dep_node) {
+ let dep_node_str = self.dep_node_str(dep_node);
+ self.tcx.sess.span_err(
+ item.span,
+ &format!("`{:?}` not found in dep graph, but should be clean", dep_node_str));
+ }
+ }
+}
+
+impl<'a, 'tcx> Visitor<'tcx> for DirtyCleanVisitor<'a, 'tcx> {
+ fn visit_item(&mut self, item: &'tcx hir::Item) {
+ let def_id = self.tcx.map.local_def_id(item.id);
+ for attr in self.tcx.get_attrs(def_id).iter() {
+ if attr.check_name(DIRTY) {
+ if self.check_config(attr) {
+ self.assert_dirty(item, self.dep_node(attr, def_id));
+ }
+ } else if attr.check_name(CLEAN) {
+ if self.check_config(attr) {
+ self.assert_clean(item, self.dep_node(attr, def_id));
+ }
+ }
+ }
+ }
+}
+
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! Code to save/load the dep-graph from files.
+
+use calculate_svh::SvhCalculate;
+use rbml::Error;
+use rbml::opaque::Decoder;
+use rustc::dep_graph::DepNode;
+use rustc::hir::def_id::DefId;
+use rustc::ty;
+use rustc_data_structures::fnv::FnvHashSet;
+use rustc_serialize::Decodable as RustcDecodable;
+use std::io::Read;
+use std::fs::File;
+use std::path::Path;
+
+use super::data::*;
+use super::directory::*;
+use super::dirty_clean;
+use super::util::*;
+
+type DirtyNodes = FnvHashSet<DepNode<DefId>>;
+
+type CleanEdges = Vec<(DepNode<DefId>, DepNode<DefId>)>;
+
+/// If we are in incremental mode, and a previous dep-graph exists,
+/// then load up those nodes/edges that are still valid into the
+/// dep-graph for this session. (This is assumed to be running very
+/// early in compilation, before we've really done any work, but
+/// actually it doesn't matter all that much.) See `README.md` for
+/// more general overview.
+pub fn load_dep_graph<'tcx>(tcx: &ty::TyCtxt<'tcx>) {
+ let _ignore = tcx.dep_graph.in_ignore();
+
+ if let Some(dep_graph) = dep_graph_path(tcx) {
+ // FIXME(#32754) lock file?
+ load_dep_graph_if_exists(tcx, &dep_graph);
+ dirty_clean::check_dirty_clean_annotations(tcx);
+ }
+}
+
+pub fn load_dep_graph_if_exists<'tcx>(tcx: &ty::TyCtxt<'tcx>, path: &Path) {
+ if !path.exists() {
+ return;
+ }
+
+ let mut data = vec![];
+ match
+ File::open(path)
+ .and_then(|mut file| file.read_to_end(&mut data))
+ {
+ Ok(_) => { }
+ Err(err) => {
+ tcx.sess.err(
+ &format!("could not load dep-graph from `{}`: {}",
+ path.display(), err));
+ return;
+ }
+ }
+
+ match decode_dep_graph(tcx, &data) {
+ Ok(dirty) => dirty,
+ Err(err) => {
+ bug!("decoding error in dep-graph from `{}`: {}", path.display(), err);
+ }
+ }
+}
+
+pub fn decode_dep_graph<'tcx>(tcx: &ty::TyCtxt<'tcx>, data: &[u8])
+ -> Result<(), Error>
+{
+ // Deserialize the directory and dep-graph.
+ let mut decoder = Decoder::new(data, 0);
+ let directory = try!(DefIdDirectory::decode(&mut decoder));
+ let serialized_dep_graph = try!(SerializedDepGraph::decode(&mut decoder));
+
+ debug!("decode_dep_graph: directory = {:#?}", directory);
+ debug!("decode_dep_graph: serialized_dep_graph = {:#?}", serialized_dep_graph);
+
+ // Retrace the paths in the directory to find their current location (if any).
+ let retraced = directory.retrace(tcx);
+
+ debug!("decode_dep_graph: retraced = {:#?}", retraced);
+
+ // Compute the set of Hir nodes whose data has changed.
+ let mut dirty_nodes =
+ initial_dirty_nodes(tcx, &serialized_dep_graph.hashes, &retraced);
+
+ debug!("decode_dep_graph: initial dirty_nodes = {:#?}", dirty_nodes);
+
+ // Find all DepNodes reachable from that core set. This loop
+ // iterates repeatedly over the list of edges whose source is not
+ // known to be dirty (`clean_edges`). If it finds an edge whose
+ // source is dirty, it removes it from that list and adds the
+ // target to `dirty_nodes`. It stops when it reaches a fixed
+ // point.
+ let clean_edges = compute_clean_edges(&serialized_dep_graph.edges,
+ &retraced,
+ &mut dirty_nodes);
+
+ // Add synthetic `foo->foo` edges for each clean node `foo` that
+ // we had before. This is sort of a hack to create clean nodes in
+ // the graph, since the existence of a node is a signal that the
+ // work it represents need not be repeated.
+ let clean_nodes =
+ serialized_dep_graph.nodes
+ .iter()
+ .filter_map(|&node| retraced.map(node))
+ .filter(|node| !dirty_nodes.contains(node))
+ .map(|node| (node, node));
+
+ // Add nodes and edges that are not dirty into our main graph.
+ let dep_graph = tcx.dep_graph.clone();
+ for (source, target) in clean_edges.into_iter().chain(clean_nodes) {
+ let _task = dep_graph.in_task(target);
+ dep_graph.read(source);
+
+ debug!("decode_dep_graph: clean edge: {:?} -> {:?}", source, target);
+ }
+
+ Ok(())
+}
+
+fn initial_dirty_nodes<'tcx>(tcx: &ty::TyCtxt<'tcx>,
+ hashed_items: &[SerializedHash],
+ retraced: &RetracedDefIdDirectory)
+ -> DirtyNodes {
+ let mut items_removed = false;
+ let mut dirty_nodes = FnvHashSet();
+ for hashed_item in hashed_items {
+ match retraced.def_id(hashed_item.index) {
+ Some(def_id) => {
+ // FIXME(#32753) -- should we use a distinct hash here
+ let current_hash = tcx.calculate_item_hash(def_id);
+ debug!("initial_dirty_nodes: hash of {:?} is {:?}, was {:?}",
+ def_id, current_hash, hashed_item.hash);
+ if current_hash != hashed_item.hash {
+ dirty_nodes.insert(DepNode::Hir(def_id));
+ }
+ }
+ None => {
+ items_removed = true;
+ }
+ }
+ }
+
+ // If any of the items in the krate have changed, then we consider
+ // the meta-node `Krate` to be dirty, since that means something
+ // which (potentially) read the contents of every single item.
+ if items_removed || !dirty_nodes.is_empty() {
+ dirty_nodes.insert(DepNode::Krate);
+ }
+
+ dirty_nodes
+}
+
+fn compute_clean_edges(serialized_edges: &[(SerializedEdge)],
+ retraced: &RetracedDefIdDirectory,
+ dirty_nodes: &mut DirtyNodes)
+ -> CleanEdges {
+ // Build up an initial list of edges. Include an edge (source,
+ // target) if neither node has been removed. If the source has
+ // been removed, add target to the list of dirty nodes.
+ let mut clean_edges = Vec::with_capacity(serialized_edges.len());
+ for &(serialized_source, serialized_target) in serialized_edges {
+ if let Some(target) = retraced.map(serialized_target) {
+ if let Some(source) = retraced.map(serialized_source) {
+ clean_edges.push((source, target))
+ } else {
+ // source removed, target must be dirty
+ dirty_nodes.insert(target);
+ }
+ } else {
+ // target removed, ignore the edge
+ }
+ }
+
+ debug!("compute_clean_edges: dirty_nodes={:#?}", dirty_nodes);
+
+ // Propagate dirty marks by iterating repeatedly over
+ // `clean_edges`. If we find an edge `(source, target)` where
+ // `source` is dirty, add `target` to the list of dirty nodes and
+ // remove it. Keep doing this until we find no more dirty nodes.
+ let mut previous_size = 0;
+ while dirty_nodes.len() > previous_size {
+ debug!("compute_clean_edges: previous_size={}", previous_size);
+ previous_size = dirty_nodes.len();
+ let mut i = 0;
+ while i < clean_edges.len() {
+ if dirty_nodes.contains(&clean_edges[i].0) {
+ let (source, target) = clean_edges.swap_remove(i);
+ debug!("compute_clean_edges: dirty source {:?} -> {:?}",
+ source, target);
+ dirty_nodes.insert(target);
+ } else if dirty_nodes.contains(&clean_edges[i].1) {
+ let (source, target) = clean_edges.swap_remove(i);
+ debug!("compute_clean_edges: dirty target {:?} -> {:?}",
+ source, target);
+ } else {
+ i += 1;
+ }
+ }
+ }
+
+ clean_edges
+}
--- /dev/null
+// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+//! When in incremental mode, this pass dumps out the dependency graph
+//! into the given directory. At the same time, it also hashes the
+//! various HIR nodes.
+
+mod data;
+mod directory;
+mod dirty_clean;
+mod load;
+mod save;
+mod util;
+
+pub use self::load::load_dep_graph;
+pub use self::save::save_dep_graph;
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use calculate_svh::SvhCalculate;
+use rbml::opaque::Encoder;
+use rustc::dep_graph::DepNode;
+use rustc::ty;
+use rustc_serialize::{Encodable as RustcEncodable};
+use std::io::{self, Cursor, Write};
+use std::fs::{self, File};
+
+use super::data::*;
+use super::directory::*;
+use super::util::*;
+
+pub fn save_dep_graph<'tcx>(tcx: &ty::TyCtxt<'tcx>) {
+ let _ignore = tcx.dep_graph.in_ignore();
+
+ if let Some(dep_graph) = dep_graph_path(tcx) {
+ // FIXME(#32754) lock file?
+
+ // delete the old dep-graph, if any
+ if dep_graph.exists() {
+ match fs::remove_file(&dep_graph) {
+ Ok(()) => { }
+ Err(err) => {
+ tcx.sess.err(
+ &format!("unable to delete old dep-graph at `{}`: {}",
+ dep_graph.display(), err));
+ return;
+ }
+ }
+ }
+
+ // generate the data in a memory buffer
+ let mut wr = Cursor::new(Vec::new());
+ match encode_dep_graph(tcx, &mut Encoder::new(&mut wr)) {
+ Ok(()) => { }
+ Err(err) => {
+ tcx.sess.err(
+ &format!("could not encode dep-graph to `{}`: {}",
+ dep_graph.display(), err));
+ return;
+ }
+ }
+
+ // write the data out
+ let data = wr.into_inner();
+ match
+ File::create(&dep_graph)
+ .and_then(|mut file| file.write_all(&data))
+ {
+ Ok(_) => { }
+ Err(err) => {
+ tcx.sess.err(
+ &format!("failed to write dep-graph to `{}`: {}",
+ dep_graph.display(), err));
+ return;
+ }
+ }
+ }
+}
+
+pub fn encode_dep_graph<'tcx>(tcx: &ty::TyCtxt<'tcx>,
+ encoder: &mut Encoder)
+ -> io::Result<()>
+{
+ // Here we take advantage of how RBML allows us to skip around
+ // and encode the depgraph as a two-part structure:
+ //
+ // ```
+ // <dep-graph>[SerializedDepGraph]</dep-graph> // tag 0
+ // <directory>[DefIdDirectory]</directory> // tag 1
+ // ```
+ //
+ // Then later we can load the directory by skipping to find tag 1.
+
+ let query = tcx.dep_graph.query();
+
+ let mut builder = DefIdDirectoryBuilder::new(tcx);
+
+ // Create hashes for things we can persist.
+ let hashes =
+ query.nodes()
+ .into_iter()
+ .filter_map(|dep_node| match dep_node {
+ DepNode::Hir(def_id) => {
+ assert!(def_id.is_local());
+ builder.add(def_id)
+ .map(|index| {
+ // FIXME(#32753) -- should we use a distinct hash here
+ let hash = tcx.calculate_item_hash(def_id);
+ SerializedHash { index: index, hash: hash }
+ })
+ }
+ _ => None
+ })
+ .collect();
+
+ // Create the serialized dep-graph, dropping nodes that are
+ // from other crates or from inlined items.
+ //
+ // FIXME(#32015) fix handling of other crates
+ let graph = SerializedDepGraph {
+ nodes: query.nodes().into_iter()
+ .flat_map(|node| builder.map(node))
+ .collect(),
+ edges: query.edges().into_iter()
+ .flat_map(|(source_node, target_node)| {
+ builder.map(source_node)
+ .and_then(|source| {
+ builder.map(target_node)
+ .map(|target| (source, target))
+ })
+ })
+ .collect(),
+ hashes: hashes,
+ };
+
+ debug!("graph = {:#?}", graph);
+
+ // Encode the directory and then the graph data.
+ let directory = builder.into_directory();
+ try!(directory.encode(encoder));
+ try!(graph.encode(encoder));
+
+ Ok(())
+}
+
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use rustc::ty;
+use std::fs;
+use std::path::PathBuf;
+
+pub fn dep_graph_path<'tcx>(tcx: &ty::TyCtxt<'tcx>) -> Option<PathBuf> {
+ // For now, just save/load dep-graph from
+ // directory/dep_graph.rbml
+ tcx.sess.opts.incremental.as_ref().and_then(|incr_dir| {
+ match fs::create_dir_all(&incr_dir){
+ Ok(()) => {}
+ Err(err) => {
+ tcx.sess.err(
+ &format!("could not create the directory `{}`: {}",
+ incr_dir.display(), err));
+ return None;
+ }
+ }
+
+ Some(incr_dir.join("dep_graph.rbml"))
+ })
+}
+
// start with a non-lowercase letter rather than non-uppercase
// ones (some scripts don't have a concept of upper/lowercase)
- !name.is_empty() && !name.char_at(0).is_lowercase() && !name.contains('_')
+ !name.is_empty() &&
+ !name.chars().next().unwrap().is_lowercase() &&
+ !name.contains('_')
}
fn to_camel_case(s: &str) -> String {
#![feature(rustc_private)]
#![feature(slice_patterns)]
#![feature(staged_api)]
-#![feature(str_char)]
#[macro_use]
extern crate syntax;
reader::tagged_docs(depsdoc, tag_crate_dep).enumerate().map(|(crate_num, depdoc)| {
let name = docstr(depdoc, tag_crate_dep_crate_name);
- let hash = Svh::new(&docstr(depdoc, tag_crate_dep_hash));
+ let hash = Svh::new(docstr(depdoc, tag_crate_dep_hash));
let doc = reader::get_doc(depdoc, tag_crate_dep_explicitly_linked);
let explicitly_linked = reader::doc_as_u8(doc) != 0;
CrateDep {
pub fn maybe_get_crate_hash(data: &[u8]) -> Option<Svh> {
let cratedoc = rbml::Doc::new(data);
reader::maybe_get_doc(cratedoc, tag_crate_hash).map(|doc| {
- Svh::new(doc.as_str_slice())
+ Svh::new(doc.as_str_slice().to_string())
})
}
pub fn get_crate_hash(data: &[u8]) -> Svh {
let cratedoc = rbml::Doc::new(data);
let hashdoc = reader::get_doc(cratedoc, tag_crate_hash);
- Svh::new(hashdoc.as_str_slice())
+ Svh::new(hashdoc.as_str_slice().to_string())
}
pub fn maybe_get_crate_name(data: &[u8]) -> Option<&str> {
// broken MIR, so try not to report duplicate errors.
return;
}
- let _task = tcx.dep_graph.in_task(DepNode::MirTypeck(id));
+ let def_id = tcx.map.local_def_id(id);
+ let _task = tcx.dep_graph.in_task(DepNode::MirTypeck(def_id));
let param_env = ty::ParameterEnvironment::for_item(tcx, id);
let infcx = infer::new_infer_ctxt(tcx,
&tcx.tables,
(&Failed(_), &Failed(_)) => {
let resolutions = target_module.resolutions.borrow();
let names = resolutions.iter().filter_map(|(&(ref name, _), resolution)| {
+ if *name == source { return None; } // Never suggest the same name
match *resolution.borrow() {
NameResolution { binding: Some(_), .. } => Some(name),
NameResolution { single_imports: SingleImports::None, .. } => None,
Some(name) => format!(". Did you mean to use `{}`?", name),
None => "".to_owned(),
};
- let msg = format!("There is no `{}` in `{}`{}",
- source,
- module_to_string(target_module), lev_suggestion);
+ let module_str = module_to_string(target_module);
+ let msg = if &module_str == "???" {
+ format!("There is no `{}` in the crate root{}", source, lev_suggestion)
+ } else {
+ format!("There is no `{}` in `{}`{}", source, module_str, lev_suggestion)
+ };
return Failed(Some((directive.span, msg)));
}
_ => (),
rustc_const_eval = { path = "../librustc_const_eval" }
rustc_const_math = { path = "../librustc_const_math" }
rustc_data_structures = { path = "../librustc_data_structures" }
+rustc_incremental = { path = "../librustc_incremental" }
rustc_llvm = { path = "../librustc_llvm" }
rustc_mir = { path = "../librustc_mir" }
rustc_platform_intrinsics = { path = "../librustc_platform_intrinsics" }
/// Only later will `original_ty` aka `%Foo` be used in the LLVM function
/// pointer type, without ever having introspected it.
pub ty: Type,
+ /// Signedness for integer types, None for other types
+ pub signedness: Option<bool>,
/// Coerced LLVM Type
pub cast: Option<Type>,
/// Dummy argument, which is emitted before the real argument
kind: ArgKind::Direct,
original_ty: original_ty,
ty: ty,
+ signedness: None,
cast: None,
pad: None,
attrs: llvm::Attributes::default()
self.kind = ArgKind::Ignore;
}
+ pub fn extend_integer_width_to(&mut self, bits: u64) {
+ // Only integers have signedness
+ if let Some(signed) = self.signedness {
+ if self.ty.int_width() < bits {
+ self.attrs.set(if signed {
+ llvm::Attribute::SExt
+ } else {
+ llvm::Attribute::ZExt
+ });
+ }
+ }
+ }
+
pub fn is_indirect(&self) -> bool {
self.kind == ArgKind::Indirect
}
} else {
let mut arg = ArgType::new(type_of::type_of(ccx, ty),
type_of::sizing_type_of(ccx, ty));
+ if ty.is_integral() {
+ arg.signedness = Some(ty.is_signed());
+ }
if llsize_of_real(ccx, arg.ty) == 0 {
// For some forsaken reason, x86_64-pc-windows-gnu
// doesn't ignore zero-sized struct arguments.
+++ /dev/null
-// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-//! This pass is only used for the UNIT TESTS and DEBUGGING NEEDS
-//! around dependency graph construction. It serves two purposes; it
-//! will dump graphs in graphviz form to disk, and it searches for
-//! `#[rustc_if_this_changed]` and `#[rustc_then_this_would_need]`
-//! annotations. These annotations can be used to test whether paths
-//! exist in the graph. We report errors on each
-//! `rustc_if_this_changed` annotation. If a path exists in all
-//! cases, then we would report "all path(s) exist". Otherwise, we
-//! report: "no path to `foo`" for each case where no path exists.
-//! `compile-fail` tests can then be used to check when paths exist or
-//! do not.
-//!
-//! The full form of the `rustc_if_this_changed` annotation is
-//! `#[rustc_if_this_changed(id)]`. The `"id"` is optional and
-//! defaults to `"id"` if omitted.
-//!
-//! Example:
-//!
-//! ```
-//! #[rustc_if_this_changed]
-//! fn foo() { }
-//!
-//! #[rustc_then_this_would_need("trans")] //~ ERROR no path from `foo`
-//! fn bar() { }
-//!
-//! #[rustc_then_this_would_need("trans")] //~ ERROR OK
-//! fn baz() { foo(); }
-//! ```
-
-use graphviz as dot;
-use rustc::dep_graph::{DepGraphQuery, DepNode};
-use rustc::hir::def_id::DefId;
-use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
-use rustc_data_structures::graph::{Direction, INCOMING, OUTGOING, NodeIndex};
-use rustc::hir;
-use rustc::hir::intravisit::Visitor;
-use graphviz::IntoCow;
-use std::env;
-use std::fs::File;
-use std::io::Write;
-use syntax::ast;
-use syntax::attr::AttrMetaMethods;
-use syntax::codemap::Span;
-use syntax::parse::token::InternedString;
-
-const IF_THIS_CHANGED: &'static str = "rustc_if_this_changed";
-const THEN_THIS_WOULD_NEED: &'static str = "rustc_then_this_would_need";
-const ID: &'static str = "id";
-
-pub fn assert_dep_graph(tcx: &TyCtxt) {
- let _ignore = tcx.dep_graph.in_ignore();
-
- if tcx.sess.opts.dump_dep_graph {
- dump_graph(tcx);
- }
-
- // Find annotations supplied by user (if any).
- let (if_this_changed, then_this_would_need) = {
- let mut visitor = IfThisChanged { tcx: tcx,
- if_this_changed: FnvHashMap(),
- then_this_would_need: FnvHashMap() };
- tcx.map.krate().visit_all_items(&mut visitor);
- (visitor.if_this_changed, visitor.then_this_would_need)
- };
-
- // Check paths.
- check_paths(tcx, &if_this_changed, &then_this_would_need);
-}
-
-type SourceHashMap = FnvHashMap<InternedString,
- FnvHashSet<(Span, DefId, DepNode)>>;
-type TargetHashMap = FnvHashMap<InternedString,
- FnvHashSet<(Span, InternedString, ast::NodeId, DepNode)>>;
-
-struct IfThisChanged<'a, 'tcx:'a> {
- tcx: &'a TyCtxt<'tcx>,
- if_this_changed: SourceHashMap,
- then_this_would_need: TargetHashMap,
-}
-
-impl<'a, 'tcx> IfThisChanged<'a, 'tcx> {
- fn process_attrs(&mut self, node_id: ast::NodeId, def_id: DefId) {
- for attr in self.tcx.get_attrs(def_id).iter() {
- if attr.check_name(IF_THIS_CHANGED) {
- let mut id = None;
- for meta_item in attr.meta_item_list().unwrap_or_default() {
- match meta_item.node {
- ast::MetaItemKind::Word(ref s) if id.is_none() => id = Some(s.clone()),
- _ => {
- self.tcx.sess.span_err(
- meta_item.span,
- &format!("unexpected meta-item {:?}", meta_item.node));
- }
- }
- }
- let id = id.unwrap_or(InternedString::new(ID));
- self.if_this_changed.entry(id)
- .or_insert(FnvHashSet())
- .insert((attr.span, def_id, DepNode::Hir(def_id)));
- } else if attr.check_name(THEN_THIS_WOULD_NEED) {
- let mut dep_node_interned = None;
- let mut id = None;
- for meta_item in attr.meta_item_list().unwrap_or_default() {
- match meta_item.node {
- ast::MetaItemKind::Word(ref s) if dep_node_interned.is_none() =>
- dep_node_interned = Some(s.clone()),
- ast::MetaItemKind::Word(ref s) if id.is_none() =>
- id = Some(s.clone()),
- _ => {
- self.tcx.sess.span_err(
- meta_item.span,
- &format!("unexpected meta-item {:?}", meta_item.node));
- }
- }
- }
- let dep_node_str = dep_node_interned.as_ref().map(|s| &**s);
- macro_rules! match_depnode_name {
- ($input:expr, $def_id:expr, match { $($variant:ident,)* } else $y:expr) => {
- match $input {
- $(Some(stringify!($variant)) => DepNode::$variant($def_id),)*
- _ => $y
- }
- }
- }
- let dep_node = match_depnode_name! {
- dep_node_str, def_id, match {
- CollectItem,
- BorrowCheck,
- TransCrateItem,
- TypeckItemType,
- TypeckItemBody,
- ImplOrTraitItems,
- ItemSignature,
- FieldTy,
- TraitItemDefIds,
- InherentImpls,
- ImplItems,
- TraitImpls,
- ReprHints,
- } else {
- self.tcx.sess.span_fatal(
- attr.span,
- &format!("unrecognized DepNode variant {:?}", dep_node_str));
- }
- };
- let id = id.unwrap_or(InternedString::new(ID));
- self.then_this_would_need
- .entry(id)
- .or_insert(FnvHashSet())
- .insert((attr.span, dep_node_interned.clone().unwrap(), node_id, dep_node));
- }
- }
- }
-}
-
-impl<'a, 'tcx> Visitor<'tcx> for IfThisChanged<'a, 'tcx> {
- fn visit_item(&mut self, item: &'tcx hir::Item) {
- let def_id = self.tcx.map.local_def_id(item.id);
- self.process_attrs(item.id, def_id);
- }
-}
-
-fn check_paths(tcx: &TyCtxt,
- if_this_changed: &SourceHashMap,
- then_this_would_need: &TargetHashMap)
-{
- // Return early here so as not to construct the query, which is not cheap.
- if if_this_changed.is_empty() {
- return;
- }
- let query = tcx.dep_graph.query();
- for (id, sources) in if_this_changed {
- let targets = match then_this_would_need.get(id) {
- Some(targets) => targets,
- None => {
- for &(source_span, _, _) in sources.iter().take(1) {
- tcx.sess.span_err(
- source_span,
- &format!("no targets for id `{}`", id));
- }
- continue;
- }
- };
-
- for &(_, source_def_id, source_dep_node) in sources {
- let dependents = query.dependents(source_dep_node);
- for &(target_span, ref target_pass, _, ref target_dep_node) in targets {
- if !dependents.contains(&target_dep_node) {
- tcx.sess.span_err(
- target_span,
- &format!("no path from `{}` to `{}`",
- tcx.item_path_str(source_def_id),
- target_pass));
- } else {
- tcx.sess.span_err(
- target_span,
- &format!("OK"));
- }
- }
- }
- }
-}
-
-fn dump_graph(tcx: &TyCtxt) {
- let path: String = env::var("RUST_DEP_GRAPH").unwrap_or_else(|_| format!("dep_graph"));
- let query = tcx.dep_graph.query();
-
- let nodes = match env::var("RUST_DEP_GRAPH_FILTER") {
- Ok(string) => {
- // Expect one of: "-> target", "source -> target", or "source ->".
- let parts: Vec<_> = string.split("->").collect();
- if parts.len() > 2 {
- bug!("Invalid RUST_DEP_GRAPH_FILTER: expected '[source] -> [target]'");
- }
- let sources = node_set(&query, &parts[0]);
- let targets = node_set(&query, &parts[1]);
- filter_nodes(&query, &sources, &targets)
- }
- Err(_) => {
- query.nodes()
- .into_iter()
- .collect()
- }
- };
- let edges = filter_edges(&query, &nodes);
-
- { // dump a .txt file with just the edges:
- let txt_path = format!("{}.txt", path);
- let mut file = File::create(&txt_path).unwrap();
- for &(source, target) in &edges {
- write!(file, "{:?} -> {:?}\n", source, target).unwrap();
- }
- }
-
- { // dump a .dot file in graphviz format:
- let dot_path = format!("{}.dot", path);
- let mut v = Vec::new();
- dot::render(&GraphvizDepGraph(nodes, edges), &mut v).unwrap();
- File::create(&dot_path).and_then(|mut f| f.write_all(&v)).unwrap();
- }
-}
-
-pub struct GraphvizDepGraph(FnvHashSet<DepNode>, Vec<(DepNode, DepNode)>);
-
-impl<'a, 'tcx> dot::GraphWalk<'a> for GraphvizDepGraph {
- type Node = DepNode;
- type Edge = (DepNode, DepNode);
- fn nodes(&self) -> dot::Nodes<DepNode> {
- let nodes: Vec<_> = self.0.iter().cloned().collect();
- nodes.into_cow()
- }
- fn edges(&self) -> dot::Edges<(DepNode, DepNode)> {
- self.1[..].into_cow()
- }
- fn source(&self, edge: &(DepNode, DepNode)) -> DepNode {
- edge.0
- }
- fn target(&self, edge: &(DepNode, DepNode)) -> DepNode {
- edge.1
- }
-}
-
-impl<'a, 'tcx> dot::Labeller<'a> for GraphvizDepGraph {
- type Node = DepNode;
- type Edge = (DepNode, DepNode);
- fn graph_id(&self) -> dot::Id {
- dot::Id::new("DependencyGraph").unwrap()
- }
- fn node_id(&self, n: &DepNode) -> dot::Id {
- let s: String =
- format!("{:?}", n).chars()
- .map(|c| if c == '_' || c.is_alphanumeric() { c } else { '_' })
- .collect();
- debug!("n={:?} s={:?}", n, s);
- dot::Id::new(s).unwrap()
- }
- fn node_label(&self, n: &DepNode) -> dot::LabelText {
- dot::LabelText::label(format!("{:?}", n))
- }
-}
-
-// Given an optional filter like `"x,y,z"`, returns either `None` (no
-// filter) or the set of nodes whose labels contain all of those
-// substrings.
-fn node_set(query: &DepGraphQuery, filter: &str) -> Option<FnvHashSet<DepNode>> {
- debug!("node_set(filter={:?})", filter);
-
- if filter.trim().is_empty() {
- return None;
- }
-
- let filters: Vec<&str> = filter.split("&").map(|s| s.trim()).collect();
-
- debug!("node_set: filters={:?}", filters);
-
- Some(query.nodes()
- .into_iter()
- .filter(|n| {
- let s = format!("{:?}", n);
- filters.iter().all(|f| s.contains(f))
- })
- .collect())
-}
-
-fn filter_nodes(query: &DepGraphQuery,
- sources: &Option<FnvHashSet<DepNode>>,
- targets: &Option<FnvHashSet<DepNode>>)
- -> FnvHashSet<DepNode>
-{
- if let &Some(ref sources) = sources {
- if let &Some(ref targets) = targets {
- walk_between(query, sources, targets)
- } else {
- walk_nodes(query, sources, OUTGOING)
- }
- } else if let &Some(ref targets) = targets {
- walk_nodes(query, targets, INCOMING)
- } else {
- query.nodes().into_iter().collect()
- }
-}
-
-fn walk_nodes(query: &DepGraphQuery,
- starts: &FnvHashSet<DepNode>,
- direction: Direction)
- -> FnvHashSet<DepNode>
-{
- let mut set = FnvHashSet();
- for start in starts {
- debug!("walk_nodes: start={:?} outgoing?={:?}", start, direction == OUTGOING);
- if set.insert(*start) {
- let mut stack = vec![query.indices[start]];
- while let Some(index) = stack.pop() {
- for (_, edge) in query.graph.adjacent_edges(index, direction) {
- let neighbor_index = edge.source_or_target(direction);
- let neighbor = query.graph.node_data(neighbor_index);
- if set.insert(*neighbor) {
- stack.push(neighbor_index);
- }
- }
- }
- }
- }
- set
-}
-
-fn walk_between(query: &DepGraphQuery,
- sources: &FnvHashSet<DepNode>,
- targets: &FnvHashSet<DepNode>)
- -> FnvHashSet<DepNode>
-{
- // This is a bit tricky. We want to include a node only if it is:
- // (a) reachable from a source and (b) will reach a target. And we
- // have to be careful about cycles etc. Luckily efficiency is not
- // a big concern!
-
- #[derive(Copy, Clone, PartialEq)]
- enum State { Undecided, Deciding, Included, Excluded }
-
- let mut node_states = vec![State::Undecided; query.graph.len_nodes()];
-
- for &target in targets {
- node_states[query.indices[&target].0] = State::Included;
- }
-
- for source in sources.iter().map(|n| query.indices[n]) {
- recurse(query, &mut node_states, source);
- }
-
- return query.nodes()
- .into_iter()
- .filter(|n| {
- let index = query.indices[n];
- node_states[index.0] == State::Included
- })
- .collect();
-
- fn recurse(query: &DepGraphQuery,
- node_states: &mut [State],
- node: NodeIndex)
- -> bool
- {
- match node_states[node.0] {
- // known to reach a target
- State::Included => return true,
-
- // known not to reach a target
- State::Excluded => return false,
-
- // backedge, not yet known, say false
- State::Deciding => return false,
-
- State::Undecided => { }
- }
-
- node_states[node.0] = State::Deciding;
-
- for neighbor_index in query.graph.successor_nodes(node) {
- if recurse(query, node_states, neighbor_index) {
- node_states[node.0] = State::Included;
- }
- }
-
- // if we didn't find a path to target, then set to excluded
- if node_states[node.0] == State::Deciding {
- node_states[node.0] = State::Excluded;
- false
- } else {
- assert!(node_states[node.0] == State::Included);
- true
- }
- }
-}
-
-fn filter_edges(query: &DepGraphQuery,
- nodes: &FnvHashSet<DepNode>)
- -> Vec<(DepNode, DepNode)>
-{
- query.edges()
- .into_iter()
- .filter(|&(source, target)| nodes.contains(&source) && nodes.contains(&target))
- .collect()
-}
use super::rpath::RPathConfig;
use super::rpath;
use super::msvc;
-use super::svh::Svh;
use session::config;
use session::config::NoDebugInfo;
use session::config::{OutputFilenames, Input, OutputType};
use CrateTranslation;
use util::common::time;
use util::fs::fix_windows_verbatim_for_gcc;
+use rustc::ty::TyCtxt;
use rustc_back::tempdir::TempDir;
+use rustc_incremental::SvhCalculate;
use std::ascii;
use std::char;
use std::env;
use syntax::codemap::Span;
use syntax::attr::AttrMetaMethods;
-use rustc::hir;
-
// RLIB LLVM-BYTECODE OBJECT LAYOUT
// Version 1
// Bytes Data
}
"rust_out".to_string()
+
}
-pub fn build_link_meta(sess: &Session,
- krate: &hir::Crate,
+pub fn build_link_meta(tcx: &TyCtxt,
name: &str)
-> LinkMeta {
let r = LinkMeta {
crate_name: name.to_owned(),
- crate_hash: Svh::calculate(&sess.crate_disambiguator.get().as_str(), krate),
+ crate_hash: tcx.calculate_krate_hash(),
};
info!("{:?}", r);
return r;
use _match;
use abi::{self, Abi, FnType};
use adt;
-use assert_dep_graph;
use attributes;
use build::*;
use builder::{Builder, noname};
}
}
- let link_meta = link::build_link_meta(&tcx.sess, krate, name);
+ let link_meta = link::build_link_meta(&tcx, name);
let codegen_units = tcx.sess.opts.cg.codegen_units;
let shared_ccx = SharedCrateContext::new(&link_meta.crate_name,
};
let no_builtins = attr::contains_name(&krate.attrs, "no_builtins");
- assert_dep_graph::assert_dep_graph(tcx);
-
CrateTranslation {
modules: modules,
metadata_module: metadata_module,
fn classify_ret_ty(ccx: &CrateContext, ret: &mut ArgType) {
if is_reg_ty(ret.ty) {
+ ret.extend_integer_width_to(32);
return;
}
if let Some((base_ty, members)) = is_homogenous_aggregate_ty(ret.ty) {
fn classify_arg_ty(ccx: &CrateContext, arg: &mut ArgType) {
if is_reg_ty(arg.ty) {
+ arg.extend_integer_width_to(32);
return;
}
if let Some((base_ty, members)) = is_homogenous_aggregate_ty(arg.ty) {
fn classify_ret_ty(ccx: &CrateContext, ret: &mut ArgType, align_fn: TyAlignFn) {
if is_reg_ty(ret.ty) {
+ ret.extend_integer_width_to(32);
return;
}
let size = ty_size(ret.ty, align_fn);
fn classify_arg_ty(ccx: &CrateContext, arg: &mut ArgType, align_fn: TyAlignFn) {
if is_reg_ty(arg.ty) {
+ arg.extend_integer_width_to(32);
return;
}
let align = align_fn(arg.ty);
}
}
+fn classify_ret_ty(ccx: &CrateContext, ret: &mut ArgType) {
+ if is_reg_ty(ret.ty) {
+ ret.extend_integer_width_to(32);
+ } else {
+ ret.make_indirect(ccx);
+ }
+}
+
fn classify_arg_ty(ccx: &CrateContext, arg: &mut ArgType, offset: &mut usize) {
let orig_offset = *offset;
let size = ty_size(arg.ty) * 8;
if !is_reg_ty(arg.ty) {
arg.cast = Some(struct_ty(ccx, arg.ty));
arg.pad = padding_ty(ccx, align, orig_offset);
+ } else {
+ arg.extend_integer_width_to(32);
}
}
}
pub fn compute_abi_info(ccx: &CrateContext, fty: &mut FnType) {
- if !fty.ret.is_ignore() && !is_reg_ty(fty.ret.ty) {
- fty.ret.make_indirect(ccx);
+ if !fty.ret.is_ignore() {
+ classify_ret_ty(ccx, &mut fty.ret);
}
let mut offset = if fty.ret.is_indirect() { 4 } else { 0 };
}
}
+fn classify_ret_ty(ccx: &CrateContext, ret: &mut ArgType) {
+ if is_reg_ty(ret.ty) {
+ ret.extend_integer_width_to(32);
+ } else {
+ ret.make_indirect(ccx);
+ }
+}
+
fn classify_arg_ty(ccx: &CrateContext, arg: &mut ArgType, offset: &mut usize) {
let orig_offset = *offset;
let size = ty_size(arg.ty) * 8;
if !is_reg_ty(arg.ty) {
arg.cast = Some(struct_ty(ccx, arg.ty));
arg.pad = padding_ty(ccx, align, orig_offset);
+ } else {
+ arg.extend_integer_width_to(32);
}
}
}
pub fn compute_abi_info(ccx: &CrateContext, fty: &mut FnType) {
- if !fty.ret.is_ignore() && !is_reg_ty(fty.ret.ty) {
- fty.ret.make_indirect(ccx);
+ if !fty.ret.is_ignore() {
+ classify_ret_ty(ccx, &mut fty.ret);
}
let mut offset = if fty.ret.is_indirect() { 4 } else { 0 };
fn classify_ret_ty(ccx: &CrateContext, ret: &mut ArgType) {
if is_reg_ty(ret.ty) {
+ ret.extend_integer_width_to(64);
return;
}
fn classify_arg_ty(ccx: &CrateContext, arg: &mut ArgType) {
if is_reg_ty(arg.ty) {
+ arg.extend_integer_width_to(64);
return;
}
use super::machine::*;
pub fn compute_abi_info(ccx: &CrateContext, fty: &mut FnType) {
- if !fty.ret.is_ignore() && fty.ret.ty.kind() == Struct {
- // Returning a structure. Most often, this will use
- // a hidden first argument. On some platforms, though,
- // small structs are returned as integers.
- //
- // Some links:
- // http://www.angelcode.com/dev/callconv/callconv.html
- // Clang's ABI handling is in lib/CodeGen/TargetInfo.cpp
- let t = &ccx.sess().target.target;
- if t.options.is_like_osx || t.options.is_like_windows {
- match llsize_of_alloc(ccx, fty.ret.ty) {
- 1 => fty.ret.cast = Some(Type::i8(ccx)),
- 2 => fty.ret.cast = Some(Type::i16(ccx)),
- 4 => fty.ret.cast = Some(Type::i32(ccx)),
- 8 => fty.ret.cast = Some(Type::i64(ccx)),
- _ => fty.ret.make_indirect(ccx)
+ if !fty.ret.is_ignore() {
+ if fty.ret.ty.kind() == Struct {
+ // Returning a structure. Most often, this will use
+ // a hidden first argument. On some platforms, though,
+ // small structs are returned as integers.
+ //
+ // Some links:
+ // http://www.angelcode.com/dev/callconv/callconv.html
+ // Clang's ABI handling is in lib/CodeGen/TargetInfo.cpp
+ let t = &ccx.sess().target.target;
+ if t.options.is_like_osx || t.options.is_like_windows {
+ match llsize_of_alloc(ccx, fty.ret.ty) {
+ 1 => fty.ret.cast = Some(Type::i8(ccx)),
+ 2 => fty.ret.cast = Some(Type::i16(ccx)),
+ 4 => fty.ret.cast = Some(Type::i32(ccx)),
+ 8 => fty.ret.cast = Some(Type::i64(ccx)),
+ _ => fty.ret.make_indirect(ccx)
+ }
+ } else {
+ fty.ret.make_indirect(ccx);
}
} else {
- fty.ret.make_indirect(ccx);
+ fty.ret.extend_integer_width_to(32);
}
}
if arg.ty.kind() == Struct {
arg.make_indirect(ccx);
arg.attrs.set(Attribute::ByVal);
+ } else {
+ arg.extend_integer_width_to(32);
}
}
}
} else {
arg.cast = Some(llreg_ty(ccx, &cls));
}
+ } else {
+ arg.extend_integer_width_to(32);
}
}
8 => a.cast = Some(Type::i64(ccx)),
_ => a.make_indirect(ccx)
}
+ } else {
+ a.extend_integer_width_to(32);
}
};
impl<'tcx> DepTrackingMapConfig for TraitSelectionCache<'tcx> {
type Key = ty::PolyTraitRef<'tcx>;
type Value = traits::Vtable<'tcx, ()>;
- fn to_dep_node(key: &ty::PolyTraitRef<'tcx>) -> DepNode {
+ fn to_dep_node(key: &ty::PolyTraitRef<'tcx>) -> DepNode<DefId> {
ty::tls::with(|tcx| {
let lifted_key = tcx.lift(key).unwrap();
lifted_key.to_poly_trait_predicate().dep_node()
use syntax::parse::token;
-const DW_LANG_RUST: c_uint = 0x9000;
+// From DWARF 5.
+// See http://www.dwarfstd.org/ShowIssue.php?issue=140129.1
+const DW_LANG_RUST: c_uint = 0x1c;
#[allow(non_upper_case_globals)]
const DW_ATE_boolean: c_uint = 0x02;
#[allow(non_upper_case_globals)]
output: &mut String) {
// First, find out the 'real' def_id of the type. Items inlined from
// other crates have to be mapped back to their source.
- let source_def_id = if let Some(node_id) = cx.tcx().map.as_local_node_id(def_id) {
+ let def_id = if let Some(node_id) = cx.tcx().map.as_local_node_id(def_id) {
match cx.external_srcs().borrow().get(&node_id).cloned() {
Some(source_def_id) => {
// The given def_id identifies the inlined copy of a
def_id
};
- // Get the crate hash as first part of the identifier.
- let crate_hash = if source_def_id.is_local() {
- cx.link_meta().crate_hash.clone()
+ // Get the crate name/disambiguator as first part of the identifier.
+ let crate_name = if def_id.is_local() {
+ cx.tcx().crate_name.clone()
} else {
- cx.sess().cstore.crate_hash(source_def_id.krate)
+ cx.sess().cstore.original_crate_name(def_id.krate)
};
+ let crate_disambiguator = cx.tcx().crate_disambiguator(def_id.krate);
- output.push_str(crate_hash.as_str());
+ output.push_str(&crate_name[..]);
output.push_str("/");
+ output.push_str(&crate_disambiguator[..]);
+ output.push_str("/");
+ // Add the def-index as the second part
output.push_str(&format!("{:x}", def_id.index.as_usize()));
- // Maybe check that there is no self type here.
-
let tps = substs.types.get_slice(subst::TypeSpace);
if !tps.is_empty() {
output.push('<');
#[macro_use] extern crate rustc;
extern crate rustc_back;
extern crate rustc_data_structures;
+extern crate rustc_incremental;
pub extern crate rustc_llvm as llvm;
extern crate rustc_mir;
extern crate rustc_platform_intrinsics as intrinsics;
mod abi;
mod adt;
mod asm;
-mod assert_dep_graph;
mod attributes;
mod base;
mod basic_block;
match *lvalue {
mir::Lvalue::Temp(index) => {
match context {
+ LvalueContext::Call => {
+ self.mark_assigned(index as usize);
+ }
LvalueContext::Consume => {
}
LvalueContext::Store |
use llvm::{self, BasicBlockRef, ValueRef, OperandBundleDef};
use rustc::ty;
use rustc::mir::repr as mir;
-use abi::{Abi, FnType};
+use abi::{Abi, FnType, ArgType};
use adt;
use base;
use build;
use callee::{Callee, CalleeData, Fn, Intrinsic, NamedTupleConstructor, Virtual};
-use common::{self, Block, BlockAndBuilder, C_undef};
+use common::{self, type_is_fat_ptr, Block, BlockAndBuilder, C_undef};
use debuginfo::DebugLoc;
use Disr;
use machine::{llalign_of_min, llbitsize_of_real};
use glue;
use type_::Type;
-use super::{MirContext, drop};
+use super::{MirContext, TempRef, drop};
use super::lvalue::{LvalueRef, load_fat_ptr};
use super::operand::OperandRef;
use super::operand::OperandValue::{self, FatPtr, Immediate, Ref};
_ => bug!("{} is not callable", callee.ty)
};
+ let sig = bcx.tcx().erase_late_bound_regions(sig);
+
// Handle intrinsics old trans wants Expr's for, ourselves.
let intrinsic = match (&callee.ty.sty, &callee.data) {
(&ty::TyFnDef(def_id, _, _), &Intrinsic) => {
if intrinsic == Some("transmute") {
let &(ref dest, target) = destination.as_ref().unwrap();
- let dst = self.trans_lvalue(&bcx, dest);
- let mut val = self.trans_operand(&bcx, &args[0]);
- if let ty::TyFnDef(def_id, substs, _) = val.ty.sty {
- let llouttype = type_of::type_of(bcx.ccx(), dst.ty.to_ty(bcx.tcx()));
- let out_type_size = llbitsize_of_real(bcx.ccx(), llouttype);
- if out_type_size != 0 {
- // FIXME #19925 Remove this hack after a release cycle.
- let f = Callee::def(bcx.ccx(), def_id, substs);
- let datum = f.reify(bcx.ccx());
- val = OperandRef {
- val: OperandValue::Immediate(datum.val),
- ty: datum.ty
- };
- }
- }
+ self.with_lvalue_ref(&bcx, dest, |this, dest| {
+ this.trans_transmute(&bcx, &args[0], dest);
+ });
- let llty = type_of::type_of(bcx.ccx(), val.ty);
- let cast_ptr = bcx.pointercast(dst.llval, llty.ptr_to());
- self.store_operand(&bcx, cast_ptr, val);
self.set_operand_dropped(&bcx, &args[0]);
funclet_br(bcx, self.llblock(target));
return;
}
- let extra_args = &args[sig.0.inputs.len()..];
+ let extra_args = &args[sig.inputs.len()..];
let extra_args = extra_args.iter().map(|op_arg| {
self.mir.operand_ty(bcx.tcx(), op_arg)
}).collect::<Vec<_>>();
let mut llargs = Vec::with_capacity(arg_count);
// Prepare the return value destination
- let ret_dest = if let Some((ref d, _)) = *destination {
- let dest = self.trans_lvalue(&bcx, d);
- if fn_ty.ret.is_indirect() {
- llargs.push(dest.llval);
- None
- } else if fn_ty.ret.is_ignore() {
- None
+ let ret_dest = if let Some((ref dest, _)) = *destination {
+ let is_intrinsic = if let Intrinsic = callee.data {
+ true
} else {
- Some(dest)
- }
+ false
+ };
+ self.make_return_dest(&bcx, dest, &fn_ty.ret, &mut llargs, is_intrinsic)
} else {
- None
+ ReturnDest::Nothing
};
// Split the rust-call tupled arguments off.
use expr::{Ignore, SaveIn};
use intrinsic::trans_intrinsic_call;
- let (dest, llargs) = if fn_ty.ret.is_indirect() {
- (SaveIn(llargs[0]), &llargs[1..])
- } else if let Some(dest) = ret_dest {
- (SaveIn(dest.llval), &llargs[..])
- } else {
- (Ignore, &llargs[..])
+ let (dest, llargs) = match ret_dest {
+ _ if fn_ty.ret.is_indirect() => {
+ (SaveIn(llargs[0]), &llargs[1..])
+ }
+ ReturnDest::Nothing => (Ignore, &llargs[..]),
+ ReturnDest::IndirectOperand(dst, _) |
+ ReturnDest::Store(dst) => (SaveIn(dst), &llargs[..]),
+ ReturnDest::DirectOperand(_) =>
+ bug!("Cannot use direct operand with an intrinsic call")
};
bcx.with_block(|bcx| {
- let res = trans_intrinsic_call(bcx, callee.ty, &fn_ty,
+ trans_intrinsic_call(bcx, callee.ty, &fn_ty,
ArgVals(llargs), dest,
DebugLoc::None);
- let bcx = res.bcx.build();
- if let Some((_, target)) = *destination {
- for op in args {
- self.set_operand_dropped(&bcx, op);
- }
- funclet_br(bcx, self.llblock(target));
- } else {
- // trans_intrinsic_call already used Unreachable.
- // bcx.unreachable();
- }
});
+
+ if let ReturnDest::IndirectOperand(dst, _) = ret_dest {
+ // Make a fake operand for store_return
+ let op = OperandRef {
+ val: OperandValue::Ref(dst),
+ ty: sig.output.unwrap()
+ };
+ self.store_return(&bcx, ret_dest, fn_ty.ret, op);
+ }
+
+ if let Some((_, target)) = *destination {
+ for op in args {
+ self.set_operand_dropped(&bcx, op);
+ }
+ funclet_br(bcx, self.llblock(target));
+ } else {
+ // trans_intrinsic_call already used Unreachable.
+ // bcx.unreachable();
+ }
+
return;
}
Fn(f) => f,
if destination.is_some() {
let ret_bcx = ret_bcx.build();
ret_bcx.at_start(|ret_bcx| {
- if let Some(ret_dest) = ret_dest {
- fn_ty.ret.store(&ret_bcx, invokeret, ret_dest.llval);
- }
+ let op = OperandRef {
+ val: OperandValue::Immediate(invokeret),
+ ty: sig.output.unwrap()
+ };
+ self.store_return(&ret_bcx, ret_dest, fn_ty.ret, op);
for op in args {
self.set_operand_dropped(&ret_bcx, op);
}
let llret = bcx.call(fn_ptr, &llargs, cleanup_bundle.as_ref());
fn_ty.apply_attrs_callsite(llret);
if let Some((_, target)) = *destination {
- if let Some(ret_dest) = ret_dest {
- fn_ty.ret.store(&bcx, llret, ret_dest.llval);
- }
+ let op = OperandRef {
+ val: OperandValue::Immediate(llret),
+ ty: sig.output.unwrap()
+ };
+ self.store_return(&bcx, ret_dest, fn_ty.ret, op);
for op in args {
self.set_operand_dropped(&bcx, op);
}
pub fn llblock(&self, bb: mir::BasicBlock) -> BasicBlockRef {
self.blocks[bb.index()].llbb
}
+
+ fn make_return_dest(&mut self, bcx: &BlockAndBuilder<'bcx, 'tcx>,
+ dest: &mir::Lvalue<'tcx>, fn_ret_ty: &ArgType,
+ llargs: &mut Vec<ValueRef>, is_intrinsic: bool) -> ReturnDest {
+ // If the return is ignored, we can just return a do-nothing ReturnDest
+ if fn_ret_ty.is_ignore() {
+ return ReturnDest::Nothing;
+ }
+ let dest = match *dest {
+ mir::Lvalue::Temp(idx) => {
+ let lvalue_ty = self.mir.lvalue_ty(bcx.tcx(), dest);
+ let lvalue_ty = bcx.monomorphize(&lvalue_ty);
+ let ret_ty = lvalue_ty.to_ty(bcx.tcx());
+ match self.temps[idx as usize] {
+ TempRef::Lvalue(dest) => dest,
+ TempRef::Operand(None) => {
+ // Handle temporary lvalues, specifically Operand ones, as
+ // they don't have allocas
+ return if fn_ret_ty.is_indirect() {
+ // Odd, but possible, case, we have an operand temporary,
+ // but the calling convention has an indirect return.
+ let tmp = bcx.with_block(|bcx| {
+ base::alloc_ty(bcx, ret_ty, "tmp_ret")
+ });
+ llargs.push(tmp);
+ ReturnDest::IndirectOperand(tmp, idx)
+ } else if is_intrinsic {
+ // Currently, intrinsics always need a location to store
+ // the result. so we create a temporary alloca for the
+ // result
+ let tmp = bcx.with_block(|bcx| {
+ base::alloc_ty(bcx, ret_ty, "tmp_ret")
+ });
+ ReturnDest::IndirectOperand(tmp, idx)
+ } else {
+ ReturnDest::DirectOperand(idx)
+ };
+ }
+ TempRef::Operand(Some(_)) => {
+ bug!("lvalue temp already assigned to");
+ }
+ }
+ }
+ _ => self.trans_lvalue(bcx, dest)
+ };
+ if fn_ret_ty.is_indirect() {
+ llargs.push(dest.llval);
+ ReturnDest::Nothing
+ } else {
+ ReturnDest::Store(dest.llval)
+ }
+ }
+
+ fn trans_transmute(&mut self, bcx: &BlockAndBuilder<'bcx, 'tcx>,
+ src: &mir::Operand<'tcx>, dst: LvalueRef<'tcx>) {
+ let mut val = self.trans_operand(bcx, src);
+ if let ty::TyFnDef(def_id, substs, _) = val.ty.sty {
+ let llouttype = type_of::type_of(bcx.ccx(), dst.ty.to_ty(bcx.tcx()));
+ let out_type_size = llbitsize_of_real(bcx.ccx(), llouttype);
+ if out_type_size != 0 {
+ // FIXME #19925 Remove this hack after a release cycle.
+ let f = Callee::def(bcx.ccx(), def_id, substs);
+ let datum = f.reify(bcx.ccx());
+ val = OperandRef {
+ val: OperandValue::Immediate(datum.val),
+ ty: datum.ty
+ };
+ }
+ }
+
+ let llty = type_of::type_of(bcx.ccx(), val.ty);
+ let cast_ptr = bcx.pointercast(dst.llval, llty.ptr_to());
+ self.store_operand(bcx, cast_ptr, val);
+ }
+
+ // Stores the return value of a function call into it's final location.
+ fn store_return(&mut self,
+ bcx: &BlockAndBuilder<'bcx, 'tcx>,
+ dest: ReturnDest,
+ ret_ty: ArgType,
+ op: OperandRef<'tcx>) {
+ use self::ReturnDest::*;
+
+ match dest {
+ Nothing => (),
+ Store(dst) => ret_ty.store(bcx, op.immediate(), dst),
+ IndirectOperand(tmp, idx) => {
+ let op = self.trans_load(bcx, tmp, op.ty);
+ self.temps[idx as usize] = TempRef::Operand(Some(op));
+ }
+ DirectOperand(idx) => {
+ let op = if type_is_fat_ptr(bcx.tcx(), op.ty) {
+ let llval = op.immediate();
+ let ptr = bcx.extract_value(llval, 0);
+ let meta = bcx.extract_value(llval, 1);
+
+ OperandRef {
+ val: OperandValue::FatPtr(ptr, meta),
+ ty: op.ty
+ }
+ } else {
+ op
+ };
+ self.temps[idx as usize] = TempRef::Operand(Some(op));
+ }
+ }
+ }
+}
+
+enum ReturnDest {
+ // Do nothing, the return value is indirect or ignored
+ Nothing,
+ // Store the return value to the pointer
+ Store(ValueRef),
+ // Stores an indirect return value to an operand temporary lvalue
+ IndirectOperand(ValueRef, u32),
+ // Stores a direct return value to an operand temporary lvalue
+ DirectOperand(u32)
}
}
}
+ // Perform an action using the given Lvalue.
+ // If the Lvalue is an empty TempRef::Operand, then a temporary stack slot
+ // is created first, then used as an operand to update the Lvalue.
+ pub fn with_lvalue_ref<F, U>(&mut self, bcx: &BlockAndBuilder<'bcx, 'tcx>,
+ lvalue: &mir::Lvalue<'tcx>, f: F) -> U
+ where F: FnOnce(&mut Self, LvalueRef<'tcx>) -> U
+ {
+ match *lvalue {
+ mir::Lvalue::Temp(idx) => {
+ match self.temps[idx as usize] {
+ TempRef::Lvalue(lvalue) => f(self, lvalue),
+ TempRef::Operand(None) => {
+ let lvalue_ty = self.mir.lvalue_ty(bcx.tcx(), lvalue);
+ let lvalue_ty = bcx.monomorphize(&lvalue_ty);
+ let lvalue = LvalueRef::alloca(bcx,
+ lvalue_ty.to_ty(bcx.tcx()),
+ "lvalue_temp");
+ let ret = f(self, lvalue);
+ let op = self.trans_load(bcx, lvalue.llval, lvalue_ty.to_ty(bcx.tcx()));
+ self.temps[idx as usize] = TempRef::Operand(Some(op));
+ ret
+ }
+ TempRef::Operand(Some(_)) => {
+ bug!("Lvalue temp already set");
+ }
+ }
+ }
+ _ => {
+ let lvalue = self.trans_lvalue(bcx, lvalue);
+ f(self, lvalue)
+ }
+ }
+ }
+
/// Adjust the bitwidth of an index since LLVM is less forgiving
/// than we are.
///
debug!("type_of {:?}", t);
- assert!(!t.has_escaping_regions());
+ assert!(!t.has_escaping_regions(), "{:?} has escaping regions", t);
// Replace any typedef'd types with their equivalent non-typedef
// type. This ensures that all LLVM nominal types that contain
ty::TyFloat(ast::FloatTy::F32) => {
fcx.type_error_message(arg.span,
|t| {
- format!("can't pass an {} to variadic \
- function, cast to c_double", t)
+ format!("can't pass an `{}` to variadic \
+ function, cast to `c_double`", t)
}, arg_ty, None);
}
ty::TyInt(ast::IntTy::I8) | ty::TyInt(ast::IntTy::I16) | ty::TyBool => {
fcx.type_error_message(arg.span, |t| {
- format!("can't pass {} to variadic \
- function, cast to c_int",
+ format!("can't pass `{}` to variadic \
+ function, cast to `c_int`",
t)
}, arg_ty, None);
}
ty::TyUint(ast::UintTy::U8) | ty::TyUint(ast::UintTy::U16) => {
fcx.type_error_message(arg.span, |t| {
- format!("can't pass {} to variadic \
- function, cast to c_uint",
+ format!("can't pass `{}` to variadic \
+ function, cast to `c_uint`",
t)
}, arg_ty, None);
}
+ ty::TyFnDef(_, _, f) => {
+ let ptr_ty = fcx.tcx().mk_ty(ty::TyFnPtr(f));
+ let ptr_ty = fcx.infcx().resolve_type_vars_if_possible(&ptr_ty);
+ fcx.type_error_message(arg.span,
+ |t| {
+ format!("can't pass `{}` to variadic \
+ function, cast to `{}`", t, ptr_ty)
+ }, arg_ty, None);
+ }
_ => {}
}
}
```
"##,
+E0520: r##"
+A non-default implementation was already made on this type so it cannot be
+specialized further. Erroneous code example:
+
+```compile_fail
+#![feature(specialization)]
+
+trait SpaceLlama {
+ fn fly(&self);
+}
+
+// applies to all T
+impl<T> SpaceLlama for T {
+ default fn fly(&self) {}
+}
+
+// non-default impl
+// applies to all `Clone` T and overrides the previous impl
+impl<T: Clone> SpaceLlama for T {
+ fn fly(&self) {}
+}
+
+// since `i32` is clone, this conflicts with the previous implementation
+impl SpaceLlama for i32 {
+ default fn fly(&self) {}
+ // error: item `fly` is provided by an `impl` that specializes
+ // another, but the item in the parent `impl` is not marked
+ // `default` and so it cannot be specialized.
+}
+```
+
+Specialization only allows you to override `default` functions in
+implementations.
+
+To fix this error, you need to mark all the parent implementations as default.
+Example:
+
+```
+#![feature(specialization)]
+
+trait SpaceLlama {
+ fn fly(&self);
+}
+
+// applies to all T
+impl<T> SpaceLlama for T {
+ default fn fly(&self) {} // This is a parent implementation.
+}
+
+// applies to all `Clone` T; overrides the previous impl
+impl<T: Clone> SpaceLlama for T {
+ default fn fly(&self) {} // This is a parent implementation but was
+ // previously not a default one, causing the error
+}
+
+// applies to i32, overrides the previous two impls
+impl SpaceLlama for i32 {
+ fn fly(&self) {} // And now that's ok!
+}
+```
+"##,
+
}
register_diagnostics! {
// type `{}` was overridden
E0436, // functional record update requires a struct
E0513, // no type for local variable ..
- E0520, // cannot specialize non-default item
E0521 // redundant default implementations of trait
}
#![stable(feature = "rust1", since = "1.0.0")]
use core::char::CharExt as C;
-use core::option::Option::{self, Some, None};
-use core::iter::Iterator;
+use core::fmt;
use tables::{derived_property, property, general_category, conversions};
// stable reexports
}
/// An iterator that decodes UTF-16 encoded code points from an iterator of `u16`s.
-#[unstable(feature = "decode_utf16", reason = "recently exposed", issue = "27830")]
+#[stable(feature = "decode_utf16", since = "1.9.0")]
#[derive(Clone)]
pub struct DecodeUtf16<I>
where I: Iterator<Item = u16>
buf: Option<u16>,
}
+/// An iterator that decodes UTF-16 encoded code points from an iterator of `u16`s.
+#[stable(feature = "decode_utf16", since = "1.9.0")]
+#[derive(Debug, Clone, Eq, PartialEq)]
+pub struct DecodeUtf16Error {
+ code: u16,
+}
+
/// Create an iterator over the UTF-16 encoded code points in `iter`,
/// returning unpaired surrogates as `Err`s.
///
/// Basic usage:
///
/// ```
-/// #![feature(decode_utf16)]
-///
/// use std::char::decode_utf16;
///
/// fn main() {
/// 0x0073, 0xDD1E, 0x0069, 0x0063,
/// 0xD834];
///
-/// assert_eq!(decode_utf16(v.iter().cloned()).collect::<Vec<_>>(),
+/// assert_eq!(decode_utf16(v.iter().cloned())
+/// .map(|r| r.map_err(|e| e.unpaired_surrogate()))
+/// .collect::<Vec<_>>(),
/// vec![Ok('𝄞'),
/// Ok('m'), Ok('u'), Ok('s'),
/// Err(0xDD1E),
/// A lossy decoder can be obtained by replacing `Err` results with the replacement character:
///
/// ```
-/// #![feature(decode_utf16)]
-///
/// use std::char::{decode_utf16, REPLACEMENT_CHARACTER};
///
/// fn main() {
/// "𝄞mus�ic�");
/// }
/// ```
-#[unstable(feature = "decode_utf16", reason = "recently exposed", issue = "27830")]
+#[stable(feature = "decode_utf16", since = "1.9.0")]
#[inline]
pub fn decode_utf16<I: IntoIterator<Item = u16>>(iter: I) -> DecodeUtf16<I::IntoIter> {
DecodeUtf16 {
}
}
-#[unstable(feature = "decode_utf16", reason = "recently exposed", issue = "27830")]
+#[stable(feature = "decode_utf16", since = "1.9.0")]
impl<I: Iterator<Item=u16>> Iterator for DecodeUtf16<I> {
- type Item = Result<char, u16>;
+ type Item = Result<char, DecodeUtf16Error>;
- fn next(&mut self) -> Option<Result<char, u16>> {
+ fn next(&mut self) -> Option<Result<char, DecodeUtf16Error>> {
let u = match self.buf.take() {
Some(buf) => buf,
None => match self.iter.next() {
Some(Ok(unsafe { from_u32_unchecked(u as u32) }))
} else if u >= 0xDC00 {
// a trailing surrogate
- Some(Err(u))
+ Some(Err(DecodeUtf16Error { code: u }))
} else {
let u2 = match self.iter.next() {
Some(u2) => u2,
// eof
- None => return Some(Err(u)),
+ None => return Some(Err(DecodeUtf16Error { code: u })),
};
if u2 < 0xDC00 || u2 > 0xDFFF {
// not a trailing surrogate so we're not a valid
// surrogate pair, so rewind to redecode u2 next time.
self.buf = Some(u2);
- return Some(Err(u));
+ return Some(Err(DecodeUtf16Error { code: u }));
}
// all ok, so lets decode it.
}
}
-/// `U+FFFD REPLACEMENT CHARACTER` (�) is used in Unicode to represent a decoding error.
+impl DecodeUtf16Error {
+ /// Returns the unpaired surrogate which caused this error.
+ #[stable(feature = "decode_utf16", since = "1.9.0")]
+ pub fn unpaired_surrogate(&self) -> u16 {
+ self.code
+ }
+}
+
+#[stable(feature = "decode_utf16", since = "1.9.0")]
+impl fmt::Display for DecodeUtf16Error {
+ fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
+ write!(f, "unpaired surrogate found: {:x}", self.code)
+ }
+}
+
+/// `U+FFFD REPLACEMENT CHARACTER` (�) is used in Unicode to represent a
+/// decoding error.
+///
/// It can occur, for example, when giving ill-formed UTF-8 bytes to
/// [`String::from_utf8_lossy`](../../std/string/struct.String.html#method.from_utf8_lossy).
-#[unstable(feature = "decode_utf16", reason = "recently added", issue = "27830")]
+#[stable(feature = "decode_utf16", since = "1.9.0")]
pub const REPLACEMENT_CHARACTER: char = '\u{FFFD}';
use visit_ast;
use html::item_type::ItemType;
-/// A stable identifier to the particular version of JSON output.
-/// Increment this when the `Crate` and related structures change.
-pub const SCHEMA_VERSION: &'static str = "0.8.3";
-
mod inline;
mod simplify;
#![feature(box_patterns)]
#![feature(box_syntax)]
#![feature(libc)]
-#![feature(recover)]
#![feature(rustc_private)]
#![feature(set_stdio)]
#![feature(slice_patterns)]
#![feature(staged_api)]
-#![feature(std_panic)]
#![feature(test)]
#![feature(unicode)]
#![feature(question_mark)]
use std::collections::HashMap;
use std::default::Default;
use std::env;
-use std::fs::File;
-use std::io::{self, Read, Write};
+use std::io::Read;
use std::path::PathBuf;
use std::process;
use std::rc::Rc;
use std::sync::mpsc::channel;
use externalfiles::ExternalHtml;
-use serialize::Decodable;
-use serialize::json::{self, Json};
use rustc::session::search_paths::SearchPaths;
use rustc::session::config::{ErrorOutputType, RustcOptGroup, nightly_options};
-// reexported from `clean` so it can be easily updated with the mod itself
-pub use clean::SCHEMA_VERSION;
-
#[macro_use]
pub mod externalfiles;
struct Output {
krate: clean::Crate,
- json_plugins: Vec<plugins::PluginJson>,
passes: Vec<String>,
}
stable(optflag("V", "version", "print rustdoc's version")),
stable(optflag("v", "verbose", "use verbose output")),
stable(optopt("r", "input-format", "the input type of the specified file",
- "[rust|json]")),
+ "[rust]")),
stable(optopt("w", "output-format", "the output type to write",
- "[html|json]")),
+ "[html]")),
stable(optopt("o", "output", "where to place the output", "PATH")),
stable(optopt("", "crate-name", "specify the name of this crate", "NAME")),
stable(optmulti("L", "library-path", "directory to add to crate search path",
return 1;
}
};
- let Output { krate, json_plugins, passes, } = out;
+ let Output { krate, passes, } = out;
info!("going to format");
match matches.opt_str("w").as_ref().map(|s| &**s) {
Some("html") | None => {
css_file_extension)
.expect("failed to generate documentation")
}
- Some("json") => {
- json_output(krate, json_plugins,
- output.unwrap_or(PathBuf::from("doc.json")))
- .expect("failed to write json")
- }
Some(s) => {
println!("unknown output format: {}", s);
return 1;
matches: &getopts::Matches) -> Result<Output, String> {
match matches.opt_str("r").as_ref().map(|s| &**s) {
Some("rust") => Ok(rust_input(input, externs, matches)),
- Some("json") => json_input(input),
Some(s) => Err(format!("unknown input format: {}", s)),
None => {
- if input.ends_with(".json") {
- json_input(input)
- } else {
- Ok(rust_input(input, externs, matches))
- }
+ Ok(rust_input(input, externs, matches))
}
}
}
// Run everything!
info!("Executing passes/plugins");
- let (krate, json) = pm.run_plugins(krate);
- Output { krate: krate, json_plugins: json, passes: passes }
-}
-
-/// This input format purely deserializes the json output file. No passes are
-/// run over the deserialized output.
-fn json_input(input: &str) -> Result<Output, String> {
- let mut bytes = Vec::new();
- if let Err(e) = File::open(input).and_then(|mut f| f.read_to_end(&mut bytes)) {
- return Err(format!("couldn't open {}: {}", input, e))
- }
- match json::from_reader(&mut &bytes[..]) {
- Err(s) => Err(format!("{:?}", s)),
- Ok(Json::Object(obj)) => {
- let mut obj = obj;
- // Make sure the schema is what we expect
- match obj.remove(&"schema".to_string()) {
- Some(Json::String(version)) => {
- if version != SCHEMA_VERSION {
- return Err(format!(
- "sorry, but I only understand version {}",
- SCHEMA_VERSION))
- }
- }
- Some(..) => return Err("malformed json".to_string()),
- None => return Err("expected a schema version".to_string()),
- }
- let krate = match obj.remove(&"crate".to_string()) {
- Some(json) => {
- let mut d = json::Decoder::new(json);
- Decodable::decode(&mut d).unwrap()
- }
- None => return Err("malformed json".to_string()),
- };
- // FIXME: this should read from the "plugins" field, but currently
- // Json doesn't implement decodable...
- let plugin_output = Vec::new();
- Ok(Output { krate: krate, json_plugins: plugin_output, passes: Vec::new(), })
- }
- Ok(..) => {
- Err("malformed json input: expected an object at the \
- top".to_string())
- }
- }
-}
-
-/// Outputs the crate/plugin json as a giant json blob at the specified
-/// destination.
-fn json_output(krate: clean::Crate, res: Vec<plugins::PluginJson> ,
- dst: PathBuf) -> io::Result<()> {
- // {
- // "schema": version,
- // "crate": { parsed crate ... },
- // "plugins": { output of plugins ... }
- // }
- let mut json = std::collections::BTreeMap::new();
- json.insert("schema".to_string(), Json::String(SCHEMA_VERSION.to_string()));
- let plugins_json = res.into_iter()
- .filter_map(|opt| {
- opt.map(|(string, json)| (string.to_string(), json))
- }).collect();
-
- // FIXME #8335: yuck, Rust -> str -> JSON round trip! No way to .encode
- // straight to the Rust JSON representation.
- let crate_json_str = format!("{}", json::as_json(&krate));
- let crate_json = json::from_str(&crate_json_str).expect("Rust generated JSON is invalid");
-
- json.insert("crate".to_string(), crate_json);
- json.insert("plugins".to_string(), Json::Object(plugins_json));
-
- let mut file = File::create(&dst)?;
- write!(&mut file, "{}", Json::Object(json))
+ let krate = pm.run_plugins(krate);
+ Output { krate: krate, passes: passes }
}
};
// strip any traits implemented on stripped items
- let krate = {
+ {
struct ImplStripper<'a> {
stripped: &'a mut DefIdSet
}
}
let mut stripper = ImplStripper{ stripped: &mut stripped };
stripper.fold_crate(krate)
- };
-
- (krate, None)
+ }
}
/// Strip private items from the point of view of a crate or externally from a
// strip all private implementations of traits
{
let mut stripper = ImplStripper(&retained);
- krate = stripper.fold_crate(krate);
+ stripper.fold_crate(krate)
}
- (krate, None)
}
struct Stripper<'a> {
self.fold_item_recur(i)
};
- i.and_then(|i| { match i.inner {
- // emptied modules/impls have no need to exist
- clean::ModuleItem(ref m)
- if m.items.is_empty() &&
- i.doc_value().is_none() => None,
- clean::ImplItem(ref i) if i.items.is_empty() => None,
- _ => {
- self.retained.insert(i.def_id);
- Some(i)
+ i.and_then(|i| {
+ match i.inner {
+ // emptied modules/impls have no need to exist
+ clean::ModuleItem(ref m)
+ if m.items.is_empty() &&
+ i.doc_value().is_none() => None,
+ clean::ImplItem(ref i) if i.items.is_empty() => None,
+ _ => {
+ self.retained.insert(i.def_id);
+ Some(i)
+ }
}
- }})
+ })
}
}
}
pub fn strip_priv_imports(krate: clean::Crate) -> plugins::PluginResult {
- (ImportStripper.fold_crate(krate), None)
+ ImportStripper.fold_crate(krate)
}
pub fn unindent_comments(krate: clean::Crate) -> plugins::PluginResult {
}
let mut cleaner = CommentCleaner;
let krate = cleaner.fold_crate(krate);
- (krate, None)
+ krate
}
pub fn collapse_docs(krate: clean::Crate) -> plugins::PluginResult {
}
let mut collapser = Collapser;
let krate = collapser.fold_crate(krate);
- (krate, None)
+ krate
}
pub fn unindent(s: &str) -> String {
use clean;
-use serialize::json;
use std::mem;
use std::string::String;
use std::path::PathBuf;
use rustc_back::dynamic_lib as dl;
-pub type PluginJson = Option<(String, json::Json)>;
-pub type PluginResult = (clean::Crate, PluginJson);
+pub type PluginResult = clean::Crate;
pub type PluginCallback = fn (clean::Crate) -> PluginResult;
/// Manages loading and running of plugins
self.callbacks.push(plugin);
}
/// Run all the loaded plugins over the crate, returning their results
- pub fn run_plugins(&self, krate: clean::Crate) -> (clean::Crate, Vec<PluginJson> ) {
- let mut out_json = Vec::new();
- let mut krate = krate;
+ pub fn run_plugins(&self, mut krate: clean::Crate) -> clean::Crate {
for &callback in &self.callbacks {
- let (c, res) = callback(krate);
- krate = c;
- out_json.push(res);
+ krate = callback(krate);
}
- (krate, out_json)
+ krate
}
}
use std::io::prelude::*;
use std::io;
use std::path::PathBuf;
-use std::panic::{self, AssertRecoverSafe};
+use std::panic::{self, AssertUnwindSafe};
use std::process::Command;
use std::rc::Rc;
use std::str;
if let Some(name) = crate_name {
krate.name = name;
}
- let (krate, _) = passes::collapse_docs(krate);
- let (krate, _) = passes::unindent_comments(krate);
+ let krate = passes::collapse_docs(krate);
+ let krate = passes::unindent_comments(krate);
let mut collector = Collector::new(krate.name.to_string(),
cfgs,
control.after_analysis.stop = Compilation::Stop;
}
- match {
- let b_sess = AssertRecoverSafe(&sess);
- let b_cstore = AssertRecoverSafe(&cstore);
- let b_cfg = AssertRecoverSafe(cfg.clone());
- let b_control = AssertRecoverSafe(&control);
-
- panic::recover(|| {
- driver::compile_input(&b_sess, &b_cstore, (*b_cfg).clone(),
- &input, &out,
- &None, None, &b_control)
- })
- } {
+ let res = panic::catch_unwind(AssertUnwindSafe(|| {
+ driver::compile_input(&sess, &cstore, cfg.clone(),
+ &input, &out,
+ &None, None, &control)
+ }));
+
+ match res {
Ok(r) => {
match r {
Err(count) if count > 0 && compile_fail == false => {
buf >>= 4;
continue
}
- _ => return Err(InvalidHexCharacter(self.char_at(idx), idx)),
+ _ => {
+ let ch = self[idx..].chars().next().unwrap();
+ return Err(InvalidHexCharacter(ch, idx))
+ }
}
modulus += 1;
#![feature(enumset)]
#![feature(rustc_private)]
#![feature(staged_api)]
-#![feature(str_char)]
#![feature(unicode)]
#![feature(question_mark)]
#![cfg_attr(test, feature(test))]
/// # Examples
///
/// ```
- /// #![feature(ascii)]
- ///
/// use std::ascii::AsciiExt;
///
/// let mut ascii = 'a';
///
/// assert_eq!('A', ascii);
/// ```
- #[unstable(feature = "ascii", issue = "27809")]
+ #[stable(feature = "ascii", since = "1.9.0")]
fn make_ascii_uppercase(&mut self);
/// Converts this type to its ASCII lower case equivalent in-place.
/// # Examples
///
/// ```
- /// #![feature(ascii)]
- ///
/// use std::ascii::AsciiExt;
///
/// let mut ascii = 'A';
///
/// assert_eq!('a', ascii);
/// ```
- #[unstable(feature = "ascii", issue = "27809")]
+ #[stable(feature = "ascii", since = "1.9.0")]
fn make_ascii_lowercase(&mut self);
}
}
/// Returns a reference to the map's hasher.
- #[unstable(feature = "hashmap_public_hasher", reason = "don't want to make insta-stable",
- issue = "31262")]
+ #[stable(feature = "hashmap_public_hasher", since = "1.9.0")]
pub fn hasher(&self) -> &S {
&self.hash_builder
}
}
/// Returns a reference to the set's hasher.
- #[unstable(feature = "hashmap_public_hasher", reason = "don't want to make insta-stable",
- issue = "31262")]
+ #[stable(feature = "hashmap_public_hasher", since = "1.9.0")]
pub fn hasher(&self) -> &S {
self.map.hasher()
}
/// The value may be any borrowed form of the set's value type, but
/// `Hash` and `Eq` on the borrowed form *must* match those for
/// the value type.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn get<Q: ?Sized>(&self, value: &Q) -> Option<&T>
where T: Borrow<Q>, Q: Hash + Eq
{
/// Adds a value to the set, replacing the existing value, if any, that is equal to the given
/// one. Returns the replaced value.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn replace(&mut self, value: T) -> Option<T> {
Recover::replace(&mut self.map, value)
}
/// The value may be any borrowed form of the set's value type, but
/// `Hash` and `Eq` on the borrowed form *must* match those for
/// the value type.
- #[unstable(feature = "set_recovery", issue = "28050")]
+ #[stable(feature = "set_recovery", since = "1.9.0")]
pub fn take<Q: ?Sized>(&mut self, value: &Q) -> Option<T>
where T: Borrow<Q>, Q: Hash + Eq
{
use any::TypeId;
use boxed::Box;
-use convert::From;
+use char;
use fmt::{self, Debug, Display};
use marker::{Send, Sync, Reflect};
use mem::transmute;
use num;
-use option::Option::{self, Some, None};
-use result::Result::{self, Ok, Err};
use raw::TraitObject;
use str;
use string::{self, String};
}
}
+#[stable(feature = "decode_utf16", since = "1.9.0")]
+impl Error for char::DecodeUtf16Error {
+ fn description(&self) -> &str {
+ "unpaired surrogate found"
+ }
+}
+
#[stable(feature = "box_error", since = "1.7.0")]
impl<T: Error> Error for Box<T> {
fn description(&self) -> &str {
self.inner.push_slice(&s.as_ref().inner)
}
- /// Creates a new `OsString` with the given capacity. The string will be
- /// able to hold exactly `capacity` bytes without reallocating. If
- /// `capacity` is 0, the string will not allocate.
+ /// Creates a new `OsString` with the given capacity.
+ ///
+ /// The string will be able to hold exactly `capacity` lenth units of other
+ /// OS strings without reallocating. If `capacity` is 0, the string will not
+ /// allocate.
///
/// See main `OsString` documentation information about encoding.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn with_capacity(capacity: usize) -> OsString {
OsString {
inner: Buf::with_capacity(capacity)
}
/// Truncates the `OsString` to zero length.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn clear(&mut self) {
self.inner.clear()
}
- /// Returns the number of bytes this `OsString` can hold without
- /// reallocating.
+ /// Returns the capacity this `OsString` can hold without reallocating.
///
/// See `OsString` introduction for information about encoding.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn capacity(&self) -> usize {
self.inner.capacity()
}
- /// Reserves capacity for at least `additional` more bytes to be inserted
- /// in the given `OsString`. The collection may reserve more space to avoid
- /// frequent reallocations.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ /// Reserves capacity for at least `additional` more capacity to be inserted
+ /// in the given `OsString`.
+ ///
+ /// The collection may reserve more space to avoid frequent reallocations.
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn reserve(&mut self, additional: usize) {
self.inner.reserve(additional)
}
- /// Reserves the minimum capacity for exactly `additional` more bytes to be
- /// inserted in the given `OsString`. Does nothing if the capacity is
+ /// Reserves the minimum capacity for exactly `additional` more capacity to
+ /// be inserted in the given `OsString`. Does nothing if the capacity is
/// already sufficient.
///
/// Note that the allocator may give the collection more space than it
/// requests. Therefore capacity can not be relied upon to be precisely
/// minimal. Prefer reserve if future insertions are expected.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn reserve_exact(&mut self, additional: usize) {
self.inner.reserve_exact(additional)
}
}
/// Checks whether the `OsStr` is empty.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn is_empty(&self) -> bool {
self.inner.inner.is_empty()
}
- /// Returns the number of bytes in this `OsStr`.
+ /// Returns the length of this `OsStr`.
+ ///
+ /// Note that this does **not** return the number of bytes in this string
+ /// as, for example, OS strings on Windows are encoded as a list of `u16`
+ /// rather than a list of bytes. This number is simply useful for passing to
+ /// other methods like `OsString::with_capacity` to avoid reallocations.
///
- /// See `OsStr` introduction for information about encoding.
- #[unstable(feature = "osstring_simple_functions",
- reason = "recently added", issue = "29453")]
+ /// See `OsStr` introduction for more information about encoding.
+ #[stable(feature = "osstring_simple_functions", since = "1.9.0")]
pub fn len(&self) -> usize {
self.inner.inner.len()
}
/// The returned `File` is a reference to the same state that this object
/// references. Both handles will read and write with the same cursor
/// position.
- #[unstable(feature = "file_try_clone", reason = "newly added", issue = "31405")]
+ #[stable(feature = "file_try_clone", since = "1.9.0")]
pub fn try_clone(&self) -> io::Result<File> {
Ok(File {
inner: self.inner.duplicate()?
/// # Examples
///
/// ```no_run
- /// #![feature(expand_open_options)]
/// use std::fs::OpenOptions;
///
/// let file = OpenOptions::new().write(true)
/// .create_new(true)
/// .open("foo.txt");
/// ```
- #[unstable(feature = "expand_open_options",
- reason = "recently added",
- issue = "30014")]
+ #[stable(feature = "expand_open_options2", since = "1.9.0")]
pub fn create_new(&mut self, create_new: bool) -> &mut OpenOptions {
self.0.create_new(create_new); self
}
mod util;
mod stdio;
-const DEFAULT_BUF_SIZE: usize = 64 * 1024;
+const DEFAULT_BUF_SIZE: usize = 8 * 1024;
// A few methods below (read_to_string, read_line) will append data into a
// `String` buffer, but we need to be pretty careful when doing this. The
}
}
Some(match str::from_utf8(&buf[..width]).ok() {
- Some(s) => Ok(s.char_at(0)),
+ Some(s) => Ok(s.chars().next().unwrap()),
None => Err(CharsError::NotUtf8),
})
}
#![feature(collections)]
#![feature(collections_bound)]
#![feature(const_fn)]
-#![feature(copy_from_slice)]
#![feature(core_float)]
#![feature(core_intrinsics)]
-#![feature(decode_utf16)]
#![feature(dropck_parametricity)]
#![feature(float_extras)]
#![feature(float_from_str_radix)]
}
/// Change the IP address associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
pub fn set_ip(&mut self, new_ip: IpAddr) {
// `match (*self, new_ip)` would have us mutate a copy of self only to throw it away.
match (self, new_ip) {
}
/// Change the port number associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
pub fn set_port(&mut self, new_port: u16) {
match *self {
SocketAddr::V4(ref mut a) => a.set_port(new_port),
}
/// Change the IP address associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
- pub fn set_ip(&mut self, new_ip: Ipv4Addr) { self.inner.sin_addr = *new_ip.as_inner() }
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
+ pub fn set_ip(&mut self, new_ip: Ipv4Addr) {
+ self.inner.sin_addr = *new_ip.as_inner()
+ }
/// Returns the port number associated with this socket address.
#[stable(feature = "rust1", since = "1.0.0")]
- pub fn port(&self) -> u16 { ntoh(self.inner.sin_port) }
+ pub fn port(&self) -> u16 {
+ ntoh(self.inner.sin_port)
+ }
/// Change the port number associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
- pub fn set_port(&mut self, new_port: u16) { self.inner.sin_port = hton(new_port) }
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
+ pub fn set_port(&mut self, new_port: u16) {
+ self.inner.sin_port = hton(new_port);
+ }
}
impl SocketAddrV6 {
}
/// Change the IP address associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
- pub fn set_ip(&mut self, new_ip: Ipv6Addr) { self.inner.sin6_addr = *new_ip.as_inner() }
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
+ pub fn set_ip(&mut self, new_ip: Ipv6Addr) {
+ self.inner.sin6_addr = *new_ip.as_inner()
+ }
/// Returns the port number associated with this socket address.
#[stable(feature = "rust1", since = "1.0.0")]
- pub fn port(&self) -> u16 { ntoh(self.inner.sin6_port) }
+ pub fn port(&self) -> u16 {
+ ntoh(self.inner.sin6_port)
+ }
/// Change the port number associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
- pub fn set_port(&mut self, new_port: u16) { self.inner.sin6_port = hton(new_port) }
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
+ pub fn set_port(&mut self, new_port: u16) {
+ self.inner.sin6_port = hton(new_port);
+ }
/// Returns the flow information associated with this address,
/// corresponding to the `sin6_flowinfo` field in C.
#[stable(feature = "rust1", since = "1.0.0")]
- pub fn flowinfo(&self) -> u32 { self.inner.sin6_flowinfo }
+ pub fn flowinfo(&self) -> u32 {
+ self.inner.sin6_flowinfo
+ }
/// Change the flow information associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
pub fn set_flowinfo(&mut self, new_flowinfo: u32) {
self.inner.sin6_flowinfo = new_flowinfo;
}
/// Returns the scope ID associated with this address,
/// corresponding to the `sin6_scope_id` field in C.
#[stable(feature = "rust1", since = "1.0.0")]
- pub fn scope_id(&self) -> u32 { self.inner.sin6_scope_id }
+ pub fn scope_id(&self) -> u32 {
+ self.inner.sin6_scope_id
+ }
/// Change the scope ID associated with this socket address.
- #[unstable(feature = "sockaddr_setters", reason = "recent addition", issue = "31572")]
+ #[stable(feature = "sockaddr_setters", since = "1.9.0")]
pub fn set_scope_id(&mut self, new_scope_id: u32) {
self.inner.sin6_scope_id = new_scope_id;
}
#[stable(feature = "metadata_ext2", since = "1.8.0")]
fn st_ctime_nsec(&self) -> i64;
#[stable(feature = "metadata_ext2", since = "1.8.0")]
- fn st_birthtime(&self) -> i64;
- #[stable(feature = "metadata_ext2", since = "1.8.0")]
- fn st_birthtime_nsec(&self) -> i64;
- #[stable(feature = "metadata_ext2", since = "1.8.0")]
fn st_blksize(&self) -> u64;
#[stable(feature = "metadata_ext2", since = "1.8.0")]
fn st_blocks(&self) -> u64;
fn st_ctime_nsec(&self) -> i64 {
self.as_inner().as_inner().st_ctime_nsec as i64
}
- fn st_birthtime(&self) -> i64 {
- self.as_inner().as_inner().st_birthtime as i64
- }
- fn st_birthtime_nsec(&self) -> i64 {
- self.as_inner().as_inner().st_birthtime_nsec as i64
- }
fn st_blksize(&self) -> u64 {
self.as_inner().as_inner().st_blksize as u64
}
//! Panic support in the standard library
-#![unstable(feature = "std_panic", reason = "awaiting feedback",
- issue = "27719")]
+#![stable(feature = "std_panic", since = "1.9.0")]
use any::Any;
use boxed::Box;
use sys_common::unwind;
use thread::Result;
+#[unstable(feature = "panic_handler", issue = "30449")]
pub use panicking::{take_hook, set_hook, PanicInfo, Location};
///
/// "speed bump" to alert users of `recover` that broken invariants may be
/// witnessed and may need to be accounted for.
///
-/// ## Who implements `RecoverSafe`?
+/// ## Who implements `UnwindSafe`?
///
/// Types such as `&mut T` and `&RefCell<T>` are examples which are **not**
/// recover safe. The general idea is that any mutable state which can be shared
/// poisoning by default. They still allow witnessing a broken invariant, but
/// they already provide their own "speed bumps" to do so.
///
-/// ## When should `RecoverSafe` be used?
+/// ## When should `UnwindSafe` be used?
///
/// Is not intended that most types or functions need to worry about this trait.
/// It is only used as a bound on the `recover` function and as mentioned above,
/// wrapper struct in this module can be used to force this trait to be
/// implemented for any closed over variables passed to the `recover` function
/// (more on this below).
-#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+#[stable(feature = "catch_unwind", since = "1.9.0")]
#[rustc_on_unimplemented = "the type {Self} may not be safely transferred \
across a recover boundary"]
+pub trait UnwindSafe {}
+
+/// Deprecated, renamed to UnwindSafe
+#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+#[rustc_deprecated(reason = "renamed to `UnwindSafe`", since = "1.9.0")]
pub trait RecoverSafe {}
+#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+#[allow(deprecated)]
+impl<T: UnwindSafe> RecoverSafe for T {}
/// A marker trait representing types where a shared reference is considered
/// recover safe.
/// interior mutability.
///
/// This is a "helper marker trait" used to provide impl blocks for the
-/// `RecoverSafe` trait, for more information see that documentation.
-#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+/// `UnwindSafe` trait, for more information see that documentation.
+#[stable(feature = "catch_unwind", since = "1.9.0")]
#[rustc_on_unimplemented = "the type {Self} contains interior mutability \
and a reference may not be safely transferrable \
across a recover boundary"]
-pub trait RefRecoverSafe {}
+pub trait RefUnwindSafe {}
/// A simple wrapper around a type to assert that it is panic safe.
///
///
/// # Examples
///
-/// One way to use `AssertRecoverSafe` is to assert that the entire closure
+/// One way to use `AssertUnwindSafe` is to assert that the entire closure
/// itself is recover safe, bypassing all checks for all variables:
///
/// ```
-/// #![feature(recover, std_panic)]
-///
-/// use std::panic::{self, AssertRecoverSafe};
+/// use std::panic::{self, AssertUnwindSafe};
///
/// let mut variable = 4;
///
/// // This code will not compile because the closure captures `&mut variable`
/// // which is not considered panic safe by default.
///
-/// // panic::recover(|| {
+/// // panic::catch_unwind(|| {
/// // variable += 3;
/// // });
///
-/// // This, however, will compile due to the `AssertRecoverSafe` wrapper
-/// let result = panic::recover(AssertRecoverSafe(|| {
+/// // This, however, will compile due to the `AssertUnwindSafe` wrapper
+/// let result = panic::catch_unwind(AssertUnwindSafe(|| {
/// variable += 3;
/// }));
/// // ...
/// ```
///
/// Wrapping the entire closure amounts to a blanket assertion that all captured
-/// variables are recover safe. This has the downside that if new captures are
-/// added in the future, they will also be considered recover safe. Therefore,
+/// variables are unwind safe. This has the downside that if new captures are
+/// added in the future, they will also be considered unwind safe. Therefore,
/// you may prefer to just wrap individual captures, as shown below. This is
/// more annotation, but it ensures that if a new capture is added which is not
-/// recover safe, you will get a compilation error at that time, which will
+/// unwind safe, you will get a compilation error at that time, which will
/// allow you to consider whether that new capture in fact represent a bug or
/// not.
///
/// ```
-/// #![feature(recover, std_panic)]
-///
-/// use std::panic::{self, AssertRecoverSafe};
+/// use std::panic::{self, AssertUnwindSafe};
///
/// let mut variable = 4;
/// let other_capture = 3;
///
/// let result = {
-/// let mut wrapper = AssertRecoverSafe(&mut variable);
-/// panic::recover(move || {
+/// let mut wrapper = AssertUnwindSafe(&mut variable);
+/// panic::catch_unwind(move || {
/// **wrapper += other_capture;
/// })
/// };
/// // ...
/// ```
-#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+pub struct AssertUnwindSafe<T>(
+ #[stable(feature = "catch_unwind", since = "1.9.0")]
+ pub T
+);
+
+/// Deprecated, renamed to `AssertUnwindSafe`
+#[unstable(feature = "recover", issue = "27719")]
+#[rustc_deprecated(reason = "renamed to `AssertUnwindSafe`", since = "1.9.0")]
pub struct AssertRecoverSafe<T>(pub T);
-// Implementations of the `RecoverSafe` trait:
+// Implementations of the `UnwindSafe` trait:
//
-// * By default everything is recover safe
-// * pointers T contains mutability of some form are not recover safe
+// * By default everything is unwind safe
+// * pointers T contains mutability of some form are not unwind safe
// * Unique, an owning pointer, lifts an implementation
-// * Types like Mutex/RwLock which are explicilty poisoned are recover safe
-// * Our custom AssertRecoverSafe wrapper is indeed recover safe
-impl RecoverSafe for .. {}
-impl<'a, T: ?Sized> !RecoverSafe for &'a mut T {}
-impl<'a, T: RefRecoverSafe + ?Sized> RecoverSafe for &'a T {}
-impl<T: RefRecoverSafe + ?Sized> RecoverSafe for *const T {}
-impl<T: RefRecoverSafe + ?Sized> RecoverSafe for *mut T {}
-impl<T: RecoverSafe> RecoverSafe for Unique<T> {}
-impl<T: RefRecoverSafe + ?Sized> RecoverSafe for Shared<T> {}
-impl<T: ?Sized> RecoverSafe for Mutex<T> {}
-impl<T: ?Sized> RecoverSafe for RwLock<T> {}
-impl<T> RecoverSafe for AssertRecoverSafe<T> {}
+// * Types like Mutex/RwLock which are explicilty poisoned are unwind safe
+// * Our custom AssertUnwindSafe wrapper is indeed unwind safe
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl UnwindSafe for .. {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<'a, T: ?Sized> !UnwindSafe for &'a mut T {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<'a, T: RefUnwindSafe + ?Sized> UnwindSafe for &'a T {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: RefUnwindSafe + ?Sized> UnwindSafe for *const T {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: RefUnwindSafe + ?Sized> UnwindSafe for *mut T {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: UnwindSafe> UnwindSafe for Unique<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: RefUnwindSafe + ?Sized> UnwindSafe for Shared<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: ?Sized> UnwindSafe for Mutex<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: ?Sized> UnwindSafe for RwLock<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T> UnwindSafe for AssertUnwindSafe<T> {}
+#[unstable(feature = "recover", issue = "27719")]
+#[allow(deprecated)]
+impl<T> UnwindSafe for AssertRecoverSafe<T> {}
// not covered via the Shared impl above b/c the inner contents use
// Cell/AtomicUsize, but the usage here is recover safe so we can lift the
// impl up one level to Arc/Rc itself
-impl<T: RefRecoverSafe + ?Sized> RecoverSafe for Rc<T> {}
-impl<T: RefRecoverSafe + ?Sized> RecoverSafe for Arc<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: RefUnwindSafe + ?Sized> UnwindSafe for Rc<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: RefUnwindSafe + ?Sized> UnwindSafe for Arc<T> {}
// Pretty simple implementations for the `RefRecoverSafe` marker trait,
// basically just saying that this is a marker trait and `UnsafeCell` is the
// only thing which doesn't implement it (which then transitively applies to
// everything else).
-impl RefRecoverSafe for .. {}
-impl<T: ?Sized> !RefRecoverSafe for UnsafeCell<T> {}
-impl<T> RefRecoverSafe for AssertRecoverSafe<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl RefUnwindSafe for .. {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T: ?Sized> !RefUnwindSafe for UnsafeCell<T> {}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T> RefUnwindSafe for AssertUnwindSafe<T> {}
+#[unstable(feature = "recover", issue = "27719")]
+#[allow(deprecated)]
+impl<T> RefUnwindSafe for AssertRecoverSafe<T> {}
+
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T> Deref for AssertUnwindSafe<T> {
+ type Target = T;
+
+ fn deref(&self) -> &T {
+ &self.0
+ }
+}
+
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<T> DerefMut for AssertUnwindSafe<T> {
+ fn deref_mut(&mut self) -> &mut T {
+ &mut self.0
+ }
+}
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+impl<R, F: FnOnce() -> R> FnOnce<()> for AssertUnwindSafe<F> {
+ type Output = R;
+
+ extern "rust-call" fn call_once(self, _args: ()) -> R {
+ (self.0)()
+ }
+}
+
+#[allow(deprecated)]
impl<T> AssertRecoverSafe<T> {
/// Creates a new `AssertRecoverSafe` wrapper around the provided type.
#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
}
}
+#[unstable(feature = "recover", issue = "27719")]
+#[allow(deprecated)]
impl<T> Deref for AssertRecoverSafe<T> {
type Target = T;
}
}
+#[unstable(feature = "recover", issue = "27719")]
+#[allow(deprecated)]
impl<T> DerefMut for AssertRecoverSafe<T> {
fn deref_mut(&mut self) -> &mut T {
&mut self.0
}
}
+#[unstable(feature = "recover", issue = "27719")]
+#[allow(deprecated)]
impl<R, F: FnOnce() -> R> FnOnce<()> for AssertRecoverSafe<F> {
type Output = R;
}
}
-/// Invokes a closure, capturing the cause of panic if one occurs.
+/// Invokes a closure, capturing the cause of an unwinding panic if one occurs.
///
/// This function will return `Ok` with the closure's result if the closure
/// does not panic, and will return `Err(cause)` if the closure panics. The
///
/// It is **not** recommended to use this function for a general try/catch
/// mechanism. The `Result` type is more appropriate to use for functions that
-/// can fail on a regular basis.
-///
-/// The closure provided is required to adhere to the `RecoverSafe` to ensure
-/// that all captured variables are safe to cross this recover boundary. The
-/// purpose of this bound is to encode the concept of [exception safety][rfc] in
-/// the type system. Most usage of this function should not need to worry about
-/// this bound as programs are naturally panic safe without `unsafe` code. If it
-/// becomes a problem the associated `AssertRecoverSafe` wrapper type in this
+/// can fail on a regular basis. Additionally, this function is not guaranteed
+/// to catch all panics, see the "Notes" sectino below.
+///
+/// The closure provided is required to adhere to the `UnwindSafe` to ensure
+/// that all captured variables are safe to cross this boundary. The purpose of
+/// this bound is to encode the concept of [exception safety][rfc] in the type
+/// system. Most usage of this function should not need to worry about this
+/// bound as programs are naturally panic safe without `unsafe` code. If it
+/// becomes a problem the associated `AssertUnwindSafe` wrapper type in this
/// module can be used to quickly assert that the usage here is indeed exception
/// safe.
///
/// [rfc]: https://github.com/rust-lang/rfcs/blob/master/text/1236-stabilize-catch-panic.md
///
+/// # Notes
+///
+/// Note that this function **may not catch all panics** in Rust. A panic in
+/// Rust is not always implemented via unwinding, but can be implemented by
+/// aborting the process as well. This function *only* catches unwinding panics,
+/// not those that abort the process.
+///
/// # Examples
///
/// ```
-/// #![feature(recover, std_panic)]
-///
/// use std::panic;
///
-/// let result = panic::recover(|| {
+/// let result = panic::catch_unwind(|| {
/// println!("hello!");
/// });
/// assert!(result.is_ok());
///
-/// let result = panic::recover(|| {
+/// let result = panic::catch_unwind(|| {
/// panic!("oh no!");
/// });
/// assert!(result.is_err());
/// ```
-#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
-pub fn recover<F: FnOnce() -> R + RecoverSafe, R>(f: F) -> Result<R> {
+#[stable(feature = "catch_unwind", since = "1.9.0")]
+pub fn catch_unwind<F: FnOnce() -> R + UnwindSafe, R>(f: F) -> Result<R> {
let mut result = None;
unsafe {
let result = &mut result;
Ok(result.unwrap())
}
+/// Deprecated, renamed to `catch_unwind`
+#[unstable(feature = "recover", reason = "awaiting feedback", issue = "27719")]
+#[rustc_deprecated(reason = "renamed to `catch_unwind`", since = "1.9.0")]
+pub fn recover<F: FnOnce() -> R + UnwindSafe, R>(f: F) -> Result<R> {
+ catch_unwind(f)
+}
+
/// Triggers a panic without invoking the panic handler.
///
-/// This is designed to be used in conjunction with `recover` to, for example,
-/// carry a panic across a layer of C code.
+/// This is designed to be used in conjunction with `catch_unwind` to, for
+/// example, carry a panic across a layer of C code.
+///
+/// # Notes
+///
+/// Note that panics in Rust are not always implemented via unwinding, but they
+/// may be implemented by aborting the process. If this function is called when
+/// panics are implemented this way then this function will abort the process,
+/// not trigger an unwind.
///
/// # Examples
///
/// ```should_panic
-/// #![feature(std_panic, recover, panic_propagate)]
-///
/// use std::panic;
///
-/// let result = panic::recover(|| {
+/// let result = panic::catch_unwind(|| {
/// panic!("oh no!");
/// });
///
/// if let Err(err) = result {
-/// panic::propagate(err);
+/// panic::resume_unwind(err);
/// }
/// ```
+#[stable(feature = "resume_unwind", since = "1.9.0")]
+pub fn resume_unwind(payload: Box<Any + Send>) -> ! {
+ unwind::rust_panic(payload)
+}
+
+/// Deprecated, use resume_unwind instead
#[unstable(feature = "panic_propagate", reason = "awaiting feedback", issue = "30752")]
+#[rustc_deprecated(reason = "renamed to `resume_unwind`", since = "1.9.0")]
pub fn propagate(payload: Box<Any + Send>) -> ! {
- unwind::rust_panic(payload)
+ resume_unwind(payload)
}
Done = 3,
}
-/// A Windows path prefix, e.g. `C:` or `\server\share`.
+/// A Windows path prefix, e.g. `C:` or `\\server\share`.
///
/// Does not occur on Unix.
#[stable(feature = "rust1", since = "1.0.0")]
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, Debug)]
#[stable(feature = "rust1", since = "1.0.0")]
pub enum Component<'a> {
- /// A Windows path prefix, e.g. `C:` or `\server\share`.
+ /// A Windows path prefix, e.g. `C:` or `\\server\share`.
///
/// Does not occur on Unix.
#[stable(feature = "rust1", since = "1.0.0")]
sys_common::args::init(argc, argv);
// Let's run some code!
- let res = panic::recover(mem::transmute::<_, fn()>(main));
+ let res = panic::catch_unwind(mem::transmute::<_, fn()>(main));
sys_common::cleanup();
res.is_err()
};
static O: Once = Once::new();
// poison the once
- let t = panic::recover(|| {
+ let t = panic::catch_unwind(|| {
O.call_once(|| panic!());
});
assert!(t.is_err());
// poisoning propagates
- let t = panic::recover(|| {
+ let t = panic::catch_unwind(|| {
O.call_once(|| {});
});
assert!(t.is_err());
static O: Once = Once::new();
// poison the once
- let t = panic::recover(|| {
+ let t = panic::catch_unwind(|| {
O.call_once(|| panic!());
});
assert!(t.is_err());
first = false;
}
let mut rest = inner;
- while rest.char_at(0).is_numeric() {
+ while rest.chars().next().unwrap().is_numeric() {
rest = &rest[1..];
}
let i: usize = inner[.. (inner.len() - rest.len())].parse().unwrap();
context: *mut uw::_Unwind_Context
) -> uw::_Unwind_Reason_Code
{
+ // Backtraces on ARM will call the personality routine with
+ // state == _US_VIRTUAL_UNWIND_FRAME | _US_FORCE_UNWIND. In those cases
+ // we want to continue unwinding the stack, otherwise all our backtraces
+ // would end at __rust_try.
if (state as c_int & uw::_US_ACTION_MASK as c_int)
- == uw::_US_VIRTUAL_UNWIND_FRAME as c_int { // search phase
+ == uw::_US_VIRTUAL_UNWIND_FRAME as c_int
+ && (state as c_int & uw::_US_FORCE_UNWIND as c_int) == 0 { // search phase
uw::_URC_HANDLER_FOUND // catch!
}
else { // cleanup phase
match item {
Ok(ch) => string.push_char(ch),
Err(surrogate) => {
+ let surrogate = surrogate.unpaired_surrogate();
// Surrogates are known to be in the code point range.
- let code_point = unsafe { CodePoint::from_u32_unchecked(surrogate as u32) };
+ let code_point = unsafe {
+ CodePoint::from_u32_unchecked(surrogate as u32)
+ };
// Skip the WTF-8 concatenation check,
// surrogate pairs are already decoded by decode_utf16
string.push_code_point_unchecked(code_point)
#[doc(no_inline)] #[stable(feature = "rust1", since = "1.0.0")]
pub use super::fs::{PermissionsExt, OpenOptionsExt, MetadataExt, FileTypeExt};
#[doc(no_inline)] #[stable(feature = "rust1", since = "1.0.0")]
- pub use super::fs::{DirEntryExt};
+ pub use super::fs::DirEntryExt;
+ #[doc(no_inline)] #[stable(feature = "rust1", since = "1.0.0")]
+ pub use super::thread::JoinHandleExt;
#[doc(no_inline)] #[stable(feature = "rust1", since = "1.0.0")]
pub use super::process::{CommandExt, ExitStatusExt};
}
/// (the daemon) in the same session.
#[unstable(feature = "process_session_leader", reason = "recently added",
issue = "27811")]
+ #[rustc_deprecated(reason = "use `before_exec` instead",
+ since = "1.9.0")]
fn session_leader(&mut self, on: bool) -> &mut process::Command;
/// Schedules a closure to be run just before the `exec` function is
/// file descriptors may have changed. If a "transactional spawn" is
/// required to gracefully handle errors it is recommended to use the
/// cross-platform `spawn` instead.
- #[unstable(feature = "process_exec", issue = "31398")]
+ #[stable(feature = "process_exec2", since = "1.9.0")]
fn exec(&mut self) -> io::Error;
}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-//! Unix-specific extensions to primitives in the `std::process` module.
+//! Unix-specific extensions to primitives in the `std::thread` module.
-#![unstable(feature = "thread_extensions", issue = "29791")]
+#![stable(feature = "thread_extensions", since = "1.9.0")]
#[allow(deprecated)]
use os::unix::raw::pthread_t;
use sys_common::{AsInner, IntoInner};
use thread::JoinHandle;
-#[unstable(feature = "thread_extensions", issue = "29791")]
+#[stable(feature = "thread_extensions", since = "1.9.0")]
#[allow(deprecated)]
pub type RawPthread = pthread_t;
/// Unix-specific extensions to `std::thread::JoinHandle`
-#[unstable(feature = "thread_extensions", issue = "29791")]
+#[stable(feature = "thread_extensions", since = "1.9.0")]
pub trait JoinHandleExt {
/// Extracts the raw pthread_t without taking ownership
+ #[stable(feature = "thread_extensions", since = "1.9.0")]
fn as_pthread_t(&self) -> RawPthread;
+
/// Consumes the thread, returning the raw pthread_t
///
/// This function **transfers ownership** of the underlying pthread_t to
/// the caller. Callers are then the unique owners of the pthread_t and
/// must either detach or join the pthread_t once it's no longer needed.
+ #[stable(feature = "thread_extensions", since = "1.9.0")]
fn into_pthread_t(self) -> RawPthread;
}
-#[unstable(feature = "thread_extensions", issue = "29791")]
+#[stable(feature = "thread_extensions", since = "1.9.0")]
impl<T> JoinHandleExt for JoinHandle<T> {
fn as_pthread_t(&self) -> RawPthread {
self.as_inner().id() as RawPthread
}
+
fn into_pthread_t(self) -> RawPthread {
self.into_inner().into_id() as RawPthread
}
static ENV_LOCK: StaticMutex = StaticMutex::new();
/// Returns the platform-specific value of errno
+#[cfg(not(target_os = "dragonfly"))]
pub fn errno() -> i32 {
extern {
#[cfg_attr(any(target_os = "linux", target_os = "emscripten"),
target_env = "newlib"),
link_name = "__errno")]
#[cfg_attr(target_os = "solaris", link_name = "___errno")]
- #[cfg_attr(target_os = "dragonfly", link_name = "__dfly_error")]
#[cfg_attr(any(target_os = "macos",
target_os = "ios",
target_os = "freebsd"),
}
}
+#[cfg(target_os = "dragonfly")]
+pub fn errno() -> i32 {
+ extern {
+ #[thread_local]
+ static errno: c_int;
+ }
+
+ errno as i32
+}
+
/// Gets a detailed string description for the given error number.
pub fn error_string(errno: i32) -> String {
extern {
}
}
+ #[cfg(not(target_os = "dragonfly"))]
+ pub type clock_t = libc::c_int;
+ #[cfg(target_os = "dragonfly")]
+ pub type clock_t = libc::c_ulong;
+
impl Timespec {
- pub fn now(clock: libc::c_int) -> Timespec {
+ pub fn now(clock: clock_t) -> Timespec {
let mut t = Timespec {
t: libc::timespec {
tv_sec: 0,
//! Extensions to `std::thread` for Windows.
-#![unstable(feature = "thread_extensions", issue = "29791")]
+#![stable(feature = "thread_extensions", since = "1.9.0")]
use os::windows::io::{RawHandle, AsRawHandle, IntoRawHandle};
use thread;
use sys_common::{AsInner, IntoInner};
+#[stable(feature = "thread_extensions", since = "1.9.0")]
impl<T> AsRawHandle for thread::JoinHandle<T> {
fn as_raw_handle(&self) -> RawHandle {
self.as_inner().handle().raw() as *mut _
}
}
+#[stable(feature = "thread_extensions", since = "1.9.0")]
impl<T> IntoRawHandle for thread::JoinHandle<T> {
fn into_raw_handle(self) -> RawHandle {
self.into_inner().into_handle().into_raw() as *mut _
("naked_functions", "1.9.0", Some(32408), Active),
// allow empty structs and enum variants with braces
- ("braced_empty_structs", "1.5.0", Some(29720), Accepted),
+ ("braced_empty_structs", "1.8.0", Some(29720), Accepted),
// allow overloading augmented assignment operations like `a += b`
- ("augmented_assignments", "1.5.0", Some(28235), Accepted),
+ ("augmented_assignments", "1.8.0", Some(28235), Accepted),
// allow `#[no_debug]`
("no_debug", "1.5.0", Some(29721), Active),
("stmt_expr_attributes", "1.6.0", Some(15701), Active),
// Allows `#[deprecated]` attribute
- ("deprecated", "1.6.0", Some(29935), Active),
+ ("deprecated", "1.9.0", Some(29935), Accepted),
// allow using type ascription in expressions
("type_ascription", "1.6.0", Some(23416), Active),
"the `#[rustc_if_this_changed]` attribute \
is just used for rustc unit tests \
and will never be stable")),
+ ("rustc_dirty", Whitelisted, Gated("rustc_attrs",
+ "the `#[rustc_dirty]` attribute \
+ is just used for rustc unit tests \
+ and will never be stable")),
+ ("rustc_clean", Whitelisted, Gated("rustc_attrs",
+ "the `#[rustc_clean]` attribute \
+ is just used for rustc unit tests \
+ and will never be stable")),
("rustc_symbol_name", Whitelisted, Gated("rustc_attrs",
"internal rustc attributes will never be stable")),
("rustc_item_path", Whitelisted, Gated("rustc_attrs",
("must_use", Whitelisted, Ungated),
("stable", Whitelisted, Ungated),
("unstable", Whitelisted, Ungated),
- ("deprecated", Normal, Gated("deprecated", "`#[deprecated]` attribute is unstable")),
+ ("deprecated", Normal, Ungated),
("rustc_paren_sugar", Normal, Gated("unboxed_closures",
"unboxed_closures are still evolving")),
#![feature(libc)]
#![feature(rustc_private)]
#![feature(staged_api)]
-#![feature(str_char)]
#![feature(str_escape)]
#![feature(unicode)]
#![feature(question_mark)]
loop {
// expr?
while self.eat(&token::Question) {
- let hi = self.span.hi;
+ let hi = self.last_span.hi;
e = self.mk_expr(lo, hi, ExprKind::Try(e), None);
}
self.commasep(Inconsistent, &a.outputs,
|s, out| {
- match out.constraint.slice_shift_char() {
- Some(('=', operand)) if out.is_rw => {
- s.print_string(&format!("+{}", operand),
+ let mut ch = out.constraint.chars();
+ match ch.next() {
+ Some('=') if out.is_rw => {
+ s.print_string(&format!("+{}", ch.as_str()),
ast::StrStyle::Cooked)?
}
- _ => s.print_string(&out.constraint, ast::StrStyle::Cooked)?
+ _ => s.print_string(&out.constraint,
+ ast::StrStyle::Cooked)?
}
s.popen()?;
s.print_expr(&out.expr)?;
// It's the opposite of '=&' which means that the memory
// cannot be shared with any other operand (usually when
// a register is clobbered early.)
- let output = match constraint.slice_shift_char() {
- Some(('=', _)) => None,
- Some(('+', operand)) => {
+ let mut ch = constraint.chars();
+ let output = match ch.next() {
+ Some('=') => None,
+ Some('+') => {
Some(token::intern_and_get_ident(&format!(
- "={}", operand)))
+ "={}", ch.as_str())))
}
_ => {
cx.span_err(span, "output operand constraint lacks '=' or '+'");
let is_rw = output.is_some();
let is_indirect = constraint.contains("*");
outputs.push(ast::InlineAsmOutput {
- constraint: output.unwrap_or(constraint),
+ constraint: output.unwrap_or(constraint.clone()),
expr: out,
is_rw: is_rw,
is_indirect: is_indirect,
#![feature(rustc_private)]
#![feature(staged_api)]
-#![feature(str_char)]
extern crate fmt_macros;
#[macro_use] extern crate log;
va_end(pairs);
return sum / n;
}
+
+int32_t rust_int8_to_int32(int8_t x) {
+ return (int32_t)x;
+}
// Test that when a trait impl changes, fns whose body uses that trait
// must also be recompiled.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(warnings)]
// Test that immediate callers have to change when callee changes, but
// not callers' callers.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(dead_code)]
// Test cases where a changing struct appears in the signature of fns
// and methods.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(dead_code)]
// Test that adding an impl to a trait `Foo` DOES affect functions
// that only use `Bar` if they have methods in common.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(dead_code)]
// Test that adding an impl to a trait `Foo` does not affect functions
// that only use `Bar`, so long as they do not have methods in common.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(warnings)]
// Test that when a trait impl changes, fns whose body uses that trait
// must also be recompiled.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(warnings)]
// Test that two unrelated functions have no trans dependency.
-// compile-flags: -Z incr-comp
+// compile-flags: -Z query-dep-graph
#![feature(rustc_attrs)]
#![allow(dead_code)]
// #[deprecated] can't be used in staged api
-#![feature(deprecated, staged_api)]
+#![feature(staged_api)]
#![stable(feature = "test_feature", since = "1.0.0")]
// aux-build:deprecation-lint.rs
-#![feature(deprecated)]
-
#![deny(deprecated)]
#![allow(warnings)]
// Various checks that deprecation attributes are used correctly
-#![feature(deprecated)]
-
mod bogus_attribute_types_1 {
#[deprecated(since = "a", note = "a", reason)] //~ ERROR unknown meta item 'reason'
fn f1() { }
mod core {} // Check that private crates are not glob imported
}
+mod bar {
+ pub extern crate core;
+}
+
+mod baz {
+ pub use bar::*;
+ use self::core::cell; // Check that public extern crates are glob imported
+}
+
#[rustc_error]
fn main() {} //~ ERROR compilation successful
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(rustc_attrs)]
+
+mod a {
+ pub mod b { pub struct Foo; }
+
+ pub mod c {
+ use super::b;
+ pub struct Bar(pub b::Foo);
+ }
+
+ pub use self::c::*;
+}
+
+#[rustc_error]
+fn main() { //~ ERROR compilation successful
+ let _ = a::c::Bar(a::b::Foo);
+ let _ = a::Bar(a::b::Foo);
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(rustc_attrs)]
+#![allow(unused)]
+
+extern crate core;
+use core as core_export;
+use self::x::*;
+mod x {}
+
+#[rustc_error]
+fn main() {} //~ ERROR compilation successful
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+extern {
+ fn foo(a: i32, ...);
+}
+
+fn bar(_: *const u8) {}
+
+fn main() {
+ unsafe {
+ foo(0, bar);
+ //~^ ERROR can't pass `fn(*const u8) {bar}` to variadic function, cast to `fn(*const u8)`
+ }
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use bar::Foo; //~ ERROR There is no `Foo` in `bar` [E0432]
+mod bar {
+ use Foo; //~ ERROR There is no `Foo` in the crate root [E0432]
+}
+
+fn main() {}
#![allow(dead_code)]
#![feature(recover)]
-use std::panic::RecoverSafe;
+use std::panic::UnwindSafe;
-fn assert<T: RecoverSafe + ?Sized>() {}
+fn assert<T: UnwindSafe + ?Sized>() {}
fn main() {
- assert::<&mut i32>(); //~ ERROR: RecoverSafe` is not satisfied
+ assert::<&mut i32>(); //~ ERROR: UnwindSafe` is not satisfied
}
use std::boxed::HEAP; //~ ERROR use of unstable library feature
let _ = HEAP <- { //~ ERROR use of unstable library feature
- ::core::raw::Slice { //~ ERROR use of unstable library feature
- data: &42, //~ ERROR use of unstable library feature
- len: 1 //~ ERROR use of unstable library feature
- }
+ HEAP //~ ERROR use of unstable library feature
};
}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(question_mark)]
+
+// Make sure that the span of try shorthand does not include the trailing
+// semicolon;
+fn a() -> Result<i32, ()> {
+ Err(5)?; //~ ERROR 16:5: 16:12
+ Ok(1)
+}
+
+fn main() {}
mod foo {
use self::{self};
- //~^ ERROR unresolved import `self`. There is no `self` in `???`
+ //~^ ERROR unresolved import `self`. There is no `self` in the crate root
use super::{self};
- //~^ ERROR unresolved import `super`. There is no `super` in `???`
+ //~^ ERROR unresolved import `super`. There is no `super` in the crate root
}
fn main() {}
//~| expected variadic fn
//~| found non-variadic function
- foo(1, 2, 3f32); //~ ERROR: can't pass an f32 to variadic function, cast to c_double
- foo(1, 2, true); //~ ERROR: can't pass bool to variadic function, cast to c_int
- foo(1, 2, 1i8); //~ ERROR: can't pass i8 to variadic function, cast to c_int
- foo(1, 2, 1u8); //~ ERROR: can't pass u8 to variadic function, cast to c_uint
- foo(1, 2, 1i16); //~ ERROR: can't pass i16 to variadic function, cast to c_int
- foo(1, 2, 1u16); //~ ERROR: can't pass u16 to variadic function, cast to c_uint
+ foo(1, 2, 3f32); //~ ERROR: can't pass an `f32` to variadic function, cast to `c_double`
+ foo(1, 2, true); //~ ERROR: can't pass `bool` to variadic function, cast to `c_int`
+ foo(1, 2, 1i8); //~ ERROR: can't pass `i8` to variadic function, cast to `c_int`
+ foo(1, 2, 1u8); //~ ERROR: can't pass `u8` to variadic function, cast to `c_uint`
+ foo(1, 2, 1i16); //~ ERROR: can't pass `i16` to variadic function, cast to `c_int`
+ foo(1, 2, 1u16); //~ ERROR: can't pass `u16` to variadic function, cast to `c_uint`
}
}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// revisions: rpass1 cfail2
+
+#![allow(warnings)]
+#![feature(rustc_attrs)]
+
+// Sanity check for the dirty-clean system. Give the opposite
+// annotations that we expect to see, so that we check that errors are
+// reported.
+
+fn main() { }
+
+mod x {
+ #[cfg(rpass1)]
+ pub fn x() -> usize {
+ 22
+ }
+
+ #[cfg(cfail2)]
+ pub fn x() -> u32 {
+ 22
+ }
+}
+
+mod y {
+ use x;
+
+ #[rustc_clean(label="TypeckItemBody", cfg="cfail2")]
+ #[rustc_clean(label="TransCrateItem", cfg="cfail2")]
+ pub fn y() {
+ //[cfail2]~^ ERROR `TypeckItemBody("y::y")` not found in dep graph, but should be clean
+ //[cfail2]~| ERROR `TransCrateItem("y::y")` not found in dep graph, but should be clean
+ x::x();
+ }
+}
+
+mod z {
+ #[rustc_dirty(label="TypeckItemBody", cfg="cfail2")]
+ #[rustc_dirty(label="TransCrateItem", cfg="cfail2")]
+ pub fn z() {
+ //[cfail2]~^ ERROR `TypeckItemBody("z::z")` found in dep graph, but should be dirty
+ //[cfail2]~| ERROR `TransCrateItem("z::z")` found in dep graph, but should be dirty
+ }
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// revisions: rpass1 rpass2
+
+#![allow(warnings)]
+#![feature(rustc_attrs)]
+
+fn main() { }
+
+mod x {
+ #[cfg(rpass1)]
+ pub fn x() -> i32 {
+ 1
+ }
+
+ #[cfg(rpass2)]
+ pub fn x() -> i32 {
+ 2
+ }
+}
+
+mod y {
+ use x;
+
+ #[rustc_dirty(label="TypeckItemBody", cfg="rpass2")]
+ pub fn y() {
+ x::x();
+ }
+}
+
+mod z {
+ use y;
+
+ #[rustc_clean(label="TypeckItemBody", cfg="rpass2")]
+ pub fn z() {
+ y::y();
+ }
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// revisions: rpass1 rpass2
+
+#![allow(warnings)]
+#![feature(rustc_attrs)]
+
+// Here the only thing which changes is the string constant in `x`.
+// Therefore, the compiler deduces (correctly) that typeck is not
+// needed even for callers of `x`.
+//
+// It is not entirely clear why `TransCrateItem` invalidates `y` and
+// `z`, actually, I think it's because of the structure of
+// trans. -nmatsakis
+
+fn main() { }
+
+mod x {
+ #[cfg(rpass1)]
+ pub fn x() {
+ println!("1");
+ }
+
+ #[cfg(rpass2)]
+ #[rustc_dirty(label="TypeckItemBody", cfg="rpass2")]
+ #[rustc_dirty(label="TransCrateItem", cfg="rpass2")]
+ pub fn x() {
+ println!("2");
+ }
+}
+
+mod y {
+ use x;
+
+ #[rustc_clean(label="TypeckItemBody", cfg="rpass2")]
+ #[rustc_clean(label="TransCrateItem", cfg="rpass2")]
+ pub fn y() {
+ x::x();
+ }
+}
+
+mod z {
+ use y;
+
+ #[rustc_clean(label="TypeckItemBody", cfg="rpass2")]
+ #[rustc_clean(label="TransCrateItem", cfg="rpass2")]
+ pub fn z() {
+ y::y();
+ }
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// compile-flags: -Z parse-only -Z continue-parse-after-error
+
+pub fn test() {
+ foo(|_|) //~ ERROR unexpected token: `)`
+}
+
+fn main() { }
let krate = driver::assign_node_ids(&sess, krate);
let lcx = LoweringContext::new(&sess, Some(&krate));
- let dep_graph = DepGraph::new(sess.opts.build_dep_graph);
+ let dep_graph = DepGraph::new(sess.opts.build_dep_graph());
let mut hir_forest = ast_map::Forest::new(lower_crate(&lcx, &krate), dep_graph);
let arenas = ty::CtxtArenas::new();
let ast_map = driver::make_map(&sess, &mut hir_forest);
+++ /dev/null
--include ../tools.mk
-all:
- $(HOST_RPATH_ENV) $(RUSTDOC) -w json -o $(TMPDIR)/doc.json foo.rs
- $(HOST_RPATH_ENV) $(RUSTDOC) -o $(TMPDIR)/doc $(TMPDIR)/doc.json
+++ /dev/null
-// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-#![crate_name = "foo"]
-
-//! Very docs
-
-pub mod bar {
-
- /// So correct
- pub mod baz {
- /// Much detail
- pub fn baz() { }
- }
-
- /// *wow*
- pub trait Doge { fn dummy(&self) { } }
-}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#[link(name = "rust_test_helpers")]
+extern {
+ fn rust_int8_to_int32(_: i8) -> i32;
+}
+
+fn main() {
+ let x = unsafe {
+ rust_int8_to_int32(-1)
+ };
+
+ assert!(x == -1);
+}
#![allow(dead_code)]
#![feature(recover)]
-use std::panic::{RecoverSafe, AssertRecoverSafe};
+use std::panic::{UnwindSafe, AssertUnwindSafe};
use std::cell::RefCell;
use std::sync::{Mutex, RwLock, Arc};
use std::rc::Rc;
struct Foo { a: i32 }
-fn assert<T: RecoverSafe + ?Sized>() {}
+fn assert<T: UnwindSafe + ?Sized>() {}
fn main() {
assert::<i32>();
assert::<Mutex<T>>();
assert::<RwLock<T>>();
}
- fn baz<T: RecoverSafe>() {
+ fn baz<T: UnwindSafe>() {
assert::<Box<T>>();
assert::<Vec<T>>();
assert::<RefCell<T>>();
- assert::<AssertRecoverSafe<T>>();
- assert::<&AssertRecoverSafe<T>>();
- assert::<Rc<AssertRecoverSafe<T>>>();
- assert::<Arc<AssertRecoverSafe<T>>>();
+ assert::<AssertUnwindSafe<T>>();
+ assert::<&AssertUnwindSafe<T>>();
+ assert::<Rc<AssertUnwindSafe<T>>>();
+ assert::<Arc<AssertUnwindSafe<T>>>();
}
}
const TEST_REPOS: &'static [(&'static str, &'static str, Option<&'static str>)] = &[
("https://github.com/rust-lang/cargo",
- "ff02b156f094fb83e70acd965c83c9286411914e",
+ "fae9c539388f1b7c70c31fd0a21b5dd9cd071177",
None),
("https://github.com/iron/iron",
"16c858ec2901e2992fe5e529780f59fa8ed12903",