1 //! This module provides a framework on top of the normal MIR dataflow framework to simplify the
2 //! implementation of analyses that track the values stored in places of interest.
4 //! The default methods of [`ValueAnalysis`] (prefixed with `super_` instead of `handle_`)
5 //! provide some behavior that should be valid for all abstract domains that are based only on the
6 //! value stored in a certain place. On top of these default rules, an implementation should
7 //! override some of the `handle_` methods. For an example, see `ConstAnalysis`.
9 //! An implementation must also provide a [`Map`]. Before the anaylsis begins, all places that
10 //! should be tracked during the analysis must be registered. Currently, the projections of these
11 //! places may only contain derefs, fields and downcasts (otherwise registration fails). During the
12 //! analysis, no new places can be registered.
14 //! Note that if you want to track values behind references, you have to register the dereferenced
15 //! place. For example: Assume `let x = (0, 0)` and that we want to propagate values from `x.0` and
16 //! `x.1` also through the assignment `let y = &x`. In this case, we should register `x.0`, `x.1`,
17 //! `(*y).0` and `(*y).1`.
22 //! Warning: This is a semi-formal attempt to argue for the correctness of this analysis. If you
23 //! find any weak spots, let me know! Recommended reading: Abstract Interpretation. We will use the
24 //! term "place" to refer to a place expression (like `mir::Place`), and we will call the
25 //! underlying entity "object". For instance, `*_1` and `*_2` are not the same place, but depending
26 //! on the value of `_1` and `_2`, they could refer to the same object. Also, the same place can
27 //! refer to different objects during execution. If `_1` is reassigned, then `*_1` may refer to
28 //! different objects before and after assignment. Additionally, when saying "access to a place",
29 //! what we really mean is "access to an object denoted by arbitrary projections of that place".
31 //! In the following, we will assume a constant propagation analysis. Our analysis is correct if
32 //! every transfer function is correct. This is the case if for every pair (f, f#) and abstract
33 //! state s, we have f(y(s)) <= y(f#(s)), where s is a mapping from tracked place to top, bottom or
34 //! a constant. Since pointers (and mutable references) are not tracked, but can be used to change
35 //! values in the concrete domain, f# must assume that all places that can be affected in this way
36 //! for a given program point are already marked with top in s (otherwise many assignments and
37 //! function calls would have no choice but to mark all tracked places with top). This leads us to
38 //! an invariant: For all possible program points where there could possibly exist means of mutable
39 //! access to a tracked place (in the concrete domain), this place must be assigned to top (in the
40 //! abstract domain). The concretization function y can be defined as expected for the constant
41 //! propagation analysis, although the concrete state of course contains all kinds of non-tracked
42 //! data. However, by the invariant above, no mutable access to tracked places that are not marked
43 //! with top may be introduced.
45 //! Note that we (at least currently) do not differentiate between "this place may assume different
46 //! values" and "a pointer to this place escaped the analysis". However, we still want to handle
47 //! assignments to constants as usual for f#. This adds an assumption: Whenever we have an
48 //! assignment that is captured by the analysis, all mutable access to the underlying place (which
49 //! is not observable by the analysis) must be invalidated. This is (hopefully) covered by Stacked
52 //! To be continued...
54 //! The bottom state denotes uninitalized memory.
59 //! - (A1) Assignment to any tracked place invalidates all pointers that could be used to change
60 //! the underlying value.
61 //! - (A2) `StorageLive`, `StorageDead` and `Deinit` make the underlying memory at least
62 //! uninitialized (at least in the sense that declaring access UB is also fine).
63 //! - (A3) An assignment with `State::assign_place_idx` either involves non-overlapping places, or
64 //! the places are the same.
65 //! - (A4) If the value behind a reference to a `Freeze` place is changed, dereferencing the
68 use std::fmt::{Debug, Formatter};
70 use rustc_data_structures::fx::FxHashMap;
71 use rustc_index::vec::IndexVec;
72 use rustc_middle::mir::tcx::PlaceTy;
73 use rustc_middle::mir::*;
74 use rustc_middle::ty::{self, Ty, TyCtxt};
75 use rustc_target::abi::VariantIdx;
78 fmt::DebugWithContext, lattice::FlatSet, Analysis, AnalysisDomain, CallReturnPlaces,
79 JoinSemiLattice, SwitchIntEdgeEffects,
82 pub trait ValueAnalysis<'tcx> {
83 /// For each place of interest, the analysis tracks a value of the given type.
84 type Value: Clone + JoinSemiLattice + HasBottom + HasTop;
86 const NAME: &'static str;
88 fn map(&self) -> ⤅
90 fn handle_statement(&self, statement: &Statement<'tcx>, state: &mut State<Self::Value>) {
91 self.super_statement(statement, state)
94 fn super_statement(&self, statement: &Statement<'tcx>, state: &mut State<Self::Value>) {
95 match &statement.kind {
96 StatementKind::Assign(box (place, rvalue)) => {
97 self.handle_assign(*place, rvalue, state);
99 StatementKind::SetDiscriminant { .. } => {
100 // Could treat this as writing a constant to a pseudo-place.
101 // But discriminants are currently not tracked, so we do nothing.
102 // Related: https://github.com/rust-lang/unsafe-code-guidelines/issues/84
104 StatementKind::Intrinsic(box intrinsic) => {
105 self.handle_intrinsic(intrinsic, state);
107 StatementKind::StorageLive(local) | StatementKind::StorageDead(local) => {
109 state.flood_with(Place::from(*local).as_ref(), self.map(), Self::Value::bottom());
111 StatementKind::Deinit(box place) => {
113 state.flood_with(place.as_ref(), self.map(), Self::Value::bottom());
116 | StatementKind::Retag(..)
117 | StatementKind::FakeRead(..)
118 | StatementKind::Coverage(..)
119 | StatementKind::AscribeUserType(..) => (),
125 intrinsic: &NonDivergingIntrinsic<'tcx>,
126 state: &mut State<Self::Value>,
128 self.super_intrinsic(intrinsic, state);
133 intrinsic: &NonDivergingIntrinsic<'tcx>,
134 state: &mut State<Self::Value>,
137 NonDivergingIntrinsic::Assume(..) => {
138 // Could use this, but ignoring it is sound.
140 NonDivergingIntrinsic::CopyNonOverlapping(CopyNonOverlapping { dst, .. }) => {
141 if let Some(place) = dst.place() {
142 state.flood(place.as_ref(), self.map());
151 rvalue: &Rvalue<'tcx>,
152 state: &mut State<Self::Value>,
154 self.super_assign(target, rvalue, state)
160 rvalue: &Rvalue<'tcx>,
161 state: &mut State<Self::Value>,
163 let result = self.handle_rvalue(rvalue, state);
164 state.assign(target.as_ref(), result, self.map());
169 rvalue: &Rvalue<'tcx>,
170 state: &mut State<Self::Value>,
171 ) -> ValueOrPlaceOrRef<Self::Value> {
172 self.super_rvalue(rvalue, state)
177 rvalue: &Rvalue<'tcx>,
178 state: &mut State<Self::Value>,
179 ) -> ValueOrPlaceOrRef<Self::Value> {
181 Rvalue::Use(operand) => self.handle_operand(operand, state).into(),
182 Rvalue::Ref(_, BorrowKind::Shared, place) => self
184 .find(place.as_ref())
185 .map(ValueOrPlaceOrRef::Ref)
186 .unwrap_or(ValueOrPlaceOrRef::Unknown),
187 Rvalue::Ref(_, _, place) | Rvalue::AddressOf(_, place) => {
188 state.flood(place.as_ref(), self.map());
189 ValueOrPlaceOrRef::Unknown
191 Rvalue::CopyForDeref(place) => {
192 self.handle_operand(&Operand::Copy(*place), state).into()
194 _ => ValueOrPlaceOrRef::Unknown,
200 operand: &Operand<'tcx>,
201 state: &mut State<Self::Value>,
202 ) -> ValueOrPlace<Self::Value> {
203 self.super_operand(operand, state)
208 operand: &Operand<'tcx>,
209 state: &mut State<Self::Value>,
210 ) -> ValueOrPlace<Self::Value> {
212 Operand::Constant(box constant) => {
213 ValueOrPlace::Value(self.handle_constant(constant, state))
215 Operand::Copy(place) | Operand::Move(place) => {
216 // On move, we would ideally flood the place with bottom. But with the current
217 // framework this is not possible (similar to `InterpCx::eval_operand`).
219 .find(place.as_ref())
220 .map(ValueOrPlace::Place)
221 .unwrap_or(ValueOrPlace::Unknown)
228 constant: &Constant<'tcx>,
229 state: &mut State<Self::Value>,
231 self.super_constant(constant, state)
236 _constant: &Constant<'tcx>,
237 _state: &mut State<Self::Value>,
242 /// The effect of a successful function call return should not be
243 /// applied here, see [`Analysis::apply_terminator_effect`].
244 fn handle_terminator(&self, terminator: &Terminator<'tcx>, state: &mut State<Self::Value>) {
245 self.super_terminator(terminator, state)
248 fn super_terminator(&self, terminator: &Terminator<'tcx>, state: &mut State<Self::Value>) {
249 match &terminator.kind {
250 TerminatorKind::Call { .. } | TerminatorKind::InlineAsm { .. } => {
251 // Effect is applied by `handle_call_return`.
253 TerminatorKind::Drop { place, .. } => {
254 // Place can still be accessed after drop, and drop has mutable access to it.
255 state.flood(place.as_ref(), self.map());
257 TerminatorKind::DropAndReplace { .. } | TerminatorKind::Yield { .. } => {
258 // They would have an effect, but are not allowed in this phase.
259 bug!("encountered disallowed terminator");
262 // The other terminators can be ignored.
267 fn handle_call_return(
269 return_places: CallReturnPlaces<'_, 'tcx>,
270 state: &mut State<Self::Value>,
272 self.super_call_return(return_places, state)
275 fn super_call_return(
277 return_places: CallReturnPlaces<'_, 'tcx>,
278 state: &mut State<Self::Value>,
280 return_places.for_each(|place| {
281 state.flood(place.as_ref(), self.map());
285 fn handle_switch_int(
287 discr: &Operand<'tcx>,
288 apply_edge_effects: &mut impl SwitchIntEdgeEffects<State<Self::Value>>,
290 self.super_switch_int(discr, apply_edge_effects)
295 _discr: &Operand<'tcx>,
296 _apply_edge_effects: &mut impl SwitchIntEdgeEffects<State<Self::Value>>,
300 fn wrap(self) -> ValueAnalysisWrapper<Self>
304 ValueAnalysisWrapper(self)
308 pub struct ValueAnalysisWrapper<T>(pub T);
310 impl<'tcx, T: ValueAnalysis<'tcx>> AnalysisDomain<'tcx> for ValueAnalysisWrapper<T> {
311 type Domain = State<T::Value>;
313 type Direction = crate::Forward;
315 const NAME: &'static str = T::NAME;
317 fn bottom_value(&self, _body: &Body<'tcx>) -> Self::Domain {
318 State(StateData::Unreachable)
321 fn initialize_start_block(&self, body: &Body<'tcx>, state: &mut Self::Domain) {
322 // The initial state maps all tracked places of argument projections to ⊤ and the rest to ⊥.
323 assert!(matches!(state.0, StateData::Unreachable));
324 let values = IndexVec::from_elem_n(T::Value::bottom(), self.0.map().value_count);
325 *state = State(StateData::Reachable(values));
326 for arg in body.args_iter() {
327 state.flood(PlaceRef { local: arg, projection: &[] }, self.0.map());
332 impl<'tcx, T> Analysis<'tcx> for ValueAnalysisWrapper<T>
334 T: ValueAnalysis<'tcx>,
336 fn apply_statement_effect(
338 state: &mut Self::Domain,
339 statement: &Statement<'tcx>,
342 if state.is_reachable() {
343 self.0.handle_statement(statement, state);
347 fn apply_terminator_effect(
349 state: &mut Self::Domain,
350 terminator: &Terminator<'tcx>,
353 if state.is_reachable() {
354 self.0.handle_terminator(terminator, state);
358 fn apply_call_return_effect(
360 state: &mut Self::Domain,
362 return_places: crate::CallReturnPlaces<'_, 'tcx>,
364 if state.is_reachable() {
365 self.0.handle_call_return(return_places, state)
369 fn apply_switch_int_edge_effects(
372 discr: &Operand<'tcx>,
373 apply_edge_effects: &mut impl SwitchIntEdgeEffects<Self::Domain>,
375 // FIXME: Dataflow framework provides no access to current state here.
376 self.0.handle_switch_int(discr, apply_edge_effects)
380 rustc_index::newtype_index!(
381 /// This index uniquely identifies a place.
383 /// Not every place has a `PlaceIndex`, and not every `PlaceIndex` correspondends to a tracked
384 /// place. However, every tracked place and all places along its projection have a `PlaceIndex`.
385 pub struct PlaceIndex {}
388 rustc_index::newtype_index!(
389 /// This index uniquely identifies a tracked place and therefore a slot in [`State`].
391 /// It is an implementation detail of this module.
396 #[derive(PartialEq, Eq, Clone, Debug)]
398 Reachable(IndexVec<ValueIndex, V>),
402 /// The dataflow state for an instance of [`ValueAnalysis`].
404 /// Every instance specifies a lattice that represents the possible values of a single tracked
405 /// place. If we call this lattice `V` and set set of tracked places `P`, then a [`State`] is an
406 /// element of `{unreachable} ∪ (P -> V)`. This again forms a lattice, where the bottom element is
407 /// `unreachable` and the top element is the mapping `p ↦ ⊤`. Note that the mapping `p ↦ ⊥` is not
408 /// the bottom element (because joining an unreachable and any other reachable state yields a
409 /// reachable state). All operations on unreachable states are ignored.
411 /// Flooding means assigning a value (by default `⊤`) to all tracked projections of a given place.
412 #[derive(PartialEq, Eq, Clone, Debug)]
413 pub struct State<V>(StateData<V>);
415 impl<V: Clone + HasTop> State<V> {
416 pub fn is_reachable(&self) -> bool {
417 matches!(&self.0, StateData::Reachable(_))
420 pub fn mark_unreachable(&mut self) {
421 self.0 = StateData::Unreachable;
424 pub fn flood_all(&mut self) {
425 self.flood_all_with(V::top())
428 pub fn flood_all_with(&mut self, value: V) {
429 let StateData::Reachable(values) = &mut self.0 else { return };
430 values.raw.fill(value);
433 pub fn flood_with(&mut self, place: PlaceRef<'_>, map: &Map, value: V) {
434 if let Some(root) = map.find(place) {
435 self.flood_idx_with(root, map, value);
439 pub fn flood(&mut self, place: PlaceRef<'_>, map: &Map) {
440 self.flood_with(place, map, V::top())
443 pub fn flood_idx_with(&mut self, place: PlaceIndex, map: &Map, value: V) {
444 let StateData::Reachable(values) = &mut self.0 else { return };
445 map.preorder_invoke(place, &mut |place| {
446 if let Some(vi) = map.places[place].value_index {
447 values[vi] = value.clone();
452 pub fn flood_idx(&mut self, place: PlaceIndex, map: &Map) {
453 self.flood_idx_with(place, map, V::top())
456 pub fn assign_place_idx(&mut self, target: PlaceIndex, source: PlaceIndex, map: &Map) {
457 // We use (A3) and copy all entries one after another.
458 let StateData::Reachable(values) = &mut self.0 else { return };
460 // If both places are tracked, we copy the value to the target. If the target is tracked,
461 // but the source is not, we have to invalidate the value in target. If the target is not
462 // tracked, then we don't have to do anything.
463 if let Some(target_value) = map.places[target].value_index {
464 if let Some(source_value) = map.places[source].value_index {
465 values[target_value] = values[source_value].clone();
467 values[target_value] = V::top();
470 for target_child in map.children(target) {
471 // Try to find corresponding child and recurse. Reasoning is similar as above.
472 let projection = map.places[target_child].proj_elem.unwrap();
473 if let Some(source_child) = map.projections.get(&(source, projection)) {
474 self.assign_place_idx(target_child, *source_child, map);
476 self.flood_idx(target_child, map);
481 pub fn assign(&mut self, target: PlaceRef<'_>, result: ValueOrPlaceOrRef<V>, map: &Map) {
482 if let Some(target) = map.find(target) {
483 self.assign_idx(target, result, map);
485 // We don't track this place nor any projections, assignment can be ignored.
489 pub fn assign_idx(&mut self, target: PlaceIndex, result: ValueOrPlaceOrRef<V>, map: &Map) {
491 ValueOrPlaceOrRef::Value(value) => {
492 // First flood the target place in case we also track any projections (although
493 // this scenario is currently not well-supported by the API).
494 self.flood_idx(target, map);
495 let StateData::Reachable(values) = &mut self.0 else { return };
496 if let Some(value_index) = map.places[target].value_index {
497 values[value_index] = value;
500 ValueOrPlaceOrRef::Place(source) => self.assign_place_idx(target, source, map),
501 ValueOrPlaceOrRef::Ref(source) => {
502 let StateData::Reachable(values) = &mut self.0 else { return };
503 if let Some(value_index) = map.places[target].value_index {
504 values[value_index] = V::top();
506 // Instead of tracking of *where* a reference points to (as in, which place), we
507 // track *what* it points to (as in, what do we know about the target). For an
508 // assignment `x = &y`, we thus copy the info we have for `y` to `*x`. This is
509 // sound because we only track places that are `Freeze`, and (A4).
510 if let Some(target_deref) = map.apply_elem(target, ProjElem::Deref) {
511 self.assign_place_idx(target_deref, source, map);
514 ValueOrPlaceOrRef::Unknown => {
515 self.flood_idx(target, map);
520 pub fn get(&self, place: PlaceRef<'_>, map: &Map) -> V {
521 map.find(place).map(|place| self.get_idx(place, map)).unwrap_or(V::top())
524 pub fn get_idx(&self, place: PlaceIndex, map: &Map) -> V {
526 StateData::Reachable(values) => {
527 map.places[place].value_index.map(|v| values[v].clone()).unwrap_or(V::top())
529 StateData::Unreachable => V::top(),
534 impl<V: JoinSemiLattice + Clone> JoinSemiLattice for State<V> {
535 fn join(&mut self, other: &Self) -> bool {
536 match (&mut self.0, &other.0) {
537 (_, StateData::Unreachable) => false,
538 (StateData::Unreachable, _) => {
539 *self = other.clone();
542 (StateData::Reachable(this), StateData::Reachable(other)) => this.join(other),
549 locals: IndexVec<Local, Option<PlaceIndex>>,
550 projections: FxHashMap<(PlaceIndex, ProjElem), PlaceIndex>,
551 places: IndexVec<PlaceIndex, PlaceInfo>,
556 pub fn new() -> Self {
558 locals: IndexVec::new(),
559 projections: FxHashMap::default(),
560 places: IndexVec::new(),
565 /// Register all places with suitable types up to a certain derefence depth (to prevent cycles).
566 pub fn register_with_filter<'tcx>(
569 source: &impl HasLocalDecls<'tcx>,
571 mut filter: impl FnMut(Ty<'tcx>) -> bool,
573 let mut projection = Vec::new();
574 for (local, decl) in source.local_decls().iter_enumerated() {
575 self.register_with_filter_rec(
586 fn register_with_filter_rec<'tcx>(
591 projection: &mut Vec<PlaceElem<'tcx>>,
593 filter: &mut impl FnMut(Ty<'tcx>) -> bool,
596 // This might fail if `ty` is not scalar.
597 let _ = self.register_with_ty(local, projection, ty);
600 if let Some(ty::TypeAndMut { ty, .. }) = ty.builtin_deref(false) {
601 projection.push(PlaceElem::Deref);
602 self.register_with_filter_rec(tcx, max_derefs - 1, local, projection, ty, filter);
606 iter_fields(ty, tcx, |variant, field, ty| {
607 if variant.is_some() {
608 // Downcasts are currently not supported.
611 projection.push(PlaceElem::Field(field, ty));
612 self.register_with_filter_rec(tcx, max_derefs, local, projection, ty, filter);
620 projection: &[PlaceElem<'tcx>],
621 ) -> Result<PlaceIndex, ()> {
622 // Get the base index of the local.
624 *self.locals.get_or_insert_with(local, || self.places.push(PlaceInfo::new(None)));
626 // Apply the projection.
627 for &elem in projection {
629 PlaceElem::Downcast(..) => return Err(()),
632 let elem = elem.try_into()?;
633 index = *self.projections.entry((index, elem)).or_insert_with(|| {
634 // Prepend new child to the linked list.
635 let next = self.places.push(PlaceInfo::new(Some(elem)));
636 self.places[next].next_sibling = self.places[index].first_child;
637 self.places[index].first_child = Some(next);
645 pub fn register<'tcx>(
648 projection: &[PlaceElem<'tcx>],
649 decls: &impl HasLocalDecls<'tcx>,
651 ) -> Result<(), ()> {
654 .fold(PlaceTy::from_ty(decls.local_decls()[local].ty), |place_ty, &elem| {
655 place_ty.projection_ty(tcx, elem)
658 let place_ty = Place::ty_from(local, projection, decls, tcx);
659 if place_ty.variant_index.is_some() {
662 self.register_with_ty(local, projection, place_ty.ty)
665 fn register_with_ty<'tcx>(
668 projection: &[PlaceElem<'tcx>],
670 ) -> Result<(), ()> {
672 // Currently, only scalar types are allowed, because they are atomic
673 // and therefore do not require invalidation of parent places.
677 // FIXME: Check that the place is `Freeze`.
679 let place = self.make_place(local, projection)?;
681 // Allocate a value slot if it doesn't have one.
682 if self.places[place].value_index.is_none() {
683 self.places[place].value_index = Some(self.value_count.into());
684 self.value_count += 1;
690 pub fn apply_elem(&self, place: PlaceIndex, elem: ProjElem) -> Option<PlaceIndex> {
691 self.projections.get(&(place, elem)).copied()
694 pub fn find(&self, place: PlaceRef<'_>) -> Option<PlaceIndex> {
695 let mut index = *self.locals.get(place.local)?.as_ref()?;
697 for &elem in place.projection {
698 index = self.apply_elem(index, elem.try_into().ok()?)?;
704 pub fn children(&self, parent: PlaceIndex) -> impl Iterator<Item = PlaceIndex> + '_ {
705 Children::new(self, parent)
708 pub fn preorder_invoke(&self, root: PlaceIndex, f: &mut impl FnMut(PlaceIndex)) {
710 for child in self.children(root) {
711 self.preorder_invoke(child, f);
718 next_sibling: Option<PlaceIndex>,
719 first_child: Option<PlaceIndex>,
720 /// The projection used to go from parent to this node (only None for root).
721 proj_elem: Option<ProjElem>,
722 value_index: Option<ValueIndex>,
726 fn new(proj_elem: Option<ProjElem>) -> Self {
727 Self { next_sibling: None, first_child: None, proj_elem, value_index: None }
731 struct Children<'a> {
733 next: Option<PlaceIndex>,
736 impl<'a> Children<'a> {
737 fn new(map: &'a Map, parent: PlaceIndex) -> Self {
738 Self { map, next: map.places[parent].first_child }
742 impl<'a> Iterator for Children<'a> {
743 type Item = PlaceIndex;
745 fn next(&mut self) -> Option<Self::Item> {
748 self.next = self.map.places[child].next_sibling;
756 // FIXME: See if we can get rid of `Unknown`.
757 pub enum ValueOrPlace<V> {
763 pub enum ValueOrPlaceOrRef<V> {
770 impl<V> From<ValueOrPlace<V>> for ValueOrPlaceOrRef<V> {
771 fn from(x: ValueOrPlace<V>) -> Self {
773 ValueOrPlace::Value(value) => ValueOrPlaceOrRef::Value(value),
774 ValueOrPlace::Place(place) => ValueOrPlaceOrRef::Place(place),
775 ValueOrPlace::Unknown => ValueOrPlaceOrRef::Unknown,
780 pub trait HasBottom {
788 impl<V> HasBottom for FlatSet<V> {
789 fn bottom() -> Self {
794 impl<V> HasTop for FlatSet<V> {
800 /// Currently, we only track places through deref and field projections.
802 /// For now, downcast is not allowed due to aliasing between variants (see #101168).
803 #[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
809 impl<V, T> TryFrom<ProjectionElem<V, T>> for ProjElem {
812 fn try_from(value: ProjectionElem<V, T>) -> Result<Self, Self::Error> {
814 ProjectionElem::Deref => Ok(ProjElem::Deref),
815 ProjectionElem::Field(field, _) => Ok(ProjElem::Field(field)),
821 fn iter_fields<'tcx>(
824 mut f: impl FnMut(Option<VariantIdx>, Field, Ty<'tcx>),
828 for (field, ty) in list.iter().enumerate() {
829 f(None, field.into(), ty);
832 ty::Adt(def, substs) => {
833 for (v_index, v_def) in def.variants().iter_enumerated() {
834 for (f_index, f_def) in v_def.fields.iter().enumerate() {
835 let field_ty = f_def.ty(tcx, substs);
837 .try_normalize_erasing_regions(ty::ParamEnv::reveal_all(), field_ty)
838 .unwrap_or(field_ty);
839 f(Some(v_index), f_index.into(), field_ty);
843 ty::Closure(_, substs) => {
844 iter_fields(substs.as_closure().tupled_upvars_ty(), tcx, f);
850 fn debug_with_context_rec<V: Debug + Eq>(
853 new: &IndexVec<ValueIndex, V>,
854 old: Option<&IndexVec<ValueIndex, V>>,
856 f: &mut Formatter<'_>,
857 ) -> std::fmt::Result {
858 if let Some(value) = map.places[place].value_index {
860 None => writeln!(f, "{}: {:?}", place_str, new[value])?,
862 if new[value] != old[value] {
863 writeln!(f, "\u{001f}-{}: {:?}", place_str, old[value])?;
864 writeln!(f, "\u{001f}+{}: {:?}", place_str, new[value])?;
870 for child in map.children(place) {
871 let info_elem = map.places[child].proj_elem.unwrap();
872 let child_place_str = match info_elem {
873 ProjElem::Deref => format!("*{}", place_str),
874 ProjElem::Field(field) => {
875 if place_str.starts_with("*") {
876 format!("({}).{}", place_str, field.index())
878 format!("{}.{}", place_str, field.index())
882 debug_with_context_rec(child, &child_place_str, new, old, map, f)?;
888 fn debug_with_context<V: Debug + Eq>(
889 new: &IndexVec<ValueIndex, V>,
890 old: Option<&IndexVec<ValueIndex, V>>,
892 f: &mut Formatter<'_>,
893 ) -> std::fmt::Result {
894 for (local, place) in map.locals.iter_enumerated() {
895 if let Some(place) = place {
896 debug_with_context_rec(*place, &format!("{:?}", local), new, old, map, f)?;
902 impl<'tcx, T> DebugWithContext<ValueAnalysisWrapper<T>> for State<T::Value>
904 T: ValueAnalysis<'tcx>,
907 fn fmt_with(&self, ctxt: &ValueAnalysisWrapper<T>, f: &mut Formatter<'_>) -> std::fmt::Result {
909 StateData::Reachable(values) => debug_with_context(values, None, ctxt.0.map(), f),
910 StateData::Unreachable => write!(f, "unreachable"),
917 ctxt: &ValueAnalysisWrapper<T>,
918 f: &mut Formatter<'_>,
919 ) -> std::fmt::Result {
920 match (&self.0, &old.0) {
921 (StateData::Reachable(this), StateData::Reachable(old)) => {
922 debug_with_context(this, Some(old), ctxt.0.map(), f)
924 _ => Ok(()), // Consider printing something here.