<Dd> An event E causally depends on C if, and only if, (i) if C had occurred, then E would have occurred, and (ii) if C had not occurred, then E would not have occurred . </Dd> <P> Causation is then defined as a chain of causal dependence . That is, C causes E if and only if there exists a sequence of events C, D, D,...D, E such that each event in the sequence depends on the previous . </P> <P> Note that the analysis does not purport to explain how we make causal judgements or how we reason about causation, but rather to give a metaphysical account of what it is for there to be a causal relation between some pair of events . If correct, the analysis has the power to explain certain features of causation . Knowing that causation is a matter of counterfactual dependence, we may reflect on the nature of counterfactual dependence to account for the nature of causation . For example, in his paper "Counterfactual Dependence and Time's Arrow," Lewis sought to account for the time - directedness of counterfactual dependence in terms of the semantics of the counterfactual conditional . If correct, this theory can serve to explain a fundamental part of our experience, which is that we can only causally affect the future but not the past . </P> <P> Interpreting causation as a deterministic relation means that if A causes B, then A must always be followed by B. In this sense, war does not cause deaths, nor does smoking cause cancer or emphysema . As a result, many turn to a notion of probabilistic causation . Informally, A ("The person is a smoker") probabilistically causes B ("The person has now or will have cancer at some time in the future"), if the information that A occurred increases the likelihood of Bs occurrence . Formally, P (B A) ≥ P (B) where P (B A) is the conditional probability that B will occur given the information that A occurred, and P (B) is the probability that B will occur having no knowledge whether A did or did not occur . This intuitive condition is not adequate as a definition for probabilistic causation because of its being too general and thus not meeting our intuitive notion of cause and effect . For example, if A denotes the event "The person is a smoker," B denotes the event "The person now has or will have cancer at some time in the future" and C denotes the event "The person now has or will have emphysema some time in the future," then the following three relationships hold: P (B A) ≥ P (B), P (C A) ≥ P (C) and P (B C) ≥ P (B). The last relationship states that knowing that the person has emphysema increases the likelihood that he will have cancer . The reason for this is that having the information that the person has emphysema increases the likelihood that the person is a smoker, thus indirectly increasing the likelihood that the person will have cancer . However, we would not want to conclude that having emphysema causes cancer . Thus, we need additional conditions such as temporal relationship of A to B and a rational explanation as to the mechanism of action . It is hard to quantify this last requirement and thus different authors prefer somewhat different definitions. . The publications by Anderson and Vastag (2004), Lauría and Duchessi (2006), Gupta and Kim (2008), Lee et al. (2011), Cardenas, Voordijk, and Dewulf (2017) have shown a number of examples of tests for probabilistic causation assertions in applications in different fields . </P>

Are causal relations between events a feature of the human mind