The weighted stochastic simulation algorithm (wSSA) recently introduced by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] is an innovative variation on the stochastic simulation algorithm (SSA). It enables one to estimate, with much less computational effort than was previously thought possible using a Monte Carlo simulation procedure, the probability that a specified event will occur in a chemically reacting system within a specified time when that probability is very small. This paper presents some procedural extensions to the wSSA that enhance its effectiveness in practical applications. The paper also attempts to clarify some theoretical issues connected with the wSSA, including its connection to first passage time theory and its relation to the SSA.

## REFERENCES

*no*importance sampling is used can also be deduced through the following line of reasoning: Abbreviating $p(x0,E;t)\u2261p$, the $n$ runs are analogous to $n$ tosses of a coin that have probability $p$ of being successful. We know from elementary statistics that the number of successful runs should then be the

*binomial*(or Bernoulli) random variable with mean $np$ and variance $np(1\u2212p)$. When $n$ is very large, that binomial random variable can be approximated by the normal random variable with the same mean and variance. Multiplying that random variable by $n\u22121$ gives the

*fraction*of the $n$ runs that are successful. Random variable theory tells us that it too will be (approximately) normal but with mean $n\u22121p=p/n$ and variance $(n\u22121)2np(1\u2212p)=p(1\u2212p)/n$, and hence standard deviation $p(1\u2212p)/n$. The latter, with $p=mn/n$, is precisely uncertainty (9a). Essentially this argument was given in Appendix B of Ref. 1. But there is apparently no way to generalize this line of reasoning to the case where the weights of the successful runs are not all unity; hence the need for the procedure described in the text.