Pinboard Popularpopular items from Pinboard
http://feeds.pinboard.in
https://s2.googleusercontent.com/s2/favicons?alt=feed&domain=feeds.pinboard.inf43.meMon, 16 Dec 2019 11:00:53 +0100<![CDATA[AWS Events Content]]>Experience our event content at your fingertips. Explore, view, and download presentation decks from your favorite sessions and discover what’s new. Learn from AWS experts, customers, and partners to continue your educational journey in the cloud.]]>
https://aws.amazon.com/events/events-content/?awsf_filter-series=%2Aall&%3Bawsm_page-cards=1
https://aws.amazon.com/events/events-content/?awsf_filter-series=%2Aall&%3Bawsm_page-cards=1Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[For the third time this year, GOP rejects election-security bill | MSNBC]]>Over the summer, a pair of senators – one Democrat and one Republican – partnered on a new election-security proposal called the Defending Elections from Threats by Establishing Redlines (DETER Act). The idea behind Sens. Chris Van Hollen’s (D-Md.) and Marco Rubio’s (R-Fla.) bill was pretty straightforward: if U.S. intelligence agencies were to determine that Russia interfered in another federal election, new sanctions would kick in targeting Russia’s finance, defense and energy sectors.

The point, obviously, would be to create a disincentive, letting the Kremlin know in advance that Russia would face significant economic consequences if Moscow once again attacked our democratic institutions.

The bill picked up a bipartisan group of co-sponsors, and it seemed like the sort of proposal that might even have a chance in the Republican-led Senate. At least that was the hope before it was blocked yesterday on the Senate floor. Axios reported:

A Republican senator is blocking bipartisan legislation meant to counter foreign election interference, saying it is more anti-Trump than anti-Russia.

Sen. Mike Crapo (R-Idaho) objected Tuesday when Sen. Chris Van Hollen (D-Md.) sought consent to pass the DETER bill, as reported by The Hill…. The stalled legislation comes as U.S. intelligence agencies predict Russia and other foreign countries will attempt to interfere in the 2020 election.

Van Hollen, the lead sponsor, explained, “This has nothing to do with President Trump, this has to do with protecting our elections.” Crapo, the chairman of the Senate Banking Committee, was unmoved.

“The mechanisms in this bill have been designed more to attack the Trump administration and Republicans than to attack the Russians and those who would attack our country and our elections,” the Idaho Republican argued.

I’m not altogether sure how Crapo arrived at that conclusion, or why exactly he believes a bipartisan proposal to impose sanctions on Russia would, as a practical matter, effectively represent an “attack” on the Trump administration and Republicans.

What’s more, if these circumstances seem familiar, it’s because the DETER Act isn’t the only election-security measure to be rejected by Senate Republicans.

As regular readers may recall, the Democratic-led House passed the “Securing America’s Federal Elections Act” (SAFE Act), which would, among other things, require voting systems to use backup paper ballots, mandate tech safeguards, and provide resources to states to improve their election-security measures.

In October, however, when Sens. Mark Warner (D-Va.), Amy Klobuchar (D-Minn.), and Ron Wyden (D-Ore.) tried to pass a package of election-related measures – including a Senate companion to the SHIELD Act – Sen. Marsha Blackburn (R-Tenn.) blocked the effort.

Soon after, the House also passed the Stopping Harmful Interference in Elections for a Lasting Democracy (SHIELD) Act, which would, among other things, require candidates to notify law enforcement authorities in the event of a foreign power offering campaign assistance.

Senate Majority Leader Mitch McConnell (R-Ky.) – who picked up the “Moscow Mitch” moniker after balking at other bills on election security – said his GOP-led chamber would ignore this bill, too.

Occasionally, Republican lawmakers will make the case that they’re genuinely interested in doing something on the issue, especially in the face of warnings from U.S. officials about the likelihood of further Russian intervention, but they simply cannot go along with the Democratic-led proposals, even the ones with GOP co-sponsors.

In fact, even Crapo, while blocking a bipartisan bill yesterday and claiming the bipartisan bill wasn’t bipartisan enough, said “maybe” senators can “come together” at some point to pass an election-security bill.

What it would take to make Republicans happy as they reject one bill after another is unclear.

]]>
http://www.msnbc.com/rachel-maddow-show/the-third-time-year-gop-rejects-election-security-bill
http://www.msnbc.com/rachel-maddow-show/the-third-time-year-gop-rejects-election-security-billMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Sex Differences in Personality are Large and Important - Marginal REVOLUTION]]>Men and women are different. A seemingly obvious fact to most of humanity but a long-time subject of controversy within psychology. New large-scale results using better empirical methods are resolving the debate, however, in favor of the person in the street. The basic story is that at the broadest level (OCEAN) differences are relatively small but that is because there are large offsetting differences between men and women at lower levels of aggregation. Scott Barry Kaufman, writing at Scientific American, has a very good review of the evidence:

At the broad level, we have traits such as extraversion, neuroticism, and agreeableness. But when you look at the specific facets of each of these broad factors, you realize that there are some traits that males score higher on (on average), and some traits that females score higher on (on average), so the differences cancel each other out. This canceling out gives the appearance that sex differences in personality don’t exist when in reality they very much do exist.

For instance, males and females on average don’t differ much on extraversion. However, at the narrow level, you can see that males on average are more assertive (an aspect of extraversion) whereas females on average are more sociable and friendly (another aspect of extraversion). So what does the overall picture look like for males and females on average when going deeper than the broad level of personality?

On average, males tend to be more dominant, assertive, risk-prone, thrill-seeking, tough-minded, emotionally stable, utilitarian, and open to abstract ideas. Males also tend to score higher on self-estimates of intelligence, even though sex differences in general intelligence measured as an ability are negligible [2]. Men also tend to form larger, competitive groups in which hierarchies tend to be stable and in which individual relationships tend to require little emotional investment. In terms of communication style, males tend to use more assertive speech and are more likely to interrupt people (both men and women) more often– especially intrusive interruptions– which can be interpreted as a form of dominant behavior.

…In contrast, females, on average, tend to be more sociable, sensitive, warm, compassionate, polite, anxious, self-doubting, and more open to aesthetics. On average, women are more interested in intimate, cooperative dyadic relationships that are more emotion-focused and characterized by unstable hierarchies and strong egalitarian norms. Where aggression does arise, it tends to be more indirect and less openly confrontational. Females also tend to display better communication skills, displaying higher verbal ability and the ability to decode other people’s nonverbal behavior. Women also tend to use more affiliative and tentative speech in their language, and tend to be more expressive in both their facial expressions and bodily language (although men tend to adopt a more expansive, open posture). On average, women also tend to smile and cry more frequently than men, although these effects are very contextual and the differences are substantially larger when males and females believe they are being observed than when they believe they are alone.

Moreover, the differences in the subcategories are all correlated so while one might argue that even among the subcategories the differences are small on any single category when you put them all together the differences in male and female personalities are large and systematic.

Relatively small differences across multiple traits can add up to substantial differences when considered as a whole profile of traits. Take the human face, for example. If you were to just take a particular feature of the face– such as mouth width, forehead height, or eye size– you would have difficult differentiating between a male face and a female face. You simply can’t tell a male eyeball from a female eyeball, for instance. However, a look at the combination of facial features produces two very distinct clusters of male vs. female faces. In fact, observers can correctly determine sex from pictures with greater than 95% accuracy [4]. Here’s an interesting question: does the same apply to the domain of personality?

…There now exists four large-scale studies that use this multivariate methodology (see here, here, here, and here). All four studies are conducted cross-culturally and report on an analysis of narrow personality traits (which, as you may recall, is where most of the action is when it comes to sex differences). Critically, all four studies converge on the same basic finding: when looking at the overall gestalt of human personality, there is a truly striking difference between the typical male and female personality profiles.

Just how striking? Well, actually, really striking. In one recent study, Tim Kaiser, Marco Del Giudice, and Tom Booth analyzed personality data from 31,637 people across a number of English-speaking countries. The size of global sex differences was D = 2.10 (it was D = 2.06 for just the United States). To put this number in context, a D= 2.10 means a classification accuracy of 85%. In other words, their data suggests that the probability that a randomly picked individual will be correctly classified as male or female based on knowledge of their global personality profile is 85% (after correcting for the unreliability of the personality tests).

In other words, you can predict whether a person is male of female from their personality traits almost as well as by looking at their face. Overall, the big differences are as follows:

Consistent with prior research, the researchers found that the following traits are most exaggerated among females when considered separately from the rest of the gestalt: sensitivity, tender-mindedness, warmth, anxiety, appreciation of beauty, and openness to change. For males, the most exaggerated traits were emotional stability, assertiveness/dominance, dutifulness, conservatism, and conformity to social hierarchy and traditional structure.

I have also pointed out that gender equality magnifies differences in gender choices and behavior which is probably one reason why fewer women enter STEM fields in societies with greater equality. Consistent with this, personality differences between the sexes are large in all cultures but “for all of these personality effects the sex differences tend to be larger– not smaller– in more individualistic, gender-egalitarian countries.”

Addendum: See John Nye and co-authors on testosterone and finger length for some biological correlations.

]]>
https://marginalrevolution.com/marginalrevolution/2019/12/sex-differences-in-personality-are-large-and-important.html
https://marginalrevolution.com/marginalrevolution/2019/12/sex-differences-in-personality-are-large-and-important.htmlMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Ex-Breitbart editor says Stephen Miller is a white supremacist, and she was too - CNNPolitics]]>

(CNN)She was already a racist when she took a publishing job in Washington, DC. But when she became a reporter for Breitbart News, Katie McHugh says she was taken to new depths of hate with the help of Stephen Miller.

Emails show the two were in frequent contact between 2015 and 2016 while he was working for then Sen. Jeff Sessions and later on the Trump presidential campaign.

McHugh says on Miller's way to the White House, where he is now a senior adviser deeply involved in shaping immigration policy in consonance with his hardline views, he was constantly sending her far-right material, encouraging her to use their arguments in her articles.

Former Breitbart reporter Katie McHugh said she didn't think about the people she hurt at the time.

McHugh was a willing acolyte of Miller who she says further radicalized her.

Read More

"I was a white nationalist," she says. "Whatever you want to call it -- white nationalist, white supremacist. But that part [of me] is dead."

She says Miller privately showed his true colors and pushed white supremacist ideals echoing his hardline views on restricting immigration to her in order to get them on Breitbart's website.

McHugh has shared several hundred emails with the Southern Policy Law Center and now some with CNN showing Miller's contacts from 2015 to 2016. She says Miller stopped reaching out once he got a position in the White House.

She agreed to a CNN interview -- her first on camera -- because she wants to sound the alarm about Miller.

She says she is doing this as part of her own journey to healing and repentance for the life she used to live and the people she hurt.

Miller has not responded to a detailed request for comment. He has never denied the veracity of the emails.

The White House has not commented on McHugh's interview. When the emails were first revealed last month a White House spokesperson told CNN: "While Mr. Miller condemns racism and bigotry in all forms, those defaming him are trying to deny his Jewish identity, which is a pernicious form of anti-Semitism."

McHugh says she was introduced to Miller in June 2015 by Breitbart colleague Matthew Boyle when she became a reporter, after editing the site's homepage and stories. She was 23 at the time.

Miller was interviewed by Steve Bannon during his coverage of the 2016 New Hampshire primary.

"Miller was introduced to me as someone that I would take editorial direction from as I was reporting on the immigration beat and criminal justice beat," she says. "It was not like, 'Here's someone from a Senate office, he may pitch you stories.' It was understood that Miller had editorial control over the political section," she alleges.

Elizabeth Moore, vice president of public relations and communications for Breitbart, said in a statement to CNN: "This person (McHugh) was fired years ago for a multitude of reasons, including lying, and now you have an even better idea why she was fired. Having said that, it is not exactly a newsflash that political staffers pitch stories to journalists -- sometimes those pitches are successful, sometimes not."

McHugh said Miller would point her towards crimes committed by undocumented migrants, such as the killing of Kate Steinle in San Francisco, with the subtext that curbing immigration from certain countries would cut crime. And she would seek out his opinion too.

In October 2015, McHugh asked Miller if he thought a natural disaster in Mexico could drive people to the US border. He replied: "100 percent," according to emails McHugh gave to the SPLC and then CNN.

He then raised the possibility that those potential migrants could be allowed to stay in the US with Temporary Protected Status (TPS) -- the special category given to Haitian survivors of the devastating 2010 earthquake among others.

TPS is giving to citizens of countries who are unable to safely return because of an environmental disaster, a war or extraordinary conditions that are temporary.

"Wow. Ok. Is there precedent for this?" McHugh asked, to which Miller responded with a link to an article on an extremist website that promotes the racist "great replacement" theory that white people are facing genocide.

McHugh told CNN: "I do want to emphasize ... that those emails are now White House policy."

The Trump administration decided not to offer the humanitarian relief of TPS to survivors of Hurricane Dorian that laid waste to some of the Bahamas this summer. The US is also in the process of rescinding TPS previously granted to people from El Salvador, Haiti, Honduras, Nepal, Nicaragua and Sudan. The administration has said the original dire conditions are no longer present.

Calling someone a white supremacist is a very strong and personal attack, but McHugh does not hesitate to denounce Miller from what she knows of him and how she saw in him a kindred spirit when she was on a racist path.

"I would absolutely call him a white supremacist," she says. His driving ideology is "white supremacy and anti-immigration especially," she adds.

McHugh herself once followed the same hate as Miller. When she first moved to Washington, she dated a white nationalist and they and their friends would hang out in a home they dubbed "the house of hate."

She tweeted virulently racist and Islamophobic statements but says she was still having fun and a social life.

When she went to work for Breitbart, she says she became more isolated -- working long hours remotely by herself and that helped to make her susceptible to what she calls her "radicalization" by Miller and others.

"My world got more sealed off and I got much more intense. I got prideful, and I got angrier and angrier," she says. "Unless you stop, you know, objects in motion, stay in motion. It just gets worse."

At the time, she was enjoying the success and being close to a policy maker whom she said also had the ear of then Executive Chairman Steve Bannon and other leaders at Breitbart.

Stephen Miller and Steve Bannon, here with deputy National Security Adviser Dina Powell in 2017, both ended up working for President Trump.

"It's very exciting to shape the news," McHugh says. "I wasn't self-aware enough ... to realize like what I was doing was extremely harmful. I hoped that we would bounce ideas off each other and ... it was nice to be able to talk to someone because I was very isolated and I had hoped ... we could be kind of friends."

That didn't happen, McHugh says, but she and Miller remained in repeated, sometimes almost constant, contact.

"We spoke so frequently and we were friendly to each other, but it was never like, there wasn't a friendship there because it wasn't like, 'Hey, how's your day going?'"

And McHugh admits she was traveling further and further down the rabbit hole of intolerance, sending out vile tweets that eventually led to her firing after they were highlighted by CNN among others.

McHugh now says getting fired from Breitbart was the best thing that could have happened to her. She took jobs with extremely far right websites soon after her firing, but was eventually let go from those jobs too.

"I was able to break away from what was frankly a toxic culture and a radicalization machine, especially for young people like me."

Miller has been instrumental in immigration policies.

She says she first began to see Miller had feet of clay a few weeks after being fired when he had a contentious exchange with CNN's Jim Acosta over US immigration and the poem on the Statue of Liberty that calls out for "your tired, your poor, your huddled masses yearning to breathe free."

"You see him trashing the Emma Lazarus poem at the bottom of the Statue of Liberty," she says. "It struck me as odd that he would direct such, like, vitriol about welcoming like the most desperate people in the world into a better country for that, like a safer place."

When Acosta suggested Miller was "trying to engineer the racial and ethnic flow of people into this country," he hit back, accusing the reporter of "cosmopolitan bias." Critics noted that phrase has been used to counter arguments by racist regimes for decades.

McHugh describes shedding her white supremacist views as akin to "pulling shrapnel from my brain."

Once vilified by the center and the left, she is now a target for all sides, but she knows she is on her journey away from being a white supremacist.

"I just think it's important to speak about this publicly because people need to know," she says. "It's a serious danger and I see other people younger than me going down that same path."

More than 100 Democrats in the House and 27 senators have called for Miller to go.

But McHugh says, "Not a single Republican has called for Miller's resignation. That should terrify us as a country."

McHugh's politics have now swung to the left. She has even donated to Democratic Sen. Bernie Sanders' presidential campaign, partly because of his calls for "Medicare for All," she says.

She has diabetes, alopecia and other medical issues. She has no permanent home right now and only a part-time job but says she would like to give more to Sanders' campaign if she becomes able.

At one point she breaks down. Her shoulders shaking. Tears welling up in her eyes. She apologizes for the harm she says she has caused. She says she is now doing what her Catholic upbringing has taught her. Making amends and repenting. She wants Miller to do the same. And resign.

"I was in a very dark, very small world and I was a very angry person."

CNN's Mallory Simon contributed to this story.

]]>
https://www.cnn.com/2019/12/13/politics/katie-mchugh-stephen-miller/index.html
https://www.cnn.com/2019/12/13/politics/katie-mchugh-stephen-miller/index.htmlMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Stochastic processes book]]>
(with 33 illustrations)
Gordan šitkovi¢
Department of Mathematics
The University of Texas at Austin

Chapter 1
Probability review
The probable is what usually happens.
Aristotle
It is a truth very certain that when it is not in our power to determine. what is
true we ought to follow what is most probable
Descartes - Discourse on Method
It is remarkable that a science which began with the consideration of games of
chance should have become the most important object of human knowledge.
Pierre Simon Laplace - Théorie Analytique des Probabilités, 1812
Anyone who considers arithmetic methods of producing random digits is, of
course, in a state of sin.
John von Neumann - quote in Conic Sections by D. MacHale
I say unto you: a man must have chaos yet within him to be able to give birth to
a dancing star: I say unto you: ye have chaos yet within you . . .
Friedrich Nietzsche - Thus Spake Zarathustra
1.1 Random variables
Probability is about random variables. Instead of giving a precise de nition, let us just metion that
a random variable can be thought of as an uncertain, numerical (i.e., with values in R) quantity.
While it is true that we do not know with certainty what value a random variable Xwill take, we
usually know how to compute the probability that its value will be in some some subset of R. For
example, we might be interested in P[X 7],P [X 2[2;3 :1]] orP[X 2 f 1;2 ;3 g]. The collection of
all such probabilities is called the distributionofX. One has to be very careful not to confuse
the random variable itself and its distribution. This point is particularly important when several
random variables appear at the same time. When two random variables Xand Yhave the same
distribution, i.e., when P[X 2A] = P[Y 2A]for any set A, we say that Xand Yare equally
distributed and writeX(
d )
= Y.
4

CHAPTER 1. PROBABILITY REVIEW
1.2 Countable sets
Almost all random variables in this course will take only countably many values, so it is probably
a good idea to review brei y what the word countablemeans. As you might know, the countable
in nity is one of many di erent in nities we encounter in mathematics. Simply, a set is countable
if it has the same number of elements as the set N= f1 ;2 ; : : : gof natural numbers. More
precisely, we say that a set Aiscountable if there exists a function f:N ! Awhich is bijective
(one-to-one and onto). You can think fas the correspondence that proves that there exactly as
many elements of Aas there are elements of N. Alternatively, you can view fas an ordering
of A; it arranges Ainto a particular order A= fa
1; a
2; : : :
g, where a
1 =
f(1) ,a
2 =
f(2) , etc.
In nities are funny, however, as the following example shows
Example 1.1. 1.Nitself is countable; just use f(n ) = n.
2. N
0 =
f0;1 ;2 ;3 ; : : : gis countable; use f(n ) = n 1. You can see here why I think that
in nities are funny; the set N
0 and the set
N- which is its proper subset - have the same
size.
3. Z= f: : : ; 2; 1;0 ;1 ;2 ;3 ; : : : gis countable; now the function fis a bit more complicated;
f (k ) = (
2k + 1 ; k 0
2k; k < 0:
You could think that Zis more than twice-as-large as N, but it is not. It is the same size.
4. It gets even weirder. The set N N = f(m; n ) :m2N; n 2Ng of all pairs of natural
numbers is also countable. I leave it to you to construct the function f.
5. A similar argument shows that the set Qof all rational numbers (fractions) is also countable.
6. The set [0;1] of all real numbers between 0and 1is not countable; this fact was rst proven
by Georg Cantor who used a neat trick called the diagonal argument.
1.3 Discrete random variables
A random variable is said to be discrete if it takes at most countably many values. More precisely,
X is said to be discreteif there exists a niteorcountable setS Rsuch that P[X 2S] = 1 ,
i.e., if we know with certainty that the only values Xcan take are those in S. The smallest set S
with that property is called the supportofX. If we want to stress that the support corresponds
to the random variable X, we write
X.
Some supports appear more often then the others:
1. If Xtakes only the values 1;2 ;3 ; : : : , we say that XisN-valued .
2. If we allow 0(in addition to N), so that P[X 2N
0] = 1
, we say that XisN
0-valued Last Updated: December 24, 2010
5Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
3. Sometimes, it is convenient to allow discrete random variables to take the value
+1 . This
is mostly the case when we model the waiting time until the rst occurence of an event
which may or may not ever happen. If it never happens, we will be waiting forever, and
the waiting time will be +1 . In those cases - when S= f1;2 ;3 ; : : : ; +1g =N[ f +1g -
we say that the random variable is extendedN-valued . The same applies to the case of N
0
(instead of N), and we talk about the extendedN
0-valued
random variables. Sometimes the
adjective extended is left out, and we talk about N
0-valued random variables, even though
we allow them to take the value +1 . This sounds more confusing that it actually is.
4. Occasionally, we want our random variables to take values which are not necessarily num- bers (think about Hand Tas the possible outcomes of a coin toss, or the suit of a randomly
chosen playing card). Is the collection of all possible values (like fH; T gor f~; ;| ;}g ) is
countable, we still call such random variables discrete. We will see more of that when we
start talking about Markov chains.
Discrete random variables are very nice due to the following fact: in order to be able to compute
any conceivable probability involving a discrete random variable X, it is enough to know how
to compute the probabilities P[X =x], for all x2 S. Indeed, if we are interested in guring out
how much P[X 2B]is, for some set B R(B = [3 ;6] , or B= [ 2;1 )), we simply pick all x2 S
which are also in Band sum their probabilities. In mathematical notation, we have
P[X 2B] = X
x 2 S\ B P
[X =x]:
For this reason, the distribution of any discrete random variable Xis usually described via a
table
X
x1 x
2 x
3 : : :
p 1 p
2 p
3 : : :
;
where the top row lists all the elements of S(the support of X) and the bottom row lists their
probabilities ( p
i =
P[X =x
i]
, i 2 N). When the random variable is N-valued (or N
0-valued), the
situation is even simpler because we know what x
1; x
2; : : :
are and we identify the distribution
of X with the sequence p
1; p
2; : : :
(orp
0; p
1; p
2; : : :
in the N
0-valued case), which we call the
probability mass function (pmf) of the random variableX. What about the extended N
0-valued
case? It is as simple because we can compute the probability P[X = + 1], if we know all the
probabilities p
i =
P[X =i], i 2 N
0. Indeed, we use the fact that
P [X = 0] + P[X = 1] + +P[X =1] = 1 ;
so that P[X =1] = 1 P
1
i =1 p
i, where
p
i =
P[X =i]. In other words, if you are given a
probability mass function (p
0; p
1; : : :
), you simply need to compute the sum P
1
i =1 p
i. If it happens
to be equal to 1, you can safely conclude that Xnever takes the value +1 . Otherwise, the
probability of +1 is positive.
The random variables for which S= f0;1 g are especially useful. They are called indicators.
The name comes from the fact that you should think of such variables as signal lights; if X= 1
an event of interest has happened, and if X= 0 it has not happened. In other words, Xindicates
the occurence of an event. The notation we use is quite suggestive; for example, if Yis the
outcome of a coin-toss, and we want to know whether Heads(H) occurred, we write
X =1
fY =H g: Last Updated: December 24, 2010
6Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
Example 1.2.
Suppose that two dice are thrown so that Y
1 and
Y
2 are the numbers obtained (both
Y 1 and
Y
2 are discrete random variables with
S= f1;2 ;3 ;4 ;5 ;6 g ). If we are interested in the
probability the their sum is at least 9, we proceed as follows. We de ne the random variable Z-
the sum of Y
1 and
Y
2 - by
Z= Y
1 +
Y
2. Another random variable, let us call it
X, is de ned by
X =1
fZ 9g , i.e.,
X=(
1; Z 9;
0 ; Z < 9:
With such a set-up, Xsignals whether the event of interest has happened, and we can state our
original problem in terms of X: Compute P[X = 1] ! . Can you compute it?
1.4 Expectation
For a discrete random variable Xwith support , we de ne the expectationE[X ]of X by
E [X ] = X
x 2 x
P [X =x];
as long as the (possibly) in nite sum P
x2 x
P [X =x] absolutely converges . When the sum does
not converge, or if it converges only conditionally, we say that the expectation of Xis not de ned .
When the random variable in question is N
0-valued, the expression above simpli es to
E [X ] = 1
X
i =0 i
p
i;
where p
i =
P[X =i], for i2 N
0. Unlike in the general case, the absolute convergence of the
de ning series can fail in essentially one way, i.e., when
lim
n !1 n
X
i =0 ip
i= +
1:
In that case, the expectation does not formally exist. We still write E[X ] = + 1, but really mean
that the de ning sum diverges towards in nity. Once we know what the expectation is, we can easily de ne several more common terms:
De nition 1.3. LetXbe a discrete random variable.
If the expectation E[X ]exists, we say that Xisintegrable .
If E[X 2
] < 1 (i.e., if X2
is integrable), Xis called square-integrable .
If E[j X jm
]< 1, for some m >0, we say that Xhas a nite m-th moment .
IfX has a nite m-th moment, the expectation E[jX E[X ]jm
]exists and we call it the m-th
central moment .
It can be shown that the expectation Epossesses the following properties, where Xand Y
are both assumed to be integrable: Last Updated: December 24, 2010
7Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
1.
E[ X + Y ] = E [X ] + E [Y ], for ; 2R (linearity of expectation) .
2. E[X ] E[Y ]if P[X Y] = 1 (monotonicity of expectation) .
De nition 1.4. LetXbe a square-integrable random variable. We de ne the varianceVar[X]
by Var[X] = E[(X m)2
]; where m=E[X ]:
The square-root p Var[
X]is called the standard deviation ofX.
Remark 1.5.Each square-integrable random variable is automatically integrable. Also, if the m-th
moment exists, then all lower moments also exist.
We still need to de ne what happens with random variables that take the value +1 , but that
is very easy. We stipulate that E[X ]does not exist , (i.e.,E[X ] = + 1) as long as P[X = + 1]> 0.
Simply put, the expectation of a random variable is in nite if there is a positive chance (no matter
how small) that it will take the value +1 .
1.5 Events and probability
Probability is usually rst explained in terms of the sample spaceorprobability space (which
we denote by
in these notes) and various subsetsof
which are called events 1
Events typically
contain all elementary events , i.e., elements of the probability space, usually denoted by !. For
example, if we are interested in the likelihood of getting an odd number as a sum of outcomes
of two dice throws, we build a probability space

=f(1 ;1) ;(1 ;2) ; : : : ; (6;1) ;(2 ;1) ;(2 ;2) ; : : : ; (2;6) ; : : : ; (6;1) ;(6 ;2) ; : : : ; (6;6) g
and de ne the event Awhich contains of all pairs (k; l )2
such that k+ lis an odd number, i.e.,
A = f(1 ;2) ;(1 ;4) ;(1 ;6) ;(2 ;1) ;(2 ;3) ; : : : ; (6;1) ;(6 ;3) ;(6 ;5) g:
One can think of events as very simple random variables. Indeed, if, for an event A, we de ne
the random variable 1
A by
1A = (
1; A happened,
0 ; A did not happen,
we get the indicator random variable mentioned above. Conversely, for any indicator random
variable X, we de ne the indicated event Aas the set of all elementary events at which Xtakes
the value 1.
What does all this have to do with probability? The analogy goes one step further. If we apply
the notion of expectation to the indicator random variable X=1
A , we get the probability of
A:
E [1
A ] =
P[A ]:
Indeed, 1
A takes the value
1on A, and the value 0on the complement Ac
=
nA . Therefore,
E [1
A ] = 1
P[A ] + 0 P[A c
] = P[A ]. 1
When
is in nite, not all of its subsets can be considered events, due to very strange technical reasons. We will
disregard that fact for the rest of the course. If you feel curious as to why that is the case, google Banach-Tarski
paradox, and try to nd a connection. Last Updated: December 24, 2010
8Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
1.6 Dependence and independence
One of the main di erences between random variables and (deterministic or non-random) quan-
tities is that in the former case the whole is more than the sum of its parts. What do I mean by
that? When two random variables, say Xand Y, are considered in the same setting, you must
specify more than just their distributions, if you want to compute probabilities that involve both
of them. Here are two examples.
1. We throw two dice, and denote the outcome on the rst one by Xand the second one by
Y .
2. We throw two dice, and denote the outcome of the rst one by X, set Y= 6 X and forget
about the second die.
In both cases, both Xand Yhave the same distribution
X; Y
1 2 3 4 5 6
1 6
1 6
1 6
1 6
1 6
1 6
!
The pairs (X; Y )are, however, very di erent in the two examples. In the rst one, if the value of
X is revealed, it will not a ect our view of the value of Y. Indeed, the dice are not connected in
any way (they are independent in the language of probability). In the second case, the knowledge
of X allows us to say what Yis without any doubt - it is 6 X.
This example shows that when more than one random variable is considered, one needs to
obtain external information about their relationship - not everything can be deduced only by
looking at their distributions (pmfs, or . . . ). One of the most common forms of relationship two random variables can have is the one of
example (1) above, i.e., no relationship at all. More formally, we say that two (discrete) random
variables Xand Yare independent if
P [X =xand Y=y] = P[X =x]P [Y =y];
for allxand yin the respective supports
Xand
Yof
X and Y. The same concept can be applied
to events, and we say that two events Aand Bare independent if
P [A \B] = P[A ]P [B ]:
The notion of independence is central to probability theory (and this course) because it is relatively
easy to spot in real life. If there is no physical mechanism that ties two events (like the two dice
we throw), we are inclined to declare them independent 2
. One of the most important tasks
in probabilistic modelling is the identi cation of the (small number of) independent random
variables which serve as building blocks for a big complex system. You will see many examples
of that as we proceed through the course. 2
Actually, true independence does not exist in reality, save, perhaps a few quantum-theoretic phenomena. Even with
apparently independent random variables, dependence can sneak in the most sly of ways. Here is a funny example:
a recent survey has found a large correlation between the sale of diapers and the sale of six-packs of beer across
many Walmart stores throughout the country. At rst these two appear independent, but I am sure you can come up
with many an amusing story why they should, actually, be quite dependent. Last Updated: December 24, 2010
9Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
1.7 Conditional probability
When two random variables are not independent, we still want to know how the knowledge of
the exact value of one of the a ects our guesses about the value of the other. That is what the
conditional probability is for. We start with the de nition, and we state it for events rst: for two
events A,B such that P[B ]> 0, the conditional probability P[A jB ]of Agiven Bis de ned as:
P [A jB ] = P
[A \B] P
[B ] :
The conditional probability is not de nedwhenP[B ] = 0 (otherwise, we would be computing
0 0
- why?). Every statement in the sequel which involves conditional probability will be assumed
to hold only when P[B ] = 0 , without explicit mention.
The conditional probability calculations often use one of the following two formulas. Both
of them use the familiar concept of partition. If you forgot what it is, here is a de nition: a
collection A
1; A
2; : : : ; A
nof events is called a
partition of
if a) A
1[
A
2[
: : : A
n=

and b)
A i\
A
j=
;for all pairs i; j= 1 ; : : : ; n withi6
= j. So, let A
1; : : : ; A
nbe a partition of

, and let
B be an event.
1. The Law of Total Probability.
P[B ] = n
X
i =1 P
[B jA
i]
P [A
i]
:
2. Bayes formula. Fork= 1 ; : : : ; n , we have
P [A
kj
B ] = P
[B jA
k]
P [A
k] P
n
i =1 P
[B jA
i]
P [A
i] :
Even though the formulas above are stated for nite partitions, they remain true when the number
of A
k's is countably in nite. The nite sums have to be replaced by in nite series, however.
Random variables can be substituted for events in the de nition of conditional probability as
follows: for two random variables Xand Y, the conditional probabilty thatX=x,given Y=y
(with xand yin respective supports
Xand
Y) is given by
P [X =xjY =y] = P
[X =xand Y=y] P
[Y =y] :
The formula above produces a di erent probability distribution for each y. This is called the
conditional distribution ofX,given Y=y. We give a simple example to illustrate this concept.
Let Xbe the number of headsobtained when two coins are thrown, and let Ybe the indicator
of the event that the second coin shows heads. The distribution of Xis Binomial:
X
0 1 2
1 4
1 2
1 4
!
;
or, in the more compact notation which we use when the support is clear from the context
X (1 4
; 1 2
; 1 4
)
. The random variable Yhas the Bernoulli distribution Y= ( 1 2
; 1 2
)
. What happens Last Updated: December 24, 2010
10Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
to the distribution of
X, when we are told that Y= 0 , i.e., that the second coin shows heads. In
that case we have
P[X =xjY = 0] = 8
>
>
<<br />
>
>
: P
[X =0 ;Y=0] P
[Y =0] =P
[ the pattern is TT ] P
[Y =0] =1
=4 1
=2 = 1
2 ; x
= 0
P [X =1 ;Y=0] P
[Y =0] =P
[ the pattern is HT ] P
[Y =0] =1
=4 1
=2 = 1
2 ; x
= 1
P [X =2 ;Y=0] P
[Y =0] =P
[ well, there is no such pattern ] P
[Y =0] =0 1
=2 = 0 ; x
= 2
Thus, the conditional distribution of X, given Y= 0 , is(1 2
; 1 2
;
0) . A similar calculation can be used
to get the conditional distribution of X, but now given that Y= 1 , is (0;1 2
; 1 2
)
. The moral of the
story is that the additional information contained in Ycan alter our views about the unknown
value of Xusing the concept of conditional probability. One nal remark about the relationship
between independence and conditional probability: suppose that the random variables Xand Y
are independent. Then the knowledge of Yshould not a ect how we think about X; indeed, then
P [X =xjY =y] = P
[X =x; Y =y] P
[Y =y] =
P
[X =x]P [Y =y] P
[Y =y] =
P[X =x]:
The conditional distribution does not depend on y, and coincides with the unconditional one.
The notion of independence for two random variables can easily be generalized to larger
collections
De nition 1.6. Random variables X
1; X
2; : : : ; X
nare said to be
independentif
P [X
1=
x
1; X
2=
x
2; : : : X
n=
x
n] =
P[X
1=
x
1]
P [X
2=
x
2]
: : : P[X
n=
x
n]
for allx
1; x
2; : : : ; x
n.
An in nite collection of random variables is said to be independentif all of its nite subcol-
lections are independent.
Independence is often used in the following way:
Proposition 1.7. LetX
1; : : : ; X
nbe independent random variables. Then
1. g
1(
X
1)
, . . . , g
n (
X
n)
are also independent for (practically) all functions g
1; : : : ; g
n,
2. if X
1, . . . ,
X
n are integrable then the product
X
1: : : X
nis integrable and
E [X
1: : : X
n] =
E[X
1]
: : : E[X
n]
; and
3. if X
1, . . . ,
X
n are square-integrable, then
Var[X
1+
+X
n] = Var[
X
1] +
+ Var[ X
n]
:
Equivalently Cov[X
i; X
j] =
E[(X
1
E[X
1])(
X
2
E[X
2])] = 0
;
for all i6
= j2 f 1;2 ; : : : ; n g.
Remark 1.8.The last statement says that independent random variables are uncorrelated. The
converse is not true. There are uncorrelated random variables which are not independent. Last Updated: December 24, 2010
11Intro to Stochastic Processes: Lecture Notes

CHAPTER 1. PROBABILITY REVIEW
When several random variables (
X
1; X
2; : : : X
n) are considered in the same setting, we of-
ten group them together into a random vector. Thedistribution of the random vector X=
( X
1; : : : ; X
n)
is the collection of all probabilities of the form
P[X
1=
x
1; X
2=
x
2; : : : ; X
n=
x
n]
;
when x
1; x
2; : : : ; x
nrange through all numbers in the appropriate supports. Unlike in the case
of a single random variable, writing down the distributions of random vectors in tables is a bit
more di cult. In the two-dimensional case, one would need an entire matrix, and in the higher
dimensions some sort of a hologram would be the only hope. The distributions of the components X
1; : : : ; X
nof the random vector
Xare called the
marginal distributions of the random variables X
1; : : : ; X
n. When we want to stress the fact that
the random variables X
1; : : : ; X
nare a part of the same random vector, we call the distribution
of X the joint distribution ofX
1; : : : ; X
n. It is important to note that, unless random variables
X 1; : : : ; X
nare a priori known to be independent, the joint distribution holds more information
about Xthan all marginal distributions together.
1.8 Examples
Here is a short list of some of the most important discrete random variables. You will learn about
generating functions soon.
Example 1.9.
Bernoulli .
Success (1)of failure (0)with probability p(if success is encoded by 1, failure by
1 and p= 1 2
, we call it the
coin toss). .parameters :
p2 (0;1) (q = 1 p)
.notation : b(p )
.support : f0;1 g
.pmf : p
0 =
pand p
1 =
q= 1 p
.generating function : ps+q
.mean : p
.standard deviation : p pq
. gure : the mass function a Bernoulli distribu-
tion with p= 1 =3.
Binomial .
The number of successes in nrepeti- Last Updated: December 24, 2010
12Intro to Stochastic Processes: Lecture Notes- 0.5 0.00.51.00.00.10.20.30.40.50.60.7

CHAPTER 1. PROBABILITY REVIEW
tions of a Bernoulli trial with success probability
p . .parameters :
n2 N,p 2 (0;1) (q = 1 p)
.notation : b(n; p )
.support : f0;1 ; : : : ; n g
.pmf : p
k =
n
k
p k
q n
k
, k = 0 ; : : : ; n
.generating function : (ps +q)n
.mean : np
.standard deviation : p npq
. gure : mass functions of three binomial dis-
tributions with n= 50 andp= 0 :05 (blue), p= 0 :5
(purple) and p= 0 :8 (yellow).
Poisson .
The number of spelling mistakes one makes while typing a single page. .parameters :
>0
.notation : p(n; p )
.support : N
0
.pmf : p
k =
e
k k
! ,
k 2 N
0
.generating function : e
(s 1)
.mean :
.standard deviation : p
. gure : mass functions of two Poisson distribu-
tions with parameters = 0 :9 (blue) and = 10
(purple).
Geometric .
The number of repetitions of a Bernoulli trial with parameter puntil the rst
success. .parameters :
p2 (0;1) ,q = 1 p
.notation : g(p )
.support : N
0
.pmf : p
k =
pqk
1
, k 2 N
0
.generating function : p 1
qs
.mean : q p
.standard deviation : p q
p
. gure : mass functions of two Geometric distri-
butions with parameters p= 0 :1 (blue) and p= 0 :4
(purple). Last Updated: December 24, 2010
13Intro to Stochastic Processes: Lecture Notes010203040500.050.100.150.200.250.30 05101520250.10.20.30.4 0510152025300.050.100.150.200.250.30

CHAPTER 1. PROBABILITY REVIEW
Negative Binomial .
The number of failures it takes to obtain rsuccesses in repeated indepen-
dent Bernoulli trials with success probability p. .parameters :
r2 N,p 2 (0;1) (q = 1 p)
.notation : g(n; p )
.support : N
0
.pmf : p
k =
r
k
p r
q k
, k =2 N
0
.generating function :
p 1
qs
r
.mean : rq p
.standard deviation : p qr
p
. gure : mass functions of two negative bino-
mial distributions with r= 100 ; p= 0 :6 (blue) and
r = 25 ; p= 0 :9 (purple). Last Updated: December 24, 2010
14Intro to Stochastic Processes: Lecture Notes0204060801000.050.100.150.200.250.30

Chapter 2
Mathematica in 15 min
Mathematica is a glori ed calculator. Here is how to use it 1
.
2.1 Basic Syntax Symbols +, -, /, ^, * are all supported by Mathematica. Multiplication can be repre-
sented by a space between variables. a x + banda*x + b are identical.
Warning: Mathematica is case-sensitive. For example, the command to exit is Quitand
not quit orQUIT .
Brackets are used around function arguments. Write Sin[x], notSin(x) orSin{x} .
Parentheses ( )group terms for math operations: (Sin[x]+Cos[y])*(Tan[z]+z^2).
If you end an expression with a ;(semi-colon) it will be executed, but its output will not be
shown. This is useful for simulations, e.g.
Braces { }are used for lists:
Names can refer to variables, expressions, functions, matrices, graphs, etc. A name is
assigned using name = object. An expression may contain unde ned names: 1
Actually, this is just a tip of the iceberg. It can do many many many other things.
15In[1]:= A 1, 2, 3
Out[1]= 1, 2, 3 In[5]:= A a b ^ 3
Out[5]= a b 3
In[6]:= A ^ 2
Out[6]=
a b 6

CHAPTER 2. MATHEMATICA IN 15 MIN

The percent sign %stores the value of the previous result 2.2 Numerical Approximation
N[expr] gives the approximate numerical value of expression, variable, or command:
N[%] gives the numerical value of the previous result:
N[expr,n] givesndigits of precision for the expression expr:
Expressions whose result can't be represented exactly don't give a value unless you request
approximation: 2.3 Expression Manipulation
Expand[expr] (algebraically) expands the expression expr: Last Updated: December 24, 2010
16Intro to Stochastic Processes: Lecture NotesIn[7]:= 5 3
Out[7]= 8
In[8]:= ^ 2
Out[8]= 64 In[9]:= N Sqrt 2
Out[9]= 1.41421 In[17]:= E Pi
Out[17]= ? + Π
In[18]:= N
Out[18]= 5.85987 In[14]:= N Pi, 30
Out[14]= 3.14159265358979323846264338328 In[11]:= Sin 3
Out[11]= Sin 3
In[12]:= N Sin 3
Out[12]= 0.14112

CHAPTER 2. MATHEMATICA IN 15 MIN

Factor[expr] factors the expression expr
Simplify[expr] performs all kinds of simpli cations on the expression expr: 2.4 Lists and Functions
If Lis a list, its length is given by Length[L]. Thenth
element of Lcan be accessed by
L[[n]] (note the double brackets):
Addition, subtraction, multiplication and division can be applied to lists element by element: Last Updated: December 24, 2010
17Intro to Stochastic Processes: Lecture NotesIn[19]:= Expand a b ^ 2
Out[19]= a2
2 a b b2 In[20]:= Factor a ^ 2 b ^ 2
Out[20]= a b a b
In[21]:= Factor x ^ 2 5 x 6
Out[21]= 3 x 2 x In[35]:= A x x 1 x 1 x
Out[35]= x
1 x
x
1 x
In[36]:= Simplify A
Out[36]= 2 x
1 x2 In[43]:= L 2, 4, 6, 8, 10
Out[43]= 2, 4, 6, 8, 10
In[44]:= L 3
Out[44]= 6 In[1]:= L 1, 3, 4 ; K 3, 4, 2 ;
In[2]:= L K
Out[2]= 4, 7, 6
In[3]:= L K
Out[3]=
1 3 ,

34 , 2

CHAPTER 2. MATHEMATICA IN 15 MIN

If the expression exprdepends on a variable (say i), Table[expr,{i,m,n}] produces a list
of the values of the expression exprasiranges from mto n
The same works with two indices - you will get a list of lists
It is possible to de ne your own functions in Mathematica. Just use theunderscore syntax
f[x_]=expr , whereexpris some expression involving x:
To apply the function f(either built-in, like Sin, or de ned by you) to each element of the
list L, you can use the command Mapwith syntax Map[f,L]:
If you want to add all the elements of a list L, use Total[L] . The list of the same length
as L, but whose kth
element is given by the sum of the rst kelements of Lis given by
Accumulate[L] :Last Updated: December 24, 2010
18Intro to Stochastic Processes: Lecture NotesIn[37]:= Table i ^ 2, i, 0, 5
Out[37]= 0, 1, 4, 9, 16, 25 In[40]:= Table i ^ j, i, 1, 3 , j, 2, 3
Out[40]= 1, 1 , 4, 8 , 9, 27 In[47]:= f x x ^ 2
Out[47]= x2
In[48]:= f
x y
Out[48]= x y 2 In[50]:= f x 3 x
Out[50]= 3 x
In[51]:= L 1, 2, 3, 4
Out[51]= 1, 2, 3, 4
In[52]:= Map f, L
Out[52]= 3, 6, 9, 12 In[8]:= L 1, 2, 3, 4, 5
Out[8]= 1, 2, 3, 4, 5
In[9]:= Accumulate L
Out[9]= 1, 3, 6, 10, 15
In[10]:= Total L
Out[10]= 15

CHAPTER 2. MATHEMATICA IN 15 MIN
2.5 Linear Algebra
In Mathematica , matrix is a nested list, i.e., a list whose elements are lists. By convention,
matrices are represented row by row (inner lists are row vectors).
To access the element in the ith
row and jth
column of the matrix A, type A[[i,j]] or
A[[i]][[j]] :
Matrixform[expr] displaysexpras a matrix (provided it is a nested list)
Commands Transpose[A] ,Inverse[A] ,Det[A] ,Tr[A] andMatrixRank[A] return the trans-
pose, inverse, determinant, trace and rank of the matrix A, respectively.
To compute the nth
power of the matrix A, use MatrixPower[A,n]
Identity matrix of order nis produced by IdentityMatrix[n] .
If Aand Bare matrices of the same order, A+Band A-Bare their sum and di erence. Last Updated: December 24, 2010
19Intro to Stochastic Processes: Lecture NotesIn[59]:= A 2, 1, 3 , 5, 6, 9
Out[59]= 2, 1, 3 , 5, 6, 9
In[60]:= A 2, 3
Out[60]= 9
In[61]:= A 2 3
Out[61]= 9 In[9]:= A Table i 2 ^ j, i, 2, 5 , j, 1, 2
Out[9]= 4, 8 , 6, 12 , 8, 16 , 10, 20
In[10]:= MatrixForm A
Out[10]//MatrixForm=

If Aand Bare of compatible orders, A.B(that is a dot between them) is the matrix product
of Aand B.
For a square matrix A, CharacteristicPolynomial[A,x] is the characteristic poynomial,
det( xI A)in the variable x:
To get eigenvalues and eigenvectors use Eigenvalues[A]andEigenvectors[A] . The re-
sults will be the list containing the eigenvalues in the Eigenvaluescase, and the list of
eigenvectors of Ain the Eigenvectors case: 2.6 Prede ned Constants
A number of constants are prede ned by Mathematica:Pi ,I (p
1), E (2.71828...), Infinity.
Don't use I,E (or D) for variable names - Mathematicawill object.
A number of standard functions are built into Mathematica:Sqrt[] ,Exp[] ,Log[] ,Sin[] ,
ArcSin[] ,Cos[] , etc.
2.7 Calculus D[f,x] gives the derivative of fwith respect to x. For the rst few derivatives you can use
f'[x] ,f''[x] , etc.
D[f,{x,n}] gives thenth
derivative of fwith respect to x
D[f,x,y] gives the mixed derivative of fwith respect to xand y. Last Updated: December 24, 2010
20Intro to Stochastic Processes: Lecture NotesIn[40]:= A 3, 4 , 2, 1
Out[40]= 3, 4 , 2, 1
In[42]:= CharacteristicPolynomial A, x
Out[42]= 5 4 x x2 In[52]:= A 3, 4 , 2, 1
Out[52]= 3, 4 , 2, 1
In[53]:= Eigenvalues A
Out[53]= 5, 1
In[54]:= Eigenvectors A
Out[54]= 2, 1 , 1, 1 In[66]:= D x ^ k, x
Out[66]= k x
1 k

CHAPTER 2. MATHEMATICA IN 15 MIN

Integrate[f,x] gives theinde nite integral of fwith respect to x:
Integrate[f,{x,a,b}] gives thede niteintegral of fon the interval [a; b ](a or bcan be
Infinity (1 ) or -Infinity ( 1 )) :
NIntegrate[f,{x,a,b}] gives the numerical approximation of the de nite integral. This
usually returns an answer when Integrate[]doesn't work:
Sum[expr,{n,a,b}] evaluates the ( nite or in nite) sum. Use NSumfor a numerical approx-
imation.
DSolve[eqn,y,x] solves (given the general solution to) an ordinary di erential equation for
function yin the variable x:
To calculate using initial or boundary conditions use DSolve[{eqn,conds},y,x]:Last Updated: December 24, 2010
21Intro to Stochastic Processes: Lecture NotesIn[67]:= Integrate Log x , x
Out[67]= x x Log x In[72]:= Integrate Exp 2 x , x, 0, Infinity
Out[72]=
1 2 In[76]:= Integrate 1 x Sin x , x, 1, 2
Out[76]=
12
??????????????????????????? 1
x +Sin x ?
x
In[77]:= NIntegrate 1 x Sin x , x, 1, 2
Out[77]= 0.414085 In[80]:= Sum 1 k ^ 4, k, 1, Infinity
Out[80]= ??????Π
4
90 In[88]:= DSolve y '' x y x x, y x , x
Out[88]= y x x C 1 Cos x C 2 Sin x In[93]:= DSolve y ' x y x ^ 2, y 0 1 , y x , x
Out[93]= y x 1
1 x

CHAPTER 2. MATHEMATICA IN 15 MIN
2.8 Solving Equations
Algebraic equations are solved with Solve[lhs==rhs,x], wherexis the variable with re-
spect to which you want to solve the equation. Be sure to use ==and not =in equations.
Mathematica returns the list with all solutions:
FindRoot[f,{x,x0}] is used to nd a root when Solve[]does not work. It solves for x
numerically, using an initial value of x0: 2.9 Graphics
Plot[expr,{x,a,b}] plots the expressionexpr, in the variable x, from ato b:
Plot3D[expr,{x,a,b},{y,c,d}] produces a 3D plot in 2 variables:Last Updated: December 24, 2010
22Intro to Stochastic Processes: Lecture NotesIn[81]:= Solve x ^ 3 x, x
Out[81]= x 1 , x 0 , x 1 In[82]:= FindRoot Cos x x, x, 1
Out[82]= x 0.739085 In[83]:= Plot Sin x , x, 1, 3
Out[83]=
1.5 2.0 2.5 3.0
0.4
0.6
0.8
1.0 In[84]:= Plot3D Sin x ^ 2 y ^ 2 , x, 2, 3 , y, 2, 4
Out[84]=
2.0
2.5
3.0 2
0
2
4
1.0
0.5
0.0
0.5
1.0

CHAPTER 2. MATHEMATICA IN 15 MIN

If Lis a list of the form L= ffx
1; y
1g
;f x
2; y
2g
; : : : ; fx
n; y
ngg
, you can use the command
ListPlot[L] to display a graph consisting of points (x
1; y
1)
, . . . , (x
n; y
n)
: 2.10 Probability Distributions and Simulation
PDF[distr,x] andCDF[distr,x] return the pdf (pmf in the discrete case) and the cdf of
the distribution distrin the variable x. distr can be one of:
NormalDistribution[m,s] ,
ExponentialDistribution[l] ,
UniformDistribution[{a,b}] ,
BinomialDistribusion[n,p] ,
and many many others (see ?PDFand follow various links from there).
Use ExpectedValue[expr,distr,x] to compute the expectationE[f (X )], where expris the
expression for the function fin the variable x:
There is no command for the generating function, but you can get it by computing the char-
acteristic function and changing the variable a bit CharacteristicFunction[distr, - I Log[s]]:Last Updated: December 24, 2010
23Intro to Stochastic Processes: Lecture NotesIn[11]:= L Table i, i ^ 2 , i, 0, 4
Out[11]= 0, 0 , 1, 1 , 2, 4 , 3, 9 , 4, 16
In[12]:= ListPlot L
Out[12]=
1 2 3 4
5
10
15 In[23]:= distr=PoissonDistribution Λ
Out[23]= PoissonDistribution Λ
In[25]:= PDF distr, x
Out[25]= ???????????????
?
-Λ
Λx
x !
In[27]:= ExpectedValue x ^ 3, distr, x
Out[27]= Λ +3Λ2
+ Λ 3

CHAPTER 2. MATHEMATICA IN 15 MIN

To get a random number (unifomly distributed between 0and 1) use RandomReal[] . A uni-
formly distributed random number on the interval [a; b ]can be obtained by RandomReal[{a,b}] .
For a list of nuniform random numbers on [a; b ]write RandomReal[{a,b},n] .
If you need a random number from a particular continuousdistribution (normal, say), use
RandomReal[distr] orRandomReal[distr,n] if you needndraws.
When drawing from a discretedistribution use RandomIntegerinstead.
If Lis a list of numbers, Histogram[L]displays a histogram of L(you need to load the
package Histograms by issuing the command <<Histograms`before you can use it):2.11 Help Commands<br />
?name returns information about name
??name adds extra information about name
Options[command] returns all options that may be set for a given commandLast Updated: December 24, 2010
24Intro to Stochastic Processes: Lecture NotesIn[22]:= distr=PoissonDistribution Λ
Out[22]= PoissonDistribution Λ
In[23]:= CharacteristicFunction distr, -I Log s
Out[23]= ?
-1+s Λ In[2]:= RandomReal
Out[2]= 0.168904
In[3]:= RandomReal 7, 9
Out[3]= 7.83027
In[5]:= RandomReal 0, 1 , 3
Out[5]= 0.368422, 0.961658, 0.692345 In[7]:= L RandomReal NormalDistribution 0, 1 , 100 ;
In[10]:= Histograms`
In[12]:= Histogram L
Out[12]=
2 1 0 1 2
5
10
15
20

CHAPTER 2. MATHEMATICA IN 15 MIN

?pattern returns the list of matching names (used when you forget a command). pattern
contains one or more asterisks *which match any string. Try ?*Plot*
2.12 Common Mistakes Mathematica is case sensitive: Sinis not sin
Don't confuse braces, brackets, and parentheses {},[] ,()
Leave spaces between variables: write a x^2instead of ax^2, if you want to get ax2
.
Matrix multiplication uses .instead of *or a space.
Don't use =instead of ==inSolve orDSolve
If you are using an older version of Mathematica, a function might be de ned in an external
module which has to be loaded before the function can be used. For example, in some
versions, the command <<Graphics'needs to be given before any plots can be made. The<br />
symbol at the end is notan apostrophe - it is the dash above the TAB key.
Using Integrate[] around a singular point can yield wrong answers. (Use NIntegrate[]
to check.)
Don't forget the underscore _when you de ne a function. Last Updated: December 24, 2010
25Intro to Stochastic Processes: Lecture Notes

Chapter 3
Stochastic Processes
De nition 3.1. LetTbe a subset of [0;1 ). A family of random variables fX
tg
t2T , indexed by
T , is called a stochastic (or random) process . WhenT=N(or T=N
0),
fX
tg
t2T is said to be
a discrete-time process , and whenT= [0 ;1 ), it is called a continuous-time process .
When Tis a singleton (say T=f1g), the process fX
tg
t2T
X
1is really just a single random
variable. When Tis nite (e.g., T=f1;2 ; : : : ; n g), we get a random vector. Therefore, stochastic
processes are generalizations of random vectors. The interpretation is, however, somewhat
di erent. While the components of a random vector usually (not always) stand for di erent spatial
coordinates, the index t2 T is more often than not interpreted as time. Stochastic processes
usually model the evolution of a random system in time. When T= [0 ;1 )(continuous-time
processes ), the value of the process can change every instant. When T=N(discrete-time
processes ), the changes occur discretely.
In contrast to the case of random vectors or random variables, it is not easy to de ne a notion
of a density (or a probability mass function) for a stochastic process. Without going into details
why exactly this is a problem, let me just mention that the main culprit is the in nity. One usually
considers a family of (discrete, continuous, etc.) nite-dimensional distributions, i.e., the joint
distributions of random vectors (X
t1 ; X
t2 ; : : : ; X
tn )
;
for all n2 N and all choices t
1 ; : : : ; t
n2 T
.
The notion of a stochastic processes is very important both in mathematical theory and its
applications in science, engineering, economics, etc. It is used to model a large number of various
phenomena where the quantity of interest varies discretely or continuously through time in a
non-predictable fashion.
Every stochastic process can be viewed as a function of two variables - tand !. For each
xed t, ! 7! X
t(
! ) is a random variable, as postulated in the de nition. However, if we change
our point of view and keep ! xed, we see that the stochastic process is a function mapping !to
the real-valued function t7! X
t(
! ). These functions are called the trajectoriesof the stochastic
process X.
26

CHAPTER 3. STOCHASTIC PROCESSES
Figures on the left show two di erent
trajectories of a simple random walk a
,
i.e., each one corresponds to a (di er-
ent) frozen !2
, but tgoes from 0to
30 . a
We will de ne the simple random walk
later. For now, let us just say that is behaves
as follows. It starts at x= 0 fort= 0 . Af-
ter that a fair coin is tossed and we move up
(to x= 1 ) if heads is observed and down to
x = 1 is we see tails. The procedure is re-
peated at t= 1 ;2 ; : : : and the position at t+ 1 is
determined in the same way, independently of
all the coin tosses before (note that the position
at t= kcan be any of the following x= k,
x = k+ 2 , . . . , x= k 2,x = k).
Unlike with the gures above, the
two pictures on the right show
two time-slices of the same ran-
dom process; in each graph, the
time tis xed ( t= 15 vs.t=
25 ) but the various values random
variables X
15 and
X
25 can take
are presented through the prob-
ability mass functions. 3.1 The canonical probability space
When one deals with in nite-index ( #T = + 1) stochastic processes, the construction of the
probability space (
;F ;P ) to support a given model is usually quite a technical matter. This
course does not su er from that problem because all our models can be implemented on a
special probability space. We start with the sample-space
:

= [0 ;1] [0;1] = [0;1] 1
;
and any generic element of
will be a sequence != ( !
0; !
1; !
2; : : :
)of real numbers in [0;1] .
For n2 N
0 we de ne the mapping

n :

![0;1] which simply chooses the n-th coordinate :
n(
! ) = !
n:
The proof of the following theorem can be found in advanced probability books:
Theorem 3.2. There exists a -algebra Fand a probability Pon
such that
1. each
n,
n 2 N
0 is a random variable with the uniform distribution on
[0;1] , and
2. the sequence f
ng
n2 N
0 is independent. Last Updated: December 24, 2010
27Intro to Stochastic Processes: Lecture Notesæææææææææææææææææææææ
5101520- 6 - 4 - 2 246 æææææææææææææææææææææ
5101520- 6 - 4 - 2 246 - 20 - 10 010200.000.050.100.150.200.250.30 - 20 - 10 010200.000.050.100.150.200.250.30

CHAPTER 3. STOCHASTIC PROCESSES
Remark
3.3.One should think of the sample space
as a source of all the randomness in
the system: the elementary event !2
is chosen by a process beyond out control and the
exact value of !is assumed to be unknown. All the other parts of the system are possibly
complicated, but deterministic, functions of !(random variables). When a coin is tossed, only a
single drop of randomness is needed - the outcome of a coin-toss. When several coins are tossed,
more randomness is involved and the sample space must be bigger. When a system involves an
in nite number of random variables (like a stochastic process with in nite T), a large sample
space
is needed.
3.2 Constructing the Random Walk
Let us show how to construct the simple random walk on the canonical probability space (
;F ;P )
from Theorem 3.2. First of all, we need a de nition of the simple random walk:
De nition 3.4. A stochastic process fX
ng
n2 N
0 is called a
simple random walk if
1. X
0= 0
,
2. the increment X
n+1
X
n is independent of
(X
0; X
1; : : : ; X
n)
for each n2 N
0, and
3. the increment X
n+1
X
n has the coin-toss distribution, i.e.
P [X
n+1
X
n= 1] =
P[X
n+1
X
n=
1] = 1 2
:
For the sequence f
ng
n2 N , given by Theorem 3.2, de ne the following, new, sequence
f
n g
n2 N
of random variables:
n = (
1;
n 1 2
1; otherwise.
Then, we set
X0= 0
; X
n= n
X
k =1
k ; n
2N:
Intuitively, we use each
n to emulate a coin toss and then de ne the value of the process
Xat
time nas the cumulative sum of the rst ncoin-tosses.
Proposition 3.5. The sequencefX
ng
n2 N
0 de ned above is a simple random walk.
Proof. (1) is trivially true. To get (2), we rst note that the f
n g
n2 N is an independent sequence
(as it has been constructed by an application of a deterministic function to each element of an
independent sequence f
ng
n2 N ). Therefore, the increment
X
n+1
X
n=

n +1 is independent of
all the previous coin-tosses
1; : : : ;
n. What we need to prove, though, is that it is independent
of all the previous values of the process X. These, previous, values are nothing but linear
combinations of the coin-tosses
1; : : : ;
n, so they must also be independent of

n +1 . Finally, to
get (3), we compute
P[X
n+1
X
n= 1] =
P[
n +1 = 1] =
P[
n+1 1 2
] = 1 2
:
A similar computation shows that P[X
n+1
X
n=
1] = 1 2
. Last Updated: December 24, 2010
28Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
3.3 Simulation
Another way of thinking about sample spaces, and randomness in general, is through the notion
of simulation . Simulation is what I did to produce the two trajectories of the random walk above;
a computer tossed a fair coin for me 30 times and I followed the procedure described above
to construct a trajectory of the random walk. If I asked the computer to repeat the process, I
would get di erent 30 coin-tosses 1
. This procedure is the exact same one we imagine nature (or
casino equipment) follows whenever a non-deterministic situation is involved. The di erence is,
of course, that if we use the random walk to model out winnings in a fair gamble, it is much
cheaper and faster to use the computer than to go out and stake (and possibly loose) large
amounts of money. Another obvious advantage of the simulation approach is that it can be
repeated; a simulation can be run many times and various statistics (mean, variance, etc.) can be
computed. More technically, every simulation involves two separate inputs. The rst one if the actual
sequence of outcomes of coin-tosses. The other one is the structure of the model - I have to
teach the computer to go up if headsshows and to go down if tailsshow, and to repeat
the same procedure several times. In more complicated situations this structure will be more
complicated. What is remarkable is that the rst ingredient, the coin-tosses, will stay almost as
simple as in the random walk case, even in the most complicated models. In fact, all we need is
a sequence of so-called random numbers. You will see through the many examples presented
in this course that if I can get my computer to produce an independent sequence of uniformly
distributed numbers between 0and 1(these are the random numbers) I can simulate trajectories
of all important stochastic processes. Just to start you thinking, here is how to produce a coin-toss
from a random number: declare headsif the random number drawn is between 0and 0:5 , and
declare tailsotherwise.
3.3.1 Random number generation
Before we get into intricacies of simulation of complicated stochastic processes, let us spend
some time on the (seemingly) simple procedure of the generation of a single random number.
In other words, how do you teach a computer to give you a random number between 0and 1?
Theoretically, the answer is You can't!. In practice, you can get quite close. The question of what
actually constitutes a random number is surprisingly deep and we will not even touch it in this
course.
Suppose we have written a computer program, a random number generator (RNG) - call it
rand - which produces a random number between 0and 1every time we call it. So far, there
is nothing that prevents randfrom always returning the same number 0:4 , or from alternating
between 0:3 and 0:83 . Such an implementation of randwill, however, hardly qualify for an
RNG since the values it spits out come in a predictable order. We should, therefore, require
any candidate for a random number generator to produce a sequence of numbers which is as
unpredictable as possible. This is, admittedly, a hard task for a computer having only deterministic
functions in its arsenal, and that is why the random generator design is such a di cult eld. The
state of the a airs is that we speak of goodorless good random number generators, based on
some statistical properties of the produced sequences of numbers. 1
Actually, I would get the exact same 30 coin-tosses with probability 0.000000001 Last Updated: December 24, 2010
29Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
One of the most important requirements is that our RNG produce
uniformly distributed
numbers in[0;1] - namely - the sequence of numbers produced by randwill have to cover the
interval [0;1] evenly, and, in the long run, the number of random numbers in each subinterval
[ a; b ]of [0;1] should be proportional to the length of the interval b a. This requirement if hardly
enough, because the sequence
0;0 :1 ;0 :2 ; : : : ; 0:8 ;0 :9 ;1 ; 0:05 ;0 :15 ;0 :25 ; : : : ; 0:85 ;0 :95 ;0:025 ;0 :075 ;0 :125 ;0 :175 ; : : :
will do the trick while being perfectly predictable. To remedy the inadequacy of the RNGs satisfying only the requirement of uniform distribu-
tion, we might require randto have the property that the pairs of produced numbers cover the
square [0;1] [0;1] uniformly. That means that, in the long run, the proportion of pairs falling
in a patch Aof the square [0;1] [0;1] will be proportional to its area. Of course, one could
continue with such requirements and ask for triples, quadruples, . . . of random numbers to be
uniform in [0;1] 3
, [0 ;1] 4
: : : . The highest dimension nsuch that the RNG produces uniformly
distributed numbers in [0;1] n
is called the orderof the RNG. A widely-used RNG called the
Mersenne Twister , has the order of 623.
Another problem with RNGs is that the numbers produced will start to repeat after a while
(this is a fact of life and niteness of your computer's memory). The number of calls it takes for
a RNG to start repeating its output is called the periodof a RNG. You might have wondered how
is it that an RNG produces a di erent number each time it is called, since, after all, it is only a
function written in some programming language. Most often, RNGs use a hidden variable called
the random seed which stores the last output of randand is used as an (invisible) input to the
function randthe next time it is called. If we use the same seedtwice, the RNG will produce
the same number, and so the period of the RNG is limited by the number of possible seeds.
It is worth remarking that the actual random number generators usually produce a random
integer between 0and some large number RAND_MAX , and report the result normalized (divided)
by RAND_MAX to get a number in [0;1) .
3.3.2 Simulation of Random Variables
Having found a random number generator good enough for our purposes (the one used by
Mathematica is just ne), we might want to use it to simulate random variables with distributions
di erent from the uniform on [0;1] (coin-tosses, normal, exponential, . . . ). This is almost always
achieved through transformations of the output of a RNG, and we will present several methods
for dealing with this problem. A typical procedure (see the Box-Muller method below for an
exception) works as follows: a real (deterministic) function f: [0 ;1] ! R- called the transforma-
tion function - is applied torand. The result is a random variable whose distribution depends on
the choice of f. Note that the transformation function is by no means unique. In fact,
U[0;1] ,
then f(
) and ^
f (
), where ^
f (x ) = f(1 x), have the same distribution (why?).
What follows is a list of procedures commonly used to simulate popular random variables:
1. Discrete Random Variables LetXhave a discrete distribution given by
X
x1 x
2 : : : x
n
p 1 p
2 : : : p
n
: Last Updated: December 24, 2010
30Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
For discrete distributions taking an in nite number of values we can always truncate at a
very large nand approximate it with a distribution similar to the one of X.
We know that the probabilities p
1,
p
2, . . . ,
p
n add-up to 1, so we de ne the numbers
0 =q
0 <<br />
q 1 <<br />
< q
n= 1
by
q 0 = 0
; q
1=
p
1; q
2=
p
1 +
p
2; : : : q
n=
p
1 +
p
2 +
+p
n = 1
:
To simulate our discrete random variable X, we call randand then return x
1 if
0 rand <<br />
q 1, return
x
2 if
q
1
rand < q
2, and so on. It is quite obvious that this procedure indeed
simulates a random variable X. The transformation function fis in this case given by
f (x ) = 8
>
>
>
>
<<br />
>
>
>
>
: x
1;
0 x < q
1
x 2; q
1
x < q
2
: : :
x n; q
n 1
x 1
2. The Method of Inverse Functions The basic observation in this method is that , for
any continuous random variable Xwith the distribution function F
X , the random variable
Y =F
X (
X )is uniformly distributed on [0;1] . By inverting the distribution function F
X and
applying it to Y, we recover X. Therefore, if we wish to simulate a random variable with
an invertible distribution function F, we rst simulate a uniform random variable on [0;1]
(using rand) and then apply the function F
1
to the result. In other words, use f= F
1
as
the transformation function. Of course, this method fails if we cannot write F
1
in closed
form.
Example 3.6. (Exponential Distribution) Let us apply the method of inverse functions
to the simulation of an exponentially distributed random variable Xwith parameter .
Remember that the density f
X of
X is given by
f X (
x ) = exp( x ); x > 0; and so F
X (
x ) = 1 exp( x ); x > 0;
and so F
1
X (
y ) = 1
log(1
y): Since, 1 rand has the same U [0;1] -distribution as rand, we
conclude that f(x ) = 1
log(
x) works as a transformation function in this case, i.e., that
log(
rand )
has the required Exp ( )-distribution.
Example 3.7. (Cauchy Distribution) The Cauchy distribution is de ned through its density
function
fX (
x ) = 1
1 (1 +
x2
) :
The distribution function F
X can be determined explicitly in this example:
F X (
x ) = 1
Z
x
1 1 (1 +
x2
) dx
=1

2
+ arctan(
x)
; and so F
1
X (
y ) = tan

y 1 2

;
yielding that f(x ) = tan( (x 1 2
))
is a transformation function for the Cauchy random
variable, i.e., tan( (rand 0:5)) will simulate a Cauchy random variable for you. Last Updated: December 24, 2010
31Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
3.
The Box-Muller method This method is useful for simulating normal random variables,
since for them the method of inverse function fails (there is no closed-form expression for
the distribution function of a standard normal). Note that this method does not fall under
that category of transformation function methods as described above. You will see, though,
that it is very similar in spirit. It is based on a clever trick, but the complete proof is a bit
technical, so we omit it.
Proposition 3.8. Let
1 and

2 be independent U
[0;1] -distributed random variables. Then
the random variables
X1= p
2 log(
1) cos(2

2)
; X
2= p
2 log(
1) sin(2

2)
are independent and standard normal (N(0,1)).
Therefore, in order to simulate a normal random variable with mean = 0 and variance
2
= 1 , we produce call the function randtwice to produce two random numbers rand1
and rand2 . The numbers
X 1= p
2 log( rand1 ) cos(2 rand2 ); X
2= p
2 log( rand1 ) sin(2 rand2 )
will be two independent normals. Note that it is necessary to call the function randtwice,
but we also get two normal random numbers out of it. It is not hard to write a procedure
which will produce 2 normal random numbers in this way on every second call, return
one of them and store the other for the next call. In the spirit of the discussion above, the
function f= ( f
1; f
2) : (0
;1] [0;1] ! R2
given by
f 1(
x; y ) = p
2 log( x) cos(2 y); f
2(
x; y ) = p
2 log( x) sin(2 y):
can be considered a transformation function in this case.
4. Method of the Central Limit Theorem The following algorithm is often used to simulate
a normal random variable:
(a) Simulate 12 independent uniform random variables ( rands) -
1;
2; : : : ;
12.
(b) Set X=
1 +

2 +
+
12
6.
The distribution of Xis very close to the distribution of a unit normal, although not exactly
equal (e.g. P[X > 6] = 0 , andP[Z > 6]6
= 0 , for a true normal Z). The reason why X
approximates the normal distribution well comes from the following theorem
Theorem 3.9. LetX
1; X
2; : : :
be a sequence of independent random variables, all hav-
ing the same (square-integrable) distribution. Set = E[X
1] (=
E[X
2] =
: : :)and 2
=
Var[ X
1] (= Var[
X
2] =
: : :). The sequence of normalized random variables
(X
1+
X
2+
+X
n)
n
p n
;
converges to the normal random variable (in a mathematically precise sense). Last Updated: December 24, 2010
32Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
The choice of exactly 12
rands(as opposed to 11 or 35) comes from practice: it seems to
achieve satisfactory performance with relatively low computational cost. Also, the standard
deviation of a U[0;1] random variable is 1= p 12
, so the denominator p n
conveniently
becomes 1for n= 12 . It might seem a bit wasteful to use 12 calls of randin order to
produce one draw from the unit normal. If you try it out, you will see, however, that
it is of comparable speed to the Box-Muller method described above; while Box-Muller
uses computationally expensive cos;sin ;p and
log, this method uses only addition and
subtraction. The nal verdict of the comparison of the two methods will depend on the
architecture you are running the code on, and the quality of the implementation of the
functions cos;sin : : :.
5. Other methods There is a number of other methods for transforming the output of rand
into random numbers with prescribed density (rejection method, Poisson trick, . . . ). You
can read about them in the free online copy of Numerical recipes in Cat
http://www.library.cornell.edu/nr/bookcpdf.html
3.4 Monte Carlo Integration
Having described some of the procedures and methods used for simulation of various random
objects (variables, vectors, processes), we turn to an application in probability and numerical
mathematics. We start o by the following version of the Law of Large Numbers which constitutes
the theory behind most of the Monte Carlo applications
Theorem 3.10. (Law of Large Numbers) LetX
1; X
2; : : :
be a sequence of identically distributed
random variables, and let g:R ! Rbe function such that = E[g (X
1)] (=
E[g (X
2)] =
: : :)
exists. Then
g(X
1) +
g(X
2) +
+g(X
n) n
!
= Z
1
1 g
(x )f
X 1(
x )dx; asn! 1 .
The key idea of Monte Carlo integration is the following Suppose that the quantity ywe are interested in can be written as y= R
1
1 g
(x )f
X (
x )dx
for some random variable Xwith density f
X and come function
g, and that x
1; x
2; : : :
are random numbers distributed according to the distribution with density f
X . Then
the average 1 n
(
g (x
1) +
g(x
2) +
+g(x
n))
;
will approximate y.
It can be shown that the accuracy of the approximation behaves like 1=p n
, so that you have
to quadruple the number of simulations if you want to double the precision of you approximation. Last Updated: December 24, 2010
33Intro to Stochastic Processes: Lecture Notes

CHAPTER 3. STOCHASTIC PROCESSES
Example 3.11.
1. ( numerical integration) Letgbe a function on [0;1] . To approximate the integral R
1
0 g
(x )dx
we can take a sequence of n(U[0,1]) random numbers x
1; x
2; : : :
,
Z 1
0 g
(x )dx g
(x
1) +
g(x
2) +
+g(x
n) n
;
because the density of X U[0;1] is given by
f X (
x ) = (
1; 0 x 1
0 ; otherwise :
2. ( estimating probabilities ) LetYbe a random variable with the density function f
Y . If
we are interested in the probability P[Y 2[a; b ]]for some a < b, we simulate ndraws
y 1; y
2; : : : ; y
nfrom the distribution
F
Y and the required approximation is
P [Y 2[a; b ]] number of
y
n 's falling in the interval
[a; b ] n
:
One of the nicest things about the Monte-Carlo method is that even if the density of the
random variable is not available, but you can simulate draws from it, you can still preform
the calculation above and get the desired approximation. Of course, everything works in
the same way for probabilities involving random vectors in any number of dimensions.
3. ( approximating )
We can devise a simple procedure for approximating 3:141592 by using the Monte-
Carlo method. All we have to do is remember that is the area of the unit disk. Therefore,
= 4equals to the portion of the area of the unit disk lying in the positive quadrant, and we
can write 4
= Z
1
0 Z
1
0 g
(x; y )dxdy;
where g(x; y ) = (
1; x 2
+ y2
1
0 ; otherwise :
So, simulate npairs (x
i; y
i)
; i = 1 : : : n of uniformly distributed random numbers and count
how many of them fall in the upper quarter of the unit circle, i.e. how many satisfy
x 2
i +
y2
i
1, and divide by n. Multiply your result by 4, and you should be close to . How
close? Well, that is another story . . . Experiment! Last Updated: December 24, 2010
34Intro to Stochastic Processes: Lecture Notes

Chapter 4
The Simple Random Walk
4.1 Construction
We have de ned and constructed a random walk fX
ng
n2 N
0 in the previous lecture. Our next
task is to study some of its mathematical properties. Let us give a de nition of a slightly more
general creature.
De nition 4.1. A sequencefX
ng
n2 N
0 of random variables is called a
simple random walk(with
parameter p2 (0;1) ) if
1. X
0= 0
,
2. X
n+1
X
n is independent of
(X
0; X
1; : : : ; X
n)
for all n2 N, and
3. the random variable X
n+1
X
n has the following distribution
1 1
q p
where, as usual, q= 1 p.
If p= 1 2
, the random walk is called
symmetric.
The adjective simplecomes from the fact that the size of each step is xed (equal to 1) and
it is only the direction that is random. One can study more general random walks where each
step comes from an arbitrary prescribed probability distribution.
Proposition 4.2. LetfX
ng
n2 N
0 be a simple random walk with parameter
p. The distribution of
the random variable X
n is discrete with support
f n; n + 2 ; : : : ; n 2; n g, and probabilities
p l =
P[X
n=
l] =
n
l + n 2

p(
n + l) = 2
q (
n l) = 2
; l = n; n + 2 ; : : : ; n 2; n:
(4.1)
Proof. X
n is composed of
nindependent steps
k =
X
k+1
X
k,
k = 1 ; : : : ; n , each of which goes
either up or down. In order to reach level lin those nsteps, the number uof up-steps and the
number dof downsteps must satisfy u d= l(and u+ d= n). Therefore, u= n
+ l 2
and
d= n
l 2
.
35

CHAPTER 4. THE SIMPLE RANDOM WALK
The number of ways we can choose these
uup-steps from the total of nis
n
n + l 2

, which, with
the fact the probability of any trajectory with exactly uup-steps is pu
qn
u
, gives the probability
(4.1) above. Equivalently, we could have noticed that the random variable n
+ X
n 2
has the binomial
b (n; p )-distribution. The proof of Proposition 4.2 uses the simple idea already hinted at in
the previous lecture: view the random walk as a random trajectory
in some space of trajectories, and, compute the required probability
by simply counting the number of trajectories in the subset (event)
you are interested in, and adding them all together, weighted by
their probabilities. To prepare the ground for the future results, let
C be the set of all possible trajectories:
C =f(x
0; x
1; : : : ; x
n) :
x
0 = 0
; x
k+1
x
k =
1; k n 1g:
You can think of the rst nsteps of a random walk simply as a
probability distribution on the state-space C.
The gure on the right shows the superposition of all trajectories
in Cfor n= 4 and a particular one - (0;1 ;0 ;1 ;2) - in red. 4.2 The maximum
Now we know how to compute the probabilties related to the position of the random walk
f X
ng
n2 N
0 at a xed future time
n. A mathematically more interesting question can be posed
about the maximum of the random walk on f0;1 ; : : : ; n g. A nice expression for this probability
is available for the case of symmetricsimple random walks.
Proposition 4.3. LetfX
ng
n2 N
0 be a symmetric simple random walk, suppose
n 2, and let
M n = max(
X
0; : : : ; X
n)
be the maximal value of fX
ng
n2 N
0 on the interval
0;1 ; : : : ; n . The
support of M
nis
f0 ;1 ; : : : ; n gand its probability mass function is given by
p l =
P[M
n=
l] =
n
b n
+ l+1 2
c
2
n
; l = 0 ; : : : ; n:
Proof. Let us rst pick a level l2 f 0;1 ; : : : ; n gand compute the auxilliary probability q
l =
P[M
n
l ] by counting the number of trajectories whose maximal level reached is at least l. Indeed,
the symmetry assumption ensures that all trajectories are equally likely. More precisely, let
A l
C
0(
n ) be given by
Al=
f(x
0; x
1; : : : ; x
n)
2 C : max
k=0 ;:::;n x
k
lg
= f(x
0; x
1; : : : ; x
n)
2 C :x
k
l; for at least one k2 f 0; : : : ; n gg:
Then P[M
n
l] = 1 2
n
#
A
l, where
#A denotes the number of elements in the set A. When l= 0 ,
we clearly have P[M
n
0] = 1 , since X
0= 0
.
To count the number of elements in A
l, we use the following clever observation (known as
the re ection principle ):Last Updated: December 24, 2010
36Intro to Stochastic Processes: Lecture Notesæææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææææ
1234- 4 - 2 24

CHAPTER 4. THE SIMPLE RANDOM WALK
Claim
4.4.For l2 N, we have
# A
l= 2#
f(x
0; x
1; : : : ; x
n) :
x
n > l
g+ # f(x
0; x
1; : : : ; x
n) :
x
n =
lg :
(4.2)
Proof Claim 4.4. We start by de ning a bijective transformation which maps trajectories into
trajectories. For a trajectory (x
0; x
1; : : : ; x
n)
2 A
l, let
k(l) = k(l; (x
0; x
1; : : : ; x
n))
be the smallest
value of the index ksuch that x
k
l. In the stochastic-process-theory parlance, k(l) is the rst
hitting time of the set fl; l + 1 ; : : : g. We know that k(l) is well-de ned (since we are only applying
it to trajectories in A
l) and that it takes values in the set
f1; : : : ; n g. With k(l) at our disposal, let
( y
0; y
1; : : : ; y
n)
2 C be a trajectory obtained from (x
0; x
1; : : : ; x
n)
by the following procedure:
1. do nothing until you get to k(l) :
y
0 =
x
0,
y
1 =
x
1, . . .
y
k(l) =
x
k(l) .
2. use the ipped values for the coin-tosses from k(l) onwards:
y
k(l)+1
y
k(l) =
(x
k(l)+1
x
k(l) )
,
y
k(l)+2
y
k(l)+1 =
(x
k(l)+2
x
k(l)+1 )
,
. . .
y
n
y
n 1 =
(x
n
x
n 1)
. The picture on the right shows two trajectories: a blue one and its re ection in red, with
n = 15 ,l = 4 and k(l) = 8 . Graphically, (y
0; : : : ; y
n)
looks like (x
0; : : : ; x
n)
until it hits the level l,
and then follows its re ection around the level lso that y
k
l= l x
k, for
k k(l) . If k(l) = n,
then (x
0; x
1; : : : ; x
n) = (
y
0; y
1; : : : ; y
n)
. It is clear that (y
0; y
1; : : : ; y
n)
is in C. Let us denote this
transformation by :A
l!
C; (x
0; x
1; : : : ; x
n) = (
y
0; y
1; : : : ; y
n)
and call it the re ection map . The rst important property of the re exion map is that it is its
own inverse: apply to any (y
0; y
1; : : : ; y
n)
in A
l, and you will get the original
(x
0; x
1; : : : ; x
n)
. In
other words = Id , i.e. is an involution. It follows immediately that is a bijection from
A lonto
A
l.
To get to the second important property of , let us split the set A
linto three parts according
to the value of x
n:
1. A>
l =
f(x
0; x
1; : : : ; x
n)
2 A
l:
x
n > l
g,
2. A=
l =
f(x
0; x
1; : : : ; x
n)
2 A
l:
x
n =
lg , and
3. A<<br />
l =
f(x
0; x
1; : : : ; x
n)
2 A
l:
x
n < l
g,
So that (A>
l ) =
A<<br />
l ;
( A<<br />
l ) =
A>
l ;
and (A=
l ) =
A=
l :
We should note that, in the de nition of A>
l and
A=
l , the a priori stipulation that
(x
0; x
1; : : : ; x
n)
2
A lis unncessary. Indeed, if
x
n
l, you must already be in A
l. Therefore, by the bijectivity of
, Last Updated: December 24, 2010
37Intro to Stochastic Processes: Lecture Notesææææææææææææææææææææææææææææææææ
2468101214- 2 246

CHAPTER 4. THE SIMPLE RANDOM WALK
we have
#A<<br />
l = #
A>
l = #
f(x
0; x
1; : : : ; x
n) :
x
n > l
g;
and so #A
l= 2#
f(x
0; x
1; : : : ; x
n) :
x
n > l
g+ # f(x
0; x
1; : : : ; x
n) :
x
n =
lg ;
just as we claimed. Now that we have (4.2), we can easily rewrite it as follows:
P[M
n
l] = P[X
n=
l] + 2 X
j >l P
[X
n=
j] = X
j >l P
[X
n=
j] + X
j lP
[X
n=
j]:
Finally, we subtract P[M
n
l+ 1] from P[M
n
l] to get the expression for P[M
n=
l]:
P [M
n=
l] = P[X
n=
l+ 1] + P[X
n=
l]:
It remains to note that only one of the probabilities P[X
n =
l+ 1] andP[X
n =
l] is non-zero,
the rst one if nand lhave di erent parity and the second one otherwise. In either case the
non-zero probability is given by
n
b n
+ l+1 2
c
2
n
: Let us use the re ection principle to solve a classical problem in combinatorics.
Example 4.5 (The Ballot Problem) .Suppose that two candidates, Daisy and Oscar, are running
for o ce, and n2 N voters cast their ballots. Votes are counted by the same o cial, one by one,
until all nof them have been processed (like in the old days). After each ballot is opened, the
o cial records the number of votes each candidate has received so far. At the end, the o cial
announces that Daisy has won by a margin of m >0votes, i.e., that Daisy got (n + m)= 2 votes
and Oscar the remaining (n m)= 2 votes. What is the probability that at no time during the
counting has Oscar been in the lead? We assume that the order in which the o cial counts the votes is completely independent
of the actual votes, and that each voter chooses Daisy with probability p2 (0;1) and Oscar with
probability q= 1 p. For k n, let X
k be the number of votes received by Daisy
minusthe
number of votes received by Oscar in the rst kballots. When the k+ 1 -st vote is counted, X
k
either increases by 1(if the vote was for Daisy), or decreases by 1 otherwise. The votes are
independent of each other and X
0= 0
, so X
k,
0 k nis (the beginning of) a simple random
walk. The probability of an up-step is p2 (0;1) , so this random walk is not necessarily symmetric.
The ballot problem can now be restated as follows:
What is the probability that X
k
0for all k2 f 0; : : : ; n g, given that X
n=
m?
The rst step towards understanding the solution is the realization that the exact value of pdoes
not matter. Indeed, we are interested in the conditional probability P[F jG ] = P[F \G ]= P [G ], where
F denotes the family of all trajectories that always stay non-negative and Gthe family of those Last Updated: December 24, 2010
38Intro to Stochastic Processes: Lecture Notes

CHAPTER 4. THE SIMPLE RANDOM WALK
that reach
mat time n. Each trajectory in Ghas (n + m)= 2 up-steps and (n m)= 2 down-steps,
so its probability weight is always equal to p(
n + m )= 2
q (
n m )= 2
. Therefore,
P [F jG ] = P
[F \G] P
[G ] =
#(
F\G)p(
n + m )= 2
q (
n m )= 2 #
G p (
n + m )= 2
q (
n m )= 2 =#(
F\G) #
G :
(4.3)
We already know how to count the number of paths in G- it is equal to
n
( n + m )= 2
- so all
that remains to be done is to count the number of paths in G\F.
The paths in G\F form a portion of all the paths in Gwhich don't hit the level l= 1, so
that #(G\F) = # G #H , where His the set of all paths which nish at m, but cross (or, at
least, touch) the level l= 1 in the process. Can we use the re ection principle to nd #H ? Yes,
we do. In fact, you can convince yourself that the re ection of any path in Haround the level
l = 1 after its rst hitting time of that level poduces a path that starts at 0and ends at m 2.
Conversely, the same procedure applied to such a path yields a path in H. The number of paths
from 0to m 2is easy to count - it is equal to
n
( n + m )= 2+1
. Putting everything together, we get
P [F jG ] =
n
k

n
k +1
n
k
=2
k + 1 n k
+ 1 ;
where k= n
+ m 2
:
The last equality follows from the de nition of binomial coe cients
n
k
= n
! k
!( n k)! .
The Ballot problem has a long history (going back to at least 1887) and has spurred a lot of
research in combinatorics and probability. In fact, people still write research papers on some of
its generalizations. When posed outside the context of probability, it is often phrased as in how
many ways can the counting be performed . . . (the di erence being only in the normalizing
factor
n
k
appearing in (4.3) above). A special case m= 0 seems to be even more popular - the
number of 2n -step paths from 0to 0never going below zero is called the Catalan numberand
equals to
Cn = 1 n
+ 1
2n
n
:
Can you derive this expression from (4.3)? If you want to test your understanding a bit further,
here is an identity (called Segner's recurrence formula ) satis ed by the Catalan numbers
C n = n
X
i =1 C
i 1C
n i; n
2N:
Can you prove it using the Ballot-problem interpretation? Last Updated: December 24, 2010
39Intro to Stochastic Processes: Lecture Notes

Chapter 5
Generating functions
The path-counting method used in the previous lecture only works for computations related to
the rst nsteps of the random walk, where nis given in advance. We will see later that most
of the interesting questions do notfall into this category. For example, the distribution of the
time it takes for the random walk to hit the level l6
= 0 is like that. There is no way to give
an a-priori bound on the number of steps it will take to get to l(in fact, the expectation of this
random variable is +1 ). To deal with a wider class of properties of random walks (and other
processes), we need to develop some new mathematical tools.
5.1 De nition and rst properties
The distribution of an N
0-valued random variable
Xis completely determined by the sequence
f p
n g
n2 N
0 of numbers in
[0;1] given by
pn =
P[X =n]; n 2N
0:
As a sequence of real numbers, fp
n g
n2 N
0 can be used to construct a power series:
P X (
s ) = 1
X
k =0 p
ksk
: (5.1)
It follows from the fact that P
nj
p
n j
1that the radius of convergence 1
of fp
n g
n2 N
0 is at
least equal to 1. Therefore, P
X is well de ned for
s2 [ 1;1] , and, perhaps, elsewhere, too.
De nition 5.1. The functionP
X given by
P
X (
s ) = P
1
k =0 p
ksk
is called the generating function
of the random variable X, or, more precisely, of its pmf fp
n g
n2 N
0.
Before we proceed, let us nd an expression for the generating functions of some of the
popular N
0-valued random variables.
Example 5.2. 1
Remember, that the radius of convergence of a power seriesP
1
k =0 a
kx k
is the largest number R2[0;1 ]such
that P
1
k =0 a
kx k
converges absolutely whenever jx j< R .
40

CHAPTER 5. GENERATING FUNCTIONS
1.
Bernoulli - b(p ) Here p
0 =
q, p
1 =
p, and p
n = 0
, for n 2. Therefore,
P X (
s ) = ps+q:
2. Binomial - b(n; p )Since p
k =
n
k
p k
q n
k
, k = 0 ; : : : ; n , we have
P X (
s ) = n
X
k =0
n
k
pk
q n
k
sk
= ( ps+q)n
;
by the binomial theorem.
3. Geometric - g(p ) For k2 N
0,
p
k =
qk
p , so that
P X (
s ) = 1
X
k =0 q
k
sk
p = p1
X
k =0 (
qs )k
= p 1
qs :
4. Poisson - p( ) Given that p
k =
e
k k
! ,
k 2 N
0, we have
P X (
s ) = 1
X
k =0 e

k k
!s
k
= e
1
X
k =0 (
s )k k
! =
e

es
=e
(s 1)
:
Some of the most useful analytic properties of P
X are listed in the following proposition
Proposition 5.3. LetXbe an N
0-valued random variable, let
fp
n g
n2 N
0 be its pmf, and let
P
X
be its generating function. Then
1.P
X (
s ) = E[s X
], s 2 [ 1;1] ,
2. P
X (
s ) is convex and non-decreasing with 0 P
X (
s ) 1for s2 [0;1]
3. P
X (
s ) is in nitely di erentiable on ( 1;1) with
d n ds
nP
X (
s ) = 1
X
k = n k
(k 1) : : : (k n+ 1) sk
n
pk; n
2N:
(5.2)
In particular, p
n = 1 n
! d
n ds
n
P
X (
s )

s =0 and so
s7! P
X (
s ) uniquely determines the sequence
f p
n g
n2 N
0.
Proof. Statement (1) follows directly from the formula E[g (X )] = P
1
k =0 g
(k )p
k, applied to
g(x ) =
s x
. As far as (3) is concerned, we only note that the expression (5.2) is exactly what you would
get if you di erentiated the expression (5.1) term by term. The rigorous proof of the fact this is
allowed is beyond the scope of these notes. With (3) at our disposal, (2) follows by the fact that
the rst two derivatives of the function P
X are non-negative and that
P
X (1) = 1
. Remark
5.4.
1. If you know about moment-generating functions, you will notice that P
X (
s ) = M
X(log(
s)) ,
for s2 (0;1) , where M
X(
) = E[exp( X)]is the moment-generating function of X. Last Updated: December 24, 2010
41Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
2. Generating functions can be used with sequences
fa
ng
n2 N
0 which are not necessarily pmf's
of random variables. The method is useful for any sequence fa
ng
n2 N
0 such that the power
series P
1
k =0 a
ksk
has a positive (non-zero) radius of convergence.
3. The name generating function comes from the last part of the property (3). The knowledge
of P
X implies the knowledge of the whole sequence
fp
n g
n2 N
0. Put di erently,
P
X generates
the whole distribution of X.
Remark 5.5.Note that the true radius of convergence varies from distribution to distribution. It
is in nite in (1), (2) and (4), and equal to 1=q > 1in (3), in Example 5.2. For the distribution with
pmf given by p
k = C (
k +1) 2
, where
C= ( P
1
k =0 1 (
k +1) 2
)
1
, the radius of convergence is exactly equal
to 1. Can you see why?
5.2 Convolution and moments
The true power of generating functions comes from the fact that they behave very well under
the usual operations in probability.
De nition 5.6. Letfp
n g
n2 N
0 and
fq
n g
n2 N
0 be two probability-mass functions. The
convolution
p q of fp
n g
n2 N
0 and
fq
n g
n2 N
0 is the sequence
fr
n g
n2 N
0, where
r n = n
X
k =0 p
kq
n k; n
2N
0:
This abstractly-de ned operation will become much clearer once we prove the following
proposition:
Proposition 5.7. LetX; Y be two independent N
0-valued random variables with pmfs
fp
n g
n2 N
0
and fq
n g
n2 N
0. Then the sum
Z= X +Y is also N
0-valued and its pmf is the convolution of
f p
n g
n2 N
0 and
fq
n g
n2 N
0 in the sense of De nition 5.6.
Proof. Clearly, ZisN
0-valued. To obtain an expression for its pmf, we use the law of total
probability:
P[Z = n] = n
X
k =0 P
[X =k]P [Z = njX =k]:
However, P[Z = njX =k] = P[X +Y =njX =k] = P[Y =n kjX =k] = P[Y =n k], where
the last equality follows from independence of Xand Y. Therefore,
P [Z = n] = n
X
k =0 P
[X =k]P [Y =n k] = n
X
k =0 p
kq
n k: Corollary 5.8.
Letfp
n g
n2 N
0 and
fp
n g
n2 N
0 be any two pmfs.
1. Convolution is commutative, i.e., p q = q p. Last Updated: December 24, 2010
42Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
2. The convolution of two pmfs is a pmf, i.e.
r
n
0, for all n2 N
0 and P
1
k =0 r
k = 1
, for
r = p q.
Corollary 5.9. Letfp
n g
n2 N
0 and
fp
n g
n2 N
0 be any two pmfs, and let
P (s ) = 1
X
k =0 p
ksk
and Q(s ) = 1
X
k =0 q
k sk
be their generating functions. Then the generating function R(s ) = P
1
k =0 r
k sk
, of the convolu-
tion r= p q is given by
R(s ) = P(s )Q (s ):
Equivalently, the generating function P
X +Y of the sum of two independent
N
0-valued random
variables is equal to the product
PX +Y (
s ) = P
X (
s )P
Y (
s );
of the generating functions P
X and
P
Y of
X and Y.
Example 5.10. 1. The binomial b(n; p )distribution is a sum of nindependent Bernoullis b(p ). Therefore,
if we apply Corrolary 5.9 ntimes to the generating function (q + ps)of the Bernoulli
b (p ) distribution we immediately get that the generating function of the binomial is (q +
ps ): : : (q + ps) = ( q+ ps)n
.
2. More generally, we can show that the sum of mindependent random variables with the
b (n; p )distribution has a binomial distribution b(mn; p ). If you try to sum binomials with
di erent values of the parameter pyou will not get a binomial.
3. What is even more interesting, the following statement can be shown: Suppose that the sum Zof two independent N
0-valued random variables
Xand Yis binomially distributed
with parameters nand p. Then both Xand Yare binomial with parameters n
X ; p
and
n y; p
where n
X +
n
Y =
n. In other words, the only way to get a binomial as a sum of
independent random variables is the trivial one.
Another useful thing about generating functions is that they make the computation of mo-
ments easier.
Proposition 5.11. Letfp
n g
n2 N
0 be a pmf of an
N
0-valued random variable
Xand let P(s ) be
its generating function. For n2 N the following two statements are equivalent
1. E[X n
] < 1,
2. d
n
P (s ) ds
n

s =1 exists (in the sense that the left limit
lim
s% 1d
n
P (s ) ds
n
exists)
In either case, we have
E[X (X 1)( X 2) : : : (X n+ 1)] = d
n ds
nP
(s )

s =1 : Last Updated: December 24, 2010
43Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
The quantities
E[X ]; E[X (X 1)] ; E[X (X 1)( X 2)] ; : : :
are called factorial moments of the random variable X. You can get the classical moments from
the factorial moments by solving a system of linear equations. It is very simple for the rst few:
E[X ] = E[X ];
E [X 2
] = E[X (X 1)] + E[X ];
E [X 3
] = E[X (X 1)( X 2)]] + 3 E[X (X 1)] + E[X ]; : : :
A useful identity which follows directly from the above results is the following:
Var[X] = P00
(1) + P0
(1) (P 0
(1)) 2
;
and it is valid if the rst two derivatives of Pat1exist.
Example 5.12. LetXbe a Poisson random variable with parameter . Its generating function is
given by PX (
s ) = e
(s 1)
:
Therefore, d
n ds
n
P
X (1) =
n
, and so, the sequence (E [X ]; E [X (X 1)] ;E [X (X 1)( X 2)] ; : : : )of
factorial moments of Xis just ( ; 2
; 3
; : : : ). It follows that
E [X ] = ;
E [X 2
] = 2
+ ; Var[ X] =
E [X 3
] = 3
+ 3 2
+ ; : : :
5.3 Random sums and Wald's identity
Our next application of generating function in the theory of stochastic processes deals with the
so-called random sums . Letf
n g
n2 N be a sequence of random variables, and let
Nbe a random
time (a random time is simply anN
0[ f
+1g -value random variable). We can de ne the random
variable
Y= N
X
k =0
k by
Y(! ) = (
0; N (! ) = 0 ;
P N(! )
k =1
k (
! ); N (! ) 1 for
!2
:
More generally, for an arbitrary stochastic process fX
ng
n2 N
0 and a random time
N(with P[N =
+ 1 ] = 0) , we de ne the random variable X
N by
X
N(
! ) = X
N(! )(
! ), for !2
. When N
is a constant ( N=n), then X
N is simply equal to
X
n. In general, think of
X
N as a value
of the stochasti process Xtaken at the time which is itself random. If X
n = P
n
k =1
k , then
X N = P
N
k =1
k .
Example 5.13. Letf
n g
n2 N be the increments of a symmetric simple random walk (coin-tosses),
and let Nhave the following distribution
N
0 1 2
1 = 3 1 =3 1 =3 Last Updated: December 24, 2010
44Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
which is
independent off
n g
n2 N (it is very important to specify the dependence structure
between Nand f
n g
n2 N in this setting!). Let us compute the distribution of
Y=P
N
k =0
k in this
case. This is where we, typically, use the formula of total probability:
P[Y =m] = P[Y =mjN = 0] P[N = 0] + P[Y =mjN = 1] P[N = 1] + P[Y =mjN = 2] P[N = 2]
= P[ N
X
k =0
k =
mjN = 0] P[N = 0] + P[ N
X
k =0
k =
mjN = 1] P[N = 1]
+ P[ N
X
k =0
k =
mjN = 2] P[N = 2]
= 1 3
(
P [0 = m] + P[
1 =
m] + P[
1 +

2 =
m]):
When m= 1 (for example), we get
P[Y = 1] = 0 +
1 2
+ 0 3
= 1
=6:
Perform the computation for some other values of mfor yourself.
What happens when Nand f
n g
n2 N are dependent? This will usually be the case in practice,
as the value of the time Nwhen we stop adding increments will typically depend on the behaviour
of the sum itself.
Example 5.14. Letf
n g
n2 N be as above - we can think of a situation where a gambler is repeatedly
playing the same game in which a coin is tossed and the gambler wins a dollar if the outcome
is heads and looses a dollar otherwise. A smart gambler enters the game and decides on the
following tactic: Let's see how the rst game goes. If I lose, I'll play another 2 games and
hopefully cover my losses, and If I win, I'll quit then and there. The described strategy amounts
to the choice of the random time Nas follows:
N (! ) = (
1;
1= 1
;
3 ;
1=
1:
Then Y(! ) = (
1;
1=
1;
1 +
2 +

3 = 2]
P[
1 =
1]
= 1 2
(1 + 1 4
) = 5 8
:
Similarly, we get P[Y = 1] = 1 4
and
P[Y = 3] = 1 8
. The expectation
E[Y ]is equal to 1 5 8
+
( 1) 1 4
+ (
3) 1 8
= 0
. This is not an accident. One of the rst powerful results of the beautiful
martingale theory states that no matter how smart a strategy you emply, you cannot beat a fair
gamble. Last Updated: December 24, 2010
45Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
We will return to the general (non-independent) case in the next lecture. Let us use generating
functions to give a full description of the distribution of Y=P
N
k =0
k in this case.
Proposition 5.15. Letf
n g
n2 N be a sequence of independent
N
0-valued random variables, all
of which share the same distribution with pmf fp
n g
n2 N
0 and generating function
P
(
s ). Let N
be a random time independent of f
n g
n2 N . Then the generating function
P
Y of the random
sum Y=P
N
k =0
k is given by
PY (
s ) = P
N (
P
(
s )) :
Proof. We use the idea from Example 5.13 and condition on possible values of N. We also use
the following fact (Tonelli's theorem) without proof:
If8i; j a
ij
0; then 1
X
k =0 1
X
i =0 a
ij = 1
X
i =0 1
X
k =0 a
ij :
(5.3)
P Y (
s ) = 1
X
k =0 s
k
P [Y =k]
= 1
X
k =0 s
k
1
X
i =0 P
[Y =kjN =i]P [N =i]
= 1
X
k =0 s
k
1
X
i =0 P
[ i
X
j =0
j =
k]P [N =i]
(by independence)
= 1
X
i =0 1
X
k =0 s
k
P [ i
X
j =0
j =
k]P [N =i] (by Tonelli)
= 1
X
i =0 P
[N =i] 1
X
k =0 s
k
P [ i
X
j =0
j =
k] (by(5.3) )
By (iteration of) Corollary 5.9, we know that the generating function of the random variable
P i
j =0
j - which is exactly what the second sum above represents - is
(P
(
s )) i
. Therefore, the
chain of equalities above can be continued as
=1
X
i =0 P
[N =i]( P
(
s )) i
= P
N (
P
(
s )) : Corollary 5.16
(Wald's Identity I) .Let f
n g
n2 N and
Nbe as in Proposition 5.15. Suppose, also,
that E[N ]< 1 and E[
1]
< 1. Then
E[N
X
k =0
k ] =
E[N ]E [
1]
: Last Updated: December 24, 2010
46Intro to Stochastic Processes: Lecture Notes

CHAPTER 5. GENERATING FUNCTIONS
Proof.
We just apply the composition rule for derivatives to the equality P
Y =
P
N
P
to get
P 0
Y (
s ) = P0
N (
P
(
s )) P 0
(
s ):
After we let s% 1, we get
E [Y ] = P0
Y (1) =
P0
N (
P
(1))
P0
(1) =
P0
N (1)
P0
(1) =
E[N ]E [
1]
: Example 5.17.
Every time Spring eld Wildcats play in the Superbowl, their chance of winning
is p2 (0;1) . The number of years between two Superbowls they get to play in has the Poisson
distribution p( ), > 0. What is the expected number of years Ybetween the consequtive
Superbowl wins?
Let f
n g
n2 N be the sequence of independent
p( )-random variables modeling the number
of years between consecutive Superbowl appearances by the Wildcat. Moreover, let Nbe a
geometric g(p ) random variable with success probability p. Then
Y = N
X
k =0
k :
Indeed, every time the Wildcats lose the Superbowl, another
years have to pass before they
get another chance and the whole thing stops when they nally win. To compute the expectation
of Ywe use Corollary 5.16
E[Y ] = E[N ]E [
k ] = 1
p p
: Last Updated: December 24, 2010
47Intro to Stochastic Processes: Lecture Notes

Chapter 6
Random walks - advanced methods
6.1 Stopping times
The last application of generating functions dealt with sums evaluated between 0and some
random time N. An especially interesting case occurs when the value of Ndepends directly
on the evolution of the underlying stochastic process. Even more important is the case where
time's arrow is taken into account. If you think of Nas the time you stopadding new terms to
the sum, it is usually the case that you are not allowed (able) to see the values of the terms you
would get if you continued adding. Think of an investor in the stock market. Her decision to
stop and sell her stocks can depend only on the information available up to the moment of the
decision. Otherwise, she would sell at the absolute maximum and buy at the absolute minimum,
making tons of money in the process. Of course, this is not possible unless you are clairvoyant,
so the mere mortals have to restrict their choices to so-called stopping times.
De nition 6.1. LetfX
ng
n2 N
0 be a stochastic process. A random variable
Ttaking values in
N 0[ f
+1g is said to be a stopping time with respect to fX
ng
n2 N
0 if for each
n2 N
0 there exists
a function Gn
:R n
+1
! f 0;1 g such that
1 fT = ng =
Gn
(X
0; X
1; : : : ; X
n)
; for all n2 N
0:
The functions Gn
are called the decision functions , and should be thought of as a black box
which takes the values of the process fX
ng
n2 N
0 observed up to the present point and outputs
either 0or 1. The value 0means keep going and1means stop. The whole point is that the
decision has to based only on the available observations and not on the future ones.
Example 6.2. 1. The simplest examples of stopping times are (non-random) deterministic times. Just set
T = 5 (orT= 723 orT= n
0 for any
n
0 2
N
0[ f
+1g ), no matter what the state of the
world !2
is. The family of decision rules is easy to construct:
Gn
(x
0; x
1; : : : ; x
n) = (
1; n =n
0;
0 ; n 6
= n
0: :
Decision functions Gn
do not depend on the values of X
0; X
1; : : : ; X
nat all
. A gambler
who stops gambling after 20 games, no matter of what the winnings of losses are uses such
a rule.
48

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
2. Probably the most well-known examples of stopping times are
( rst) hitting times. They
can be de ned for general stochastic processes, but we will stick to simple random walks
for the purposes of this example. So, let X
n = P
n
k =0
k be a simple random walk, and let
T l be the rst time
Xhits the level l2 N. More precisely, we use the following slightly
non-intuitive but mathematically correct de nition
Tl= min
fn 2 N
0 :
X
n=
lg :
The set fn 2 N
0 :
X
n =
lg is the collection of all time-points at which Xvisits the level l.
The earliest one - the minimum of that set - is the rst hitting time of l. In states of the
world !2
in which the level ljust never get reached, i.e., when fn 2 N
0 :
X
n=
lg is an
empty set, we set T
l(
! ) = + 1. In order to show that T
l is indeed a stopping time, we need
to construct the decision functions Gn
, n 2 N
0. Let us start with
n= 0 . We would have
T l = 0
in the (impossible) case X
0=
l, so we always have G0
(X
0) = 0
. How about n2 N.
For the value of T
l to be equal to exactly
n, two things must happen:
(a) X
n=
l(the level lmust actually be hit at time n), and
(b) X
n 1 6
= l, X
n 2 6
= l, . . . , X
16
= l, X
06
= l(the level lhas not been hit before).
Therefore, Gn
(x
0; x
1; : : : ; x
n) = (
1; x
06
= l; x
16
= l; : : : ; x
n 1 6
= l; x
n=
l
0 ; otherwise :
The hitting time T
2 of the level
l= 2 for a particular trajectory of a symmetric simple
random walk is depicted below: :
3. How about something that is nota stopping time? Let n
0 be an arbitrary time-horizon and
let T
M be the last time during
0; : : : ; n
0that the random walk visits its maximum during
0 ; : : : ; n
0(see picture above). If you bought a stock at time
t= 0 , had to sell it some time
before n
0 and had the ability to predict the future, this is one of the points you would choose
to sell it at. Of course, it is impossible to decide whether T
M =
n, for some n2 0; : : : ; n
0
1
without the knowledge of the values of the random walk after n. More precisely, let us
sketch the proof of the fact that T
M is not a stopping time. Suppose, to the contrary, that it
is, and let Gn
be the family of decision functions. Consider the following two trajectories:
(0 ;1 ;2 ;3 ; : : : ; n 1; n )and (0;1 ;2 ;3 ; : : : ; n 1; n 2). The di er only in the direction of the Last Updated: December 24, 2010
49Intro to Stochastic Processes: Lecture NotesT
2
T
M
51015202530- 6 - 4 - 2 246

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
last step. They also di er in the fact that
T
M =
nfor the rst one and T
M =
n 1for the
second one. On the other hand, by the de nition of the decision functions, we have
1fT
M =
n 1g =
Gn
1
(X
0; : : : ; X
n 1)
:
The right-hand side is equal for both trajectories, while the left-hand side equals to 0for
the rst one and 1for the second one. A contradiction.
6.2 Wald's identity II
Having de ned the notion of a stopping time, let use try to compute something about it. The
random variables f
n g
n2 N in the statement of the theorem below are only assumed to be in-
dependent of each other and identically distributed. To make things simpler, you can think of
f
n g
n2 N as increments of a simple random walk. Before we state the main result, here is an
extremely useful identity:
Proposition 6.3. LetNbe an N
0-valued random variable. Then
E[N ] = 1
X
k =1 P
[N k]:
Proof. Clearly, P[N k] = P
j k P
[N =j], so (note what happens to the indices when we switch
the sums)
1
X
k =1 P
[N k] = 1
X
k =1 X
j k P
[N =j]
= 1
X
j =1 j
X
k =1 P
[n = j] = 1
X
j =1 j
P [N =j]
= E[N ]: Theorem 6.4
(Wald's Identity II) .Let f
n g
n2 N be a sequence of independent, identically dis-
tributed random variables with E[j
1j
] < 1. Set
X n= n
X
k =1
k ; n
2N
0:
If T is an fX
ng
n2 N
0-stopping time such that
E[T ]< 1, then
E [X
T] =
E[
1]
E [T ]:
Proof. Here is another way of writing the sum P
T
k =1
k :
T
X
k =1
k = 1
X
k =1
k 1
fk Tg: Last Updated: December 24, 2010
50Intro to Stochastic Processes: Lecture Notes

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
The idea behind it is simple: add all the values of

k for
k Tand keep adding zeros (since
k 1
fk Tg = 0
fork > T ) after that. Taking expectation of both sides and switching Eand P
(this
can be justi ed, but the argument is technical and we omit it here) yields:
E[ T
X
k =1
k ] = 1
X
k =1 E
[1
fk Tg
k ]
: (6.1)
Let us examine the term E[
k 1
fk Tg]
in some detail. We rst note that
1 fk Tg = 1
1
fk>T g= 1
1
fk 1 Tg = 1
k
1
X
j =0 1
fT = jg :
Therefore, E[
k 1
fk Tg] =
E[
k ]
k
1
X
j =0 E
[
k 1
fT = jg ]
:
By the assumption that Tis a stopping time, the indicator 1
fT = jg can be represented as
1
fT = jg =
G j
(X
0; : : : ; X
j)
, and, because each X
iis just a sum of the increments, we can actually write
1
fT = jg
as a function of
1; : : : ;
jonly - say
1
fT = jg =
Hj
(
1; : : : ;
j)
. By the independence of (
1; : : : ;
j)
from
k (because
j < k) we have
E [
k 1
fT = jg ] =
E[
k H j
(
1; : : : ;
j)] =
E[
k ]
E [H j
(
1; : : : ;
j)] =
E[
k ]
E [1
fT = jg ] =
E[
k ]
P [T = j]:
Therefore,
E[
k 1
fk Tg] =
E[
k ]
k
1
X
j =0 E
[
k ]
P [T = j] = E[
k ]
P [T k] = E[
1]
P [T k];
where the last equality follows from the fact that all
k have the same distribution.
Going back to (6.1), we get
E[X
T] =
E[ T
X
k =1
k ] = 1
X
k =1 E
[
1]
P [T k] = E[
1] 1
X
k =1 P
[T k] = E[
1]
E [T ];
where we use Proposition 6.3 for the last equality. Example 6.5
(Gambler's ruin problem) .. A gambler start with x2 N dollars and repeatedly plays
a game in which he wins a dollar with probability 1 2
and loses a dollar with probability 1 2
. He
decides to stop when one of the following two things happens:
1. he goes bankrupt, i.e., his wealth hits 0, or
2. he makes enough money, i.e., his wealth reaches some level a > x.
The classical Gambler's ruin problem asks the following question: what is the probability that
the gambler will make adollars before he goes bankrupt? Last Updated: December 24, 2010
51Intro to Stochastic Processes: Lecture Notes

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
Gambler's wealth
fW
ng
n2 N is modelled by a simple random walk starting from
x, whose
increments
k =
W
k
W
k 1 are coin-tosses. Then
W
n=
x+ X
n, where
X
n = P
n
k =0
k ,
n 2 N
0.
Let Tbe the time the gambler stops. We can represent Tin two di erent (but equivalent) ways.
On the one hand, we can think of Tas the smaller of the two hitting times T
x and
T
a x of
the levels x and a xfor the random walk fX
ng
n2 N
0 (remember that
W
n=
x+ X
n, so these
two correspond to the hitting times for the process fW
ng
n2 N
0 of the levels
0and a). On the
other hand, we can think of Tas the rst hitting time of the two-element setf x; a xg for
the process fX
ng
n2 N
0. In either case, it is quite clear that
Tis a stopping time (can you write
down the decision functions?). We will see later that the probability that the gambler's wealth will
remain strictly between 0and aforever is zero, so P[T < 1] = 1 . Moreover, we will prove that
E [T ]< 1.
What can we say about the random variable X
T - the gambler's wealth (minus
x) at the
random timeT? Clearly, it is either equal to x or to a x, and the probabilities p
0 and
p
a with
which it takes these values are exactly what we are after in this problem. We know that, since
there are no other values X
T can take, we must have
p
0 +
p
a = 1
. Wald's identity gives us the
second equation for p
0 and
p
a:
E[X
T] =
E[
1]
E [T ] = 0 E [T ] = 0 ;
so 0 =E[X
T] =
p
0(
x) + p
a(
a x):
These two linear equations with two unknowns yield
p0 = a
x a
; p
a= x a
:
It is remarkable that the two probabilities are proportional to the amounts of money the gambler
needs to make (lose) in the two outcomes. The situation is di erent when p6
= 1 2
.
6.3 The distribution of the rst hitting time T
1
6.3.1 A recursive formula
Let fX
ng
n2 N
0 be a simple random walk, with the probability
pof stepping up. Let T
1 = min
fn 2
N 0 :
X
n= 1
gbe the rst hitting time of level l= 1 , and let fp
n g
n2 N
0 be its pmf, i.e.,
p
n =
P[T
1 =
n],
n 2 N
0. The goal of this section is to use the powerful generating-function methods to nd
f p
n g
n2 N
0. You cannot get from
0to 1in an even number of steps, so p
2n = 0
,n 2 N
0. Also,
p
1 =
p
- you just have to go up on the rst step. What about n >1? In order to go from 0to 1in n > 1
steps (and not before!) the rst step needs to be downand then you need to climb up from 1
to 1in n 1steps. Climbing from 1 to 1is exactly the same as climbing from 1 to 0and then
climbing from 0to 1. If it took jsteps to go from 1 to 0it will have to take n 1 jsteps to go
from 1to 2, where jcan be anything from 1to n 2, in order to nish the job in exactly n 1 Last Updated: December 24, 2010
52Intro to Stochastic Processes: Lecture Notes

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
steps. So, in formulas, we have
P [T
1 =
n] =
= qn
2
X
j =1 P
[ exactly jsteps to rst hit 0from 1 and exactly n 1 jsteps to rst hit 1from 0 ]:
(6.2)
Taking jsteps from 1 to 0is exactly the same as taking jsteps from 0to 1, so
P [ exactly jsteps to rst hit 0from 1 ] = P[T
1 =
j] = p
j:
By the same token, P[ exactly n 1 jsteps to rst hit 1from 0 ] = P[T
1 =
n 1 j] = p
n 1 j:
Finally, I claim that the two events are independent of each other 1
. Indeed, once we have reached
0 , the future increments of the random walk behave exactly the same as the increments of a fresh
random walk starting from zero - they are independent of everything. Equivalently, a knowledge
of everything that happened until the moment the random walk hit 0for the rst time does
not change our perception (and estimation) of what is going to happen later - in this case the
likelihood of hitting 1in exactly n 1 jsteps. This property is called the regeneration property
or the strong Lévy property of random walks. More precisely (but still not entirely precise), we
can make the following claim:
LetfX
ng
n2 N
0 be a simple random walk and let
Tbe any N
0-valued stopping time.
De ne the process fY
ng
n2 N
0 by
Y
n =
X
T+ n
X
T. Then
fY
ng
n2 N
0 is also a simple
random walk, and it is independent of Xup to T.
In order to check your understanding, try to convince yourself that the requirement that Tbe a
stopping time is necessary - nd an example of a random time Twhich is not a stopping time
where the statement above fails.
We can go back to the distribution of the hitting time T
1, and use our newly-found indepen-
dence together with (6.2) to obtain the following recursion
pn =
qn
2
X
j =1 p
jp
n j 1; n >
1; p
0= 0
; p
1=
p:
(6.3)
6.3.2 Generating-function approach
This is where generating functions step in. We will use (6.3) to derive an equation for the
generating function P(s ) = P
1
k =0 p
ksk
. The sum on the right-hand side of (6.3) looks a little bit
like a convolution, so let us compare it to the following expansion of the square P(s )2
:
P (s )2
= 1
X
k =0 (k
X
i =0 p
ip
k i)
s k
: 1
A demanding reader will object at this point, since my de nitions of the two events are somewhat loose. I beg
forgiveness. Last Updated: December 24, 2010
53Intro to Stochastic Processes: Lecture Notes

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
The inner sum
P
k
i =0 p
ip
k ineeds to be split into several parts to get an expression which matches
(6.3):
k
X
i =0 p
ip
k i=
p
0p
k
| {z }
0 +
k
1
X
i =1 p
ip
k i+
p
kp
0
| {z }
0 =
(
k +1) 2
X
i =1 p
ip
(k +1) i 1 =
q
1
p k+1 ; k
2:
Therefore, since the coe cients of P(s )2
start at s2
, we have
qsP (s )2
= qs 1
X
k =2 q

1
p k+1 sk
= 1
X
k =2 p
k+1 sk
+1
=P(s ) ps;
which is nothing but a quadratic equation for P. It admits two solutions (for each s):
P (s ) = 1
p 1
4pqs 2 2
qs :
One of the two solutions is always greater than 1in absolute value, so it cannot correspond to a
value of a generating function, so we conclude that
P(s ) = 1
p 1
4pqs 2 2
qs ;
for js j 1 2
p pq
:
It remains to extract the information about fp
n g
n2 N
0 from
P. The obvious way to do it is to
compute higher and higher derivatives of Pand s= 1 . There is an easier way, though. The
square root appearing in the formula for Pis an expression of the form (1 +x)1
= 2
and the
(generalized) binomial formula can be used:
(1 +x)
= 1
X
k =0

k
x
; where

k
=
( 1) : : : ( k+ 1) k
! ; k
2N; 2R:
Therefore, P(s ) = 1 2
qs
1 2
qs 1
X
k =0 1 2
q
1=2
k
( 4pqs 2
)k
= 1
X
k =1 s
2
k 1 1 2
p (4
pq )k
( 1) k
1
1=2
k
; (6.4)
and so p2k 1 = 1 2
q (4
pq )k
( 1) k
1
1=2
k
; p 2k 2 = 0
; k 2N:
This expression can be simpli ed a bit further: the formula for
1=2
k
evaluates to:
1=2
k
= ( 1) k
1 1 4
k
1 1 k

2k 3
k 2
:
Thus, p2k 1 =
pk
q k
12 k

2k 3
k 2
:
This last expression might remind you of something related to the re ection principle. And it is!
Can you derive the formula for p
2k 1 from the re ection principle? How would you deal with
the fact that the random walk here is not symmetric? Last Updated: December 24, 2010
54Intro to Stochastic Processes: Lecture Notes

CHAPTER 6. RANDOM WALKS - ADVANCED METHODS
6.3.3 Do we actually hit
1sooner or later?
What happens if we try to evaluate P(1) ? We should get 1, right? In fact, what we get is the
following:
P(1) = 1
p 1
4pq 2
q =
1
j p qj 2
q =(
1; p 1 2
p q
; p < 1 2
Clearly, P(1) <1when p < q. The explanation is simple - the random walk may fail to hit<br />
the level 1at all , ifp < q . In that case P(1) = P
1
k =0 p
k =
P[T
1 <<br />
1]< 1, or, equivalently,
P [T
1 = +
1]> 0. It is remarkable that if p= 1 2
, the random walk
willalways hit 1sooner or later,
but this does not need to happen if p <1 2<br />
. What we have here is an example of a phenomenon
known as criticality: many physical systems exhibit qualitatively di erent behavior depending
on whether the value of certain parameter plies above or below certain critical valuep= p
c.
6.3.4 Expected time until we hit 1?
Another question that generating functions can help up answer is the following one: how long, on
average, do we need to wait before 1is hit? When p <1 2<br />
P
[T
1 = +
1]> 0, so we can immediately
conclude that E[T
1] = +
1, by de nition. The case p 1 2
is more interesting. Following the
recipe from the lecture on generating functions, we compute the derivative of P(s ) and get
P 0
( s ) = 2
p p
1
4pqs 2 1
p 1
4pqs 2 2
qs 2 :
When p= 1 2
, we get
lim
s % 1P
0
( s ) = lim
s% 1
1 p
1
s2 1
p 1
s2 s
2
= + 1;
and conclude that E[T
1] = +
1.
For p > 1 2
, the situation is less severe:
lim
s % 1P
0
( s ) = 1 p
q:
We can summarize the situation in the following table P[T
1 <<br />
1] E[T
1] p <<br />
1 2
p q
+
1 p
= 1 2
1 +
1 p >
1 2
1 1 p
q Last Updated: December 24, 2010
55Intro to Stochastic Processes: Lecture Notes

Chapter 7
Branching processes
7.1 A bit of history
In the mid 19th century several aristocratic families in Victorian
England realized that their family names could become extinct.
Was it just unfounded paranoia, or did something real prompt
them to come to this conclusion? They decided to ask around,
and Sir Francis Galton (a polymath, anthropologist, eugenicist,
tropical explorer, geographer, inventor, meteorologist, proto-
geneticist, psychometrician and statistician and half-cousin of
Charles Darwin) posed the following question (1873, Educa-
tional Times ):
How many male children (on average) must each
generation of a family have in order for the family
name to continue in perpetuity?
The rst complete answer came from Reverend Henry William
Watson soon after, and the two wrote a joint paper entitled One
the probability of extinction of families in 1874. By the end of
this lecture, you will be able to give a precise answer to Galton's
question. Sir Francis Galton
7.2 A mathematical model
The model proposed by Watson was the following: 1. A population starts with one individual at time n= 0 :Z
0 = 1
.
2. After one unit of time (at time n= 1 ) the sole individual produces Z
1 identical clones of
itself and dies. Z
1 is an
N
0-valued random variable.
3. (a) If Z
1 happens to be equal to
0the population is dead and nothing happens at any future
time n 2.
56

CHAPTER 7. BRANCHING PROCESSES
(b) If
Z
1 >
0, a unit of time later, each of Z
1 individuals gives birth to a random number
of children and dies. The rst one has Z
1;1 children, the second one
Z
1;2 children,
etc. The last, Zth
1 one, gives birth to
Z
1;Z
1 children. We assume that the distribution
of the number of children is the same for each individual in every generation and
independent of either the number of individuals in the generation and of the number
of children the others have. This distribution, shared by all Z
n;i and
Z
1, is called the
o spring distribution . The total number of individuals in the second generation is
now
Z2 = Z
1
X
k =1 Z
1;k :
(c) The third, fourth, etc. generations are produced in the same way. If it ever happens that Zn = 0
, for some n, then Z
m = 0
for all m n- the population is extinct. Otherwise,
Z n+1 =Z
n
X
k =1 Z
n;k :
De nition 7.1. A stochastic process with the properties described in (1), (2) and (3) above is called
a (simple) branching process .
The mechanism that produces the next generation from the present one can di er from ap-
plication to application. It is the o spring distribution alone that determines the evolution of a
branching process. With this new formalism, we can pose Galton's question more precisely:
Under what conditions on the o spring distribution will the process fZ
ng
n2 N
0 never
go extinct, i.e., when does
P[Z
n
1for all n2 N
0] = 1
(7.1)
hold?
7.3 Construction and simulation of branching processes
Before we answer Galton's question, let us gure out how to simulate a branching process, for a
given o spring distribution fp
n g
n2 N
0 (
p
k =
P[Z
1 =
k]). The distribution fp
n g
n2 N
0 is
N
0-valued -
we have learned how to simulate such distributions in Lecture 3. We can, therefore, assume that
a transformation function Fis known, i.e., that the random variable = F(
) is N
0-valued with
pmf fp
n g
n2 N
0, where
U[0;1] .
Some time ago we assumed that a probability space with a sequence f
ng
n2 N
0 of independent
U [0;1] random variables is given. We think of f
ng
n2 N
0 as a sequence of random numbers
produced by a computer. Let us rst apply the function Fto each member of f
ng
n2 N
0 to obtain
an independent sequence f
n g
n2 N
0 of
N
0-valued random variables with pmf
fp
n g
n2 N
0. In the
case of a simple random walk, we would be done at this point - an accumulation of the rst n
elements of f
n g
n2 N
0 would give you the value
X
n of the random walk at time
n. Branching
processes are a bit more complicated; the increment Z
n+1
Z
n depends on
Z
n: the more Last Updated: December 24, 2010
57Intro to Stochastic Processes: Lecture Notes

CHAPTER 7. BRANCHING PROCESSES
individuals in a generation, the more o spring they will produce. In other words, we need a
black box with two inputs - randomness and Z
n - which will produce
Z
n+1 . What do we mean
by randomness ? Ideally, we would need exactly Z
n (unused) elements of
f
n g
n2 N
0 to simulate
the number of children for each of Z
n members of generation
n. This is exactly how one would
do it in practice: given the size Z
n of generation
n, one would draw Z
n simulations from the
distribution fp
n g
n2 N
0, and sum up the results to get
Z
n+1 . Mathematically, it is easier to be more
wasteful. The sequence f
n g
n2 N
0 can be rearranged into a double sequence 1
fZ
n;i g
n2 N
0;i
2 N . In
words, instead of one sequence of independent random variables with pmf fp
n g
n2 N
0, we have a
sequence of sequences. Such an abundance allows us to feed the whole row fZ
n;i g
i2 N into the
black box which produces Z
n+1 from
Z
n. You can think of
Z
n;i as the number of children the
i th
individual in the nth
generation would have had he been born. The black box uses only the
rst Z
n elements of
fZ
n;i g
i2 N and discards the rest:
Z0 = 1
; Z
n+1 =Z
n
X
i =1 Z
n;i ;
where all fZ
n;i g
n2 N
0;i
2 N are independent of each other and have the same distribution with pmf
f p
n g
n2 N
0. Once we learn a bit more about the probabilistic structure of
fZ
ng
n2 N
0, we will describe
another way to simulate it.
7.4 A generating-function approach
Having de ned and constructed a branching process with o spring distribution fZ
ng
n2 N
0, let us
analyze its probabilistic structure. The rst question the needs to be answered is the following:
What is the distribution of Z
n, for
n2 N
0?
It is clear that Z
n must be
N
0-valued, so its distribution
is completely described by its pmf, which is, in turn, completely determined by its generating
function. While an explicit expression for the pmf of Z
n may not be available, its generating
function can always be computed:
Proposition 7.2. LetfZ
ng
n2 N
0 be a branching process, and let the generating function of its
o spring distribution fp
n g
n2 N
0 be given by
P(s ). Then the generating function of Z
n is the
n -fold composition of Pwith itself, i.e.,
PZn(
s ) = P(P (: : : P (s ) : : : ))
| {z }
n P's ;
for n 1:
Proof. Forn= 1 , the distribution of Z
1 is exactly
fp
n g
n2 N
0, so
P
Z1 =
P(s ). Suppose that the
statement of the proposition holds for some n2 N. Then
Z n+1 =Z
n
X
i =1 Z
i;n ;
can be viewed as a random sum of Z
n independent random variables with pmf
fp
n g
n2 N
0, where
the number of summands Z
n is independent of
fZ
n;i g
i2 N . By Proposition 5.16 in the lecture on 1
Can you nd a one-to-one and onto mapping from Ninto N N? Last Updated: December 24, 2010
58Intro to Stochastic Processes: Lecture Notes

CHAPTER 7. BRANCHING PROCESSES
generating functions, we have seen that the generating function
P
Zn+1 of
Z
n+1 is a composition
of the generating function P(s ) of each of the summands and the generating function P
Zn of
the random time Z
n. Therefore,
PZn+1 (
s ) = P
Zn(
P (s )) = P(P (: : : P (P (s )) : : : )))
| {z }
n + 1 P's ;
and the full statement of the Proposition follows by induction. Let us use Proposition 7.2 in some simple examples.
Example 7.3. LetfZ
ng
n2 N
0 be a branching process with o spring distribution
fp
n g
n2 N
0. In the
rst three examples no randomness occurs and the population growth can be described exactly.
In the other examples, more insteresting things happen.
1.p
0 = 1
,p
n = 0
,n 2 N:
In this case Z
0 = 1
and Z
n = 0
for all n2 N. This infertile population dies after the rst
generation.
2. p
0 = 0
; p
1= 1
; p
n= 0
,n 2:
Each individual produces exactly one child before he/she dies. The population size is always
1 : Z
n = 1
,n 2 N
0.
3. p
0 = 0
; p
1= 0
; : : : ; p
k= 1
; p
n= 0
,n k, for some k 2:
Here, there are kkids per individual, so the population grows exponentially: P(s ) = sk
, so
P Zn(
s ) = (( : : :(s k
)k
: : : )k
)k
= sk
n
. Therefore, Z
n =
kn
, for n2 N.
4. p
0 =
p; p
1=
q= 1 p; p
n= 0
,n 2:
Each individual tosses a (a biased) coin and has one child of the outcome is headsor dies
childless if the outcome is tails. The generating function of the o spring distribution is
P (s ) = p+ qs. Therefore,
PZn(
s ) = ( p+ q(p + q(p + q(: : : (p + qs)))))
| {z }
n pairs of parentheses :
The expression above can be simpli ed considerably. One needs to realize two things: (a) After all the products above are expanded, the resulting expression must be of the form A+ Bs , for some A; B. If you inspect the expression for P
Zn even more closely,
you will see that the coe cient Bnext to sis just qn
.
(b) P
Zn is a generating function of a probability distribution, so
A+ B = 1 .
Therefore, PZn(
s ) = (1 qn
) + qn
s:
Of course, the value of Z
n will be equal to
1if and only if all of the coin-tosses of its
ancestors turned out to be heads. The probability of that event is qn
. So we didn't need
Proposition 7.2 after all. Last Updated: December 24, 2010
59Intro to Stochastic Processes: Lecture Notes

CHAPTER 7. BRANCHING PROCESSES
This example can be interpreted alternatively as follows. Each individual has exactly one
child, but its gender is determined at random - male with probability qand female with
probability p. Assuming that all females change their last name when they marry, and
assuming that all of them marry, Z
n is just the number of individuals carrying the family
name after ngenerations.
5. p
0 =
p2
; p 1= 2
pq; p
2=
q2
; p n= 0
,n 3:
In this case each individual has exactly two children and their gender is female with prob-
ability pand male with probability q, independently of each other. The generating function
P of the o spring distribution fp
n g
n2 N is given by
P(s ) = ( p+ qs)2
. Then
P Zn = (
p+ q(p + q(: : : p +qs)2
: : : )2
)2
| {z }
n pairs of parentheses :
Unlike the example above, it is not so easy to simplify the above expression.
Proposition 7.2 can be used to compute the mean and variance of the population size Z
n, for
n 2 N.
Proposition 7.4. Letfp
n g
n2 N
0 be a pmf of the o psring distribution of a branching process
f Z
ng
n2 N
0. If
fp
n g
n2 N
0 admits an expectation, i.e., if
= 1
X
k =0 kp
k<<br />
1;
then E[Z
n] =
n
: (7.2)
If the variance of fp
n g
n2 N
0 is also nite, i.e., if
2
= 1
X
k =0 (
k )2
p k <<br />
1;
then Var[Z
n] =
2
n
(1 + + 2
+ + n
) = (
2
n
1
n
+1 1
;
6
= 1 ;
2
(n + 1) ; = 1 (7.3)
Proof. Since the distribution of Z
1 is just
fp
n g
n2 N
0, it is clear that
E[Z
1] =
and Var[Z
1] =
2
. We
proceed by induction and assume that the formulas (7.2) and (7.3) hold for n2 N. By Proposition
7.2, the generating function P
Zn+1 is given as a composition
P
Zn+1 (
s ) = P
Zn(
P (s )) . Therefore, if
we use the identity E[Z
n+1 ] =
P0
Z n+1 (1)
, we get
P 0
Z n+1 (1) =
P0
Z n(
P (1)) P0
(1) = P0
Z n(1)
P0
(1) = E[Z
n]
E [Z
1] =
n
= n
+1
:
A similar (but more complicated and less illuminating) argument can be used to establish (7.3). Last Updated: December 24, 2010
60Intro to Stochastic Processes: Lecture Notes

CHAPTER 7. BRANCHING PROCESSES
7.5 Extinction probability
We now turn to the central question (the one posed by Galton). We de ne extinctionto be the
following event: E=f! 2
: Z
n(
! ) = 0 for some n2 Ng:
It is the property of the branching process that Z
m = 0
for all m nwhenever Z
n = 0
. Therefore,
we can write Eas an increasing union of sets E
n, where
E n =
f! 2
: Z
n(
! ) = 0 g:
Therefore, the sequence fP [E
n]
g
n2 N is non-decreasing and continuity of probability (see the
very rst lecture) implies that P[E ] = lim
n2 N P
[E
n]
:
The number P[E ]is called the extinction probability . Using generating functions, and, in partic-
ular, the fact that P[E
n] =
P[Z
n = 0] =
P
Zn(0)
we get
P [E ] = lim
n2 N P
Zn(0) = lim
n2 N P
(P (: : : P (0): : :))
| {z }
n P's :
It is amazing that this probability can be computed, even if the explicit form of the generating
function P
Zn is not known.
Proposition 7.5. The extinction probability p= P[E ]is the smallest non-negative solution of
the equation
x= P(x ); called the extinction equation ,
where Pis the generating function of the o spring distribution.
Proof. Let us show rst that p= P[E ]is asolution of the equation x= P(x ). Indeed, Pis a
continuous function, so P(lim
n!1 x
n) = lim
n!1 P
(x
n)
for every convergent sequence fx
ng
n2 N
0
in [0;1] with x
n !
x
1 . Let us take a particular sequence given by
xn =
P(P (: : : P (0): : :))
| {z }
n P's :
Then 1.p= P[E ] = lim
n2 N x
n, and
2. P(x
n) =
x
n+1 .
Therefore, p= lim
n!1 x
n = lim
n!1 x
n+1 = lim
n!1 P
(x
n) =
P( lim
n !1 x
n) =
P(p );
and so psolves the equation P(x ) = x.
The fact that p= P[E ]is the smallest solution of x= P(x ) on [0;1] is a bit trickier to get. Let
p 0
be another solution of x= P(x ) on [0;1] . Since 0 p0
and Pis a non-decreasiing function, we
have P(0) P(p 0
) = p0
: Last Updated: December 24, 2010
61Intro to Stochastic Processes: Lecture Notes

CHAPTER 7. BRANCHING PROCESSES
We can apply the function
Pto both sides of the inequality above to get
P(P (0)) P(P (p 0
)) = P(p 0
) = p0
:
Continuing in the same way we get P[E
n] =
P(P (: : : P (0): : :))
| {z }
n P'
p0
;
we get p= P[E ] = lim
n2 N P
[E
n]
lim
n2 N p0
= p0
, so pis not larger then any other solution p0
of
x = P(x ). Example 7.6.
Let us compute extinction probabilities in the cases from Example 7.3.
1. p
0 = 1
,p
n = 0
,n 2 N:
No need to use any theorems. P[E ] = 1 in this case.
2. p
0 = 0
; p
1= 1
; p
n= 0
,n 2:
Like above, the situation is clear - P[E ] = 0 .
3. p
0 = 0
; p
1= 0
; : : : ; p
k= 1
; p
n= 0
,n k, for some k 2:
No extinction here - P[E ] = 0 .
4. p
0 =
p; p
1=
q= 1 p; p
n= 0
,n 2:
Since P(s ) = p+ qs , the extinction equation is s= p+ qs . If p= 0 , the only solution is s= 0 ,
so no extinction occurs. If p >0, the only solution is s= 1 - the extinction is guaranteed. It
is interesting to note the jump in the extinction probability as pchanges from 0to a positive
number.
5. p
0 =
p2
; p 1= 2
pq; p
2=
q2
; p n= 0
,n 3:
Here P(s ) = ( p+ qs)2
, so the extinction equation reads
s= ( p+ qs)2
:
This is a quadratic in sand its solutions are s
1 = 1
and s
2 = p
2 q
2
, if we assume that
q >0.
When p < q, the smaller of the two is s
2. When
p q, s = 1 is the smallest solution.
Therefore
P[E ] = min(1 ;p
2 q
2 )
: Last Updated: December 24, 2010
62Intro to Stochastic Processes: Lecture Notes

Chapter 8
Markov Chains
8.1 The Markov property
Simply put, a stochastic process has the Markov propertyif its future evolution depends only
on its current position, not on how it got there. Here is a more precise, mathematical, de nition.
It will be assumed throughout this course that any stochastic process fX
ng
n2 N
0 takes values in
a countable set S- the state space. Usually, Swill be either N
0 (as in the case of branching
processes) or Z(random walks). Sometimes, a more general, but still countable, state spaceS
will be needed. A generic element of Swill be denoted by ior j.
De nition 8.1. A stochastic process fX
ng
n2 N
0 talking values in a countable state space
Sis called
a Markov chain (or said to have the Markov property) if
P [X
n+1 =
i
n +1 j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ] =
P[X
n+1 =
i
n +1 j
X
n=
i
n ]
;
(8.1)
for all n2 N
0, all
i
0 ; i
1; : : : ; i
n; i
n+1 2
S, whenever the two conditional probabilities are well-
de ned, i.e., when P[X
n=
i
n ; : : : ; X
1=
i
1 ; X
0=
i
0 ]
> 0.
The Markov property is typically checked in the following way: one computes the left-hand
side of (8.1) and shows that its value does not depend on i
n 1; i
n 2; : : : ; i
1; i
0(why is that enough?).
A condition P[X
n =
i
n ; : : : ; X
0=
i
0 ]
> 0will be assumed (without explicit mention) every time
we write a conditional expression like to one in (8.1). All chains in this course will be homogeneous, i.e., the conditional probabilities P[X
n+1 =
j jX
n =
i] will not depend on the current time n2 N
0, i.e.,
P[X
n+1 =
jjX
n =
i] = P[X
m+1 =
j jX
m =
i], for m; n 2N
0.
Markov chains are (relatively) easy to work with because the Markov property allows us
to compute all the probabilities, expectations, etc. we might be interested in by using only two
ingredients.
1.Initial probability a(0)
=fa (0)
i :
i2 Sg,a (0)
i =
P[X
0=
i] - the initial probability distribution
of the process, and
2. Transition probabilities p
ij =
P[X
n+1 =
jjX
n =
i] - the mechanism that the process uses
to jump around.
63

CHAPTER 8. MARKOV CHAINS
Indeed, if one knows all
a(0)
i and all
p
ij , and wants to compute a joint distribution
P[X
n =
i n ; X
n 1 =
i
n 1; : : : ; X
0=
i
0 ]
, one needs to use the de nition of conditional probability and the
Markov property several times (the multiplication theoremfrom your elementary probability
course) to get
P[X
n=
i
n ; : : : ; X
0=
i
0 ] =
P[X
n=
i
n j
X
n 1 =
i
n 1; : : : ; X
0=
i
0 ]
P [X
n 1 =
i
n 1; : : : ; X
0=
i
0 ]
= P[X
n=
i
n j
X
n 1 =
i
n 1]
P [X
n 1 =
i
n 1; : : : ; X
0=
i
0 ]
= p
in 1i
n P
[X
n 1 =
i
n 1; : : : ; X
0=
i
0 ]
Repeating the same procedure, we get
P[X
n=
i
n ; : : : ; X
0=
i
0 ] =
p
in 1i
n
p
in 2i
n 1
p
i0 i
1
a(0)
i 0 :
When Sis nite, there is no loss of generality in assuming that S= f1;2 ; : : : ; n g, and then we
usually organize the entries of a(0)
into a row vector
a (0)
= ( a(0)
1 ; a(0)
2 ; : : : ; a (0)
n )
;
and the transition probabilities p
ij into a square matrix
P, where
P =2
6
6
6
4 p
11 p
12 : : : p
1n
p 21 p
22 : : : p
2n
.
.
. .
.
. .
.. .
.
.
p n1 p
n2 : : : p
nn3
7
7
7
5
In the general case ( Spossibly in nite), one can still use the vector and matrix notation as before,
but it becomes quite clumsy in the general case. For example, if S= Z,P is an in nite matrix
P =2
6
6
6
6
6
6
4 .
.. .
.
. .
.
. .
.
. ...
: : : p 1 1 p
1 0 p
1 1 : : :
: : : p 0 1 p
0 0 p
0 1 : : :
: : : p 1 1 p
1 0 p
1 1 : : :
. ..
.
.
. .
.
. .
.
. .
.. 3
7
7
7
7
7
7
5
8.2 Examples
Here are some examples of Markov chains - for each one we write down the transition matrix.
The initial distribution is sometimes left unspeci ed because it does not really change anything.
1. Random walks LetfX
ng
n2 N
0 be a simple random walk. Let us show that it indeed has
the Markov property (8.1). Remember, rst, that X
n = P
n
k =1
k , where

k are
independent
coin-tosses. For a choice of i
0 ; : : : ; i
n+1 (such that
i
0 = 0
and i
k +1
i
k =
1) we have
P [X
n+1 =
i
n +1 j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[X
n+1
X
n=
i
n +1
i
n j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[
n +1 =
i
n +1
i
n j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[
n +1 =
i
n +1
i
n ]
; Last Updated: December 24, 2010
64Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
where the last equality follows from the fact that the increment

n +1 is idependent of the previous
increments, and, therefore, of the values of X
1; X
2; : : : ; X
n. The last line above does not depend
on i
n 1; : : : ; i
1; i
0, so
Xindeed has the Markov property.
The state space Sof fX
ng
n2 N
0 is the set
Zof all integers, and the initial distribution a(0)
is very simple: we start at 0with probability 1(so that a(0)
0 = 1
anda(0)
i = 0
, for i6
= 0 .). The
transition probabilities are simple to write down
pij = 8
>
<<br />
>
: p; j
=i+ 1
q; j =i 1
0 ; otherwise.
These can be written down in an in nite matrix,
P=2
6
6
6
6
6
6
6
6
6
6
6
6
4 .
.. .
.
. .
.
. .
.
. .
.
. .
.
. ...
: : : 0P 0 0 0 : : :
: : : q 0p0 0 : : :
: : : 0q0p 0: : :
: : : 0 0 q0 p : : :
: : : 0 0 0 q0 : : :
: : : 0 0 0 0 q : : :
. .
.
.
.
. .
.
. .
.
. .
.
. .
.
. .
.
. .
..3
7
7
7
7
7
7
7
7
7
7
7
7
5
but it does not help our understanding much.
2. Branching processes LetfX
ng
n2 N
0 be a simple Branching process with the branching
distribution fp
n g
n2 N
0. As you surely remember, it is constructed as follows:
X
0= 1
and X
n+1 =
P Xn
k =1 X
n;k , where
fX
n;k g
n2 N
0;k
2N is a family of independent random variables with distribution
f p
n g
n2 N
0. It is now not very di cult to show that
fX
ng
n2 N
0 is a Markov chain
P [X
n+1 =
i
n +1 j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[X
n
X
k =1 X
n;k =
i
n +1 j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[ i
n
X
k =1 X
n;k =
i
n +1 j
X
n=
i
n ; X
n 1 =
i
n 1; : : : ; X
1=
i
1 ; X
0=
i
0 ]
= P[ i
n
X
k =1 X
n;k =
i
n +1 ]
;
where, just like in the random-walk case, the last equality follows from the fact that the random
variables X
n;k ,
k 2 N are independent of all X
m;k ,
m < n ,k 2 N. In particular, they are
independent of X
n; X
n 1; : : : ; X
1; X
0, which are obtained as combinations of
X
m;k ,
m < n ,k 2 N.
The computation above also reveals the structure of the transition probabilities, p
ij ,
i; j 2S = N
0:
p ij =
P[ i
X
k =1 X
n;k =
j]: Last Updated: December 24, 2010
65Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
There is little we can do to make the expression above more explicit, but we can remember
generating functions and write P
i(
s ) = P
1
j =0 p
ij sj
(remember that each row of the transition
matrix is a probability distribution). Thus, P
i(
s ) = ( P(s )) i
(why?), where P(s ) = P
1
k =0 p
ksk
is the
generating function of the branching probability. Analogously to the random walk case, we have
a(0)
i =(
1; i = 1;
0 ; i 6
= 1 :
3. Gambler's ruin In Gambler's ruin, a gambler starts with $x , where 0 x a2 N and in
each play wins a dollar (with probability p2 (0;1) ) and loses a dollar (with probability q= 1 p).
When the gambler reaches either 0or a, the game stops. The transition probabilities are similar
to those of a random walk, but di er from them at the boundaries 0and a. The state space is
nite S= f0 ;1 ; : : : ; a gand the matrix Pis, therefore, given by
P =2
6
6
6
6
6
6
6
6
6
6
6
4 1 0 0 0
: : :0 0 0
q 0 p 0: : : 0 0 0
0 q0 p : : : 0 0 0
0 0 q0 : : : 0 0 0
.
.
. .
.
. .
.
. .
.
. .
.. .
.
. .
.
. .
.
.
0 0 0 0 : : :0p 0
0 0 0 0 : : : q0p
0 0 0 0 : : :0 0 1 3
7
7
7
7
7
7
7
7
7
7
7
5
The initial distribution is deterministic: a(0)
i =(
1; i =x;
0 ; i 6
= 1 :
4. Regime Switching Consider a system with two di erent states; think about a simple weather
forcast (rain/no rain), high/low water level in a reservoire, high/low volatility regime in a nancial
market, high/low level of economic growth, etc. Suppose that the states are called 0and 1and
the probabilities p
01 and
p
10 of switching states are given. The probabilities
p
00 = 1
p
01 and
p 11 = 1
p
10 correspond to the system staying in the same state. The transition matrix for this
Markov with S= f0;1 g is
P=
p00 p
01
p 10 p
11 :
When p
01 and
p
10 are large (close to 1) the system nervously jumps between the two states. When
they are small, there are long periods of stability (staying in the same state). Last Updated: December 24, 2010
66Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
5. Deterministically monotone Markov chain
A stochastic processfX
ng
n2 N
0 with state space
S = N
0 such that
X
n=
nfor n2 N
0 (no randomness here) is called Deterministically monotone
Markov chain (DMMC). The transition matrix looks something like this
P=2
6
6
6
4 0 1 0 0
: : :
0 0 1 0 : : :
0 0 0 1 : : :
.
.
. .
.
. .
.
. .
.
. .
.. 3
7
7
7
5
6. Not a Markov chain Consider a frog jumping from a lotus leaf to a lotus leaf on in a
small forest pond. Suppose that there are Nleaves so that the state space can be described as
S = f1;2 ; : : : ; N g. The frog starts on leaf 1 at time n= 0 , and jumps around in the following
fashion: at time 0it chooses any leaf except for the one it is currently sitting on (with equal
probability) and then jumps to it. At time n >0, it chooses any leaf other than the one it is
sitting on and the one it visited immediately before (with equal probability) and jumps to it. The
position fX
ng
n2 N
0 of the frog is not a Markov chain. Indeed, we have
P [X
3= 1
jX
2= 2
; X
1= 3] = 1 N
2;
while P[X
3= 1
jX
2= 2
; X
1= 1] = 0
:
A more dramatic version of this example would be the one where the frog remembers all
the leaves it had visited before, and only chooses among the remaining ones for the next jump.
7. Making a non-Markov chain into a Markov chain How can we turn the process of Example
6. into a Markov chain. Obviously, the problem is that the frog has to remember the number
of the leaf it came from in order to decide where to jump next. The way out is to make this
information a part of the state. In other words, we need to change the state space. Instead of just
S = f1 ;2 ; : : : ; N g, we set S= f(i
1 ; i
2) :
i; j2 f1;2 ; : : : N gg. In words, the state of the process will
now contain not only the number of the current leaf (i.e., i) but also the numer of the leaf we
came from (i.e., j). There is a bit of freedom with the initial state, but we simply assume that we
start from (1;1) . Starting from the state (i; j ), the frog can jump to any state of the form (k; i ),
k 6
= i; j (with equal probabilities). Note that some states will never be visited (like (i; i )for i6
= 1 ),
so we could have reduced the state space a little bit right from the start.
8. A more complicated example LetfX
ng
n2 N
0 be a simple symmetric random walk. The
absolute-value process Y
n =
jX
nj
, n 2 N
0, is also a Markov chain. This processes is sometimes
called the re ected random walk .
In order to establish the Markov property, we let i
0 ; : : : ; i
n+1 be non-negative integers with
i k +1
i
k =
1 for all 0 k n(the state space is S= N
0). We need to show that the conditional
probability
P[jX
n+1 j
= i
n +1

j
X
nj
= i
n ; : : : ;
jX
0j
= i
0 ]
(8.2)Last Updated: December 24, 2010
67Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
does not depend on
i
n 1; : : : ; i
0. We write
P [jX
n+1 j
= i
n +1

j
X
nj
= i
n ; : : : ;
jX
0j
= i
0 ] =
P[X
n+1 =
i
n +1

j
X
nj
= i
n ; : : : ;
jX
0j
= i
0 ]
+ P[X
n+1 =
i
n +1

j
X
nj
= i
n ; : : : ;
jX
0j
= i
0 ]
; (8.3)
and concentrate on the rst conditional probability on the right-hand side, asumming that i
n >
0
(the case i
n = 0
is easier and is left to the reader who needs practice). Let us use the law
of total probability (see Problem 1 in HW 6) with A
1 =
fX
n =
i
n g
, A
2 =
fX
n 6
= i
n g
and
B =fjX
nj
= i
n ; : : : ;
jX
0j
= ji
0 jg
. Since A
2\
B =fX
n=
i
n g \
B, we have
P [X
n+1 =
i
n +1 j
B ] = P[X
n+1 =
i
n +1 j
B \A
1]
P [A
1j
B ]
+ P[X
n+1 =
i
n +1 j
B \A
2]
P [A
2j
B ]
= P[X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
]P [X
n=
i
n j
B ]
+ P[X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
]P [X
n=
i
n j
B ]
Conditionally on X
n =
i
n , the probability that
X
n+1 =
i
n +1 does not depend on the extra infor-
mation Bmight contain. Therefore
P[X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
] = P[X
n+1 =
i
n +1 j
X
n=
i
n ]
; and, similarly,
P [X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
] = P[X
n+1 =
i
n +1 j
X
n=
i
n ]
:
How about the term P[X
n=
i
n j
B ](with P[X
n=
i
n j
B ]being completely analogous)? If we show
that
P[X
n=
i
n j
B ] = P[X
n=
i
n

j
X
nj
= i
n ]
; (8.4)
we would be making great progress. There is a rigorous way of doing this, but it is quite technical
and not very illuminating. The idea is simple, though: for every path (0; X
1(
! ); : : : ; X
n(
! )) , the
ipped path (0; X
1(
! ); : : : ; X
n(
! )) isequally likely and gives exactly the same sequence of
absolute values. Therefore, the knowledge of Bdoes not permit us to distinguish between them.
In particular, for every (possible) sequence of absolute values jx
0j
= i
0 ;
jx
1j
= i
1 ; : : : ;
jx
0j
= i
n
there are as many actual paths (x
0; x
1; : : : ; x
n)
that end up in i
n as those that end up in
i
n .
Therefore,
P[X
n=
i
n j
B ] = P[X
n=
i
n

j
X
nj
= i
n ] = 1 2
:
Similarly, P[X
n=
i
n j
B ] = 1 2
.
Going back to the initial expression (8.4), we have
P [X
n+1 =
i
n +1 j
B ] = 1 2
P
[X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
] + 1 2
P
[X
n+1 =
i
n +1 j
B \ f X
n=
i
n g
]:
Given that i
n >
0, the rst conditional probability above equals to P[X
n+1 =
i
n ]
because of the
Markov property; as far as X
n+1 is concerned, the information
B\ f X
n =
i
n g
is the same as
just fX
n=
i
n g
). The second conditional probability is 0: it is impossible for X
n+1 to be equal to
i n +1 >
0if X
n =
i
n <<br />
0. Therefore, P[X
n+1 =
i
n +1 j
B ] = 1 =4 . An essential identical argument
shows that P[X
n+1 =
i
n +1 j
B ] = 1 =4 . Therefore P[j X
n+1 j
= i
n +1 j
B ] = 1 2
- which is independent Last Updated: December 24, 2010
68Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
of
i
n 1; : : : ; i
1; i
0(it looks like it is also independent of
i
n but it is not; this probability is equal to
0 unless i
n +1
i
n =
1).
9. A more realistic example In a
game of tennis, the scoring system is
as follows: both players (let us call
them Amélie and Björn) start with the
score of 0. Each time Amélie wins a
point, her score moves a step up in the
following hierarchy
07! 15 7!307! 40:
Once Amélie reaches 40and scores a
point, three things can happen:
1. if Björn's score is 30or less,
Amélie wins the game.
2. if Björn's score is 40, Amélie's
score moves up to advantage ,
and
3. if Björn's score is advantage ,
nothing happens to Amélie's
score, but Björn's score falls back
to 40.
Finally, if Amélie's score is advantage
and she wins a point, she wins the
game. The situation is entirely sym-
metric for Björn. We suppose that the
probability that Amélie wins each point
is p2 (0;1) , independently of the cur-
rent score. Figure.
Markov chains with a nite number of states are
usually represented by directed graphs (like the one in
the gure above). The nodes are states, two states i; jare
linked by a (directed) edge if the transition probability p
ij
is non-zero, and the number p
ij is writen above the link.
If p
ij = 0
, no edge is drawn.
A situation like this is a typical example of a Markov chain in an applied setting. What are
the states of the process? We obviously need to know both players' scores and we also need to
know if one of the players has won the game. Therefore, a possible state space is the following:
S = n
Amelie wins ; Bjorn wins ;(0 ;0) ;(0 ;15) ;(0 ;30) ;(0 ;40) ;(15 ;0) ;(15 ;15) ;(15 ;30) ;(15 ;40) ;
(30 ;0) ;(30 ;15) ;(30 ;30) ;(30 ;40) ;(40 ;0) ;(40 ;15) ;(40 ;30) ;(40 ;40) ;(40 ;Adv ); (Adv ;40) o
(8.5)
It is not hard to assign probabilities to transitions between states. Once we reach either Amelie
wins or Bjorn wins the game stops. We can assume that the chain remains in that state forever,
i.e., the state is absorbing. The initial distribution is quite simple - we aways start from the same
state (0;0) , so that a(0)
(0 ;0) = 1
and a(0)
i = 0
for all i2 Sn f 0g. Last Updated: December 24, 2010
69Intro to Stochastic Processes: Lecture Notesqpqpqpqqqppqqppqpq1qpqpqppqqppqpq1pqpq
8 0 , 0 < 8 0 , 15 < 8 15 , 0 < 8 0 , 30 < 8 15 , 15 < 8 0 , 40 < 8 15 , 30 < 8 15 , 40 < 8 30 , 40 < Bjorn wins 8 30 , 0 < 8 30 , 15 < 8 30 , 30 < 8 40 , 0 < 8 40 , 15 < 8 40 , 30 < 8 40 , 40 < Amelie wins 8 Adv , 40 < 8 40 , Adv <<br />
CHAPTER 8. MARKOV CHAINS
How about the transition matrix? When the number of states is big (
#S = 20 in this case),
transition matrices are useful in computer memory, but not so much on paper. Just for the fun
of it, here is the transition matrix for our game-of-tennis chain, with the states ordered as in (8.5):
P=2
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
6
4 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 q0 0 p0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 q0 0 p0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 q0 0 p0 0 0 0 0 0 0 0 0 0 0
0 q0 0 0 0 0 0 0 p0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 q0 0 p0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 q0 0 p0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 q0 0 p0 0 0 0 0 0 0
0 q0 0 0 0 0 0 0 0 0 0 0 p0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 q0 0 p0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 q0 0 p0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 q0 0 p0 0 0
0 q0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 p0 0
p 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q0 0 0 0
p 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q0 0 0
p 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q p
0 q0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 p0 0
p 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q0 0 3
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
7
5
Here is a question we will learn how to answer later:
Question 8.2. Does the structure of a game of tennis make is easier or harder for the better
player to win? In other words, if you had to play against Roger Federer (I am rudely assuming
that he is better than you), would you have a better chance of winning if you only played a point,
or if you actually played the whole game?
8.3 Chapman-Kolmogorov relations
The transition probabilities p
ij ,
i; j 2Stell us how a Markov chain jumps from a state to a
state in one step. How about several steps, i.e., how does one compute the the probabilities like
P [X
k+ n =
jjX
k =
i], n 2 N? Since we are assuming that all of our chains are homogeneous
(transition probabilities do not change with time), this probability does not depend on the time k,
and we set p(
n )
ij =
P[X
k+ n =
jjX
k=
i] = P[X
n=
jjX
0=
i]:
It is sometimes useful to have a more compact notation for this, last, conditional probability, so
we write Pi[
A ] = P[A jX
0=
i]; for any event A:
Therefore, p(
n )
ij =
P
i[
X
n=
j]: Last Updated: December 24, 2010
70Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
For
n= 0 , we clearly have
p(0)
ij =(
1; i =j;
0 ; i 6
= j:
Once se have de ned the multi-step transition probabilities p(
n )
ij ,
i; j 2S,n 2 N
0, we need to
be able to compute them. This computation is central in various applications of Markov chains:
they relate the small-time (one-step) behavior which is usually easy to observe and model to a
long-time (multi-step) behavior which is really of interest. Before we state the main result in this
direction, let us remember how matrices are multiplied. When Aand Bare n nmatrices, the
product C=AB is also an n nmatrix and its ij-entry C
ij is given as
C ij = n
X
k =1 A
ikB
kj :
There is nothing special about niteness in the above de nition. If Aand Bwere in nite matrices
A = ( A
ij)
i;j 2S ,
B = ( B
ij)
i;j 2S for some countable set
S, the same procedure could be used to
de ne C=AB . Indeed, Cwill also be an S S -matrix and
C ij = X
k 2 S A
ikB
kj ;
as long as the (in nite) sum above converges absolutely. In the case of a typical transition matrix
P , convergence will not be a problem since Pis a stochastic matrix , i.e., it has the following two
properties (why?):
1.p
ij
0, for all i; j2S, and
2. P
j2 S p
ij = 1
, for all i2 S (in particular, p
ij 2
[0;1] , for all i; j).
When P= ( p
ij )
i;j 2S and
P0
= ( p0
ij )
i;j 2S are two
S S-stochastic matrices, the series P
k2 S p
ik p0
kj
converges absolutely since 0 p0
kj
1for all k; j2S and so
X
k 2 S
p ik p0
kj
X
k 2 S p
ik
1; for all i; j2S:
Moreover, a product Cof two stochastic matrices Aand Bis always a stochastic matrix: the
entries of Care clearly positive and (by Tonelli's theorem)
X
j 2 S C
ij = X
j 2 S X
k 2 S A
ikB
kj =X
k 2 S X
j 2 S A
ikB
kj =X
k 2 S A
ik X
j 2 S B
kj
| {z }
1 =
X
k 2 S A
ik = 1
:
Proposition 8.3. LetPn
be the n-th (matrix) power of the transition matrix P. Then p(
n )
ij =
( P n
)ij , for
i; j2S. Last Updated: December 24, 2010
71Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
Proof.
We proceed by induction. For n= 1 the statement follows directly from the de nition of
the matrix P. Supposing that p(
n )
ij = (
Pn
)ij for all
i; j, we have
p (
n +1)
ij =
P[X
n+1 =
jjX
0=
i]
= X
k 2 S P
[X
1=
kjX
0=
i]P [X
n+1 =
jjX
0=
i; X
1=
k]
= X
k 2 S P
[X
1=
kjX
0=
i]P [X
n+1 =
jjX
1=
k]
= X
k 2 S P
[X
1=
kjX
0=
i]P [X
n=
jjX
0=
k]
= X
k 2 S p
ik p(
n )
kj :
where the second equality follows from the law of total probability, the third one from the Markov
property, and the fourth one from homogeneity. The last sum above is nothing but the expression
for the matrix product of Pand Pn
, and so we have proven the induction step. Using Proposition 8.3, we can write a simple expression for the distribution of the random
variable X
n, for
n2 N
0. Remember that the initial dsitribution (the distribution of
X
0) is denoted
by a(0)
= ( a(0)
i )
i2 S . Analogously, we de ne the vector
a(
n )
= ( a(
n )
i )
i2 S by
a (
n )
i =
P[X
n=
i]; i 2S:
Using the law of total probability, we have
a(
n )
i =
P[X
n=
i] = X
k 2 S P
[X
0=
k]P [X
n=
ijX
0=
k] = X
k 2 S a
(0)
k p(
n )
ki :
We usually interpret a(0)
as a (row) vector, so the above relationship can be expressed using
vector-matrix multiplication a(
n )
= a(0)
Pn
:
The following corollary shows a simple, yet fundamental, relationship between di erent multi-
step transition probabilities p(
n )
ij .
Corollary 8.4 (Chapman-Kolmogorov relations) .For n; m 2N
0 and
i; j2S we have
p (
m +n)
ij =X
k 2 S p
(
m )
ik p(
n )
kj :
Proof. The statement follows directly from the matrix equality
Pm
+n
= Pm
P n
: Last Updated: December 24, 2010
72Intro to Stochastic Processes: Lecture Notes

CHAPTER 8. MARKOV CHAINS
It is usually di cult to compute
Pn
for a general transition matrix Pand a large n. We will
see later that it will be easier to nd the limiting values lim
n!1 p(
n )
ij . In the mean-time, here is a
simple example where this can be done by hand
Example 8.5. In the setting of a Regime Switching chain (Example 4.), let us write afor p
01 and
b for p
10 to simplify the notation, so that the transition matrix looks like this:
P=
1 a a
b 1 b
The characteristic equation det( I P) = 0 of the matrix Pis
0 = det( I P) =

1 + a a
b 1 + b

= ((
1) + a)(( 1) + b) ab
= ( 1)( (1 a b)) :
The eigenvalues are, therefore,
1 = 1
and
2 = 1
a b. The eigenvectors are v
1 =
1
1
and
v 2 =
a
b
, so that with V=
1 a
1 b
and D=
1 0
0
2
=
1 0
0 (1 a b)
we have
P V =V D; i.e.,P=V DV
1
:
This representation is very useful for taking matrix powers:
Pn
= ( V DV
1
)( V DV
1
) : : : (V DV
1
) = V Dn
V
1
= V
1 0
0 (1 a b)n
V
1
Assuming a+ b > 0(i.e., P6
=
0 1
1 0
), we have V
1
= 1 a
+ b
b a
1 1
, and
P n
= V D n
V
1
=
1 a
1 b
1 0
0 (1 a b)n
1 a
+ b
b a
1 1
= 1 a
+ b
b a
b a
+ (1
a b)n a
+ b
a a
b b
= 2
4 b a
+ b + (1
a b)n
a a
+ b a a
+ b
(1 a b)n
a a
+ b
b a
+ b + (1
a b)n
b a
+ b a a
+ b
(1 a b)n
b a
+ b 3
5
The expression for Pn
above tells us a lot about the structure of the multi-step probabilities
p (
n )
ij for large
n. Note that the second matrix on the right-hand side above comes multiplied by
(1 a b)n
which tends to 0as n! 1 , unless we are in the uninteresting situation a= b= 0 or
( a = b= 1 ). Therefore,
Pn
1 a
+ b
a b
a b
for large n:
The fact that the rows of the right-hand side above are equal points to the fact that, for large n,
p (
n )
ij does not depend (much) on the initial state
i. In other words, this Markov chain forgets its
initial condition after a long period of time. This is a rule more than an exception, and we will
study such phenomena in the following lectures. Last Updated: December 24, 2010
73Intro to Stochastic Processes: Lecture Notes

Chapter 9
The Stochastics package
Stochastics is aMathematica package (a collection of functions) for various calculations, sim-
ulation and visualization of Markov chains. Here is a short user's guide.
9.1 Installation
You can download the le Stochastics.mfrom the course web-site. It needs to be put in a
directory/folder on your system that Mathematica knows about. The easiest way to do this is
to type $Pathin Mathematica. The output will be a list of folders. You simply copy the le
Stochastics.m in any of the folders listed. Restart Mathematica, and that is it.
Every time you want to use functions from Stochastics, you issue the command <<Stochastics`<br />
(note that the last symbol `is not an apostrophe; it can be found above the Tab key on your key-
board). If you want to get information about the syntax and usage of a certain command (say
BuildChain ), just type?BuildChain , and Mathematica will display a short help paragraph.
9.2 Building Chains
The rst thing you need to learn is to tell Mathematica about the structure of the Markov Chain
you want to analyze. As you know, two ingredients are needed for that: the initial distribution, and
the set of transition probabilities. Since the transition matrix is usually quite sparse (has many
zero elements), it is typically faster to specify those probabilities as a list of transitions of the
form {FromState,ToState,probability} . That is exactly howStochasticsis set up. In order to
store the information about a Markov Chain that Mathematica can do further computations with,
you issue the command MyChain=BuildChain[Triplets,InitialDistrubution] , whereMyChain
is just the name of the variable (could be anything), Tripletsis a list each of whose elements is
a triplet of the from {FromState,ToState,probability} , andInitialDistribution is a list of
pairs of the form {State,probability} whereprobability =P[X
0=
State ]. Here is an example
for the simple Markov chain with three states A,B and C, the initial distribution P[X
0=
A] = 1 =2,
P [X
0=
B] = 1 =4 and P[X
0=
C] = 1 =4 , and the transition graph that looks like this:
74

CHAPTER 9. THE STOCHASTICS PACKAGE
The code I used to de ne it is
The Tennis Chain example we covered in class is already implemented in the package, so that
you don't have to type it in yourself. All you need to do it issue the command MyChain=TennisChain[p],
where pis the probability of winning a rally for Amélie.
9.3 Getting information about a chain
Once you have built a chain (assume that its name is MyChain) you can get information about it
using the following commands. We start by listing the simple ones:
States[MyChain] returns the list of all states.
InitialDistribution[MyChain] returns the list containing the list of initial probabilities.
The order is the same as that returned by the command States.
TransitionMatrix[MyChain] returns the transition matrix.
Probability[s1,s2,n,MyChain] returns then-step probability P[X
n=
s2 jX
0=
s1] .
NumberOfStates[MyChain] returns the number of states.
Classes[MyChain] returns a list whose elements are the communication classes (lists them-
selves) of the chain.
Recurrence[MyChain] returns a list of zeros or ones, one for each state. It is 1 if the state
is recurrent and 0 otherwise.
TransientStates[MyChain] returns the list of transient states.
RecurrentStates[MyChain] returns the list of recurrent states.
ClassNumber[s,MyChain] returns the number of the communication class the state sbe-
longs to, i.e., the number of the element of Classes[MyChain]thatsbelongs to.
See Example 1 for a Mathematica notebook which illustrates these commands using MyChain
de ned above. The following commands help with more complicated (matrix) computations:
QMatrix[MyChain] ,RMatrix[MyChain] andPMatrix[MyChain] return the matrices Q,R and
P from the canonical decomposition of MyChain. Last Updated: December 24, 2010
75Intro to Stochastic Processes: Lecture Notes0.50.51.00.50.5
A
B
C In[1]:= Stochastics`
In[34]:= MyChain BuildChain
A, B, 1 2 , A, A, 1 2 , B, C, 1 , C, C, 1 2 , C, B, 1 2 ,
A, 1 2 , B, 1 4 , C, 1 4
;

CHAPTER 9. THE STOCHASTICS PACKAGE

CanonicalForm[MyChain] returns the 2x2 block matrix representation of the canonical form
of the chain MyChain
FundamentalMatrix[MyChain] returns the fundamental matrix (I Q)
1
of the chain.
See Example 9.2 for a related Mathematica notebook.
9.4 Simulation
The functions in Stochasticscan be used to perform repeated simulations of Markov chains:
SimulateFirst[MyChain] outputs a single draw from the initial distribution of the chain.
SimulateNext[s,MyChain] outputs a single draw from tomorrow's position of the chain if
it is in the state stoday.
SimulatePaths[nsim,nsteps,MyChain] outputs a matrix withnsimrows and nstepscolumns;
each row is a simulated path of the chain of length nsteps.
See Example 9.3 for an illustration.
9.5 Plots
Finally, the package Stochasticscan produce pretty pictures and animations of your chains:
PlotChain[MyChain,Options] produces a graphical representation of the Markov Chain
MyChain . TheOptions argument is optional and can be used to change the look and feel
of the plot. Any option that the built-in function GraphPlotaccepts can be used.
AnimateChain[nsteps,Mychain] produces an animation of a single simulated trajectory of
MyChain .
The picture in section 9.2 is the output of the command PlotChain[MyChain].Last Updated: December 24, 2010
76Intro to Stochastic Processes: Lecture Notes

CHAPTER 9. THE STOCHASTICS PACKAGE
9.6 Examples
Example 9.1. Simple commands Example 9.2.
Canonical-form related commands (a continuation of Example 9.1) Last Updated: December 24, 2010
77Intro to Stochastic Processes: Lecture NotesIn[1]:= Stochastics`
In[2]:= MyChain BuildChain
A, B, 1 2 , A, A, 1 2 , B, C, 1 , C, C, 1 2 , C, B, 1 2 ,
A, 1 2 , B, 1 4 , C, 1 4
;
In[3]:= States MyChain
Out[3]= A, B, C
In[4]:= NumberOfStates MyChain
Out[4]= 3
In[6]:= InitialDistribution MyChain
Out[6]= 1 2, 1 4, 1 4
In[7]:= TransitionMatrix
MyChain
Out[7]= 1 2, 1 2, 0
, 0, 0, 1 , 0, 1 2 , 1 2
In[8]:= MatrixForm
TransitionMatrix MyChain
Out[8]//MatrixForm= 1 2 1 20
0 0 1 0
1 2 1 2
In[9]:= Probability A, B, 3, MyChain
Out[9]= 3 8
In[10]:= Classes MyChain
Out[10]= A , B, C
In[11]:= Recurrence MyChain
Out[11]= 0, 1, 1
In[12]:= TransientStates MyChain
Out[12]= A
In[13]:= RecurrentStates MyChain
Out[13]= B, C

CHAPTER 9. THE STOCHASTICS PACKAGE
Example 9.3.
Simulations (a continuation of Example 9.1) Last Updated: December 24, 2010
78Intro to Stochastic Processes: Lecture NotesIn[14]:= QMatrix MyChain
Out[14]= 1 2
In[15]:= RMatrix
MyChain
Out[15]= 1 2, 0

MatrixForm M 1, 1 , MatrixForm M 1, 2 ,
MatrixForm M 2, 1 , MatrixForm M 2, 2

Out[20]//MatrixForm= 0 1
1 2 1 2 0 0
1 2 0
1 2
In[21]:= FundamentalMatrix
MyChain
Out[21]= 2 In[22]:= SimulateFirst MyChain
Out[22]= A
In[23]:= SimulateFirst MyChain
Out[23]= C
In[24]:= SimulateNext A, MyChain
Out[24]= A
In[28]:= SimulateNext A, MyChain
Out[28]= B
In[36]:= MatrixForm SimulatePaths 3, 10, MyChain
Out[36]//MatrixForm= C B C B C C B C B C
A A B C C B C B C B
A A A B C B C C B C

Chapter 10
Classi cation of States
There will be a lot of de nitions and some theory before we get to examples. You might want to
peek into the last part (examples) as notions are being introduced; it will help your understanding.
10.1 The Communication Relation
Let fX
ng
n2 N
0 be a Markov chain on the state space
S. For a given set Bof states, de ne the
hitting time (B )of Bas
B = min
fn 2 N
0 :
X
n2
Bg:
(10.1)
We know that
B is, in fact, a stopping time with respect to
fX
ng
n2 N
0. When
Bconsists of only
one element B=fig , we simply write
i for

fig ;

i is the rst time the Markov chain
fX
ng
n2 N
0
hits the state i. As always, we allow
B to take the value
+1 ; it means that no state in Bis ever
hit.
The hitting times are important both for immediate applications of fX
ng
n2 N
0, as well as for
better understanding of the structure of Markov chains.
Example 10.1. LetfX
ng
n2 N
0 be the chain which models a game of tennis (Example 9., in Section
2. of Lecture 8). The probability of winning for (say) Amélie can be phrased in terms of hitting
times: P[Amélie wins ] =P[
iA <
iB ]
;
where i
A =
Amélie wins and i
B =
Björn wins (the two absorbing states of the chain). We will
learn how to compute such probabilities in the subsequent lectures.
Having introduced the hitting times
B , let us give a few more de nitions. Recall that the
notation P
i[
A ]is used to mean P[A jX
0=
i] (for any event A). In practice, we use P
ito signify that
we are starting the chain from the state i, i.e., P
icorresponds to a Markov chain whose transition
matrix is the same as the one of fX
ng
n2 N
0, but the initial distribution is given by
P
i[
X
0=
j] = 0
if j6
= iand P
i[
X
0=
i] = 1 .
De nition 10.2. A statei2 S is said to communicate with the statej2 S , denoted by i! jif
P i[

j <<br />
1]> 0:
79

CHAPTER 10. CLASSIFICATION OF STATES
Intuitively,
icommunicates with jis there is a non-zero chance that the Markov chain Xwill
eventually visit jif it starts from i. Sometimes we also say that jis a consequent of i, or that j
is accessible from i.
Example 10.3. In the tennis example, every state is accessible from (0;0) (the fact that p2 (0;1)
is important here), but (0;0) is not accessible from any other state. The consequents of (40;40)
are (40;40) itself, (40; Adv ), (Adv; 40), Amélie wins and Björn wins .
Before we examine some properties of the relation !, here is a simple but useful charac-
terization. Before we give it, we recall the following fact about probability: let fA
ng
n2 N
0 be an
increasing sequence of events, i.e., A
n
A
n+1 , for all
n2 N
0. Then
P [[
n2 N
0A
n] = lim
nP
[A
n]
;
and the sequence inside the limit is non-decreasing.
Proposition 10.4. i! jif and only if p(
n )
ij >
0for some n2 N
0.
Proof. The event A= f
j <<br />
1g can be written as an increasing union
A = [
n2 N A
n;
where A
n =
f
j
ng:
Therefore, Pi[

j <<br />
1] = P
i[
A ] = lim
nP
i[
A
n] = lim
nP
i[

j
n];
and the sequence P
i[

j
n], n 2 N isnon-decreasing . In particular,
P i[

j <<br />
1] P
i[
A
n]
; for all n2 N:
(10.2)
Suppose, rst, that p(
n )
ij >
0for some n. Since
j is the
rst time jis visited, we have
P i[
A
n] =
P
i[

j
n] P
i[
X
j=
n] = p(
n )
ij >
0:
By (10.2), we have P
i[

j <<br />
1]> 0, and so, i! j.
Conversely, suppose that i! j, i.e., P
i[
A ] > 0. Since P
i[
A ] = lim
nP
i[
A
n]
, we must have
P i[
A
n]
> 0for some n
0 (and then all larger
n), i.e., P
i[

j
n
0]
> 0. When
j
n
0, we must have
X 0=
jor X
1=
jor . . . or X
n=
j, i.e.,
f
j
n
0g [ n
0
k =0 f
X
k=
jg ;
and so 0< P
i[

j
n
0]
P
i[
[ n
0
k =0 f
X
k=
jg ] n
0
X
k =0 P
i[
X
k=
j]:
Therefore, P
i[
X
n=
j] for at least one n2 f 0;1 ; : : : ; n
0g
. In other words, p(
n )
ij >
0, for at least one
n 2 f 0;1 ; : : : ; n
0g
. Proposition 10.5.
For alli; j; k2 S, we have Last Updated: December 24, 2010
80Intro to Stochastic Processes: Lecture Notes

CHAPTER 10. CLASSIFICATION OF STATES
1.
i! i,
2. i! j; j !k) i! k.
Proof. 1. If we start from state i2 S we are already there (note that 0is allowed as a value for
B in
(10.1)), i.e.,
i = 0
when X
0=
i.
2. Using Proposition 10.4, it will be enough to show that p(
n )
ik >
0for some n2 N. By the
same Proposition, we know that p(
n
1)
ij >
0and p(
n
2)
j k >
0for some n
1; n
22
N
0. By the
Chapman-Kolmogorov relations, with n= n
1 +
n
2, we have
p (
n )
ik =X
l 2S p
(
n
1)
il p(
n
2)
lk
p(
n
1)
ij p(
n
2)
j k >
0: Remark
10.6.The inequality p(
n )
ik
p(
n
1)
il p(
n
2)
lk is valid for all
i; l; k2 S, as long as n
1 +
n
2 =
n. It
will come in handy later.
10.2 Classes
De nition 10.7. We say that the states iand jin Sintercommunicate , denoted byi$ jif i! j
and j! i. A set B S of states is called irreducibleifi$ jfor all i; j2 S .
Unlike the relation of communication, the relation of intercommunication is symmetric. More-
over, we have the following three properties (the result follows directly from Proposition 10.4,
so we omit it):
Proposition 10.8. The relation$is an equivalence relation ofS, i.e., for all i; j; k2 S, we have
1. i$ i(re exivity ) ,
2. i$ j) j$ i(symmetry ), and
3. i$ j; j $k) i$ k(transitivity ).
The fact that $is an equivalence relation allows us to split the state-space Sinto equivalence
classes with respect to $. In other words, we can write
S= S
1 [
S
2 [
S
3 [
: : : ;
where S
1; S
2; : : :
are mutually exclusive (disjoint) and all states in a particular S
n intercommu-
nicate, while no two states from di erent equivalence classes S
n and
S
m do. The sets
S
1; S
2; : : :
are called classesof the chain fX
ng
n2 N
0. Equivalently, one can say that classes are
maximal
irreducible sets , in the sense that they are irreducible and no class is a subset of a (strictly larger)
irreducible set. A cookbook algorithm for class identi cation would involve the following steps:
1. Start from an arbitrary state (call it 1). Last Updated: December 24, 2010
81Intro to Stochastic Processes: Lecture Notes

CHAPTER 10. CLASSIFICATION OF STATES
2. Identify
allstates jthat communicate with it - don't forget that always i$ i, for all i.
3. That is your rst class, call it C
1. If there are no elements left, then there is only one class
C 1 =
S. If there is an element in S nC
1, repeat the procedure above starting from that
element.
The notion of a class is especially useful in relation to another natural concept:
De nition 10.9. A setB S of states is said to be closedifi6! jfor all i2 B and all j2 S n B.
A state i2 S such that the set fig is closed is called absorbing.
Here is a nice characterization of closed sets:
Proposition 10.10. A setBof states is closed if and only if p
ij = 0
for all i2 B and all
j 2 Bc
= S n B.
Proof. Suppose, rst, that Bis closed. Then for i2 B and j2 Bc
, we have i6! j, i.e., p(
n )
ij = 0
for
all n2 N. In particular, p
ij = 0
.
Conversely, suppose that p
ij = 0
for all i2 B,j 2 Bc
. We need to show that i6! j(i.e. p(
n )
ij = 0
for all n2 N) for all i2 B,j 2 Bc
. Suppose, to the contrary, that there exist i2 B and j2 Bc
such that p(
n )
ij >
0for some n2 N. Since p
ij = 0
, we must have n >1. Out of all n >1such that
p (
n )
ik >
0for some k2 Bc
, we pick the smallest one (let us call it n
0), so that that
p(
n
0
1)
ik = 0
for
all k2 Bc
. By the Chapman-Kolmogorov relation, we have
p(
n )
ij =X
k 2S p
(
n 1)
ik p
kj
= X
k 2 B p
(
n 1)
ik p
kj + X
k 2 B cp
(
n 1)
ik p
kj : (10.3)
The terms in the rst sum in the second line of (10.3) are all zero because p
kj = 0
(k 2 B and
j 2 Bc
), and the terms of the second one are also all zero because p(
n 1)
ik = 0
for all k2 Bc
.
Therefore, p(
n )
ij = 0
- a contradiction. Intuitively, a set of states is closed if it has the property that the chain
fX
ng
n2 N
0 stays in it
forever, once it enters it. In general, if Bis closed, it does not have to follow that S nBis closed.
Also, a class does not have to be closed, and a closed set does not have to be a class. Here are
some examples:
Example 10.11. Consider thetennischain of the previous lecture and consider the following
three sets of states:
1.B=f Amélie wins g: closed and a class, but S nBis not closed
2. B=S n f (0;0) g: closed, but not a class, and
3. B=f(0 ;0) g: class, but not closed.
There is a relationship between classes and the notion of closedness: Last Updated: December 24, 2010
82Intro to Stochastic Processes: Lecture Notes

CHAPTER 10. CLASSIFICATION OF STATES
Proposition 10.12.
Every closed setBis a union of classes.
Proof. Let^
B be the union of all classes Csuch that C\B 6
= ;. In other words, take all the
elements of Band throw in all the states which intercommunicate with them. I claim that
^
B =B. Blearly, B ^
B , so we need to show that ^
B B. Suppose, to the contrary, that there
exists j2 ^
B nB . By construction, jintercommunicates with some i2 B. In particular i! j.
By closedness of B, we must have j2 B. This is a contradiction with the assumptions that
j 2 ^
B nB . Example 10.13.
A converse of Proposition 10.12 is not true. Just take the set B=f(0 ;0) ;(0 ;15) g
in the tennis example. It is a union of classes, but it is not closed.
10.3 Transience and recurrence
It is often important to know whether a Markov chain will ever return to its initial state, and if
so, how often. The notions of transience and recurrence address this questions.
De nition 10.14. The( rst) visit time to statej, denoted by
j(1)
is de ned as
j(1) = min
fn 2 N :X
n=
jg :
As usual
j(1) = +
1ifX
n6
= jfor all n2 N.
Note that the de nition of the random variable
j(1)
di ers from the de nition of
j in that
the minimum here is take over the set Nof natural numbers, while the set of non-negative
integers N
0 is used for

j. When
X
06
= j, the hitting time
j and the visit time

j(1)
coincide.
The important di erence occurs when X
0=
j. In that case
j = 0
(we are already there), but it
is always true that
j(1)
1. It can even happen that P
i[

i(1) =
1] = 1 .
De nition 10.15. A statei2 S is said to be
1. recurrent ifP
i[

i(1)
<1] = 1 ,<br />
2. positive recurrent ifE
i[

i(1)]
<1 (E<br />
imeans expectation when the probability is
P
i),
3. null recurrent if it is recurrent, but not positive recurrent,
4. transient if it is not recurrent.
A state is recurrent if we are sure we will come back to it eventually (with probability 1).
It is positive recurrent if the time between two consecutive visits has nite expectation. Null
recurrence means the we will return, but the waiting time may be very long. A state is transient
is there is a positive chance (however small) that the chain will never return to it.
Remember that the greatest common denominator (GCD) of a setAof natural numbers
if the largest number dsuch that ddivides each k2 A, i.e., such that each k2 A is of the form
k = ld for some l2 N.
De nition 10.16. Aperiod d(i) of a state i2 S is the greatest common denominator of the
return-time set R(i) = fn 2 N :p(
n )
ii >
0g of the state i. When R(i) = ;, we set d(i) = 1 . A state
i 2 S is called aperiodic ifd(i) = 1 . Last Updated: December 24, 2010
83Intro to Stochastic Processes: Lecture Notes

CHAPTER 10. CLASSIFICATION OF STATES
Example 10.17.
Consider the Markov chain with three states and the transition matrix
P=2
4 0 1 0
0 0 1
1 0 0 3
5 :
The return set for each state i2 f 1;2 ;3 g is given by
R (i) = f3;6 ;9 ;12 ; : : : g;
so d(i) = 3 for all i2 f 1;2 ;3 g. However, if we change the probabilities a bit:
^
P =2
4 0 1 0
0 0 1 1 2
0 1 2
3
5 ;
the situation changes drastically: R(1) = f3 ;4 ;5 ;6 ; : : : g;
R (2) = f2 ;3 ;4 ;5 ;6 ; : : : g;
R (3) = f1 ;2 ;3 ;4 ;5 ;6 ; : : : g;
so that d(i) = 1 fori2 f 1;2 ;3 g.
10.4 Examples
Random walks: p2 (0;1) .
Communication and classes . Clearly, it is possible to go from any state ito either i+ 1 or
i 1in one step, so i! i+ 1 and i! i 1for all i2 S . By transitivity of communication,
we have i! i+ 1 !i+ 2 ! ! i+ k. Similarly, i! i kfor any k2 N. Therefore,
i ! jfor all i; j2 S , and so, i$ jfor all i; j2 S , and the whole Sis one big class.
Closed sets . The only closed set is Sitself.
Transience and recurrence We studied transience and recurrence in the lectures about
random walks (we just did not call them that). The situation highly depends on the proba-
bility pof making an up-step. If p >1 2
, there is a positive probability that the rst step will
be up , so that X
1= 1
. Then, we know that there is a positive probability that the walk will
never hit 0again. Therefore, there is a positive probability of never returning to 0, which
means that the state 0is transient. A similar argument can be made for any state iand any
probability p6
= 1 2
. What happens when
p= 1 2
? In order to come back to
0, the walk needs
to return there from its position at time n= 1 . If it went up, the we have to wait for the
walk to hit 0starting from 1. We have shown that this willhappen sooner or later, but that
the expected time it takes is in nite. The same argument works if X
1=
1. All in all, 0
(and all other states) are null-recurrent (recurrent, but not positive recurrent).
Periodicity . Starting from any state i2 S , we can return to it after 2;4 ;6 ; : : : steps. There-
fore, the return set R(i) is always given by R(i) = f2;4 ;6 ; : : : gand so d(i) = 2 for all
i 2 S . Last Updated: December 24, 2010
84Intro to Stochastic Processes: Lecture Notes

CHAPTER 10. CLASSIFICATION OF STATES
Gambler's ruin:
p2 (0;1) .
Communication and classes . The winning stateaand the losing state 0are clearly ab-
sorbing, and form one-element classes. The other a 1states intercommunicate among
each other, so they form a class of their own. This class is not closed (you can - and will -
exit it and get absorbed sooner or later).
Transience and recurrence . The absorbing states0and aare (trivially) positive recurrent.
All the other states are transient: starting from any state i2 f 1;2 ; : : : ; a 1g , there is a
positive probability (equal to pa
i
) of winning every one of the next a igames and, thus,
getting absorbed in abefore returning to i.
Periodicity . The absorbing states have period 1since R(0) = R(a ) = N. The other states
have period 2(just like in the case of a random walk).
Deterministically monotone Markov chain Communication and classes . A stateicommunicates with the state jif and only if j i.
Therefore i$ jif and only if i= j, and so, each i2 S is in a class by itself.
Closed sets . The closed sets are precisely the sets of the form B=i; i + 1 ; i+ 2 ; : : : , for
i 2 N.
Transience and recurrence All states are transient.
Periodicity . The return set R(i) is empty for each i2 S , so d(i) = 1 , for all i2 S .
A game of tennis Communication and classes . All the states except for those in E=f(40 ; Adv ); (40 ;40) ;(Adv; 40);
Amélie wins ;Björn wins gintercommunicate only with themselves, so each i2 S n Eis in a
class by iteself. The winning states Amélie winsandBjörn wins are absorbing, and, so, also
form classes with one element. Finally, the three states in f(40 ; Adv ); (40 ;40) ;(Adv; 40)g
intercommunicate with each other, so they form the last class.
Periodicity . The states iin S n Ehave have the property that p(
n )
ii = 0
for all n2 N, so
d (i) = 1 . The winning states are absorbing so d(i) = 1 fori2 f Amélie wins, Björn wins g.
Finally, the return set for the remaining three states is f2;4 ;6 ; : : : gso their period is 2. Last Updated: December 24, 2010
85Intro to Stochastic Processes: Lecture Notes

Chapter 11
More on Transience and recurrence
11.1 A criterion for recurrence
The de nition of recurrence from the previous lecture is conceptually simple, but it gives us
no clue about how to actually go about deciding whether a particular state in a speci s Markov
chain is recurrent. A criterion stated entirely in terms of the transition matrix Pwould be nice.
Before we give it, we need to introduce some notation. For two (not necessarily di erent) states,
i; j 2 S , let f(
n )
ij be the probability that it will take exactly
nsteps for the rst visit to j(starting
from i) to occur, i.e.,
f(
n )
ij =
P
i[

j(1) =
n] = P[X
n=
j; X
n 1 6
= j; X
n 2 6
= j; : : : ; X
26
= j; X
16
= jjX
0=
i]:
Let f
ij denote the probability that
jwill be reached from ieventually , i.e.,
f ij = 1
X
n =1 f
(
n )
ij :
Clearly, we have the following equivalence: iis recurrent if and only if f
ii = 1
.
The reason the quantities f(
n )
ij are useful lies in the following recursive relationship:
Proposition 11.1. Forn2 N and i; j2 S , we have
p (
n )
ij = n
X
m =1 p
(
n m )
j j f(
m )
ij :
Proof. In preparation for the rest of the proof, let us reiterate that the event f
j(1) =
mgcan be
written as f
j(1) =
mg= fX
m =
j; X
m 1 6
= j; X
m 2 6
= j; : : : ; X
16
= jg :
Using the law of total probability, we can split the event fX
n=
jg according to the value of the
random variable
j(1)
(the values of
j(1)
larger than ndo not appear in the sum since X
ncannot
86

CHAPTER 11. MORE ON TRANSIENCE AND RECURRENCE
be equal to
jin those cases):
p (
n )
ij =
P[X
n=
jjX
0=
i] = n
X
m =1 P
[X
n=
jj
j(1) =
m; X
0=
i] P [
j(1) =
mjX
0=
i]
= n
X
m =1 P
[X
n=
jjX
m =
j; X
m 1 6
= j; X
m 2 6
= j; : : : ; X
16
= j; X
0=
i] f(
m )
ij
= n
X
m =1 P
[X
n=
jjX
m =
j] f(
m )
ij =n
X
m =1 p
(
n m )
j j f(
m )
ij An important corollary to Proposition 11.1 is the following characterization of recurrence:
Proposition 11.2. A statei2 S is recurrent if and only if
1
X
n =1 p
(
n )
ii = +
1:
Proof. Fori= j, Proposition 11.1 states that
p(
n )
ii = n
X
m =1 p
(
n m )
ii f(
m )
ii :
Summing over all nfrom 1to N 2N, we get
N
X
n =1 p
(
n )
ii = N
X
n =1 n
X
m =1 p
(
n m )
ii f(
m )
ii =N
X
n =1 N
X
m =1 1
fm ngp (
n m )
ii f(
m )
ii (11.1)
We set s
N = P
N
n =1 p(
n )
ii for
N2N, remember that p(0)
ii = 1
, and interchange the order of
summation to get
sN = N
X
m =1 N
X
n = m p
(
n m )
ii f(
m )
ii =N
X
m =1 f
(
m )
ii N
m
X
n =0 p
(
n )
ii = N
X
m =1 f
(
m )
ii (1 +
s
N m )
The sequence fs
N g
N 2N is non-decreasing, so
sN N
X
m =1 f
(
m )
ii (1 +
s
N )
(1 + s
N )
f
ii :
Therefore, fii
lim
N s
N 1 +
s
N :
If P
1
n =1 p(
n )
ii = +
1, then lim
N!1 s
N 1+
s
N = 1
, so f
ii
1. On the other hand, f
ii =
P
i[

i(1)
<1],<br />
so f
ii
1. Therefore, f
ii = 1
, and, so, the state iis recurrent. Last Updated: December 24, 2010
87Intro to Stochastic Processes: Lecture Notes

CHAPTER 11. MORE ON TRANSIENCE AND RECURRENCE
Conversely, suppose that
iis recurrent, i.e., f
ii = 1
. We can repeat the procedure from above
(but with +1 instead of Nand using Tonelli's theorem to interchange the order of the sums) to
get that 1
X
n =1 p
(
n )
ii = 1
X
m =1 f
(
m )
ii (1 + 1
X
n =1 p
(
n )
ii ) = 1 + 1
X
n =1 p
(
n )
ii :
This can happen only if P
1
m =1 p(
n )
ii = +
1. 11.2 Class properties
Certain properties of states are shared between all elements in a class. Knowing which properties
share this feature is useful for a simple reason - if you can check them for a single class member,
you know automatically that all the other elements of the class share it.
De nition 11.3. A property is called a class propertyit holds for all states in its class, whenever
it holds for any one particular state in the that class.
Put di erently, a property is a class property if and only if either all states in a class have or
none does.
Proposition 11.4. Transience and recurrence are class properties.
Proof. Suppose that the state iis recurrent, and that jis in its class, i.e., that i$ j. Then, there
exist natural numbers mand ksuch that p(
m )
ij >
0and p(
k )
j i >
0. By the Chapman-Kolmogorov
relations, for each n2 N, we have
p (
n + m +k)
j j =X
l 1 2S X
l 2 2S p
(
k )
j l 1p (
n )
l 1 l
2 p (
m )
l 2 m
p(
k )
j i p(
n )
ii p(
m )
ij :
In other words, there exists a positive constant c(take c= p(
k )
j i p(
m )
ij ), independent of
n, such that
p (
n + m +k)
j j
cp(
n )
ii :
Therefore, by recurrence of iwe have P
1
n =1 p(
n )
ii = +
1, and
1
X
n =1 p
(
n )
j j 1
X
n = m +k+1 p
(
n )
j j = 1
X
n =1 p
(
n + m +k)
j j
c1
X
n =1 p
(
n )
ii = +
1;
and so, jis recurrent. Therefore, recurrence is a class property.
Since transience is just the opposite of recurrence, it is clear that transience is also a class
property. Proposition 11.5.
Period is a class property, i.e., all elements of a class have the same period.Last Updated: December 24, 2010
88Intro to Stochastic Processes: Lecture Notes

CHAPTER 11. MORE ON TRANSIENCE AND RECURRENCE
Proof.
Letd= d(i) be the period of the state i, and let j$ i. Then, there exist natural numbers
m and ksuch that p(
m )
ij >
0and p(
k )
j i >
0. By Chapman-Kolmogorov,
p (
m +k)
ii
p(
m )
ij p(
k )
j i >
0;
and so
m+k2 R(i) :
(11.2)
Similarly, for any n2 R(j ),
p(
m +k+ n)
ii
p(
m )
ij p(
n )
j j p(
k )
j i >
0;
so
m+k+ n2 R(i) :
(11.3)
By (11.2), d(i) divides m+k, and, by (11.3), d(i) divides m+k+ n. Therefore, d(i) divides n, for
each n2 R(j ), and so, d(i) d(j ), because d(j ) is the greatest common divisor of R(j ). The
same argument with roles of iand jswitched shows that d(j ) d(i) . Therefore, d(i) = d(j ). 11.3 A canonical decomposition
Now that we know that transience and recurrence are class properties, we can introduce the
notion of canonical decomposition of a Markov chain. LetS
1;
S
2; : : :
be the collection of all
classes; some of them contain recurrent states and some transient ones. Proposition 11.4 tells
us that if there is one recurrent state in a class, than all states in the class must be recurrent.
This, it makes sense to call the whole class recurrent. Similarly, the classes which are not
recurrent consists entirely of transient states, so we call them transient. There are at most
countably many states, so the number of all classes is also at most countable. In particular, there
are only countably (or nitely) many recurrent classes, and we usually denote them by C
1; C
2; : : :
.
Transient classes are denoted by T
1; T
2; : : :
. There is no particular rule in the choice of indices
1 ;2 ;3 ; : : : for particular classes. The only point is that they can be enumerated because there are
at most countably many of them. The distinction between di erent transient classes is usually not very important, so we pack
all transient states together in a set T= T
1 [
T
2 [
: : : .
De nition 11.6. LetSbe the state space of a Markov chain fX
ng
n2 N
0. Let
C
1; C
2; : : :
be its
recurrent classes, and T
1; T
2; : : :
the transient ones, and let T= T
1 [
T
2 [
: : : be their union. The
decomposition S= T[C
1[
C
2[
C
3[
: : : ;
is called the canonical decomposition of the (state space of the) Markov chain fX
ng
n2 N
0.
The reason that recurrent classes are important is simple - they can be interpreted as Markov
chains themselves. In order for such an interpretation to be possible, we need to make sure that
the Markov chain stays in a recurrent class if it starts there. In other words, we have the following
important proposition: Last Updated: December 24, 2010
89Intro to Stochastic Processes: Lecture Notes

CHAPTER 11. MORE ON TRANSIENCE AND RECURRENCE
Proposition 11.7.
Recurrent classes are closed.
Proof. Suppose, contrary to the statement of the Proposition, that there exist two states i6
= j
such that
1.iis recurrent,
2. i! j, and
3. j6! i.
The idea of the proof is the following: whenever the transition i! joccurs (perhaps in several
steps), then the chain will never return to i, since j6! i. By recurrence of i, that is not possible.
Therefore i6! j- a contradiction.
More formally, the recurrence of imeans that (starting from i, i.e., under P
i) the event
A = fX
n=
i; for in nitely many n2 Ng
has probability 1, i.e., P
i[
A ] = 1 . The law of total probability implies that
1 = P
i[
A ] = X
n 2 N P
i[
A j
j(1) =
n]P
i[

j(1) =
n] + P
i[
A j
j(1) = +
1]P
i[

j(1) = +
1]:
(11.4)
On the event
j(1) =
n, the chain can visit the state iat most n 1times, because it cannot hit i
after it visits j(remember j6! i). Therefore, P[A j
j(1) =
n] = 0 , for all n2 N. It follows that
1 = P
i[
A j
j(1) = +
1]P
i[

j(1) = +
1]:
Both of the terms on the right-hand side above are probabilities whose product equals 1, which
forces both of them to be equal to 1. In particular, P
i[

j(1) = +
1], or, phrased di erently, i6! j
- a contradiction. Together with the canonical decomposition, we introduce the
canonical formof the transition
matrix P. The idea is to order the states in Swith the canonical decomposition in mind. We start
from all the states in C
1, followed by all the states in
C
2, etc. Finally, we include all the states in
T . The resulting matrix looks like this
P=2
6
6
6
6
6
4 P
1 0 0
: : :0
0 P
2 0
: : : 0
0 0 P
3 : : :
0
.
.
. .
.
. .
.
. .
.. .
.
.
Q 1 Q
2 Q
3 : : : : : : 3
7
7
7
7
7
5 ;
where the entries should be interpreted as matrices: P
1 is the transition matrix within the rst
class, i.e., P
1 = (
p
ij ; i
2C
1; j
2C
1)
, etc. Q
kcontains the transition probabilities from the transient
states to the states in the (recurrent) class C
k. Note that Proposition 11.7 implies that each
P
k is
a stochastic matrix, or, equivalently, that all the entries in the row of P
k outside of
P
k are zeros.
We nish the discussion of canonical decomposition with an important result and one of its
consequences. Last Updated: December 24, 2010
90Intro to Stochastic Processes: Lecture Notes

CHAPTER 11. MORE ON TRANSIENCE AND RECURRENCE
Proposition 11.8.
Suppose that the state space Sis nite. Then there exists at least one
recurrent state.
Proof. The transition matrix Pis stochastic, and so are all its powers Pn
. In particular, we have
1 = X
j 2S p
(
n )
ij ;
for all i2 S :
Summing the above equality over all n2 N and switching the order of integration gives
+ 1 =X
n 2 N 1 =
X
n 2 N X
j 2S p
(
n )
ij =X
j 2S X
n 2 N p
(
n )
ij :
We conclude that P
n2 N p(
n )
ij = +
1for at least one state j(this is where the niteness of Sis
crucial). We claim that jis a recurrent state. Suppose, to the contrary, that it is transient. If we sum
the equality from Proposition 11.1 over n2 N, we get
X
n 2 N p
(
n )
ij =X
n 2 N n
X
m =1 p
(
n m )
j j f(
m )
ij =X
m 2N 1
X
n = m p
(
n m )
j j f(
m )
ij =X
m 2N f
(
m )
ij 1
X
n =0 p
(
n )
j j =
f
ij 1
X
n =0 p
(
n )
j j :
(11.5)
Transience of jimplies that P
1
n =1 p(
n )
j j <<br />
1 and, by de nition, f
ij
1, for any i; j. Therefore,
P
n2 N p(
n )
ij <<br />
1 - a contradiction. Remark
11.9.If Sis not nite, it is not true that recurrent states must exist. Just remember the
Deterministically-monotone Markov chain example, or the random walk with p6
= 1 2
. All states
are transitive there.
In a nite state-space case, we have the following dichotomy:
Corollary 11.10. A class of a Markov chain on a nite state space is recurrent if and only if
it is closed.
Proof. We know that recurrent classes are closed. In order to show the converse, we need to
prove that transient classes are not closed. Suppose, to the contrary, the there exists a nite
state-space Markov chain with a closed transient class T. Since Tis closed, we can see it as a
state space of the restricted Markov chain. This, new, Markov chain has a nite number of states
so there exists a recurrent state. This is a contradiction with the assumption that Tconsists only
of transient states. Remark
11.11.Again, niteness is necessary. For a random walk on Z, all states intercommuni-
cate. In particular, there is only one class Zitself and it it trivially closed. If p6
= 1 2
, however, all
states are transient, and, so, Zis a closed and transient class. Last Updated: December 24, 2010
91Intro to Stochastic Processes: Lecture Notes

Chapter 12
Absorption and reward
Note: Even though it is not by any means necessary, we are assuming that, from this point
onwards, all Markov chains have nite state spaces.
12.1 Absorption
Remember the Tennis example from a few lectures ago and the question we asked there,
namely, how does the probability of winning a single point a ect the probability of winning the
overall game? An algorithm that will help you answer that question will be described in this
lecture. The rst step is to understand the structure of the question asked in the light of the canonical
decomposition of the previous lecture. In the Tennis example, all the states except for the
winning ones are transient, and there are two one-element recurrent classes fAmélie wins gand
f Björn wins g. The chain starts from a transient state (0;0) , moves around a bit, and, eventually,
gets absorbed in one of the two. The probability we are interested in is not the probability that the
chain will eventually get absorbed. This probability is always 1(this is true in every nite Markov
chain, but we do not give a proof of this). We are, instead, interested in the probability that the
absorption will occurr in a particular state - the state fAmélie wins gin the Tennis example.
A more general version of the problem above is the following: let i2 S be any state, and let j
be a recurrent state. If the set of all recurrent states is denoted by C, and if
C is the rst hitting
time of the set C, then X
C denotes the rst recurrent state visited by the chain. Equivalently,
X C is the value of
Xat (random) time
C ; its value is the name of the state in which it happens
to nd itself the rst time it hits the set of all recurrent states. For any two states i; j2 S , the
absorption probability u
ij is de ned as
u ij =
P
i[
X
C =
j] = P
i[
the rst recurrent state visited by Xisj]:
When j, is not a recurrent state, then u
ij = 0
;j cannot possibly be the rst recurrent state we
hit - it is not even recurrent. Whwn i= jis a recurrent state, then u
ii = 1
- we are in iright
from the start. The situation i2 T,j 2 C is the interesting one.
In many calculations related to Markov chains, the method of rst-step decompositionworks
miracles. Simply, we cut the probability space according to what happened in the rst step and
92

CHAPTER 12. ABSORPTION AND REWARD
use the law of total probability (assuming
i2 T,j 2 C)
u ij =
P
i[
X
C =
j] = X
k 2S P
[X
C =
jjX
0=
i; X
1=
k]P [X
1=
kjX
0=
i]
= X
k 2S P
[X
C =
jjX
1=
k]p
ik
The conditional probability P[X
C =
jjX
1=
k] is an absorption probability, too. If k= j, then
P [X
C =
jjX
1=
k] = 1 . Ifk2 C n f jg , then we are already in C, but in a state di erent from j,
so P[X
C =
jjX
1=
k] = 0 . Therefore, the sum above can be written as
uij = X
k 2 T p
ik u
kj +
p
ij ;
which is a system of linear equations for the family (u
ij ; i
2T ; j 2C). Linear systems are
typically better understood when represented in the matrix form. Let Ube a T C-matrix
U = ( u
ij ; i
2T ; j 2C), and let Qbe the portion of the transition matrix Pcorresponding
to the transitions from Tto T, i.e. Q= ( p
ij ; i
2T ; j 2T), and let Rcontain all transitions
from Tto C, i.e., R= ( p
ij )
i2 T ;j 2C . If
P
C denotes the matrix of all transitions from
Cto C, i.e.,
P C = (
p
ij ; i
2C; j 2C), then the canonical form of Plooks like this:
P =
P C 0
R Q
:
The system (12.1) now becomes: U=QU +R; i.e., (I Q)U =R:
If the matrix I Q happens to be invertible, we are in business, because we then have an explicit
expression for U:
U= ( I Q)
1
R:
So, is I Q invertible? It is when the state space Sis nite, but you will not see the proof in these
notes. When the inverse (I Q)
1
exists, it is called the fundamental matrixof the Markov
chain.
Example 12.1. Before we turn to the Tennis example, let us analyze a simpler case of Gambler's
ruin with a= 3 . The states 0and 3are absorbing, and all the others are transient. Therefore
C 1=
f0 g,C
2=
f3g and T= T
1 =
f1 ;2 g. The transition matrix Pin the canonical form (the rows
and columns represent the states in the order 0;3 ;1 ;2 )
P =2
6
6
4 1 0 0 0
0 1 0 0
1 p 0 0 p
0 p1 p 03
7
7
5
Therefore, R=
1 p 0
0 p
and Q=
0 p
1 p 0
: Last Updated: December 24, 2010
93Intro to Stochastic Processes: Lecture Notes

CHAPTER 12. ABSORPTION AND REWARD
The matrix
I Q is a 2 2matrix so it is easy to invert:
(I Q)
1
= 1 1
p+ p2
1 p
1 p 1
:
So U= 1 1
p+ p2
1 p
1 p 1
1 p 0
0 p
= "
1 p 1
p+ p2 p
2 1
p+ p2
(1 p)2 1
p+ p2 p 1
p+ p2 #
:
Therefore, for example, if the initial wealth is 1, the probability of getting rich before bankruptcy
is p2
= (1 p+ p2
).
We have already calculated this probability in one of the homework problems (HW4, Problem
4.4). There, we obtained that the desired probability equals (1 q p
)
=
1 (q p
)3
. You can check
that these two expressions are really the same.
Example 12.2. In the Tennis example, the transition matrix is 20 20, with only 2 recurrent
states (each in its own class). The matrix given in Lecture 8 is already in the canonical form
(the recurrent states correspond to the rst two rows/columns). In order to get (I Q)
1
, we
need to invert an 18 18 matrix. This is a job for a computer, and we use Mathematica(P is
the full transition matrix and 3is order of the state (0;0) in the internal representation). After
we remove the absorbing states, the number of (0;0) becomes 1(in the Mathematica internal
order) and that is why we are extracting the element in the rst row and rst column in Uto
get the result): :
If we plot this probability against the value of p, we get the following picture: :
Last Updated: December 24, 2010
94Intro to Stochastic Processes: Lecture NotesIn[114]:= Q P 3 ;; 20, 3 ;; 20 ; R P 3 ;; 20, 1 ;; 2 ;
In[115]:= F Inverse IdentityMatrix 18 Q ;
In[116]:= U F.R;
In[117]:= U 1, 1
Out[117]= p4
4 p 4
q 10 p 4
q2
20 p
5
q3
1 2 p q 0.20.40.60.81.00.20.40.60.81.0

CHAPTER 12. ABSORPTION AND REWARD
12.2 Expected reward
Suppose that each time you visit a transient state j2 T you receive a rewardg(j ) 2 R. The name
reward is a bit misleading since the negative g(j ) corresponds more to a ne than to a reward;
it is just a name, anyway. Can we compute the expected total reward before absorption
vi =
E
i[
C
X
n =0 g
(X
n)]?
And if we can, what is it good for? Many things, actually, as the following two special cases show: 1. If g(j ) = 1 for all j2 T, then v
i is the expected time until absorption. We will calculate
v (0 ;0) for the Tennis example to compute the expected duration of a tennis game.
2. If g(k ) = 1 andg(j ) = 0 forj6
= k, then v
i is the expected number of visits to the state
k
before absorption. In the Tennis example, if k= (40 ;40) , the value of v
(0 ;0) is the expected
number of times the score (40;40) is seen in a tennis game.
We compute v
i using the rst-step decomposition:
vi =
E[
C
X
n =0 g
(X
n)
jX
0=
i] = g(i) + E[
C
X
n =1 g
(X
n)
jX
0=
i]
= g(i) + X
k 2S E
[
C
X
n =1 g
(X
n)
jX
0=
i; X
1=
k]P [X
1=
kjX
0=
i]
= g(i) + X
k 2S p
ik E
[
C
X
n =1 g
(X
n)
jX
1=
k] (12.1)
If k2 T, then the homogeneity of the chain implies that
E[
C
X
n =1 g
(X
n)
jX
1=
k] = E[
C
X
n =0 g
(X
n)
jX
0=
k] = v
k:
When k62 T, then
E[
C
X
n =1 g
(X
n)
jX
1=
k] = 0 ;
because we have arrived and no more rewards are going to be collected. Therefore, for i2 T
we have vi =
g(i) + X
k 2 T p
ik v
k:
If we organize all v
i and all
g(i) into column vectors v= ( v
i; i
2T), g = ( g(i) ; i 2T), we get
v = Qv +g; i.e., v= ( I Q)
1
g:
Having derived the general forumla for various rewards, we can give an interpretation of the
fundamental matrix itself. Let us pick a transient state jand use the reward function ggiven by
g (k ) = 1
fk = jg = (
1; k =j
0 ; k 6
= j: Last Updated: December 24, 2010
95Intro to Stochastic Processes: Lecture Notes

CHAPTER 12. ABSORPTION AND REWARD
By the discussion above, the
ith
entry in v= ( I Q)
1
g is the expected reward when we start
from the state i. Given the form of the reward function, v
i is the expected number of visits to
the state jwhen we start from i. On the other hand, as the product of the matrix (I Q)
1
and
the vector g= (0 ;0 ; : : : ; 1; : : : ; 0),v
i is nothing but the
(i; j )-entry in (I Q)
1
:
Proposition 12.3. For two transient states iand j, the (i; j )-th entry in the fundamental matrix
( I Q)
1
is the expteced number of visits to the state jprior to absorbtion if the chain starts
from the state i(the time 0is counted, if i= j).
Example 12.4. We continue the analysis of the Tennis chain from Example 12.2. We set g(i) = 1
for all transient states ito nd the expected duration of a tennis game. :
We can plot this as a function of pto get the following graph: :
When p= 0 , the course of the game is totally predictable and Amélie wins in 4points. The same
holds when p= 1 , only it is Björn who wins with probabilty 1 this time. In between, we see that
the expected game-length varies between 4and about 7(actually, the exact number is 6:75 ), and
it longest when p= 1 2
.
How about the expected number of deuces (scores (40;40) )? We can compute that too by set-
ting g(i) = 0 ifi6
= (40 ;40) and g((40 ;40)) = 1 (the number 16 used below is just the MathematicaLast Updated: December 24, 2010
96Intro to Stochastic Processes: Lecture NotesIn[87]:= G Table 1 , i, 1, 18 ;
In[86]:= f p Simplify F.G 1, 1 . q 1 p
Out[86]=
4
1 p p2
6 p 3
18 p 4
18 p 5
6 p 6

1 2 p 2 p 2 0.00.20.40.60.81.04.04.55.05.56.06.57.0

CHAPTER 12. ABSORPTION AND REWARD
internal number of the state (40,40) among the transient states):
The plot of the obtained expressions, as a function of
p, looks like this Therefore, the expected number of deuces
varies between 0and a bit more than 0:6
(the exact number is 0:625 and corresponds
to the case p= 1 2
). When asked, people
would usually give a higher estimate for this
probability. The reason is that the distribu-
tion of the number of deuces looks some-
thing like the picture on the left (a his-
togram of the number of deuces in a simu-
lation of 1000 tennis games with p= 1 2
). We
see that most of the games have no deuces.
However, in the cases where a deuce hap-
pens, it is quite possible it will be repeated.
From a psychological point of view, we tend to forget all the games without deuces and focus
on those with at least 1 deuce when we make predictions. In fact, the expected number of deuces
given that the game contains at least one deuce is approximatly equal to2:1 . Last Updated: December 24, 2010
97Intro to Stochastic Processes: Lecture NotesIn[117]:= G Table If i 16, 1, 0 , i, 1, 18
Out[117]= 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 1 , 0 , 0
In[118]:= f p Simplify F.G 1, 1 . q 1 p
Out[118]= 20
1 p 3
p3
1 2 p 2 p 2 0.20.40.60.81.00.10.20.30.40.50.6 2468100200300400500600700

Chapter 13
Stationary and Limiting Distributions
Transitions between di erent states of a Markov chain describe short-timebehavior of the chain.
In most models used in physical and social sciences, systems change states many times per
second. In a rare few, the time scale of the steps can be measured in hours or days. What is of
interest, however, is the long-term behavior of the system, measured in thousands, millions, or
even billions of steps. Here is an example: for a typical liquid stock traded on the New York Stock
Excahenge, there is a trade every few seconds, and each trade changes the price (state) of the
stock a little bit. What is of interest to an investor is, however, the distribution of the stock-price in
6 months, in a year or, in 30 years - just in time for retirement. A back-of-an-envelope calculation
shows that there are, approximately, 50 million trades in 30 years. So, a grasp of very-long time
behavior of a Markov chain is one of the most important achievments of probability in general,
and stochastic-process theory in particular. We only scratch the surface in this lecture.
13.1 Stationary and limiting distributions
De nition 13.1. A stochastic process fX
ng
n2 N
0 is said to be
stationaryif the random vectors
( X
0; X
1; X
2; : : : ; X
k)
and (X
m; X
m+1 ; X
m+2 ; : : : ; X
m+k)
have the same (joint) distribution for all m; k2N
0.
For stationary processes, all random variables X
0; X
1; : : :
have the same distribution (just
take k= 0 in the de nition). That condition is, however, only necessary. The pairs (X
0; X
1)
and
( X
m; X
m+1 )
should be equally distributed as random vectors, the same for triplets, etc. Intuitively,
a stochastic process is stationary if, statistically, it does not evolve. Its probabilistic behavior today
is the same as its probabilistic behavior in a billion years. It is somethimes useful to think about
stationarity in the following way; if a system is let to evolve for a long time, it will reach an
equilibrium state and uctuate around it forever. We can expect that such a system will look
similar a million years from now and a billion years from now. It might, however, not resemble
its present state at all. Think about a glass of water in which we drop a tiny drop of ink. Imediately
after that, the glass will be clear, with a tiny black speck. The ink starts to di use and the spack
starts to grow immediately. It won't be long before the whole glass is of uniform black color - the
ink has permeated every last corner of the glass. After that, nothing much happens. The ink
98

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
will never spontaneously return to its initial state
1
. Ink is composed of many small particles which
do not interact with each other too much. They do, however, get bombarded by the molecules of
water, and this bombardment makes them behave as random walks which simply bounce back
once they hit the glass wall (this phenomenon is called di usion). Each ink particle will wander
o in its own direction, and quite soon, they will be everywhere . Eventally, the distribution of
the ink in the glass becomes very close to uniform and no amount of further activity will change
that - you just cannot get more random then the uniform distribution in a glass of water. Let us get back to mathematics and give two simple examples; one of a process which is not
stationary, and the other of a typical stationary process.
Example 13.2. The simple random walk is not stationary. Indeed, X
0 is a constant, while
X
1
takes two values with equal probabilities, so they cannot have the same distribution.
For an example of a stationary process, take a regime switching chain fX
ng
n2 N
0 with
p
01 =
p 10 = 1
, and the initial distribution P[X
0= 0] =
P[X
0= 1] = 1 2
. Then
X
n =
X
0if
nis even, and
X n= 1
X
0if
nis odd. Moreover, X
0and
1 X
0have the same distribution (Bernoulli with
p= 1 2
),
and, so X
0; X
1; : : :
all have the same distribution. How about k-tuples? Why do (X
0; X
1; : : : ; X
k)
and (X
m; X
m+1 ; : : : ; X
m+k)
have the same distribution? For i
0 ; i
1; : : : ; i
k2 f
0;1 g, by the Markov
property, we have
P[X
0=
i
0 ; X
1=
i
i; : : : ; X
k=
i
k ] =
P[X
0=
i
0 ]
p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k = 1 2
p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k :
In the same manner, P[X
m =
i
0 ; X
1=
i
i; : : : ; X
m+k =
i
k ] =
P[X
m =
i
0 ]
p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k = 1 2
p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k ;
so the two distributions are identical. The second example above is quite instructive. We took a Markov chain and gave it an
initial distribution with the property that X
0and
X
m have the same distribution for all
m2N
0.
Magically, the whole process became stationary. This is not a coincidence; we can play the same
trick with any Markov chain, as long as the initial distribution with the above property can be
found. Actually, such a distribution is so important that it even has a name:
De nition 13.3. A distribution = (
i)
i2S on the state space
Sof a Markov chain with transition
matrix Pis called a stationary distribution if
P [X
1=
i] =
i for all
i2 S ;whenever P[X
0=
i] =
i;
for all i2 S :
In words, is called a stationary distribution if the distribution of X
1is equal to that of
X
0
when the distribution of X
0is
. Here is a hands-on characterization:
Proposition 13.4. A vector = (
i; i
2 S )with P
i2S
i = 1
is a stationary distribution if and
only if P= ;
when is interpreted as a row vector. In that case the Markov chain with initial distribution
and transition matrix Pis stationary and the distribution of X
m is
for all m2N
0. 1
It will, actually, but it will take an unimaginably long time. Last Updated: December 24, 2010
99Intro to Stochastic Processes: Lecture Notes

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
Proof.
Suppose, rst, that is a stationary distribution, and let fX
ng
n2 N
0 be a Markov chain with
initial distribution a(0)
= and transition matrix P. Then,
a (1)
=a(0)
P = P:
By the assumption, the distribution a(1)
ofX
1is
. Therefore, = P .
Conversely, suppose that = P . We can directly prove more than just the fact that is a
stationary distribution. We can prove that the process fX
ng
n2 N
0 is stationary. Let
fX
ng
n2 N
0 be a
Markov chain with initial distribution and transition matrix P. We need to show that fX
ng
n2 N
0
is stationary. In order to do that, we rst note that all random variables X
m,
m 2N
0, have the
same distribution. Indeed, the distribution a(
m )
of X
m is given by
a (
m )
= a(0)
Pm
= P m
= ( P)P m
1
= P m
1
= = :
Next, we pick m; k2N
0 and a
k+ 1 -tuple i
0 ; i
1; : : : ; i
kof elements of
S. By the Markov property,
we have
P [X
m =
i
0 ; X
m+1 =
i
1 ; : : : ; X
m+k =
i
k ] =
P[X
m =
i
0 ]
p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k =

i0 p
i0 i
1 p
i1 i
2 : : : p
ik 1i
k :
This last expression does not depend on m, so we can conclude that fX
ng
n2 N
0 is stationary. Example 13.5
(A model of di usion in a glass) .Let us get back to the story about the glass of
water and let us analyze a simpli ed model of that phenomenon. Our glass will be represented
by the set f0;1 ;2 ; : : : ; a g, where 0and aare the positions adjacent to the walls of the glass. The
ink particle performs a simple random walk inside the glass. Once it reaches the state 0it either
takes a step to the right to 1(with probability 1 2
) or tries to go left (also with probability 1 2
). The
passage to the left is blocked by the wall, so the particle ends up staying where it is. The same
thing happens at the other wall. All in all, we get a Markov chain with the following transition
matrix
P=2
6
6
6
6
6
6
6
6
6
4 1 2
1 2
0 0
: : :0 0 0
1 2
0 1 2
0
: : : 0 0 0
0 1 2
0 1 2
: : :
0 0 0
.
.
. .
.
. .
.
. .
.
. .
.. .
.
. .
.
. .
.
.
0 1 0 0 : : :01 2
0
0 1 0 0 : : :1 2
0 1 2
0 1 0 0 : : :01 2
1 2
3
7
7
7
7
7
7
7
7
7
5 :
Let us see what happens when we start the chain with a distribution concentrated in a=2(assuming
that ais even); a graphical representation of the distribution of X
3,
X
12,
X
30,
X
200 ,
X
700 and
X
5000
when a= 30 represents the behavior of the system very well (the yaxis is on a di erent scales
on the rst two plots): Last Updated: December 24, 2010
100Intro to Stochastic Processes: Lecture Notes0510152025300.20.40.60.81.0 0510152025300.20.40.60.81.0 0510152025300.050.100.150.20

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
How about if we start from a di erent initial distribution? Here are the same plots in that
case: As you can see, the distribution changes rapidly at rst, but then, once it has reached the
equilibrium it changes remarkably little (compare X
700 and
X
5000 ). Also, the equilibrium
distribution is very uniform and does not depend on the initial distribution ; this is exactly what
you would expect from a long-term distribution of ink in a glass. Let us show that the uniform distribution = (1 =(a +1) ;1 = (a +1) ; : : : ; 1=(1+ a)) = (1 =31 ;1 = 31 ; : : : ; 1=31)
on f0;1 ;2 ; : : : ; a gis indeed the (unique) stationary distribution. By Proposition 13.4, we need to
solve the equation = P . If we write out this system of equations line by line, we get
0 = 1 2
(

0 +

1)
1 = 1 2
(

0 +

2)
2 = 1 2
(

1 +

3)
3 = 1 2
(

2 +

4)
.
.
.
a 1 = 1 2
(

a 2 +

a)
a = 1 2
(

a 1 +

a) (13.1)
The rst equation yields
0 =

1. Using that in the second one, we get

1 =

2, etc. Finally, we
know that is a probability distribution so that
0 +

1 +
+
a = 1
. Therefore,
i = 1
=(a + 1)
for all i2 S , i.e., the uniform distribution on Sis the only stationary distribution. Last Updated: December 24, 2010
101Intro to Stochastic Processes: Lecture Notes0510152025300.050.100.150.20 0510152025300.050.100.150.20 0510152025300.050.100.150.20 0510152025300.20.40.60.81.0 0510152025300.20.40.60.81.0 0510152025300.050.100.150.20 0510152025300.050.100.150.20 0510152025300.050.100.150.20 0510152025300.050.100.150.20

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
Can there be more than one stationary distribution? Can there be none? Sure, here is an
example:
Example 13.6. ForP=I, any distribution is stationary, and there are in nitely many.
A simple example where no stationary distribution exists can be constructed on an in nite
state space. Take the Deterministically Monotone Markov Chain. The transition matrix looks
like the identity matrix, with the diagonal of ones shifted to the right. Therefore, the system of
equations = P reads
1 =

2;
2=

3; : : : ;
n=

n+1 ; : : : ;
and so, for to be a stationary distribution, we must have
n =

1 for all
n2 N. Now, if
1 = 0
,
is not a distribution (it sums to 0, not 1). But if
1 >
0, then the sum is +1 , so is not a
distribution either. Intuitively, the chain never stabilizes, it just keeps moving to the right ad
in nitum.
The example with many stationary distributions can be constructed on any state space, but the
other one, where no stationary distribution exists, had to use an in nite one. Was that necessary?
Yes. Before we show this fact, let us analyze the relation between stationary distributions and the
properties of recurrence and transience. Here is our rst result:
Proposition 13.7. Suppose that the state space Sof a Markov chain is nite and let S=
C 1[
C
2[ [
C
m [
T be its canonical decomposition, Then the following two statements are
equivalent:
1. is a stationary distribution, and
2. C
k
= C
k
P k,
k = 1 ; : : : ; m , and T
= (0 ;0 ; : : : ; 0),
where
P=2
6
6
6
4 P
1 0 0 0
.
.
. .
.. .
.
. 0
0 0 P
m 0
R Q 3
7
7
7
5 ;
is the canonical form of the transition matrix, C
k
= (
i; i
2C
k)
, k = 1 ;2 ; : : : ; m and T
=
(
i; i
2T).
Proof. We write the equation = P coordinatewise as
j = P
i2S
ip
ij and, by distinguishing
the cases i2 C
k,
k 2 f 1;2 ; : : : ; m g, and i2 T, we get the following sytem of matrix equations
(alternatively, just write the system = P in the block-matrix form according to the cannonical
decomposition above):
C
k
=
CK P
Ck +
T
R; k = 1; : : : ; m; and T
= T
Q:
The last equality can be read as follows: T
is in a row null-space of I Q. We know, however,
that I Q admits an inverse, and so it is a regular square matrix. Its row null-space (as well as
its column null-space) must be trivial, and, consequently, T
= 0 .
Having established that T
= 0 , we can de-couple the system of equations above and write it
as C
k
=
CK P
k; k
= 1; : : : ; m; and T
= (0 ;0 ; : : : ; 0); Last Updated: December 24, 2010
102Intro to Stochastic Processes: Lecture Notes

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
which is exactly what we needed to prove.
The other implication - the proof of which consists of a veri cation of the fact that each
distribution from (2)above is indeed a stationaty distribution - is left to the reader. The moral of the story of Proposition 13.7 is the following: in order to compute the stationary
distribution(s), classify the states and nd the canonical decomposition of the state space. Then,
set
i = 0
for any transient state i. What remains are recurrent classes, and you can analyize
each one separately. Note, however, that C
k
does not need to be a real distribution on C
k, since
P
i2 C
k C
k
i does not need to equal
1. However, unless C
k
= (0 ;0 ; : : : ; 0), we can always multiply
all its elements by a constant to make the sum equal to 1.
Thanks to the results of Proposition 13.7, we can focus on Markov chains with only one class:
De nition 13.8. A Markov chain is said to be irreducibleif there is only one class.
When a Markov chain is nite and irreducible, all of its states must be recurrent. Indeed, a
nite Markov chain has at least one recurrent state, and recurrence is a class property. Now that we have described the structure of the set of all stationary distributions, we still
need to to tackle the question of existence: are there any stationary distributions? In addition
to giving an a rmative answer to this question, we will also show how to construct a stationary
distribution and how to interpret it probabilistically.
Proposition 13.9. LetfX
ng
n2 N
0 be an irreducible Markov chain with a nite state space. Then,
the following two statements hold:
1. All states are positive recurrent.
2. Let ibe a xed (but arbitrary) recurrent state. Let
j =
E
i2
4
i(1)
1
X
n =0 1
fX
n=
jg 3
5 ; j 2 S
be the expected number of visits to state jin between two consecutive visits to state
i . Then the vector , given by
j = 1 m
i
j,
j 2 S (where m
i=
E
i[

i(1)]
) is a stationary
distribution.
Remark 13.10.Even though it is not exceedingly hard, the proof of this proposition is a bit
technical, so we omit it. It is important, however, to understand what the propositions states:
1. the expected number of visits to the state jin between two consecutive visits to the state i
can be related to a stationary distribution of the Markov chain by
j =
m
i
j, and
2. when we set j= i,
i counts the number of visits to
ibetween two consecutive visits to
i , which is always equal to 1(the rst visit is counted and the last one is not). Therefore,
i = 1
, and so
i = 1 m
i.
Proposition 13.9 and Remark 13.10 are typically used in the following way: one computes
the unique stationary distribution by solving the equation = P and then draws conclusions
about m
ior the

j's.
Note also that the computation above is not a special case of a reward computation of the
previous lecture. There, you need to start from a transient state and nish in a recurrent state,
while here your starting and nal state are the same. Last Updated: December 24, 2010
103Intro to Stochastic Processes: Lecture Notes

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
13.2 Limiting distributions
Example 13.5 shows vividly how the distribution of ink quickly reaches the uniform equilibrium
state. It is no coincidence that this limiting distribution happens to be a stationaty distribution.
Before we make this claim more precise, let us de ne rigorously what we mean by a limiting
distribution:
De nition 13.11. A distribution = (
i; i
2 S )on the state space Sof a Markov chain with
transition matrix Pis called a limiting distribution if
lim
n !1 p
(
n )
ij =

j;
for all i; j2 S .
Note that for to be a limiting distribution, all the limits in De nition 13.11 must exist. Once
they do, is automatically a distribution:
j
0(as a limit of non-negative numbers) and
X
j 2S
j = X
j 2S lim
n !1 p
(
n )
ij = lim
n!1 X
j 2S p
(
n )
ij = lim
n!1 1 = 1
:
Also, note that the independence on the initial state iis built into the de nition of the limiting
distribution: the sequence fp (
n )
ij g
n2 N must tend to the same limit

j for all
i2 S . Moreover, it
follows immediately there can be at most one limiting distribution. The connection with stationary distributions is spelled out in the following propositions:
Proposition 13.12. Suppose that a Markov chain with transition matrix Padmits a limiting
distribution = (
i; i
2 S ). Then is a stationary distribution.
Proof. To show that is a stationary distribution, we need to verify that it satis es = P , i.e.,
that j = X
i 2S
ip
ij :
We use the Chapman-Kolmogorov relation p(
n +1)
ij =P
k2S p
ik p(
n )
kj and start from the observation
that
j = lim
n!1 p(
n +1)
ij to get exactly what we need:
j = lim
n!1 p
(
n +1)
ij = lim
n!1 X
k 2S p
(
n )
ik p
kj =X
k 2S ( lim
n !1 p
(
n )
ik )
p
kj =X
k 2S
kp
kj : Example 13.13.
Limiting distributions don't need to exist, even when there are stationary ones.
Here are two examples:
1. Let fX
ng
n2 N
0 be a Regime-switching chain with
P=
0 1
1 0 Last Updated: December 24, 2010
104Intro to Stochastic Processes: Lecture Notes

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
Then
p(
n )
ij =(
1 2
(1 + (
1) n
); i =j
1 2
(1 + (
1) n
+1
); i 6
= j:
Clearly, the sequence p(
n )
ij is alternating between
0and 1, so it cannot admit a limit. A
stationary distribution exists: the system = P becomes
0 = 0

0 + 1

1;
1 = 1

0 + 0

1:
2. In the previous example the limiting distribution did not exist which implies that
0 =

1.
In order for to be a distribution on f0 ;1 g, we must have
0 =

1 = 1 2
. So,
(1 2
; 1 2
)
is a
stationary distribution.
3. In the previous example the limiting distribution did not exists because the limits of the sequences p(
n )
ij did not exist. Another reason for the non-existence of limiting distributions
can be dependence on the initial conditions: the limits lim
np(
n )
ij may exist for all
i; j, but
their values can depend on the intial state i. The simplest example is a Markov chain with
two states i= 1 ;2 , where p
11 =
p
22 = 1
. There are two recurrent (and therefore closed)
classes, and the chain remains in the state it starte in forever. Therefore, lim
np(
n )
12 = 0
and
lim np(
n )
22 = 1
, so no limiting distribution exists despite the fact that all limits lim
np(
n )
ij do.
Proposition 13.14. LetfX
ng
n2 N
0 be a nite-state, irreducible and aperiodic Markov chain.
Then the limiting distribution exists.
We conclude the discussion of limiting distributions with a version of the Law of Large
Numbers (LLN) for Markov chains. Before we state it, we recall the classical LLN for independent
variables:
Theorem 13.15. LetfX
ng
n2 N
0 be a sequence of independent and identically distributed random
variables, such that E[j X
0j
] < 1. Then
limn 1 n
n
1
X
k =0 X
k=
E[X
0]
;
almost surely 2
.
When fX
ng
n2 N
0 is a Markov chain, two problems occur. First, the random variables
X
0; X
1; : : :
are not independent. Second, X
ktakes its values in the state space
Swhich does not necessarily
consist of numbers, so the expression X
0+
X
1or
E[X
0]
does not make sense all the time. To
deal with the second problem, we pick a numerical function f:S ! R(give each state a value)
and form sums of the form f(X
0) +
f(X
1) +
+f(X
n 1)
. Independence is more subtle, but it
can be replaced with stationarity (loosely speaking), as in the following proposition (we skip the
proof): 2
Almost surely meanswith probability one , and states that this statement is true all the time for all practical
purposes. It may fail, but only in extremely exceptional cases such as the one where you toss a fair coin in nitely
many times and get tailsevery single time. That means never, really. Last Updated: December 24, 2010
105Intro to Stochastic Processes: Lecture Notes

CHAPTER 13. STATIONARY AND LIMITING DISTRIBUTIONS
Proposition 13.16.
LetfX
ng
n2 N
0 be an irreducible Markov chain with a nite state space
S,
let = (
i; i
2 S )be the (unique) stationary distribution, and let f:S ! Rbe an arbitrary
function. Then
limn P
n 1
k =0 f
(X
k) n
=X
i 2S f
(i)
i;
almost surely, no matter what initial distribution we choose.
Example 13.17. LetfX
ng
n2 N
0 be an irreducible nite Markov chain, and let the function
f:S !
R be given by
f(j ) = 1
(i= j) = (
1; j =i;
0 ; j 6
= i:
Then, by the Law of Large Numbers, we gave
limn 1 n
n
1
X
k =0 1
fX
k=
ig = X
j 2S 1
(i= j)
j =

i;
so that
i can be interpreted as a long-run percentage of time that the Markov chain
fX
ng
n2 N
0
spends in the state i. Last Updated: December 24, 2010
106Intro to Stochastic Processes: Lecture Notes]]>
https://web.ma.utexas.edu/users/gordanz/notes/introduction_to_stochastic_processes.pdf
https://web.ma.utexas.edu/users/gordanz/notes/introduction_to_stochastic_processes.pdfMon, 16 Dec 2019 06:47:00 +0100<![CDATA[The election in the media: against evasion and lies, good journalism is all we have | Alan Rusbridger | Politics | The Guardian]]>

In his first 1,000 days in office Donald Trump made 13,435 false or misleading claims, according to the good folk at the Washington Post who painstakingly monitor the president’s habit of bending the truth. How we Brits have smiled at this con man’s Teflon gift. Could never happen here.

But consider the lessons political managers around the world might have learned about our election and how we struggled to negotiate the increasingly blurred lines between truth and falsehood; facts and propaganda; openness and stealth; accountability and impunity; clarity and confusion; news and opinion.

It rather looks as if one or two skilled backroom manipulators (we can guess) studied Trump’s ability to persuade enough people that black is white and, rather than recoil in disgust, came to the opposite conclusion: it works.

One far off day we will discover whether 40 new hospitals will be built, and whether 20,000 new police officers will materialise along with 50,000 “new” nurses. It won’t be long before we learn whether we’ve now finally got Brexit “done” or whether this is just the start of a long and painful process of negotiating our future trading relationships with a greatly weakened hand.

We’ll learn the reality of whether there is to be frictionless trade between the mainland of Britain and the island of Ireland. We will read the truth about alleged Russian interference in the 2016 Brexit referendum … and much more. But by then life will have moved on, and maybe many of us will have forgotten the promises, evasions and outright lies of late 2019.

Coin one unforgettable message and stick to it. “Get Brexit done” was brilliant, never mind that the meaning of “Brexit” and “done” was far from clear: this is an age of simplicity, not complexity. Even the so-called mainstream media will do far more to amplify that slogan rather than question it. Try this stunt: slap the words on a JCB digger and drive it through a pile of polystyrene bricks ... and watch as news editors obligingly clear their front pages for the image. They are making posters, not doing journalism.

And remember that in most countries, governments have unusual power over public service broadcasters. So, in the event that television journalists seem to be getting too big for their boots, it is often useful to drop a heavy hint there will be a price to pay. Maybe Channel 4 has outlived its usefulness? Possibly it’s time to privatise the BBC? That should do the trick.

Old-fashioned press conferences should be kept to the minimum. A manifesto should say almost nothing. Gaffe-prone colleagues should be “disappeared”. If in real trouble, make things up. You’ll be amazed how readily even the best journalists will repeat unattributable fictions (see the “row” over the four-year-old boy in Leeds General Infirmary and what “happened” during the subsequent visit of health secretary Matt Hancock). By the time the journalists have corrected themselves and Twitter has spent 24 hours arguing about the truth, the world will have moved on.

So, as Trump has discovered, the liars, myth-makers and manipulators are in the ascendancy – and however valiantly individual journalists attempt to hold them to account (and many, especially at a local level, have tried magnificently) the dice are loaded against them.

The one over-riding thought is that for many years I looked at US newspapers and pitied colleagues there who “just” ran the newsroom, leaving comment pages to others. Pity has turned to envy. I now think it would be cleansing for all British national newspapers to split the responsibility for news and comment. It’s simply too hard for the average reader – especially, but not only online – to tell the difference.

And a hero? After the Yorkshire Evening Post‘s reporting of the Leeds story was questioned, its editor in chief, James Mitchinson, wrote a long and considered reply to a reader who, on the basis of something she read on social media, thought the story was fake. Mitchinson’s reply courteously asks the reader why she would believe the word of a total stranger (who might not even exist) over a newspaper she had read for many years in good faith.

The fact the paper knew the story to be true was, said Mitchinson, down to “bog-standard journalism”. It was a powerful statement of why good journalism – independent and decently crafted – should matter. So let’s hear it for bog-standard journalism. There’s too little of it. It may not be enough, but it’s all we have.

Alan Rusbridger is chair of the Reuters Institute for the Study of Journalism

]]>
https://www.theguardian.com/politics/2019/dec/14/election-in-the-media-evasion-lies-good-journalism-is-all-we-have
https://www.theguardian.com/politics/2019/dec/14/election-in-the-media-evasion-lies-good-journalism-is-all-we-haveMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Quoting in HTML: Quotations, Citations, and Blockquotes | CSS-Tricks]]>It’s all too common to see the incorrect HTML used for quotes in markup. In this article, let’s dig into all this, looking at different situations and different HTML tags to handle those situations.

There are three major HTML elements involved in quotations:

<blockquote>

<q>

<cite>

Let’s take a look.

Blockquotes

Blockquote tags are used for distinguishing quoted text from the rest of the content. My tenth grade English teacher drilled it into my head that any quote of four lines or longer should be set apart this way. The HTML spec has no such requirement, but as long as the text is a quote and you want it to be set apart from the surrounding text and tags, a blockquote is the semantic choice.

By default, browsers indent blockquotes by adding margin on each side.

As a flow element (i.e. “block level” element), blockquote can contain other elements inside it. For example, we can drop paragraphs in there with no problem:

<blockquote>
<p></p>
<p></p>
</blockquote>

But it could be other elements, too, like a heading or an unordered list:

It’s important to note that blockquotes should only be used for quotations rather than as a decorative element in a design. This also aids accessibility as screen reader users can jump between blockquotes. Thus a blockquote element used solely for aesthetics could really confuse those users. If you need something more decorative that falls outside the bounds of extended quotations, then perhaps a div with a class is the way to go.

blockquote,
.callout-block {
/* These could share styling */
}

Quoting with Q

Q tags (<q>) are for inline quotes, or what my tenth grade teacher would say are those under four lines. Many modern browsers will automatically add quotation marks to the quote as pseudo elements but you may need a backup solution for older browsers.

Typical quotation marks are just as valid for inline quotes as the <q> element. The benefits of using <q>, however, are that it includes a cite attribute, automatic handling of quotation marks, and automatic handling of quote levels. <q> elements should not used for sarcasm (e.g. “you would use a <q> tag for sarcasm, wouldn’t you?”), or signifying a word with air quotes (e.g. “awesome” is an “accurate” description of the author). But if you can figure out how to mark up air quotes, please let me know. Because that would be “awesome.”

The citation attribute

Both <q> and blockquotes can use a citation (cite) attribute. This attribute holds a URL that provides context and/or a reference for the quoted material. The spec makes a point of saying that the URL can be surrounded by spaces. (I’m not sure why that’s pointed out, but if you want to anger the semantic code deities, you’ll have to do more than throw spaces around.)

<p>The officer left a note saying <q cite="https://johnrhea.com/summons">You have been summoned to appear on the 4th day of January on charges of attempted reader bribery.</q></p>

That cite attribute isn’t visible to the user by default. You could add it in with a sprinkle of CSS magic like the following demo. You could even fiddle with it further to make the citation appear on hover.

Neither of those options are particularly great. If you need to cite a source such that users can see it and go to it, you should do it in HTML and probably with the <cite> element, which we’ll cover next.

The citation element

The <cite> tag should be used for referencing creative work rather than the person who said or wrote the quote. In other words, it’s not for quotes. Here are the examples from the spec:

<p>My favorite book is <cite>The Reality Dysfunction</cite> by
Peter F. Hamilton. My favorite comic is <cite>Pearls Before
Swine</cite> by Stephan Pastis. My favorite track is <cite>Jive
Samba</cite> by the Cannonball Adderley Sextet.</p>

If the author of this article told you he’d give you a cupcake, and you <cite> him by name, that would be semantically incorrect. Thus no cupcakes would change hands. If you cited the article in which he offered you a cupcake, that would be semantically correct, but since the author wouldn’t do that, you still wouldn’t get a cupcake. Sorry.

By default, browsers italicize cite tags and there’s no requirement that a <q> or <blockquote> be present to use the cite element. If you want to cite a book or other creative work, then slap it in the cite element. The semantic deities will smile on you for not using either <i> or <em> elements.

But where to put the cite element? Inside? Outside? The upside down? If we put it inside the <blockquote> or the <q>, we’re making it part of the quote. That's forbidden by the spec for just that reason.

<!-- This is apparently wrong -->
<blockquote>
Quote about cupcake distribution from an article
<cite>The Article</cite>
</blockquote>

Putting it outside just feels wrong and also requires you to have an enclosing element like a <div> if you wanted to style them together.

<div class="need-to-style-together">
<blockquote>
Quote about cupcake distribution from an article
</blockquote>
<cite>The Article</cite>
</div>

N.B. If you google this issue you may come across an HTML5 Doctor article from 2013 that contradicts much of what's laid out here. That said, every time it links to the docs for support, the docs agree with the article you're currently reading rather than the HTML5 Doctor article. Most likely the docs have changed since that article was written.

Hey, what about the figure element?

One way to mark up a quotation — and in a way that pleases the semantic code deities — is to put the blockquote within a figure element. Then, put the cite element and any other author or citation information in a figcaption.

<figure class="quote">
<blockquote>
But web browsers aren’t like web servers. If your back-end code is getting so big that it’s starting to run noticably slowly, you can throw more computing power at it by scaling up your server. That’s not an option on the front-end where you don’t really have one run-time environment—your end users have their own run-time environment with its own constraints around computing power and network connectivity.
</blockquote>
<figcaption>
— Jeremy Keith, <cite>Mental models</cite>
</figcaption>
</figure>

While this doubles the number of elements needed, there are several benefits:

It’s semantically correct for all four elements.

It allows you to both include and encapsulate author information beyond citing the name of the work.

It gives you an easy way to style the quote without resorting to divs, spans or wretchedness.

Not <dialog>! Those are for attention-grabbing modals. Dialogue, as in, conversational exchanges between people speaking or typing to each other.

Neither <blockquote> nor <q> nor <cite> are to be used for dialogue and similar exchanges between speakers. If you’re marking up dialogue, you can use whatever makes the most sense to you. There’s no semantic way to do it. That said, the spec suggests <p> tags and punctuation with <span> or <b> tags to designate the speaker and <i> tags to mark stage directions.

Accessibility of quotes

From the research I’ve done, screen readers should not have any issue with understanding semantic-deity-approved <q>, <blockquote>, or <cite> tags.

More “ways” to “quote”

You can add quotation marks to a <blockquote> using CSS pseudo elements. The <q> element comes with quotation marks baked in so they need not be added, however adding them as pseudo-elements can be a workaround for older browsers that don’t automatically add them. Since this is how modern browsers add the quotation marks there's no danger of adding duplicate quotes. New browsers will overwrite the default pseudo elements, and older browsers that support pseudo elements will add the quotes.

But you can’t, like I did, assume that the above will always give you smart opening and closing quotes. Even if the font supports smart quotes, sometimes straight quotes will be displayed. To be safe, it’s better to use the quotes CSS property to up the intelligence on those quotation marks.

Now let’s look at quote levels. The <q> tag will automatically adjust quote levels.

Let’s say you’re in an area that uses the British convention of using single quotes. You could use the CSS quotes rule to put the opening and closing single quotes first in the list. Here’s an example of both ways:

There is no limit to nesting. Those nested <q> elements could even be within a blockquote that’s within a blockquote.

If you add quotation marks to a blockquote, know that the blockquote does not change the quote level the way a <q> tag does. If you expect to have quotes within a blockquote, you may want to add a descendant selector rule to start <q> elements within a blockquote at the single quote level (or double quotes if you follow British conventions).

blockquote q {
quotes: "‘" "’" "“" "”";
}

The last quote level you put in will continue through subsequent levels of quotation. To use the double, single, double, single… convention, add more levels to the CSS quotes property.

Many typography experts will tell you that hanging the quotation marks on blockquotes looks better (and they’re right). Hanging punctuation is, in this case, quotation marks that are pushed out from the text so that the characters of the text line up vertically.

One possibility in CSS is using a slightly negative value on the text-indent property. The exact negative indentation will vary by font, so be sure to double check the spacing with the font you end up using.

blockquote {
text-indent: -0.45em;
}

There is a nicer way to handle this by using the hanging-punctuation CSS property. It’s only supported in Safari at the time of this writing, so we’ll have to progressively enhance:

/* Fallback */
blockquote {
text-indent: -0.45em;
}
/* If there's support, erase the indent and use the property instead */
@supports ( hanging-punctuation: first) {
blockquote {
text-indent: 0;
hanging-punctuation: first;
}
}

Using hanging-punctuation is better because it’s less fiddly. It’ll just work when it can.

Why you’d need to do this, I’m not totally sure, but the quotation marks in a <q> tag are added are pseudo elements in the UA stylesheet, so we’re able to select and style them — including animation — if we need to.

Wait, maybe we just solved the air quotes thing after all.

]]>
https://css-tricks.com/quoting-in-html-quotations-citations-and-blockquotes/
https://css-tricks.com/quoting-in-html-quotations-citations-and-blockquotes/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[A Dropbox account gave me stomach ulcers : sysadmin]]>

A Dropbox account gave me stomach ulcers

Score: 3219

Comments: 464

Flair:

Author: lemmycaution0

Anyone ever find that "thing" that no one wants to talk about and is secretly holding the company together with shoe string, bubble gum, and paper clips. It's usually found at 445 on a Friday before a major holiday and after it goes down a beat red senior executive is screaming to the heavens that there's going to be a second of Battle Stalingrad if we don't get this previously unknown and undocumented "thing" back online. You email the alleged domain expert only to see they are out office till 2099 so you email their manager only to get a bounce back message that they haven't worked here since Barracks first term. I recently found one of those "things".

It all started with an acquisition of another company we'll call them the insane asylum that basically makes software for our industry. I am going to vaguely say my company is in the manufacturer world and buying the software gave us a competitive advantage. Of course no senior executive thinks about the difficulties the IT teams are now faced with in a meager. The first sign of something being amiss is when me and my coworkers were provisioning laptops and computers for employees from the insane asylum and we asked for requirements for each department. Everything seems to be going fine until I see the request from the insane asylum's development team. They wanted 40 laptops each with 4 TB of storage, which is a hell of a lot for a work computer and could send them way over budget. I couldn't understand why they needed that much local storage so I called up the head of that department for an explanation and his team danced around why they need that much storage. I mean we pay for cloud services for a reason, basically we walked away with the team telling they would try to make it work with less storage but never elaborated on why they requested it in the first place. I walked away from that phone call confused and my co-worker who is Jamaican (not relevant except that he uses local colloquialisms that wind up being very funny later in this story) brought up that their behavior seemed bizarre like why on earth would they plead the fifth when we pressed them for questions, we're honestly just looking to help. But work was piling up and even though we hadn't been involved in the acquisition they had passed audit before we purchased them so I let it go.

Flash forward three months to present. 4'o clock on Friday I'm wrapping up some day to day security stuff, and getting ready for an amazon sales meeting. I make it point to freeze changes and projects in December. Everyone's on vacation and I don't want a major outage during the holidays. So I'm all prepared for a lull period until January 3rd. I was starting to get really annoyed with the insane asylum employees because they kept scheduling changes but always would pencil out 2 to 3 days of time to get everything done even basic maintenance without explaining why it was taking so long. I was beginning to think they had snails or something typing at the computers. I was catastrophically wrong, my young Jamaican colleague was monitoring my ticket queue while I was in the sales meeting. He got an escalation request from help desk, its contents were literally

Something very weird is going on with the new dev team. Their app is suffering intermit outages, slow responses, and network monitoring says they are seeing that team trying to move GB's of data on the network. Call them ASAP.

My poor colleague calls the team and things really start to unravel they tell him many of the insane asylum old IT folks were let go during the acquisition including the guy who was responsible for increasing their storage when their app was close to hitting space capacity. They had assumed we had been doing it in his place. No problem he could request a new virtual server or additional space in amazon to mitigate the problem right now and we could come up with a long term plan once I got back to my desk. The person he's talking too immediately cuts off and says that isn't necessary they just need him to call drop box support. He's now very confused and asks why on Earth are they sending or storing information in drop box that's a huge breach. He asks what information the app/website is pulling from the drop box and they drop a bombshell they tell him the entire database is in drobox. At this point I'm told he began to look like he just stumbled out of the trenches in 1917. He asked them to elaborate because what they described didn't sound possible. It was but it wasn't just the database it was the entire app and website. The app was actually just a server instance in Heroku that was spun up whenever there was an update and would make crazy api calls to the drop box account read information from hardcoded database files. He immediately called drop box support to figure out what in god's name was going on and to his horror after several escalations gained access to the account and found that the account had 497 TB of 500 TB space used up and the team was on the verge of running out. This explained why they needed such large hard drives and why they changes were taking so long it would take days to upload and download so much data to drop box plus have all the devs resync their local drop box instances with the correct latest versions. This single drop box account was also their version control.

My colleague perhaps prophesying that a tsunami of shit was about to be unleashed started screaming the blood of Jesus, the blood of Jesus, lord no the blood of Jesus which might be the Caribbean equivalent of holy fucking shit. Unfortunately, the CISO happened to be in the room and was concerned why one of her employees was having a break down or if she should start preparing for the second coming. Usually I look to put together bullet points and work actions before contacting the CISO in an emergency because she often doesn't see the nuances of day to day operations. When this was all explained to her from street level her head exploded. Meanwhile I'm falling asleep in a meeting completely ignorant of the impending hurricane of shit I'm about to walk into until an analyst stormed into the meeting like Pheidippides right before he collapsed after the battle of marathon. He told us there was a potential privacy breach the CISO was already aware without being briefed and on top of everything else since the technical leads were in this doomed sales meeting all the zoo animals were let loose in the office. My blood runs cold and we all rush downstairs to a three ring political circus, our CISO is trying to justify to the CFO and the insane asylum employees that this is unacceptable even if we get this back online and increase the drop box storage this is a ticking bomb and we need to start an emergency investigation to see if anyone former employee or hacker has accessed this drop box account. There is zero monitoring in place and they were sharing accessing willy nilly with the whole team. Every team member had read/write access. Weary of losing this political battle and forcing her team to support this beast she went with the nuclear option and emailed the general counsel explaining the risks. This is when shit really started to roll because she interrupted the lovercraftian cosmic horror otherwise known as general counsel's vacation to lob this turd grenade. I spent of all night coming up for a solution to migrate all this information and try to confirm that there hasn't been a data breach yet. I would have been working the following morning as well but I was in so much pain when I woke up on top of having anxiety nightmares the whole night, I went to the doctor and found out I have a stomach ulcer I can't be certain but I'm pretty sure this whole incident plus intervention from IT demons pushed my body over the edge. The solution is yet to be determined it’s a miracle I haven't shot a developer yet.

There's a lot of lessons to unpack here but to this day it blows my mind what glue stick and thumb tact solutions are in production. I'm concerned there are tons of companies out there were the standard operating procedure is too have stuff collecting electricity without anyone knowing what it is or how it works.

P.S. my son said I should write that I'm hopping my fellow IT veterans pour one out for me this weekend.

*****Update number 1*****

1.We are paying to upgrade the storage in drop box I am not happy about this but we're not going to win friends for this battle if we come off as mules unwilling to offer a solution.

The cost of this much drop box storage is tens of thousands. I just found this out via an email but the CFO is not clear in the message if its per year or per month (more unlikely)

We are having four people work over the weekend to go through the data and understand whats going on. (You better believe they are making time and half)

I'm concerned there was data leak or breach and so is legal. We are still putting together a way to track who accessed what historically. I'm praying we don't find anything malicious.

If its a situation were we don't any historical information or logs. Legal is considering accepting that we can't assume integrity and will send a notice to customers.

Audit has some explaining todo.

I'm taking a few days to deal with my ulcer and get an abscess in mouth cleared up (may have been a result of the ulcer) . This problem is not going to be magically programmed away so I fully expect it to be waiting for me when come back to the office in a few days.

My email and phone are ringing off the chain

****Update number 2*****

I feel bad because peoples holidays are begin interrupted but a shit show is never convenient.

Upping the storage has not resolved all the issues and were still on high alert.

Two of our senior devs not insane asylum employees (also making time and half one of the gave up a vacation day) are getting involved to start documenting this mess this is not my cup of tea I don't make web applications so this is over mine and some of the security staffs head.

Both Devs can't believe they did this. One is only 26 or 27 and can't believe in this day age someone would think this is a proper version control system. The older colleague is from the Soviet Union and told us the only shit storm he remember even being remotely as bad was when he in university/army service right as communism was falling apart and he had to work with a computer in Russian, software written in his local language, and software guides written in English. Longest year and half of his life apparently.

****Update Number 3*****

The soviet has come up with a plan I just spoke to him over the phone a few hours ago. He already got the storage increased but thats doesn't fix all the other issues. He's going to freeze updates and have people download the latest version of each file manually onto a virtual server then commit this to a private git repo. This is an extremely time consuming and tedious annoying task but it will get the job done god help the poor folks that draw the short straw on this assignment.

We have a post mortem /come to Jesus moment with this dev team on Monday. I will not be attending as I'm sick but the Soviet, the CISO, and my manager the head of IT operations, and a very technical associate will be there to get a lay of the land. The Soviet also told me if there is push back or if they start getting cold on giving him direct access to the drop box instance he's going to shoot someone (I don't think he's kidding) he had to work on a Saturday because of these people.

My Jamaican co-worker is fine he'd probably get a kick out of everyone's concern. But people tend to overreact/ get worked up when security is involved.

Investigation is on-going there is some serious concerns. This companies old IT ticketing system was turned off / decommissioned I jumped through hoops to get the archive out of a landfill. Apparently they have an IT ticket from a year half ago where an ex employee tried to delete files which is concerning not a big concern but trying to figure out if for instance an employee left the building after downloading dropbox files to their home computer is ongoing. There is a lot of security implications to unpack.

It appears to be an enterprise drop box account this is unconfirmed but a consumer account I hope wouldn't be possible. What concerns is that some people were all using the same account the drop box instance and others created accounts and shared access with those accounts. People never cease to amaze.

The devs also told me there is some serious hackery going on with these web app it probably has a bunch of vulnerabilities but beside that it has not just flat csv files its querying for info but also fully functional sqlite database which probably accounts for the poor performance on top of everything else they implemented sqlite incorrectly.

****Update Number 4*****

I think one day perhaps I'll write an IT lessons learned / horror story collection book. I'm not sure if people would actually read it.

I do have more stories to share and I have glad certain seem to enjoy how I write but I do think this is should be a serious discussion board and tend to make my post more question/serious oriented. Even when I have a funny horror story I try to point out the serious implications and lessons learned. Not sure if there's a subreddit where my stream of consciousness musings would be a better fit.

Antibiotics make a world of difference when you have a stomach ulcer.

]]>
https://www.reddit.com/r/sysadmin/comments/eaphr8/a_dropbox_account_gave_me_stomach_ulcers/
https://www.reddit.com/r/sysadmin/comments/eaphr8/a_dropbox_account_gave_me_stomach_ulcers/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Hex Guess!]]>

Guess the background color hex

Answer: 100000

Closest to 0 is best.

]]>
http://hex-guess.glitch.me/
http://hex-guess.glitch.me/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Boris’s Blundering Brilliance | Andrew Sullivan | NYMag Intelligencer]]>

Photo: Stefan Rousseau/PA Images via Getty Images

This article was featured in One Great Story, New York’s reading recommendation newsletter. Sign up here to get it nightly.

It’s hard to take the British prime minister, Boris Johnson, completely seriously. Just look at him: a chubby, permanently disheveled toff with an accent that comes off as a parody of an upper-class twit, topped off by that trademark mop of silver-blond hair he deliberately musses up before venturing into the public eye. Then there are those photo-op moments in his long career that seem designed to make him look supremely silly — stuck dangling in midair on a zip line with little Union Jacks waving in his hands; rugby-tackling a 10-year-old in Japan; playing tug-of-war in a publicity stunt and collapsing, suited, onto the grass; or declaring at one point that he was more likely to be “reincarnated as an olive,” “locked in a disused fridge,” or “decapitated by a flying Frisbee” than to become prime minister.

And yet he has. And more than that: This comic figure has somehow managed to find himself at the center of the populist storms sweeping Britain and the West — first by becoming the most senior politician in Britain to back Brexit in 2016, and now by plotting a course that might actually bring the United Kingdom out of the epic, years-long, once-impossible-looking mess he helped make. Just over four months into office as PM, he appears poised to win an election he called and, if the polls are anywhere near correct, score a clear victory and take Britain out of the E.U. by the end of January.

Not so long ago, this Brexit scenario seemed inconceivable: What the E.U. demanded and the British Parliament could support seemed irreconcilable, and no single resolution to the Brexit referendum had enough support to budge the country’s politics out of a maddening stall. But here we are, with Boris having budged it.

In the politics of this, he has been helped by the British winner-takes-all electoral system; by a very unpopular Labour leader, Jeremy Corbyn; and by an opposition split down the middle on Brexit. But he deserves credit for getting the E.U. to change (if somewhat marginally) a final deal it long said was absolutely nonnegotiable and for seeing an opportunity that others didn’t — in the U.K., across Europe, and in the U.S. — to become not more ideological in a tumultuous era of uncorked populism but less so. And more nakedly opportunistic.

This “lovable buffoon,” as he’s called now by journalists and politicos as well as former Labour voters turning Tory in focus groups in the Midlands, has skillfully maneuvered toward a full term as prime minister and perhaps toward an era in British politics when the Conservative Party is defined less by Thatcherism than Borisism. Through complete lack of principle, endless charm, and ruthless ambition, he has managed to bring about a possibility that, not too long ago, probably only he allowed himself to fantasize about: that he would become not just prime minister but a significant one.

How has he pulled all this off? His critics argue that he has cynically become the British Trump, whipping up xenophobia and ugly racism, lying with abandon, and reinventing the British Tory party as a hard-right populist engine for the worst instincts of the deplorable masses. He is now attacked as a racist and reckless Little Englander, gleefully wrecking the British economy, polarizing the country, and threatening to break up the U.K. solely to advance his own narcissistic ambitions. Shallow, lazy, incompetent, and bigoted, this clown has somehow leveraged the fears of the many to advance the only thing he has ever genuinely believed in: his own destiny.

But there is another story to be told about him: that he has been serious all along, using his humor and ridiculousness to camouflage political instincts that have, in fact, been sharper than his peers’. He sensed the shifting populist tides of the 2010s before most other leading politicians did and grasped the Brexit issue as a path to power. But he also understood how important it was not to be fully captured by that raw xenophobic energy. He saw Brexit discontent as something the political Establishment needed to engage and co-opt rather than dismiss and demonize, and he approached the opportunity in a very different way from his sometime ally Nigel Farage, whose provincial extremism veered into outright racism and whose political career Johnson has now all but ended.

As a longtime liberal Tory, Boris, as he’s invariably called in the press and by the public, saw the deep unpopularity of his party’s legacy of fiscal austerity and the need to shift left in economic and social policy. So he is quietly forging a new conservatism — appealing to the working poor and aspiring middle classes, tough on immigration and crime, but much more generous in spending on hospitals and schools and science. Or so he says for now. And if he succeeds — by no means a sure thing, though at this point it almost seems foolish to bet against him — he won’t just be charting a new future for the U.K. but pioneering a path for other Western parties of the center right confronted by the rise of populist extremism.

1982: Johnson at Eton.
Photo: Richard Shymansky/Gillman & Soames/News Licensing

The parallels with Donald Trump are at first hard to resist: two well-off jokers with bad hair playing populist. But Trump sees himself, and is seen by his voters, as an outsider, locked out of the circles he wants to be in, the heir to a real-estate fortune with no political experience and a crude sense of humor, bristling with resentment, and with a background in reality television. He despises constitutional norms, displays no understanding of history or culture, and has a cold streak of cruelty deep in his soul. Boris is almost the opposite of this, his career a near-classic example of British Establishment insiderism with his deep learning, reverence for tradition, and a capacity to laugh at himself that is rare in most egos as big as his. In 2015, after Trump described parts of London as no-go areas because of Islamist influence, Johnson accused him of “a quite stupefying ignorance that makes him, frankly, unfit to hold the office of president.” Even as president, Trump is driven primarily by resentment. Boris, as always, is animated by entitlement. (The vibe of his pitch is almost that people like him should be in charge.)

Alexander Boris de Pfeffel Johnson was educated at Eton (like at least 20 prime ministers before him), then at Oxford (like 27). He was president of the Oxford Union, the university’s legendary debating society, like prime ministers Heath, Asquith, and Gladstone. From 1999 to 2005, he was the editor of the 191-year-old Spectator, the eclectic, Tory-leaning magazine, a position that has often been a stepping-stone to high political office. He has been a member of Parliament twice, from 2001 to 2008 and since 2015. He was mayor of London from 2008 to 2016, presiding over the Olympics. Then he was foreign secretary from 2016 to 2018. This résumé is close to a parody of the British elite — about as far from Trump as it is possible to imagine.

That Johnson sometimes appears as an outsider is largely a function of his personality and how he has skillfully marketed it: extremely smart but constitutionally lazy; upper class with a real feel for and delight in ordinary life; sexually promiscuous to an almost comical degree; a defender of rules as long as he is entitled to break them from time to time; a humorist and pun merchant who has succeeded in making his own aristocratic idiosyncrasies part of the joke; a ruthless careerist with a capacity for deceit and forgiveness; and a narcissist no one should even begin to trust. But of course, as loath as aristocrats of previous generations would have been to admit it, all of this may be even more characteristic of the country’s ruling class than a stiff-upper-lip sense of propriety.

It’s all there at the beginning at Eton. In his sparkling 2006 biography, his friend Andrew Gimson dug up Boris’s old school reports. They might have been written by his often-frustrated colleagues in politics: “Boris’s favored pace is the amble (with the odd last-minute sprint), which has been good enough so far, and I suppose enables him to smell the flowers along the way. It’s time, though, that with a greater commitment to the real business of scholarship … Boris could turn himself into a classicist of real distinction.” It didn’t quite turn out that way. A year later, this gentle critique harshened: “Boris really has adopted a disgracefully cavalier attitude to his classical studies … I think he honestly believes that it is churlish of us not to regard him as an exception, one who should be free of the network of obligation which binds everyone else.”

And yet somehow Boris leveraged this indolence and incompetence to gain popularity. Gimson’s biography notes that he couldn’t be bothered to remember his lines in a school play, so he posted them around the stage and rushed from prompt to prompt, garbling most of them, evoking howls of laughter from the crowd and intense frustration from his fellow actors. He acted in a Molière play in French but with an atrocious English accent that also stole the show. He made his laziness into a joke — and discovered that this made people laugh and warm to him. It was at Eton too that he first honed his signature look with his blond mop and scruffy, ill-fitting clothes. And it was at Eton that his ferocious bursts of energy, optimism, and enthusiasm became better known: “On the rugby field Boris was an absolute berserker,” one report noted. “There was a lot of yelling and hurling of himself reckless of life and limb, both his own and other people’s.” Even as a teenager, he wanted to be remembered for his passion for sex. In the equivalent of a yearbook, Boris posted a picture of himself with two scarves and a machine gun and a pledge to make “more notches on my phallocratic phallus.” It remains a rather staggering fact today that no one actually knows how many children Boris has, and he point-blank refuses to discuss any details of his private life in public.

At Oxford, it was the same performance. I overlapped with him for a year (1983–84) and, like him, was president of the Oxford Union. Compared with most of the toffs, he seemed to me endearing. So many other Etonians downplayed their upper-class origins, became lefties, smoked pot, softened their accents, and wore clothes indistinguishable from anyone else. But Boris wore his class as a clown costume — never hiding it but subtly mocking it with a performance that was as eccentric as it was self-aware. He made others feel as if they were in on a joke he had created, which somewhat defused the class resentment he might otherwise have been subject to and which, like many from the lower ranks of British society, I mostly shared.

But I gave him a pass because he was so splendidly colorful. In the Union, he routinely cracked everyone up, his debating technique less forensic than simply funny — saying something in an absurdly aristocratic and formal way and then adding some pop-culture reference as a punch line. He has never let go of that rhetorical formula. Not everyone liked him, of course, and he had a hard time getting elected. In his first attempt to become president, he lost to an earnest middle-class student who mocked his Etonian Toryism. So Boris, with his usual disaffected aplomb, reinvented himself as a Social Democrat, got elected and then declared his Tory allegiance.

He seemed to have come to Oxford fully formed, a handsome blond who joined the Bullingdon Club, a selective, upper-class, all-male clique that held dinner parties in various restaurants and was known for getting plastered and vandalizing the joint. It represented to me the worst elements of private-school privilege and exclusion. That I didn’t reflexively despise Boris — as I did most of them — is testimony to his personal charm. His chums — Viscount Althorp (now Charles Spencer), Princess Diana’s brother; and the eccentric British-Iranian Darius Guppy (later jailed for fraud) — seemed to have little in common besides going to Eton.

He did not, however, achieve a first-class degree, a rare occasion in his life when winging it failed. In his chosen profession as a journalist, he worked for the Times of London and the Daily Telegraph, often finding stories where others didn’t but also just making stuff up. In one Times story, he invented a quote to sex up the piece, then lied to his editors about it. He was fired when the person he “quoted” complained, but, using his connections, he managed to get a second job, at the Telegraph, the solid Tory non-tabloid. In a stroke of editing genius, he was assigned to cover the E.U. in Brussels, where his environmentalist father had been one of the first British officials to work for the European bureaucracy (assigned to controlling pollution), where boy Boris thereby attended elementary school for two years and where, as a journalist, he also proceeded to just make stuff up. But this time, the stuff he embellished or concocted — about the overweening ambitions of the E.U. and the absurdities of various E.U. regulations, on, say, the size of condoms — was almost designed to tickle Tory Telegraph readers.

Johnson had demonstrated no previous hostility to Europe and in fact was a passionate enthusiast for many aspects of European culture and history. In this way, he was a somewhat typical British elite of his generation, a comfortable cosmopolitan. Indeed, his great-grandfather Ali Kemal was a high-ranking Turkish politician who opposed the rise of Atatürk and was thrown to the mercies of a bloodthirsty mob as a result. So there was some irony in Boris’s becoming the xenophobic, Euroskeptic right’s favorite writer. He did it not with anger or polemic but with unrelenting scorn and humor. In time, his editor, Max Hastings, saw Boris’s antics for what they were, calling him a “cavorting charlatan” and lamenting that “we can scarcely strip the emperor’s clothes from a man who has built a career, or at least a lurid love life, out of strutting without them.” Still, Hastings didn’t fire one of the most popular writers in his paper.

But in 1999, Conrad Black, the Trump-pardoned fraudster, lured him to be editor of the Spectator, which Black owned. The core question before Boris got the job was whether he would stick to journalism and stay out of elected office. Even for a lively magazine like the Spectator, there was an obvious concern that an elected politician as editor would severely cramp the independence it had long prided itself on. “He gave us his solemn word of honor that he would not seek selection for any party, including the Conservatives,” Black subsequently told Gimson. But about two weeks after this promise, without telling Black, Boris applied to be the Tory candidate for two different seats. Somehow he charmed his way out of a pink slip.

It was during his editorship that the Spectator became known as the Sextator in the tabloids, an august old journal suddenly rife with scandalous affairs involving no fewer than five staffers and the home secretary. By this time, Boris had already had two wives (he committed adultery on his first with his second) and four young children and was still very busy adding many more “notches.” Even now, he is living in No. 10 Downing Street with a girlfriend, Carrie Symonds, who is slightly older than his children, while fending off stories of past trysts with an American model and tech consultant, Jennifer Arcuri.

In all this, he is no socially conservative hypocrite and rather a bon vivant. Boris defended Bill Clinton’s shenanigans in the 1990s, blaming Monica Lewinsky for the affair, excoriating the press for its prurience, and defending the desirability of lying about extramarital dalliances. And he did indeed lie about his. Confronted by rumors of an affair with Petronella Wyatt while he was editing the Spectator, he denounced the story as “an inverted pyramid of piffle.” It wasn’t. Wyatt had one abortion and one miscarriage, and the affair soon became public knowledge. At the same time, his attempt to be in Parliament while being a journalist began to crumble under the weight of its contradictions. When he was promoted to become Tory spokesman for the arts, his own magazine tripped him up. An unsigned 2004 editorial — not written by him — lamented the tendency of the inhabitants of Liverpool to be overly sentimental: “They see themselves whenever possible as victims, and resent their victim status; yet at the same time they wallow in it.” As editor, Boris nobly took full responsibility for the piece and never outed its actual author, a reactionary blowhard named Simon Heffer.

But the Tory leader of the time, Michael Howard, demanded Boris go to Liverpool and apologize personally. He did … and he didn’t. In scenes reminiscent of Veep, Boris dutifully went to Liverpool but couldn’t help himself and, when pressed, defended the editorial. In a subsequent column, he called his endeavor “Operation Scouse-Grovel” — scouse being a slang word for a Liverpudlian. An angry Howard soon fired him for the Wyatt affair, and the press turned viciously against Boris. It wasn’t long before he was also fired from the Spectator despite having grown its circulation substantially. And when David Cameron — a younger fellow Etonian — won the leadership of the Tories in 2005, Boris was left out of the top tier of his opposition team. His political and journalistic future looked dim.

The smooth moderation of Cameron, who as prime minister oversaw an austerity response to the financial crisis, wasn’t very compatible with Boris’s berserker temperament, but his politics were rarely to Cameron’s right until Brexit, despite the way they’ve been described in both Britain and the U.S. over the past few years. It’s an understandable misreading: In that time, Boris has allied himself with many of the most hard-core Euro-obsessives and social conservatives in the Tory party and seemed prepared earlier this year to lead the U.K. out of the E.U. without a deal — the most extreme Brexit position available then. He expelled from the party 21 moderate, rebellious Conservative MPs (who refused to entertain a “no deal” outcome) in the Brexit battle and formed a Cabinet that included many hard-core social conservatives. On top of which, he has been lambasted for a number of passages from his long journalistic career that suggest racism, classism, sexism, and homophobia — and that he viewed as satirical excesses. Trump deploys the same defense, but outside of Boris’s purple prose — “tank-topped bum-boys,” burka-wearing women looking like “letter boxes” — the evidence of his bigotry is a little thin. A bigot would be unlikely to win two elections as mayor of London, a vast multiracial, multicultural metropolis. And his Cabinet is the most ethnically diverse in British history.

Or take gay rights. Back in 2003, Johnson was one of a handful of Tories who rebelled against Conservative Party policy, voting for an end to the Thatcherite ban on teaching about homosexuality in state schools. Like many pols, he couldn’t handle marriage equality at first, but then he adjusted, becoming in 2010 one of the first senior Tory politicians to entertain it. As London mayor, he marched in several Pride parades, and as foreign secretary, he reversed a ban on rainbow flags at British embassies. On a trip to Russia, he defended gay rights, saying at a press conference with Sergei Lavrov that “we speak up for the LGBT community in Chechnya and elsewhere.”

Islamophobia? Johnson had previously favored the entry of Turkey, with 81 million Muslims, into the E.U. He is hostile to the illiberalism in contemporary Islam but has defended the religion as a whole: “Everything that most shocks us about Islam now — the sexism, the intolerance of dissent, the persecution of heresy and blasphemy, the droning about hell and shaitan, the destruction of works of art, the ferocious punishments — all of them have been characteristics of Christian Europe. It wasn’t so long ago that we were burning books and heretics ourselves.”

Boris also appreciated the moderation of Barack Obama: “He is patently not the Marxist subversive loony lefty that some of his detractors allege.” And, of course, he has shown a deep contempt for Donald Trump. In 2016, he said he was “genuinely worried that [Trump] could become president … I was in New York and some photographers were trying to take a picture of me and a girl walked down the pavement towards me and she stopped and she said, ‘Gee, is that Trump?’ It was one of the worst moments.”

The truth is Johnson has a record as a liberal Tory: a conservative who can celebrate “our fantastic National Health Service” and has no interest in politicians’ preaching about morality. And it was this conservatism that enabled him to become mayor of London, a largely Labour city, where he thrived. He brought back the double-decker bus; launched a successful, if unprofitable, bike-sharing scheme, “Boris Bikes”; backed an amnesty for illegal immigrants; banned booze on the tube; raised the recommended living wage in London; and presided over an Olympics that became a public-relations coup for the entire country. Crime declined — as it did everywhere. And Boris became one of the most famous cyclists in the city, careening back and forth, often on his mobile phone. By the end of his term, a YouGov poll found that almost twice as many Londoners thought he did a good job as mayor as those who didn’t.

As always, Johnson’s ideological flexibility was key — so much so that it led him to resist the more doctrinaire forces in his own party. As mayor, Johnson complained about the austerity measures of the Tory Cameron government. And as prime minister, he has immediately ramped up public spending on the police, schools, and hospitals. He shelved a previous proposal to lower the corporation tax and has focused on raising the income threshold at which Brits pay the equivalent of the Social Security tax and on raising the minimum wage nationwide. He has urged people to “Buy British” — a slogan anathema to market economics. Whether this is posturing or serious, no one knows exactly, but it sure is a sharp move rhetorically left for the Tories, away from the wealthy and austerity and toward the working poor and debt.

This encodes a very clear understanding that, in the wake of the 2008 crash, the global elite in London has thrived but the working and middle classes in the rest of the country have been, at best, treading water. Johnson has defended the bankers in the City of London (they pay a large amount of Britain’s taxes) and, as mayor, presided over the glitterification of the city. But he also understands that, in this new era, there is widespread support for nationalism rather than internationalism and for social welfare rather than unrestricted capitalism. Johnson intuited what the polling now shows: The “left-right” axis has morphed into an “open-closed” divide. On the one hand, there are those who have been winners in the 21st century and who favor the E.U. and international institutions, globalization, free trade, and mass immigration. On the other, there’s a rising non-elite group that defends the nation-state, opposes global capitalism, and wants to reduce immigration and put native-born workers first. Boris has definitely shifted the Tories into the latter camp, specifically through Brexit, a stance that appeals to more working-class voters — in exactly the same way that the GOP’s base has shifted to the less educated.

The public has noticed. In 2019, the polling shows that 48 percent of working-class voters now back the Tories, while only 31 percent back Labour. This means that, in the current election, the Conservatives find themselves competitive in northern seats, where Labour was once close to a religion, even as some prosperous Tory seats in the South have become vulnerable to the Liberal Democrats. Brexit cemented this. But as the former chancellor George Osborne told me, “trading Oxford West for a shot at Hartlepool is a hell of a gamble in the medium term.”

Boris’s play for nationalist votes may be calculated and opportunistic — “He’s entirely focus-group driven,” says a former colleague — but it has befuddled and angered old friends and many in his liberal family, who still see him as merely playacting his support for the ERG, the most pro-Brexit faction among the Tories. “He thinks the ERG are nutters,” one prior Cabinet minister told me. His brother Jo quit Boris’s Cabinet when it became clear that a no-deal Brexit was on the table. His sister, Rachel, quit the Tories in 2011 and joined the “remain” party, the Liberal Democrats, in 2017. “His brothers and sister don’t believe him on Europe,” said the minister. “There’s a lot of pain in the family … He’s not the Boris I knew. He’s harder.”

2012: Promoting the London Olympics as mayor.
Photo: Barcroft Media/Barcroft Media via Getty Images

In speaking with multiple school and college contemporaries of Boris’s and with colleagues and former colleagues, including Cabinet ministers, I soon discovered no deep friendships or political networks. Compared with the elaborate social political network of, say, David Cameron, he is a loner. “He doesn’t value his friends the way I do. He doesn’t care,” says a former colleague, who also says, “It was lovely to work with him.” “People attach themselves to him,” says the minister. But it rarely feels as if he attaches to them. He was anchored for a long time by his marriage to his second wife, Marina Wheeler, but constantly rocked the marriage with countless affairs. Some who know him suggest his attachment to consecutive lovers is the only way he can securely feel intimacy. Others simply believe that Boris has had one endless love affair with himself and that everything else is politics. Some see him as a persona rather than a person: “He has no purpose,” says an embittered old ally. “For someone so prodigiously talented to have no moral core is heartbreaking.”

What struck me in these conversations is how little he seems to have changed over the years since I knew him — as if his emotional development were arrested in college. What’s different now is that a series of lies and betrayals has alienated many. “The British people are going to have the same experience with Boris that everyone who has known him have understood,” says the former ally. “They will feel hugely let down.”

Johnson was brought up with three siblings by a father, Stanley Johnson, whose braggadocio and humor rival his son’s, and by a mother, Charlotte Fawcett, an artist with a liberal background. Stanley’s career came first, which led to constant upheaval (the family moved 32 times while the marriage lasted, according to Fawcett) and to Boris’s being born in New York City and brought up in the U.S., Belgium, and England. (Boris was a dual citizen of the U.S. and the U.K. until a couple of years ago.) Stanley treated his marriage vows as seriously as Boris would his, and this, along with the constant moving and not-so-secure finances, likely contributed to Fawcett’s nervous breakdown in 1974. She would be in and out of hospital with severe depression for some years before she divorced. Extremely close to her children, she was suddenly gone from their lives, and Boris was devastated.

There remains a hint of pathos in those droopy blue eyes of his. “He’s a much more introspective person than most people assume,” his former ally told me. “That strange look behind those brows is a vulnerability. People want to mother him.” And forgive him. Political rivals he has betrayed or fired have shockingly positive things to say about him. He’d follow up a sacking or a public row with texts begging forgiveness or making amends.

He seems most emotionally comfortable in front of an audience, cracking jokes. On television, even as he was a journalist and an MP, he became something of a star in his own right. He regularly went on a satirical quiz show on current events called Have I Got News for You? and was as funny as the professional comedians who were permanent guests. In turn, he was invited back to guest host. He began cracking people up on various interview shows and became a very rare politician with true pop-cultural appeal. He was the kind of celebrity figure who could advise the readers of GQ that under a Conservative government, “your wife will get bigger breasts and your chances of driving a BMW M3 will increase.”

Near the end of his second term as London mayor, Johnson broke yet another promise that he would not seek a parliamentary seat while mayor and reentered Parliament in the 2015 election, when Cameron shocked himself and everyone else by winning handily. This was not good for Boris’s career, as Cameron’s right-hand man, George Osborne, was widely regarded as the successor-in-waiting, and Boris was, in the words of one pol, “desolate” at the result. But the victory ensured that Cameron’s election promise of an E.U. referendum couldn’t be avoided, and almost all the political elite rallied around the “remain” camp, with most assuming that Boris would join them and Cameron having no idea he would be betrayed. In the end, after much dithering, Boris famously wrote two drafts of his announcement, one favoring “remain” and one “leave.” “I am a European. I lived many years in Brussels. I rather love the old place,” he wrote in the first paragraph of his pro-Brexit column. “And so I resent the way we continually confuse Europe — the home of the greatest and richest culture in the world, to which Britain is and will be an eternal contributor — with the political project of the European Union. It is, therefore, vital to stress that there is nothing necessarily anti-European or xenophobic in wanting to vote Leave on June 23.” Ultimately, he would campaign against his own government, becoming at once the most formidable politician behind the “leave” cause.

The “leave” campaign deceived voters. It famously claimed a rebate for Brexit of £350 million a week to spend on the NHS, a sum that represented the gross amount of money Britain gave to the E.U., and not the net, which was less than half that amount. A notorious poster raised fears of mass immigration by showing a trail of dark-looking migrants with the slogan “Breaking Point.” There were also ugly last-minute scare stories that Turkey was going to be admitted to the E.U. and millions of Turks would be arriving soon — a position diametrically opposed to Boris’s long championing of Turkish E.U. membership. In a campaign he didn’t personally run, Boris can’t be faulted for things he didn’t say or do, but he didn’t protest or stop the lies coming. (It is also true that the “remain” campaign grossly overstated the immediate economic consequences of voting “leave.”) But the shock surprise of the “leave” victory and the almost as shocking decision by Cameron to quit the day after suddenly gave Boris a shot at No. 10.

The day after Cameron resigned, Boris went to play cricket with his old chum Charles Spencer, rather than rally his allies for a leadership contest. Some of his fellow Tories found the idea of this reckless joker as prime minister too absurd, and his closest ally on the “leave” campaign, Michael Gove, stuck the knife in: “Boris is a big character with great abilities, and I enjoyed working with him in the referendum campaign … But there is something special about leading a party and leading a country, and I had the opportunity in the last few days to assess whether or not Boris could lead that team and build that unity. And I came reluctantly but firmly to the conclusion that, while Boris has many talents and attributes, he wasn’t capable of building that team.”

The backstabbing alienated most Tory MPs, and they gave Theresa May the job. She hugged Boris close and made him foreign secretary, but when her Brexit deal emerged as a supersoft one, Boris took a second big risk and quit the Cabinet in July 2018. May’s deal then flapped like a fish out of water on the floor of the Commons until it eventually expired.

2019: At the NATO summit.
Photo: Dan Kitwood/Getty Images

When May resigned, Johnson easily won the Tory leadership contest to succeed her. But it was a decimated party. May had backed “remain” in the referendum, and her failure to get Brexit done had made the Tory base furious and suspicious and the Conservatives almost a laughingstock. Public support tumbled from around 40 percent to 22 percent in the first half of 2019, its lowest share in recent history. Boris pledged he would get a new deal by credibly threatening to pull the country out of Europe with no deal — saying he would rather “die in a ditch” than let Britain’s E.U. membership go past October 31. He also promised there would be no compromise on the Irish border, which would remain open after Brexit, even though he offered no solution as to how this wouldn’t open a huge hole in the E.U. customs union. He attempted to prevent Parliament from intervening by proroguing it for a longer time than usual, a move swiftly ruled unconstitutional by the relatively new “supreme court” of the U.K.

As usual, he broke almost all his promises. Britain is still in the E.U. long after October 31, with an election on December 12. But on the one promise no one believed he could fulfill — a new deal with the E.U. — he succeeded. It turned out that a credible threat to leave without a deal (which May never made) concentrated minds considerably. And a burst of intense personal diplomacy with the Irish prime minister, Leo Varadkar — when Boris deployed his maximal charm — delivered a solution to the core Irish problem: a customs pseudo-border in the Irish Sea, an idea Boris had dismissed a year before. Emmanuel Macron congratulated the new prime minister: “He may be a colorful character sometimes, but we all are at times. He’s got a temper, but he’s a leader with a real strategic vision. Those who didn’t take him seriously were wrong.”

It was quite a coup, proof that Johnson could deliver, and the Tories rallied in the polls as the upstart Brexit Party plummeted. More important, the deal, unlike May’s, won its first procedural vote in Parliament, by a big majority of 30, as some Labourites had backed it. It seemed within Boris’s grasp to get the deal passed by a slightly extended deadline — some time into November of this year. And then he made a strategic gamble. Rather than pressing on, he feared Parliament could still frustrate it down the line, so he decided to call an election for a new mandate to “get Brexit done.”

The decision was driven by Dominic Cummings, the controversial but brilliant guru who had engineered the Brexit vote. Boris brought him in to No. 10. Cummings understood that Brexit was the Conservatives’ best issue and that they were vulnerable on other domestic issues, especially on austerity and public spending. If Boris delivered Brexit and then called an election, he argued, the campaign would be on Labour’s terms: domestic economic and social policy. But if the election was called quickly, before Brexit, it would be on Conservatives’ terms: “Get Brexit Done,” as Boris has repeated endlessly while campaigning. The fear was a rerun of the Churchill 1945 election, when a victorious war leader was thrown out once he was no longer needed and the Brits voted en masse for a socialist revolution. He also saw that Boris had a chance to unite the “leave” vote by winning back Brexit-party supporters but that the “remain” vote remained hopelessly divided between the Labour and Liberal Democratic parties. He saw a chance to create a new Tory coalition based on the “leave” vote. Some Brexiteers resisted, seeing a chance to get Brexit passed without any election potentially messing things up.

So far, the gamble appears to be paying off. A huge poll of over 100,000 Brits by YouGov last month, using the same methods that had rightly predicted a hung Parliament in 2017, showed a possible Tory majority of 68 seats. In the poll, the Tories held on to their traditional base in the South but made striking gains in the North, turning long-held Labour seats into Tory ones overnight. It is the same dynamic that saw the Democrats lose the Rust Belt swing states in 2016. The poll shows Labour at 32 percent with the Lib Dems at 14, while the Tories have 43 percent support and the Brexit Party has collapsed to 3 percent. Boris’s strategy destroyed both the former U.K. Independence Party and then the Brexit Party — the two parties of the far right. Divide and conquer was how Thatcher won three times in a row in parliamentary seats despite never having majority support in the country as a whole. If Boris wins, it will be by the same strategy.

But his appeal is very different from Thatcher’s. Far from confronting people with hard economic choices and threatening ever-deeper austerity amid soaring unemployment, as she did, Boris is promising much more public spending than his Tory predecessors, in an era of very low unemployment, while trimming tax for the working and middle classes. The cut in corporation tax, planned by Theresa May, was scrapped. He plans big increases in spending on the National Health Service and schools and doubling the government science budget, while also getting tougher on crime and terrorism. Much of this appeals more to traditional Labour voters than to the London Tories who read The Economist.

And, of course, Brexit will not be “done,” as Boris promises. The Withdrawal Agreement is just the first step in a long and agonizing process of trade talks. Boris has promised these will be over by the end of 2020 and said so in the first televised debate between him and Corbyn. But no one can possibly believe that (and most people don’t). What Boris seems to be counting on is that he will conclude a withdrawal agreement by the end of January, make a huge fuss over it, declare the matter finished, and hope that most Brits will not want to immerse themselves in the mind-numbing details of trade talks. He’s gambling that Brexit is largely a symbolic issue — a new statement of British sovereignty and independence — and that the details of future trade don’t really matter. And he may be cynical about this but also right.

One sign of this possibility is the immigration issue. It was critical to the Brexit vote but disappeared as a major issue in the polls as soon as the referendum was over. The question has played almost no part in the current campaign even though Britain’s immigration system hasn’t changed significantly since 2016. The xenophobic ugliness that appeared before the referendum largely subsided afterward. It’s as if people just wanted to be heard on the subject and broadly shift away from mass immigration but actually didn’t care that much. It may be a function of the fact that E.U. migration to Britain has fallen drastically since the referendum, and Boris is pledging to transform the entire system toward the Australian model of selection based on proven abilities and skills. But the ability of most people to move on from difficult subjects once they feel they’ve been listened to should not be underestimated.

It is this aspect of Boris’s politics that some of his close allies insist has been misunderstood. He has done what no other conservative leader in the West has done: He has co-opted and thereby neutered the far right. The reactionary Brexit Party has all but collapsed since Boris took over. Anti-immigration fervor has calmed. The Tories have also moved back to the economic and social center under Johnson’s leadership. And there is a strategy to this. What Cummings and Johnson believe is that the E.U., far from being an engine for liberal progress, has, through its overreach and hubris, actually become a major cause of the rise of the far right across the Continent. By forcing many very different countries into one increasingly powerful Eurocratic rubric, the E.U. has spawned a nationalist reaction. From Germany and France to Hungary and Poland, the hardest right is gaining. Getting out of the E.U. is, Johnson and Cummings argue, a way to counter and disarm this nationalism and to transform it into a more benign patriotism. Only the Johnson Tories have grasped this, and the Johnson strategy is one every other major democracy should examine.

Consider, by contrast, Germany, where the center right is reeling and the extreme-right AfD has 91 seats in the Bundestag. Or, for that matter, France, where the mainstream right has collapsed and Marine Le Pen won 34 percent in the last presidential election. Compare it with the U.S., where the GOP has been overthrown by a far-right insurgency and turned into a disturbingly fascistic personality cult. Or Hungary and Poland, where reactionaries control the entire system. The Tories under Boris, helped in part by the winner-takes-all electoral system, have kept the far right at bay, now favor tax cuts for the poor, have a strong program for climate change, and have proposed an Australian-style immigration policy to defuse native panic. They are not socially conservative in the American sense. And all of this has been made possible by Boris Johnson’s shameless ability to shift and reinvent his politics, betray his allies, lie to the public, and advance his own career. One of those close to him told me that the next group he will betray is the ERG, the hard-right Tory Brexiteers. And if he wins this election by a solid margin and seizes the center, he may force the Labour Party to reexamine how far left it has traveled in the past few years.

What Boris is offering as an alternative is a Tory social democracy rooted in national pride and delivered with a spoonful of humor and entertainment. In some ways, his personality is part of the formula. His plummy voice and silly hair and constant jokes are deeply, even reassuringly, British even as demographic change has made Britishness seem fragile. And if you still believe in the nation-state, in liberal democracy, and have qualms about the unintended consequences of neoliberal economics, it’s about as decent a conservative political blend as is on offer in the West. It makes the GOP look deranged by contrast.

Yes, Boris has shifted and lied and betrayed on his path to this moment. But he will gladly point out that the same criticisms were made of Churchill, who switched parties, alienated almost everyone in the Establishment, and was regarded long into the 1930s as a crank and a joke with a funny way of speaking. But Churchill was right about the one thing that mattered, and Johnson not so subtly implies the same is true about him and Brexit. It takes a large ego to use Churchill as an analogy, and Brexit is hardly the Battle of Britain (and Churchill famously wanted a united Europe after the war). But in defense of Britain’s independence from foreign power and its unbroken national sovereignty, without foreign invasion, for a thousand years, you can see, or rather feel, the parallel. And when it comes to chances for political analogies, Boris the opportunist will take what he can get.

*This article appears in the December 9, 2019, issue of New York Magazine. Subscribe Now!

]]>
http://nymag.com/intelligencer/2019/12/boris-johnson-brexit.html
http://nymag.com/intelligencer/2019/12/boris-johnson-brexit.htmlMon, 16 Dec 2019 06:47:00 +0100<![CDATA[dwarvesf/hidden: A ultra-light MacOS utility that helps hide menu bar icons]]>

Hidden Bar

Hidden Bar lets you hide menu bar items to give your Mac a cleaner look.

]]>
https://github.com/dwarvesf/hidden
https://github.com/dwarvesf/hiddenMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Time Converter - Conversion at a Glance - Pick best time to schedule conference calls, webinars, online meetings and phone calls.]]>+0

London GMT

United Kingdom, England

5:48a

Mon, Dec 16

-5:48a

Mon, Dec 16

Mon

Dec16

1amGMT

2amGMT

3amGMT

4amGMT

5amGMT

6amGMT

7amGMT

8amGMT

9amGMT

10amGMT

11amGMT

12pmGMT

1pmGMT

2pmGMT

3pmGMT

4pmGMT

5pmGMT

6pmGMT

7pmGMT

8pmGMT

9pmGMT

10pmGMT

11pmGMT

Tue

Dec17

]]>
https://www.worldtimebuddy.com/
https://www.worldtimebuddy.com/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[We won the argument, but I regret we didn’t convert that into a majority for change | Jeremy Corbyn | Politics | The Guardian]]>

We are living in highly volatile times. Two-and-a-half years ago, in the first general election I contested as Labour leader, our party increased its share of the popular vote by 10 percentage points. On Thursday, on a desperately disappointing night, we fell back eight points.

I have called for a period of reflection in the party, and there is no shortage of things to consider. I don’t believe these two contrasting election results can be understood in isolation.

The last few years have seen a series of political upheavals: the Scottish independence campaign, Labour’s transformation, Brexit, the Labour electoral surge, and now Johnson’s “Get Brexit Done” victory. None of that is a coincidence.

The political system is volatile because it is failing to generate stable support for the status quo following the financial crash of 2008. As Labour leader I’ve made a point of travelling to all parts of our country and listening to people, and I’ve been continually struck how far trust has broken down in politics.

The gap between the richest and the rest has widened. Everyone can see that the economic and political system is not fair, does not deliver justice, and is stacked against the majority.

That has provided an opening for a more radical and hopeful politics that insists it doesn’t have to be like this, and that another world is possible. But it has also fuelled cynicism among many people who know things aren’t working for them, but don’t believe that can change.

I saw that most clearly in the former industrial areas of England and Wales where the wilful destruction of jobs and communities over 40 years has taken a heavy toll. It is no wonder that these areas provided the strongest backlash in the 2016 referendum and, regrettably for Labour, in the general election on Thursday.

In towns where the steelworks have closed, politics as a whole wasn’t trusted, but Boris Johnson’s promise to “get Brexit done” – sold as a blow to the system – was. Sadly that slogan will soon be exposed for the falsehood it is, shattering trust even further.

Despite our best efforts, and our attempts to make clear this would be a turning point for the whole direction of our country, the election became mainly about Brexit.

A Conservative party prepared to exploit divisions capitalised on the frustration created by its own failure to deliver on the referendum result – to the cost of a Labour party seeking to bring our country together to face the future.

The polarisation in the country over Brexit made it more difficult for a party with strong electoral support on both sides. I believe we paid a price for being seen by some as trying to straddle that divide or re-run the referendum.

We now need to listen to the voices of those in Stoke and Scunthorpe, Blyth and Bridgend, Grimsby and Glasgow, who didn’t support Labour. Our country has fundamentally changed since the financial crash and any political project that pretends otherwise is an indulgence.

Progress does not come in a simple straight line. Even though we lost seats heavily on Thursday, I believe the manifesto of 2019 and the movement behind it will be seen as historically important – a real attempt at building a force powerful enough to transform society for the many, not the few. For the first time in decades, many people have had hope for a better future.

That experience, shared by hundreds of thousands of people, cannot be erased. Our task as a movement, and a party that has more than doubled in size, is not over: it now has the urgent task of defending the communities that will come under sustained assault from Boris Johnson’s government and the toxic deal he wants with Donald Trump.

And it must set about ensuring that sense of hope spreads and deepens. As socialists we seek to raise people’s expectations. People in our country deserve so much more – and they can have it, if we work together to achieve it.

I am proud that on austerity, on corporate power, on inequality and on the climate emergency we have won the arguments and rewritten the terms of political debate. But I regret that we did not succeed in converting that into a parliamentary majority for change.

There is no doubt that our policies are popular, from public ownership of rail and key utilities to a massive house-building programme and a pay rise for millions. The question is, how can we succeed in future where we didn’t this time?

There is no quick fix to overcome the distrust of many voters. Patronising them will not win them over. Labour has to earn their trust. That means the patient work of listening and standing with communities, especially as the government steps up its assault. And it means ensuring that the working class, in all its diversity, is the driving force within our party.

The media attacks on the Labour party for the last four and a half years were more ferocious than ever – and of course that has an impact on the outcome of elections. Anyone who stands up for real change will be met by the full force of media opposition.

The party needs a more robust strategy to meet this billionaire-owned and influenced hostility head-on and, where possible, turn it to our advantage.

We have suffered a heavy defeat, and I take my responsibility for it. Labour will soon have a new leader. But whoever that will be, our movement will continue to work for a more equal and just society, and a sustainable and peaceful world.

I’ve spent my life campaigning for those goals, and will continue to do so. The politics of hope must prevail.

]]>
https://www.theguardian.com/politics/2019/dec/14/we-won-the-argument-but-i-regret-we-didnt-convert-that-into-a-majority-for-change
https://www.theguardian.com/politics/2019/dec/14/we-won-the-argument-but-i-regret-we-didnt-convert-that-into-a-majority-for-changeMon, 16 Dec 2019 06:47:00 +0100<![CDATA[CS 144: Introduction to Computer Networking]]>

Sarah Tollman head CA Email: stollman at stanford

Nicholas Hirning Email: nhirning at stanford

Alex Ozdemir Email: aozdemir at stanford

William Zhuk Email: waz at stanford

Sadjad Fouladi Email: sadjad at cs.stanford

Emily Marx Email: emarx1 at stanford

]]>
https://cs144.github.io/
https://cs144.github.io/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Prime Leverage: How Amazon Wields Power in the Technology World - The New York Times]]>

Amazon Everywhere

Prime Leverage: How Amazon Wields Power in the Technology World

Software start-ups have a phrase for what Amazon is doing to them: ‘strip-mining’ them of their innovations.

SEATTLE — Elastic, a software start-up in Amsterdam, was rapidly building its business and had grown to 100 employees. Then Amazon came along.

In October 2015, Amazon’s cloud computing arm announced it was copying Elastic’s free software tool, which people use to search and analyze data, and would sell it as a paid service. Amazon went ahead even though Elastic’s product, called Elasticsearch, was already available on Amazon.

Within a year, Amazon was generating more money from what Elastic had built than the start-up, by making it easy for people to use the tool with its other offerings. So Elastic added premium features last year and limited what companies like Amazon could do with them. Amazon duplicated many of those features anyway and provided them free.

In September, Elastic fired back. It sued Amazon in federal court in California for violating its trademark because Amazon had called its product by the exact same name: Elasticsearch. Amazon “misleads consumers,” the start-up said in its complaint. Amazon denied it had done anything wrong. The case is pending.

Not since the mid-1990s, when Microsoft dominated the personal computer industry with Windows, has a technology platform instilled such fear in competitors as Amazon is now doing with its cloud computing arm. Its feud with Elastic illustrates how it brandishes power in that technical world.

While cloud computing may appear obscure and geeky, it underlies much of the internet. It has grown into one of the technology industry’s largest and most lucrative businesses, offering computing power and software to companies. And Amazon is its single-biggest provider.

Amazon has used its cloud computing arm — called Amazon Web Services, or A.W.S. for short — to copy and integrate software that other tech companies pioneered. It has given an edge to its own services by making them more convenient to use, burying rival offerings and bundling discounts to make its products less expensive. The moves drive customers toward Amazon while those responsible for the software may not see a cent.

Even so, smaller rivals say they have little choice but to work with Amazon. Given the company’s broad reach with customers, start-ups often agree to its restrictions on promoting their own products and voluntarily share client and product information with it. For the privilege of selling through A.W.S., the start-ups pay a cut of their sales back to Amazon.

Some of the companies have a phrase for what Amazon is doing: strip-mining software. By lifting other people’s innovations, trying to poach their engineers and profiting off what they made, Amazon is choking off the growth of would-be competitors and forcing them to reorient how they do business, the companies said.

All of this has fueled scrutiny of Amazon and whether it is abusing its market dominance and engaging in anticompetitive behavior. The company’s tactics have led several rivals to discuss bringing antitrust complaints against it. And regulators and lawmakers are examining its clout in the industry.

“People are afraid that Amazon’s ambitions are endless,” said Matthew Prince, chief executive of Cloudflare, an A.W.S. competitor that protects websites from attacks.

A.W.S. is just one prong of Amazon’s push to dominate large swaths of American industry. The company has transformed retailing, logistics, book publishing and Hollywood. It is rethinking how people buy prescription drugs, purchase real estate and build surveillance for their homes and cities.

But what Amazon is doing through A.W.S. is arguably more consequential. The company is the unquestioned market leader — triple the size of its nearest competitor, Microsoft — in the seismic shift to cloud computing. Millions of people unknowingly interact with A.W.S. every day when they stream movies on Netflix or store photos on Apple’s iCloud, services that run off Amazon’s machines.

Jeff Bezos, Amazon’s chief executive, once called A.W.S. an idea “no one asked for.” The service began in the early 2000s when the retailer struggled to assemble computer systems to start new projects and features. Once it built a common computer infrastructure, Amazon realized other companies needed similar capabilities.

Now companies like Airbnb and General Electric essentially rent computing from Amazon — otherwise known as using the “cloud” — instead of buying and running their own systems. Businesses can then store their information on Amazon machines, pluck data from them and analyze it.

For Amazon itself, A.W.S. has become crucial. The division generated $25 billion in sales last year — roughly the size of Starbucks — and is Amazon’s most profitable business. Those profits enable the company to plow money into many other industries.

In a statement, Amazon said the idea that it was strip-mining software was “silly and off-base.” It said it had contributed significantly to the software industry and that it acted in the best interest of customers.

Some tech companies said they had found more customers through A.W.S.; even some companies that have tangled with Amazon have grown. Elastic, for instance, went public last year and now has 1,600 employees.

But in interviews with more than 40 current and former Amazon employees and those of rivals, many said the costs of what the company was doing with A.W.S. were hidden. They said it was hard to measure how much business they had lost to Amazon, or how the threat of Amazon had turned off would-be investors. Many spoke on the condition of anonymity for fear of angering the company.

In February, seven software chief executives met in Silicon Valley and discussed bringing an antitrust lawsuit against the giant, said four people with knowledge of the gathering. Their grievances echoed a complaint by vendors who use Amazon’s shopping site: Once Amazon becomes a direct competitor, it is no longer a neutral party.

The C.E.O.s did not press forward with a legal action, partly out of concern that the process would take too long, the people said.

Now regulators are approaching some of Amazon’s software rivals. The House Judiciary Committee, which is investigating the big tech companies, asked Amazon in a September letter about A.W.S.’s practices. The Federal Trade Commission, which is also investigating Amazon, has questioned A.W.S. competitors, according to officials at two software companies who were called in but were not authorized to discuss the matter.

What Amazon is doing to software start-ups is unsustainable, said Salil Deshpande, founder of Uncorrelated, a venture capital firm.

“It has intercepted their monetization, it has forcibly wrestled control of software from their owners and it has siphoned customers to its own proprietary services,” he said.

‘Strip Mining’

When Amazon Web Services began last decade, Amazon was struggling to turn a consistent profit. A service to provide computing power seemed like a distraction.

Yet start-ups embraced A.W.S. They saved money because they did not need to buy their own computing equipment, while spending only on what they used. Soon more companies flocked to Amazon for computing infrastructure and, eventually, the software that ran on its machines.

In 2009, Amazon established a template for accelerating A.W.S.’s growth. That year, it introduced a service for managing a database, which is critical software to help companies organize information.

The A.W.S. database service, an instant hit with customers, did not run software that Amazon created. Instead, the company plucked from a freely shared option known as open source.

Open-source software has few parallels in business. It is akin to a coffee shop giving away coffee on the hopes that people spend on milk or sugar or pastries.

But open source is a tried and true model nurtured by the software industry to get technology to customers quickly. A community of enthusiasts often springs up around the shareable technology, contributing improvements and spreading the word about its benefits. Traditionally, open-source companies later earn money for customer support or from paid add-ons.

Technologists initially paid little attention to what Amazon had done with database software. Then in 2015, Amazon repeated the maneuver by copying Elasticsearch and offering its competing service.

This time, heads turned.

“There was a company that built a business around an open-source product that people like using and, suddenly, they have a competitor using their own stuff against them,” said Todd Persen, who started a non-open-source software company this year so there was “zero chance” that Amazon could lift his creations. His previous start-up, InfluxDB, was open source.

Again and again, the open-source software industry became a well that Amazon turned to. When it copied and integrated that software into A.W.S., it didn’t need permission or have to pay the start-ups for their work, creating a deterrent for people to innovate.

That left little recourse for many of these companies, which could not suddenly start charging money for what was free software. Some instead changed the rules around how their wares could be used, restricting Amazon and others who want to turn what they have created into a paid service.

Amazon has worked around some of their changes.

When Elastic, now based in Silicon Valley, shifted the rules for its software last year, Amazon said in a blog post that open-source software companies were “muddying the waters” by limiting access to certain users.

Shay Banon, Elastic’s chief executive, wrote at the time that Amazon’s actions were “masked with fake altruism.” Elastic declined to make Mr. Banon available for an interview.

Last year, MongoDB, a popular technology for organizing data in documents, also announced that it would require any company that manages its software as a web service to freely share the underlying technology. The move was widely viewed as a hedge against A.W.S., which does not openly share its technology for creating new services.

A.W.S. soon introduced its own technology with the look and feel of MongoDB’s older software, which did not fall under the new requirements.

That experience was top of mind this year when Dev Ittycheria, MongoDB’s chief executive, attended the dinner with the heads of six other software companies. Their conversation, held at the home of a Silicon Valley venture capitalist, shifted to something drastic: whether to publicly accuse Amazon of behaving like a monopoly.

At the meal, which included the heads of the software firms Confluent and Snowflake, some of the C.E.O.s said they faced an uneven playing field, according to the people with knowledge of the gathering. No complaint has materialized.

“A.W.S.’s success is built on strip-mining open-source technology,” said Michael Howard, chief executive of MariaDB, an open-source company. He estimated that Amazon made five times more revenue from running MariaDB software than his company generated from all of its businesses.

Andi Gutmans, an A.W.S. vice president, said some companies wanted to be “the only ones” to make money off open-source projects. He said Amazon was “committed to making sure that open-source projects remain truly open and customers get to choose how they use that open-source software — whether they choose A.W.S. or not.”

By the time A.W.S. held its first developer conference in 2012, Amazon was no longer the only big player in cloud computing. Microsoft and Google had introduced competing platforms.

So Amazon unveiled more software services to make A.W.S. indispensable. In a speech at the event, Andy Jassy, the head of A.W.S., said it wanted to “enable every imaginable use case.”

Amazon has since added A.W.S. services at a blistering pace, going from 30 in 2014 to about 175 as of December. It also built in a home-field advantage: simplicity and convenience.

Customers can add new A.W.S. services with a single click and use the same system to manage them. The new service is added to the same bill and requires no extra permission from a finance or compliance department.

In contrast, using a non-Amazon service on A.W.S. is more complicated.

Today when a customer logs onto A.W.S., they see a home page called the management console. At the center is a list of about 150 services. All are A.W.S.’s own products.

When someone types “MongoDB,” the search results do not fetch information for MongoDB’s service on A.W.S.; it instead suggests an offering from Amazon that is “compatible with MongoDB.”

Even after a customer has selected a non-Amazon option, the company sometimes continues pushing its own product. When someone creates a new database, they are presented an ad for Amazon’s own technology called Aurora. If they pick something else, Amazon still highlights its option as “recommended.”

Mr. Gutmans said A.W.S. worked closely with many companies to integrate their offerings “as seamlessly as possible.”

Banning Words

Amazon’s A.W.S. developer conference is now one of the world’s biggest technology events, drawing tens of thousands of people to Las Vegas every year.

The highlight is a speech from Mr. Jassy where he showcases new services. Because a new A.W.S. feature often spells hardship for some start-up, the presentation has earned the nickname “The Red Wedding,” a bloody event in a “Game of Thrones” episode.

“Nobody knows who is going to get killed next,” said Corey Quinn of the Duckbill Group, who helps companies manage their A.W.S. bills and writes a newsletter called “Last Week in A.W.S.”

At last year’s conference, Amazon unveiled a new tool — Amazon CloudWatch Logs Insights — to help customers analyze information about its services.

Daniel Vassallo, a former A.W.S. software engineer who helped develop the product, said executives wanted to go after the market, but were worried it would look like Amazon was targeting a company called Splunk, which offers a similar tool and is also a major spender with A.W.S.

So Amazon previewed its new product to Splunk before the conference and agreed not to announce it during Mr. Jassy’s speech, Mr. Vassallo said.

“They weren’t particularly happy. Who would be?” Mr. Vassallo, who left Amazon in February, said of Splunk. “But we still went ahead and did it anyway.”

Splunk said it had a “strong partnership” with A.W.S. and declined to comment further.

Amazon has also created rules for its developer conference. Companies that pay tens of thousands or hundreds of thousands of dollars for a booth said they must submit their banners, pamphlets and news releases to Amazon for approval.

According to an A.W.S. document from August explaining marketing guidelines for companies it works with, Amazon bans certain words or phrases, such as “multi-cloud,” the concept of using two or more cloud platforms. An Amazon spokesman said it had stopped this practice.

Companies are also instructed to strike claims about being “the best,” “the first,” “the only,” “the leader,” unless substantiated by independent research.

‘Love-Hate Relationship’

Redis Labs was founded in 2011 in Tel Aviv, Israel, to build a business around managing a free software called Redis, which people use to organize and update data quickly. Amazon soon offered a competing paid service.

While that created a formidable rival to Redis Labs, Amazon’s move also validated Redis technology. The start-up has since raised $150 million, exemplifying the can’t-live-with-can’t-live-without relationship that many software companies have with Amazon.

Former Redis Labs employees estimate that Amazon generates as much as $1 billion a year from Redis technology — or at least 10 times more revenue than Redis Labs. They said Amazon also tried to poach its staff and undercut it with hefty discounts.

A.W.S. offers a discount to customers who commit to spending at least a certain amount with it, but it does not treat money spent on A.W.S.’s own services and rival services equally. Spending on outside services counts as only 50 cents on the dollar toward the balance. And discounts do not apply to non-Amazon products, according to A.W.S. customers.

If a customer still chooses Redis Labs through A.W.S., Redis Labs is required to kick back around 15 percent of its revenue to Amazon.

At one point, Amazon’s attempts to hire Redis Labs employees became so aggressive that executives removed some online biographies of its technical staff, said the former employees. A Redis Labs spokesman said the start-up had no recollection of that.

Some Redis Labs executives considered bringing an antitrust action against Amazon this year, the former employees said. Others balked because 80 percent of the start-up’s revenue came from customers on A.W.S.

“It was a love-hate relationship,” said Leena Joshi, a former vice president of marketing at Redis Labs. “On one hand, most of our customers ran on A.W.S. so it was in our interest to be tightly integrated with them. At the same time, we knew they were taking away our business.”

Redis Labs declined to comment on its revenues or A.W.S. actions. It said Amazon offered “important services.”

Not every company views A.W.S. as a threat. Ali Ghodsi, chief executive of Databricks, a San Francisco start-up that uses artificial intelligence to analyze data, said A.W.S. salespeople have lifted sales of his company’s products.

“I don’t see them using shenanigans to stop us,” he said.

But Saket Saurabh, chief executive of Nexla, a 14-person start-up in Millbrae, Calif., said he had reservations about Amazon.

In August, Amazon began a service for processing and monitoring data that competes with Nexla. Investors warned him about sharing too much information with the giant.

Mr. Saurabh went ahead anyway and signed his company up to work with Amazon in September. The reason? Amazon’s giant sales teams can give Nexla access to a vast audience.

“What choice do we have?” he said.

]]>
https://www.nytimes.com/2019/12/15/technology/amazon-aws-cloud-competition.html
https://www.nytimes.com/2019/12/15/technology/amazon-aws-cloud-competition.htmlMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Microbrowsers Are Everywhere ◆ 24 ways]]>You’ve seen it everywhere - that little thumbnail preview of a website mentioned in a tweet, the expanded description in a Slack channel, or in WhatsApp group chat.

These link previews are so commonplace that we hardly pay any attention to how our site design might be impacting the generated preview. Yet, these previews can be the most influential part for attracting new audiences and increasing engagement - possibly more than SEO. Even more alarming is that most web analytics are blind to this traffic and can’t show you how these Microbrowsers are interacting with your site.

As we close out the year, here are five essential questions and ideas that every web dev should know about Microbrowsers.

1. What are Microbrowsers? How are they different from “normal” browser?

We are all very familiar with the main browsers like Firefox, Safari, Chrome, Edge and Internet Explorer. Not to mention the many new browsers that use Chromium as the rendering engine but offer unique user experiences like Samsung Internet or Brave.

In contrast, Microbrowsers are a class of User-Agents that also visit website links, parse HTML and generate a user experience. But unlike those traditional browsers, the HTML parsing is limited and the rendering engine is singularly focused. The experience is not intended to be interactive. Rather the experience is intended to be representational - to give the user a hint of what exists on the other side of the URL.

Creating link previews is not new. Facebook and Twitter have been adding these link previews in posts for nearly a decade. That used to be the primary use case. Marketing teams created backlog items to adopt different micro data - from Twitter Cards and Open Graph annotations for Facebook. LinkedIn likewise embraced both Open Graph and OEmbed tags to help generate the previews

As group chats and other collaboration tools have become more prevalent, we have seen many features from the big social media platforms emerge. Particularly in recent years we’ve seen the adoption of the link unfurling behaviour in these chat platforms. Rather than reinventing the wheel, each platform looks for pre-existing micro data to generate the preview.

The challenge now is that each communication platform parses and interprets this micro-data in different ways, and presents the information in differing ways.

2. If Microbrowsers are everywhere, why don’t I see them in my analytics reports?

It’s easy to miss the traffic from Microbrowsers. This is for a number of reasons:

First, page requests from Microbrowsers don’t run JavaScript and they don’t accept cookies. The Google Analytics <script> block won’t be run or executed. And all cookie will be ignored by the rendering agent.

Second, if you were to do a log analysis based on HTTP logs from your CDN or web stack, you would see a relatively small volume of traffic. That is assuming you can identify the User-Agent strings. Some of these Microbrowsers impersonate real browsers and others impersonate Facebook or twitter. For example, iMessage uses the same User-Agent string for all these requests and it hasn’t changed since iOS 9.

User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_1)
AppleWebKit/601.2.4 (KHTML, like Gecko)
Version/9.0.1 Safari/601.2.4
facebookexternalhit/1.1
Facebot Twitterbot/1.0

Finally, many platforms - particularly Facebook Messenger and Hangouts use centralized services to request the preview layout. This is in contrast to WhatsApp* and iMessage where you will see one request per user. In the centralized consumer approach your web servers will only see one request, but this one request might represent thousands of eyeballs.

3. Microbrowser are probably more important than google bot

We all know the importance of having our web sites crawled by search engines like googlebot. These bots are the lifeblood for lead generation and for discovering new users.

However, the real gold for marketers is from word-of-mouth discussions. Those conversations with your friends when you recommend a TV show, a brand of clothing, or share a news report. This is the most valuable kind of marketing.

Last year when assembling the data for Cloudinary’s State of the Visual Media report, I discovered that there was a very prominent usage pattern over the USA holiday season. During thanksgiving, all the way to Black Friday, the rate of link sharing sky rocketed as group chats shared deals and insights.

Zooming out (and normalizing for time-of-day), we can see that there is a daily cadence of link sharing and word of mouth referrals. It probably isn’t a shock to see that we predominantly share links in Slack between Monday and Friday, while WhatsApp is used all week long. Likewise, WhatsApp is most often used during our ‘break’ times like lunch or in the evening after we put the kids to bed.

While the link preview is increasingly common, there are two user behaviours to balance:

Users can be skeptical of links sent via SMS and other chats. We don’t want to be fooled into clicking a fishing links and so we look for other queues to offer validation. This is why most platforms use the preview while also emphasize the website url host name.

Skimming. I’m sure you’ve had the experience coming out of a meeting or grocery store to find a group chat with 100 messages. As you scroll to catch up on the conversation, links can easily be skipped. In this way, users expect the preview to act as a summary to tell them how important it is to visit the link.

4. Microbrowsers are not real browsers (they just play one on TV)

As I previously mentioned, Microbrowsers pretend to be a browser in that they send the right HTTP headers and often send impersonating User-Agent strings. Yet, there are several characteristics that a web dev should be aware of.

First, Microbrowsers try to protect the User’s privacy. The user hasn’t decided to visit your site yet, and more importantly, the user is having a private conversation. The fact that your brand or website is mentioned should just make your ears burn, but you shouldn’t be able to listen in to the conversation.

For this reason, all Microbrowsers:

don’t execute JavaScript - so your react application won’t work

ignore all cookies - so your A/B or red/green cookies will be ignored

some will follow redirects, but will quickly time out after a few seconds and give up trying to expand the link.

there won’t be a referer: HTTP header when the user clicks the link for the full browser. In fact, a new user will appear as ‘direct’ traffic - as though they typed in the url.

Second, Microbrowsers have a very small brain and very likely don’t use an advanced network algorithm. Most browsers will use a tokenizer to parse the HTML markup and send requests to the network stack asynchronously. Better yet, browsers will do some analysis of the resources needed before sending the async request to the network.

Based on observational experimentation, most platforms simply use a glorified for loop when parsing the HTML and often request the resources synchronously. This might be ok for fast wifi experiences, but it can cause inconsistent experiences on flaky wifi.

For example, iMessage will discover and load all <link rel="icon" > favicon, all <meta property="og:image" images, and all <meta name="twitter:image:src" before deciding what to render. Many sites still advertise 5 or more favicon sizes. This means that iMessage will download all favicons regardless of size and then not use them if it decides to instead render the image.

For this reason the meta markup that is included is important. The lighter the content, the more likely it will be to be rendered.

5.

Markup Matters

Since Microbrowsers are simple-brained browsers, it is all the more important to produce good markup. Here are a few good strategies:

It’s almost 2020, advertise one favicon size. Remove all the other <link rel="shortcut icon" and <link rel="icon" references.

Based on observational experimentation, the most commonly recognized micro data tags for preview are the Open-Graph tags. When the OG and twitter card tags are missing, the default SEO <meta name="description" is used. However, since the description is often nonsensical SEO optimized phrases, users’ eyes will likely glaze over.

On that note, use good descriptive text

Provide up to three <meta property="og:image" images. Most platforms will only load the first one, while others (notably iMessage) attempts to create a collage.

Use <meta property="og:video* with progressive (not streaming) video experiences.

Don’t use UA sniffing to hide the <meta> tags. Sites like Amazon do this to try and show only Facebook/Twitter the micro data annotated website. But this can cause problems for some Microbrowsers that don’t use the same impersonation convention. The result is a simple link without a preview.

Use the opportunity to tell your product story or summarize your ideas.

Summary

As more of our conversations happen in group chats and slack channels, link previews are an important way for you to engage users before they start the journey on your site. Unfortunately, not all websites present good or compelling previews. (And now that you know what to look for, you won’t be able to unsee bad examples - I’m sorry). To help users take the leap and visit your site, we need to make sure that all our pages are annotated with micro data. Better yet, we can use these previews to create compelling visual summaries.

]]>
https://24ways.org/2019/microbrowsers-are-everywhere/
https://24ways.org/2019/microbrowsers-are-everywhere/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[The Money Men Who Enabled Adam Neumann and the WeWork Debacle - WSJ]]>

In early October, WeWork’s board of directors trickled into a brick building in lower Manhattan where the startup had an office. After they took their seats around the conference room table,
Mark Schwartz
started to vent.

“I’ve stayed silent too long,” the 65-year-old former
Goldman Sachs Group Inc.
partner told the six other men on the board, including WeWork’s co-founder and chairman,
Adam Neumann.

Mr. Schwartz aired his frustrations about the state of the company, which was perilously low on cash after years of freewheeling spending and had become the butt of jokes on Wall Street, according to people familiar with the meeting.

No more fantasies, he said, as advisers and others looked on. Now, he said, they needed to make decisions that would save the company.

Even more remarkable than the content of Mr. Schwartz’s blistering rebuke was the fact that it came so late. The banker had stayed silent so long that the story was almost over.

It was a spectacular fall for the company that months before had been America’s most valuable startup.

Little of WeWork’s trajectory would have been possible were it not for the collection of veteran executives and financiers from the upper echelons of Wall Street and Silicon Valley who enabled Mr. Neumann, a charismatic 40-year-old with little prior business experience.

Mr. Neumann mesmerized them with his pitch, which offered a vision for the property-leasing company as a tech startup with limitless potential to transform how people work and live.

Investors poured capital onto Mr. Neumann’s business bonfire and ceded control, rarely pushing back with any force despite mounting problems and year after year of missed projections.

Masayoshi Son,
the CEO of
SoftBank
Group Corp., who helped inflate WeWork’s valuation to $47 billion, pushed an already wild-spending Mr. Neumann to act bigger and crazier.
JPMorgan
Chase & Co. CEO James Dimon and other bankers, instead of injecting a dose of reality, spent years championing Mr. Neumann and the company as they battled for the coveted IPO assignment.

The outside board directors, all of whom had decades of experience in business and finance, voted for years to approve decisions by Mr. Neumann that paved the way for WeWork’s near collapse. Some of them had potential conflicts of interest themselves.

The directors on the board let Mr. Neumann personally buy stakes in buildings that he would lease to WeWork. They gave him long-term voting control of the company in 2014, and allowed him to sell and borrow more than $1 billion against his WeWork stake. They approved hundreds of millions of dollars for acquisitions of tech companies that were viewed by top executives as wasteful spending, with little relation to WeWork’s core business.

We Worth

WeWork’s valuation has plunged since the beginning of the year.

$50

billion

January

$47B

Last venture-

funding round

40

30

September

$15-20B

IPO estimate*

20

Oct. 22

$8B

SoftBank investment

10

0

2013

’14

’15

’16

’17

’18

’19

*Expected valuation range

Sources: Dow Jones Venture Source (valuations by funding rounds); staff reports (Sept. and Oct.)

The end result didn’t just blow up $39 billion of the company’s value, roughly the value of Delta Air Lines Inc. It was a watershed moment for Silicon Valley. For years, investors salivated over all-powerful founders who promised disruption and demanded control. After WeWork’s spectacular flameout, investors have grown skeptical of the model.

In the moment, there was little debate following Mr. Schwartz’s remarks in the Oct. 3 meeting. The company needed funds to avoid running out of money by the second week of November.

Mr. Neumann, who had repeatedly skipped board meetings, including as the company was planning the IPO, urged the board to move quickly. They needed to save the company and that was all that mattered, he said.

The beginning

WeWork was born in the wake of the global financial crisis. In 2009, Mr. Neumann and
Miguel McKelvey,
a trained architect, had success with a small property-leasing business in Brooklyn. The next year they opened their first office in Manhattan.

The New York economy was reheating and young entrepreneurs flocked into its tiny SoHo location. It proved an ideal time to be hunting for startup financing, with housing and banks hobbled by the recession and interest rates low.

Mr. Neumann excelled at the fundraising game. He laid out a vision for a set of “We”-branded businesses, such as office renting, housing, banking and business services, that could make money off young entrepreneurs in a changing workforce.

Within a couple of years, Mr. Neumann had piqued the interest of Michael Eisenberg, then an Israel-based partner at the vaunted venture-capital firm Benchmark Capital. Mr. Neumann flew to Benchmark’s Silicon Valley office to pitch its partners. The conclusion: Many were skeptical of the business, but they loved Mr. Neumann, and figured he had the charisma and instincts to build a huge company. Benchmark led WeWork’s first, Series A round of venture funding, which totaled $17.5 million.

WeWork’s business model was to lease long-term and charge higher rates to short-term small-business clients. That meant revenue relatively quickly exceeded the costs of operating its spaces. This is relatively common in real estate, but it looked extraordinary compared with software and hardware companies, which typically require years of investment.

DAG Ventures, another Silicon Valley venture-capital shop, invested in WeWork at a $440 million valuation. Then came JPMorgan’s asset-management arm, at $1.5 billion.
T. Rowe Price Group Inc.
jumped in at a $5 billion valuation in late 2014. Six months later, Fidelity Investments followed at $10 billion.

The rush of money gave startup CEOs extraordinary leverage. For decades, investors were used to being able to fire founders and steer the direction of their companies. Now investors competed to show how they were “founder friendly.” Founders were lionized for having giant vision, inspiration and a little bit of crazy.

Mr. Neumann had all these traits, and his eccentricities only seemed to entrance investors even more.

In late 2015, WeWork was completing an investment round led by Beijing-based
Hony Capital Ltd.
that pushed its valuation to $16 billion. Mr. Neumann invited its CEO, John Zhao, to a party at 110 Wall Street, where WeWork was about to open its first WeLive dormlike apartment building.

Toward the end of the night, Mr. Neumann led others to the roof of the 27-story building. There, guests passed around tequila shots. Mr. Neumann picked up a fire extinguisher and set it off, spraying Mr. Zhao and others with white foam.

The deal went through. Mr. Zhao joined WeWork’s board in July 2016.

Mr. Neumann continuously said profitability was just around the corner. In reality, its losses were swelling far larger every year.

Rosy Forecasts

WeWork's net income projections were routinely higher than its actual haul.

Five-year company projections vs. actual net income by year

JUNE 2015

MARCH 2016

MARCH 2017

$1.5 billion

$1.5 billion

$1.5 billion

.

.

.

$1.3 billion

$1.3 billion

Projected

1.0

1.0

1.0

$908 million

0.5

0.5

0.5

0

0

0

–0.5

–0.5

–0.5

–1.0

–1.0

–1.0

–1.5

–1.5

–1.5

–2.0

–2.0

–2.0

Actual*

–$2.2 billion

–2.5

–2.5

–2.5

2015

’16

’17

’18

’19

’20

’21

2015

’16

’17

’18

’19

’20

’21

2015

’16

’17

’18

’19

’20

’21

Five-year company projections vs. actual net income by year

JUNE 2015

MARCH 2016

MARCH 2017

$1.5 billion

$1.5 billion

$1.5 billion

.

.

.

$1.3 billion

$1.3 billion

Projected

1.0

1.0

1.0

$908 million

0.5

0.5

0.5

0

0

0

–0.5

–0.5

–0.5

–1.0

–1.0

–1.0

–1.5

–1.5

–1.5

–2.0

–2.0

–2.0

Actual*

–$2.2 billion

–2.5

–2.5

–2.5

2015

’16

’17

’18

’19

’20

’21

’15

’16

’17

’18

’19

’20

’21

’15

’16

’17

’18

’19

’20

’21

Five-year company projections vs. actual net income by year

JUNE 2015

MARCH 2016

MARCH 2017

$1.5 billion

$1.5 billion

$1.5 billion

.

.

.

$1.3 billion

$1.3 billion

Projected

1.0

1.0

1.0

$908 million

0.5

0.5

0.5

0

0

0

–0.5

–0.5

–0.5

–1.0

–1.0

–1.0

–1.5

–1.5

–1.5

–2.0

–2.0

–2.0

Actual*

–$2.2 billion

–2.5

–2.5

–2.5

2016

’18

’20

2016

’18

’20

2016

’18

’20

Five-year company projections

vs. actual net income by year

$1.5 billion

.

$1.3B

June 2015

1.0

March 2016

March 2017

0.5

0

–0.5

–1.0

–1.5

–2.0

–$2.2B

–2.5

2015

’16

’17

’18

’19

’20

’21

*2019 actual net income through 3Q. Source: internal company documents

A presentation for prospective investors in fall 2014 projected the company would turn a $4.2 million operating profit for the year. When the year was through, just three months later, the company reported an operating loss of $88 million on $74 million of revenue, according to internal documents.

In fact, WeWork has had only one profitable year in its history: 2012, when it generated about $1.7 million in net income, internal documents show.

Keeping control

As investors poured in more money, Mr. Neumann’s grip on WeWork tightened.

To maintain control, as part of the round in which T. Rowe Price invested in the company, Mr. Neumann restructured WeWork’s stock so that each of his shares had 10 times the votes of a normal one.

As part of the same deal, an entity Mr. Neumann controlled sold $40 million of stock. He did it twice again in 2015, selling an additional $80 million. It was a tiny portion of his stake—he was worth around $3 billion on paper—but Silicon Valley investors normally hated such sales. Startup founders were supposed to stay aligned with investors until everyone could sell, usually in an IPO or sale.

Some directors urged more restraint.
Bruce Dunlevie,
Benchmark’s representative on the board, resisted the voting control change, telling Mr. Neumann and other members of the board that absolute power corrupts absolutely.

Mr. Neumann prevailed, winning over the full board on both the voting control and the stock sales.

Benchmark partner
Bill Gurley
is known for criticizing venture firms that give too much power to founders. Several of its companies—including Snap Inc. and
Uber Technologies Inc.,
where Benchmark held board seats—have been criticized on the same issues. In 2017, Mr. Gurley helped push out Travis Kalanick, Uber’s co-founder and CEO, amid scandals and concerns about corporate culture.

Mr. Gurley and most of the other Benchmark partners increasingly viewed Mr. Neumann as similar to Mr. Kalanick—a rogue CEO who needed to be reined in, according to people who have discussed the matter with the firm.

In 2017, five partners from Benchmark flew in from the Bay Area to Manhattan to meet Mr. Neumann. They raised concerns about issues including missed projections and Mr. Neumann’s stock sales, a person familiar with the meeting said.

Among Benchmark’s partners, Mr. Dunlevie tended to be deferential to Mr. Neumann, causing tensions within the firm, people familiar with the dynamics said. He criticized some of Mr. Neumann’s actions, but also frequently extolled the CEO’s vision, comparing him to
Amazon.
com Inc. Chief Executive
Jeff Bezos.

Some directors saw their roles as more akin to advisers, rather than watchdogs or guardians for other shareholders, given that Mr. Neumann effectively controlled the board, according to people who have spoken with them. Mr. Neumann’s potent voting stock gave him the right to replace them or pack the board to outvote dissenters. Voting against Mr. Neumann would make it harder to register criticism in the future, some reasoned.

Multiple directors also shared some of the potential conflicts for which Mr. Neumann was later criticized.

He had always been open about hiring friends and family. WeWork’s executive ranks included his wife,
Rebekah Neumann,
the company’s chief brand officer. Mr. Neumann once told staff the board strongly resisted hiring Ms. Neumann, but he persevered, telling directors they could have both Neumanns at WeWork, or neither.

At an executive retreat in Montauk on Long Island, Mr. Neumann once raised a glass in a toast “to nepotism,” attendees said.

Among board members, Mr. Zhao’s son got a job at WeWork, as did the daughter of Mr. Dunlevie, who wasn’t involved in her hiring, people familiar with the matter said.

Lew Frankfort, former CEO of Coach Inc., borrowed from WeWork to buy stock and exercise some stock options early—a move typically made to save on taxes.

Another director,
Steven Langman,
had a deal with WeWork that could prove highly lucrative. His private-equity firm, Rhone Group, became a co-manager of WeWork’s real-estate fund business, which bought properties to lease to the company. Rhone was entitled to management fees and a percentage of profits on properties purchased.

Earlier this year, WeWork expanded its involvement in the real-estate fund business, diminishing the influence of Rhone—which initially had a 50% stake. In April 2019, WeWork gave 454,546 restricted shares to Mr. Langman “for his ongoing services to The We Company,” according to the prospectus. The award, granted over multiple years, would have been worth around $50 million at WeWork’s share price from the time. The share value has fallen by more than 80% since.

Ramping up

Into this freewheeling situation came SoftBank’s Mr. Son. From WeWork’s early days, speed was central to its narrative: Its goal was to build out more properties more quickly than any company ever. Mr. Neumann, who often spent meetings pacing the floor, was known for his frenetic energy.

Speed is an essential ingredient in Mr. Son’s narrative, too. He’s often said he decided to invest in
Jack Ma,
co-founder of
Alibaba Group Holding Inc.,
within minutes of meeting him because of “the sparkles in his eyes.” (Mr. Ma and his wife, Cathy Ma, invested roughly $25 million in WeWork in 2016 as part of the round led by Hony, said people familiar with the investment.)

Mr. Son’s gut-based investment style became a hallmark of the $100 billion Vision Fund. The fund showered money on unprofitable startups, pushing up valuations many considered already overinflated. He allowed executives in his portfolio companies to cash out huge sums far before investors generated any returns.

Mr. Neumann said in an interview on CNBC this year that it took Mr. Son 28 minutes to make his initial decision in late 2016 to invest $4.4 billion in WeWork, time that included getting in and out of the car and touring the company’s headquarters.

Executives at SoftBank had looked at WeWork before—and passed.

After the 2016 initial agreement between Messrs. Son and Neumann, many on SoftBank’s team panned the deal, upset they were committing so heavily to a real-estate company. Ultimately, Mr. Son made the decisions about whether or not to invest.

In March 2017, for the first time after their handshake agreement, Mr. Neumann and a contingent of WeWork executives and advisers flew to Tokyo to meet Mr. Son and his team.

In a late-night meeting days before the trip, Mr. Neumann insisted they arrive with a gift for Mr. Son, and deemed that a giant artwork hanging in his own office, a collage made of electronics and other objects that spelled out WeWork, would be appropriate. It was too large to fly with the group on a private jet, so his team dispatched their logistics courier to ship it. The carrier put it in a crate on a commercial jet, at a cost of roughly $50,000. WeWork executives saw it hung up in SoftBank’s office in a subsequent visit.

Grow faster

The board’s newest directors, Ron Fisher and Mr. Schwartz, joined in 2017, representing SoftBank. Before the SoftBank deal, WeWork’s revenue was roughly doubling annually, an astonishing pace for a company then seven years old. Some executives hoped it would slow, so WeWork could start to focus on losses and logistical problems.

Mr. Neumann made clear to staff that the company’s new backer wanted WeWork to grow faster, not slower. He frequently cited what he said was Mr. Son’s advice: Don’t worry about profitability and grab as much market share as possible as quickly as possible. He told friends and colleagues he knew he was crazy, but Mr. Son told him to be crazier.

At one meeting, Mr. Son told Mr. Neumann he shouldn’t be proud of WeWork’s lean sales staff, and that it should aim to have 10,000 salespeople, a giant number for a company that had fewer than 10,000 total employees at the time. Mr. Neumann told his deputies to expand the sales staff more quickly.

Money poured into expansion in China and other Asian countries, highly competitive markets where losses were large.

Mr. Neumann helmed an array of new initiatives and acquisitions that had little or no connection to WeWork’s core business. Purchases included event-planning site Meetup.com, search engine optimization company Conductor, and the Flatiron Academy coding school.

WeWork also opened an elementary school in Manhattan in 2018 called WeGrow. Mr. Neumann told staff the project came about after he and his wife were unable to find adequate schooling choices for their five children.

Directors frequently raised concerns about the proposed acquisitions, questioning Mr. Neumann on why the company was expanding into the disparate areas. Nevertheless, WeWork spent more than $500 million in two years on tech-related companies, with board approval.

Mr. Neumann considered other deals, including an acquisition of
Cushman & Wakefield
PLC, one of the country’s largest commercial real-estate services firms, which currently has a market capitalization of around $4.2 billion. He made an offer to buy salad chain Sweetgreen Inc., recently valued at $1.6 billion. And WeWork came close to paying over $1 billion for facilities management company BGIS before backing out late in the process.

WeWork’s own facilities became increasingly opulent. The sixth floor of its low-slung headquarters was redone, with a large section just for executives, including an exercise room. Mr. Neumann’s office included a sauna and an ice bath.

Mr. Neumann wanted a big presence in San Francisco as well. It leased offices with sweeping views in the new Salesforce Tower. WeWork ordered giant openings to be cut in the floor to make way for airy staircases—an expensive maneuver. A fitness club was added, with an ice bath. The total costs exceeded $550 a square foot, roughly three times what WeWork normally spends on renovating an office.

Then there was the jet.

WeWork had been renting jets for Mr. Neumann from when WeWork was valued at just $5 billion, but he wanted an upgrade. The Gulfstream G650ER was top of the line, with 16 plush seats, high-speed internet and two lavatories, including one just for the crew.

Multiple investors, including some directors, questioned the necessity of it, but Mr. Neumann was insistent. The company made the $63 million purchase, and the jet was delivered in summer 2018.

The business itself was straining to keep up its punishing growth rate as it grew larger. WeWork executives worried that in places such as Manhattan, where the company was already the largest private tenant, doubling WeWork every year could drive up the entire market for office rentals.

Facing tough deadlines to open multiple new buildings a week, staffers often shipped couches by air to arrive on time, which sometimes cost more than the couches themselves.

Employees were whipsawed by frequent design changes to make new offices more avant-garde. That meant rows of furniture just months old could go to waste. WeWork occasionally held sales at its New Jersey warehouse to clear out the older models, allowing employees to buy $1,000-plus midcentury-modern-style couches from brands such as Vitra for $100 or less.

When WeWork opened its first buildings in South Korea, it shipped thousands of mugs manufactured in China that failed to meet South Korea’s strict import laws. While those mugs sat in customs warehouses, WeWork bought thousands of mugs at higher prices in time for opening day.

Some investors were increasingly concerned with the business and its management, as well as stock sales by Mr. Neumann.

“We saw the valuation rise and the corporate governance erode,” said Eric Veiel, co-head of global equity at T. Rowe Price. Amid concerns over issues like Mr. Neumann’s purchases of property he leased to WeWork, the mutual-fund manager made clear to Mr. Neumann, WeWork management and the board it had grown sour on the company, he said.

“We sold as much as we possibly could,” he said, referring to two deals in 2017 and 2019, when SoftBank bought stock from existing investors.

More capital

By 2018, it was clear WeWork would need billions of dollars more to keep growing.

Though he was chairman, Mr. Neumann missed numerous board meetings throughout 2018, sending deputies instead. In at least one meeting, directors discussed the pace of growth.

Several directors told others they took comfort knowing WeWork would soon need to go public because of its need for more cash to keep growing. The public markets, they told each other, would help serve as a check on Mr. Neumann.

In board meetings, directors including Messrs. Schwartz, Dunlevie and Langman pushed Mr. Neumann to commit to a timetable for an IPO.

Messrs. Neumann and Son had other plans. In mid-2018, they started talking about a giant deal in which SoftBank would buy a majority stake in WeWork for roughly $20 billion, including buying out existing investors.

It wasn’t to be. SoftBank’s stock plunged amid a broader fall in technology stocks and concerns over the potential acquisition, while key SoftBank investors, including Saudi Arabia’s Public Investment Fund, opposed it. On Christmas Eve, Mr. Son told Mr. Neumann the deal wouldn’t work.

Mr. Neumann took WeWork’s jet to Hawaii and met Mr. Son in an attempt to come up with an alternative. Over breakfast, Mr. Son agreed to invest $2 billion, bringing the size of WeWork’s latest round of financing to $6 billion, and SoftBank’s total to $10 billion. The two men agreed the company’s valuation would be $47 billion, although people close to the deal never saw a clear explanation of how that number was determined, according to people familiar with the matter. The deal called for $1 billion to go to buying shares from existing investors, allowing some on the board to sell.

Mr. Neumann said in a January interview on CNBC that the funding from SoftBank was “above and beyond what we need to fund the company for the next four to five years.”

The company would nearly run out of cash in November. WeWork ended up on a path to burn more than $3 billion for 2019.

Mr. Neumann told the Journal earlier this year that watching Mr. Son do his math was “beautiful to see.”

Going public

In need of more funding, WeWork began to turn to an IPO, even though Mr. Neumann felt more comfortable in the private markets.

Bankers up and down Wall Street had been wooing him for years with the hope of an eventual IPO, where they would win millions of dollars in fees and the prestige of bringing a giant company to the public markets.

Closest to the company had always been JPMorgan and Goldman Sachs, both WeWork investors. Mr. Neumann referred to JPMorgan’s Mr. Dimon as his personal banker.

It was almost literal: JPMorgan led a $500 million credit line to Mr. Neumann and lent another $97 million in other forms of debt, largely mortgages with low rates on his many homes. Mr. Dimon once ordered his bank to mimic some of WeWork’s office designs after a tour of a WeWork with Mr. Neumann.

In theory, investment bankers can provide prospective IPO companies practical expertise on the rigors of life as a public company owned by pension funds and individual investors.

In practice, the bankers supercharged WeWork’s visions of grandeur. They pitched an extraordinarily optimistic picture, giving Mr. Neumann and other executives more confidence in WeWork’s growth-heavy, loss-heavy strategy. JPMorgan told WeWork it thought the company would be worth as much as $60 billion, which was lower than estimates from other banks. Mr. Neumann said that wasn’t aggressive enough, a person familiar with the matter said.

Bankers at Goldman Sachs referenced Mother Teresa and Bob Marley in its pitch presentation. One slide enumerated, “Your path to $1 trillion,” referring to a target of $1 trillion market capitalization within a decade or so.

Comparable companies, Goldman said, were
Salesforce.com Inc.,
Amazon, Alibaba,
Facebook Inc.
and
Alphabet Inc.
The difference between WeWork and this cohort of companies, Goldman Sachs’ pitch deck said, was: “You are scaling faster.”

One slide said only, “Growth is paramount.”

Mr. Neumann and other executives began using the projections to justify WeWork’s $47 billion valuation to some employees and outsiders.

There were warning signs that public-market investors would be wary. WeWork executives and its bankers were aware that T. Rowe Price, a significant IPO investor, wouldn’t be investing in the offering. The fund’s co-head of global equity, Mr. Veiel, said they knew for years they wouldn’t invest, saying there was “mutual disinterest.”

As IPO preparations heated up, Mr. Neumann became distracted by surfing, a passion of his that became increasingly blended into the fabric of the company.

After spending part of the winter living in his house in Marin County, Calif., in early 2019, he moved back to New York, and relocated his Hawaii-based surf instructor and his family there, too. Mr. Neumann paid for their apartment in Manhattan, and some of the instructor’s children attended WeGrow, people familiar with the arrangement said.

Throughout the year, Mr. Neumann made surf trips to the Dominican Republic and the Maldives. During a week in early June, WeWork’s company plane made two trips between Costa Rica and New York.

Meanwhile, Mr. Neumann kept up surfing from the Hamptons and Montauk over the summer. Executives from WeWork and bankers and advisers including lawyers from Skadden, Arps, Slate, Meagher & Flom LLP worked with Mr. Neumann and his wife on IPO-related documents and presentations at their homes there.

Mr. Neumann oversaw a complex legal restructuring of the company that gave him and a cadre of other executives stock compensation with more favorable tax treatment than other employees at the company—a move approved by the board.

The mood changed drastically in mid-August.

SHARE YOUR THOUGHTS

How much control should investors give to a company’s founder? Join the conversation below.

After WeWork made its IPO paperwork public, potential investors, analysts and the media panned WeWork for its growing losses and lack of a path to profitability, and for Mr. Neumann’s string of conflicts. The language used to describe the company was widely derided. The prospectus was dedicated to the “energy of we,” and the company’s mission statement was to “elevate the world’s consciousness.”

The reaction sent WeWork’s expected valuation plummeting and prompted Mr. Neumann’s financial enablers to speak up more forcefully.

Mr. Neumann’s investment bankers from JPMorgan Chase and Goldman Sachs had been bracing for a rough response. While they were bullish in pitches months earlier, weeks before the IPO filing was made public, they warned Mr. Neumann that his unusual ties to the company and other governance decisions could cut the company’s stock price.

By the end of August, weeks before the IPO was planned to launch, WeWork’s valuation was expected to be less than half the $47 billion mark from January.

At Mr. Son’s behest, Mr. Neumann took the company jet to Tokyo, where Mr. Son argued to delay the offering, saying WeWork clearly wasn’t ready. Mr. Neumann rebuffed Mr. Son, saying he would push ahead.

As Mr. Neumann prepared to leave, Mr. Son offered some parting advice, according to people familiar with the conversation: This is going to be bad for you and bad for the company.

Investor search

In the days that followed, Mr. Neumann scoured the globe for others to commit to the IPO.

On Sept. 3 in London, he met Yasir al-Rumayyan, the head of Saudi Arabia’s sovereign-wealth fund, a big investor in SoftBank’s Vision Fund, according to people familiar with the meeting. Mr. al-Rumayyan didn’t invest.

Facing criticism that WeWork had no female directors, Mr. Neumann announced he would add
Frances Frei,
a professor of technology and operations at Harvard Business School, to its board.

Ms. Frei had been employed as a consultant to help improve management, including encouraging gender equality in hiring and setting up internal training programs, and her firm was given a three-year contract valued at roughly $5 million, including stock options. Some executives were frustrated by her use of private aircrafts to travel from Boston to WeWork’s headquarters in Manhattan at WeWork’s expense.

The board, though, was annoyed Mr. Neumann hadn’t told them about the addition until after it was done. Directors vented to each other at a board meeting that followed—one in which Mr. Neumann was again absent. Soon after, Mr. Langman confronted Mr. Neumann, telling him his disengagement with the board was unacceptable.

Mr. Neumann showed up at the next meeting days later, apologized and pledged to attend.

In another meeting in WeWork’s headquarters, bankers from JPMorgan and Goldman, his main lawyer from Skadden Arps and several senior executives discussed more changes. Mr. Neumann initially said he didn’t want to do anything further. Two of JPMorgan’s bankers on the deal, Michael Millman and Noah Wintroub, told Mr. Neumann that the company had no chance of going public without changes.

Mr. Neumann eventually relented. The group spent hours, stretching long into the evening, getting Mr. Neumann to agree to everything.

The changes included a promise to appoint a lead independent director by the end of the year, halving his voting rights to 10 votes per share from 20, and eliminating a provision in which his wife, Rebekah Neumann—also a WeWork co-founder—would play a role in choosing Mr. Neumann’s successor.

That night, Mr. Neumann’s co-founder, Mr. McKelvey, called Nasdaq executives to tell them WeWork planned to list on their exchange when they debuted roughly two weeks later.

As bankers surveyed investors, it was clear the offering still might not have enough demand for the more than $3 billion WeWork wanted to raise—which was also necessary to gain access to another $6 billion in debt.

On several occasions after the prospectus was filed, Mr. Wintroub, one of the JPMorgan bankers, told Mr. Neumann that he needed to stop using marijuana and take the IPO process seriously.

Postponed

On the afternoon of Sunday, Sept. 15, bankers from JPMorgan and Goldman Sachs gathered at WeWork’s headquarters to discuss setting a potential price range for shares in the IPO, ahead of the kickoff of a two-week sales pitch to potential investors the following day. Mr. Neumann was expected to sit down with the bankers but never did.

Instead, he spent the day in another part of his company’s headquarters filming his portion of a video that would be used at all the pre-IPO investor meetings. Mr. Neumann had canceled numerous previous film shoots. The process didn’t wrap up until nearly midnight and was mixed with tequila and vodka shots, largely drank by WeWork executives, as the night wore on.

By late Monday afternoon, at the urging of JPMorgan and Goldman Sachs’ bankers in a meeting at JPMorgan’s headquarters, Mr. Neumann agreed to postpone the IPO.

During a series of meetings that day with Mr. Dimon in attendance, bankers said investors were particularly concerned about Mr. Neumann and suggested he consider stepping down.
Mary Erdoes,
the bank’s asset-management chief, said many investors thought an IPO was untenable if he remained at the helm. Mr. Dimon and Goldman’s key banker, David Ludwig, pushed him to commit to other governance changes, including getting rid of his voting control.

Mr. Neumann was noncommittal. But the stage was set for his ouster.

On Wednesday, the Journal story was published, sparking chatter at a three-day meeting in Pasadena, Calif., run by SoftBank. Investors and executives at companies backed by SoftBank urged Mr. Son to move against Mr. Neumann.

Over the weekend, he did. His two board members, as well as others, pushed for Mr. Neumann’s ouster. Mr. Dimon added to the pressure on Sunday, advising Mr. Neumann the offering couldn’t go ahead if he stayed on as CEO. At a dinner that night, Messrs. Eisenberg, Dunlevie and Langman urged him to step down.

By the end of the weekend, it was clear to members of the board that Mr. Neumann would relinquish his role. If he didn’t give up his post, the company would run out of money and his stake could be worthless. When the board met without Mr. Neumann Monday morning, they largely spoke about who would lead the company.

He gave up control to two lieutenants,
Artie Minson
and
Sebastian Gunningham,
and board members began descending on the company to help lead its operations. The company hired prominent banker Peter Weinberg as an adviser to help the board sort through options.

SoftBank executives put together a rescue package with $5 billion in new financing that would value WeWork around $8 billion. Nearly $40 billion of valuation had vanished.

Mr. Neumann was still chairman and, given his voting shares, still had effective control of the company. To entice him to step aside, SoftBank offered a $185 million consulting fee and boosted the price it was offering to buy out existing shareholders to $19.19, from $17.

Mr. Neumann remains a board observer. As part of his consulting fee, he promised not to start a competitor for four years.

One final inducement involved a large bill for personal travel on the jet. After a tally of surf vacations and other jaunts, plus some additional personal expenses, Mr. Neumann owed WeWork $1.75 million at the time of his ouster, according to shareholder documents.

As part of the deal, which was approved by the board, WeWork forgave the debt.

—Phred Dvorak contributed to this article.

—Photo Illustration at top by David Vogin; Photos: Bloomberg News, Reuters (3), Getty Images

]]>
https://www.wsj.com/articles/the-money-men-who-enabled-adam-neumann-and-the-wework-debacle-11576299616
https://www.wsj.com/articles/the-money-men-who-enabled-adam-neumann-and-the-wework-debacle-11576299616Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Untitled (https://www.motherjones.com/media/2019/12/bloomberg-just-bought-citylab-and-put-half-its-reporters-out-of-a-job/)]]>

Rick Scuteri/AP

Journalism has had a rough year full of layoffs across the country, and Bloomberg Media, the company owned by the latest entrant in the 2020 Democratic primary, just made life worse for a handful of reporters.

Earlier this week, Bloomberg Media announced its first acquisition in more than a decade, buying CityLab, an offshoot website of the Atlantic that reports on a variety of areas—development, housing, transit, the environment—from an urban policy perspective. And as part of that purchase, it appears half of the website’s editorial staff could lose their jobs. According to several CityLab staffers who spoke to Mother Jones on the condition of anonymity, they have been told by Atlantic management that after the sale is completed at the end of this month only seven people will make the transition—not enough jobs for CityLab’s 16 employees: 13 full-time staff and three contractors. The staffers were informed that they each would have to interview with Bloomberg Media management this week to keep their jobs.

Michael Bloomberg’s presidential campaign redirected requests to the company, and Bloomberg Media declined to comment. The Atlantic did not respond to an email from Mother Jones.

In today’s grim media landscape, the announcement of a publication’s sale is usually packaged with news of layoffs, but the initial reports of Bloomberg Media’s acquisition of CityLab didn’t mention any and hinted at a bright future. Adweek, which first reported on the sale, said that CityLab would “continue to operate as a standalone website and brand” under the new ownership, and Michael Finnegan, Atlantic Media’s president, said in a statement that Bloomberg Media would be an “ambitious new owner” for the site, committed to carrying out its mission.

But none of the details of the Bloomberg Media’s acquisition of CityLab mentioned the future of the staff. When CityLab’s employees were briefed by Atlantic management on the sale on Tuesday, they were told that, as part of the acquisition, Bloomberg Media would only be bringing seven people from their staff on board. (CityLab’s masthead lists 15 people, but at least three of those people, and one not listed on the masthead, are full-time contractors, according to a CityLab staffer.) In that same meeting, a CityLab staffer says they were also given severance information by Atlantic management in the event that they are not hired by Bloomberg Media or do “not want to go forward with that [interview] process.” As part of the transition, Bloomberg Media executives have been conducting crash interviews this week with the current CityLab staff who wish to be considered for one of the seven positions.

“Over the last few days, most of the staff has received comments [from friends and on Twitter] saying ‘Congratulations!’ and asking if we’re moving to New York or what our new job title will be,” says one of the CityLab staffers Mother Jones spoke to. “But the reality is most of us are being laid off.”

As his nascent presidential candidacy revs up, Michael Bloomberg has held onto his 88 percent ownership stake in the company he founded, Bloomberg LP, though he has stepped down as its CEO. That has created complications for the company’s news division. In a newsroom memo sent out last month, Bloomberg News Editor-in-Chief John Micklethwait said that because of the conflict-of-interest, the news organization won’t do any in-depth investigative reporting of Bloomberg or the other Democratic presidential candidates. The policy is consistent with comments Bloomberg made last year about the possibility of a presidential run and the potential conflicts of interest that might hold for his business holdings. “I don’t want the reporters I’m paying to write a bad story about me,” he said on Radio Iowa. “I don’t want them to be independent.”

Last week, in response to the Bloomberg News memo that leaked, Bloomberg doubled down on the sentiment, tellingCBS This Morning’s Gayle King that Bloomberg News employees would have to just deal with it. “They get a paycheck. But with your paycheck comes some restrictions and responsibilities,” he said. Bloomberg News’ new editorial policy on covering 2020 Democratic presidential candidates, as well as Bloomberg’s comment to King, has irked enough outlets who use Bloomberg News as a wire service that they’re reconsidering using its stories. In addition, Donald Trump’s reelection campaign said it won’t give Bloomberg News reporters credentials to cover campaign events because of its new policy, claiming that it exposes an open bias against the president. Meanwhile, several Bloomberg News editorial staff have taken leaves of absence to work for Bloomberg’s presidential campaign.

It’s unclear if CityLab under Bloomberg Media ownership would be immune from the Bloomberg News directive. Justin B. Smith, the CEO of Bloomberg Media, told Adweek that the decision to purchase CityLab had “no relation whatsoever” to the Bloomberg News directive to not investigate Bloomberg or any other Democratic candidate. In the near-decade of CityLab’s existence, it has done plenty of reporting—both positive and negative—about Bloomberg’s tenure and legacy as New York City’s mayor. Since announcing his presidential campaign, Bloomberg hasn’t said anything about distancing himself from his company, but in that Radio Iowa interview last year, he said he would completely divest himself from his media empire if he ran for president, either placing it all into a blind trust or selling it, if possible. So far, neither has happened, and Bloomberg has not mentioned either possibility since announcing his candidacy.

Some CityLab staffers say they have concerns about the Michael Bloomberg-Bloomberg News situation. “This would affect CityLab writers specifically in different ways,” says one staffer, explaining that some of CityLab’s staff writes about the specific plans that presidential candidates put forward as it relates to their areas of coverage, like housing, infrastructure, and other city-specific issues.

It makes sense that Bloomberg Media would want to purchase CityLab. For years, the Atlantic partnered with Bloomberg Philanthropies to host an annual CityLab conference. But the site started to show signs of struggle earlier this year when the Atlantic laid off CityLab’s dedicated business staff. “The writing’s been on the wall for a long time,” one of the staffers said about CityLab’s future with the Atlantic. Another staffer said that during the meeting Atlantic management had announcing its sale to Bloomberg Media, the management made it clear “that the company is really focusing on the Atlantic and that they just feel that they can’t focus and invest in CityLab to make it robust.” (CityLab partners with Mother Jones and other news outlets through the Climate Desk collaboration.)

For CityLab’s staff, there’s a lot of uncertainty and anxiety about their future with Bloomberg Media. “We’re really proud of the work we do, and we really want to be able to keep doing it,” one staffer said. “I feel like whatever my outcome is I do feel some relief knowing that there is a next phase of CityLab coming up. And I’d say at this point I feel cautious optimism about it.”

]]>
https://www.motherjones.com/media/2019/12/bloomberg-just-bought-citylab-and-put-half-its-reporters-out-of-a-job/
https://www.motherjones.com/media/2019/12/bloomberg-just-bought-citylab-and-put-half-its-reporters-out-of-a-job/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Cache of Crypto-Jewish recipes dating to Inquisition found in Miami kitchen | The Times of Israel]]>A few years ago, Genie Milgrom came across a treasure trove of old recipes stashed away in her elderly mother’s kitchen drawers. There were hundreds of them — some in tattered notebooks, others scribbled on crumbling scraps of paper.

Upon closer examination, it became apparent to Milgrom that these were the handwritten notations of generations of women in her family. The recipes had traveled as an intact, ever-growing collection from Spain to Portugal to Cuba to the United States, reflecting not only the lives of Milgrom’s ancestors, but also the hidden heritage they had for the most part unknowingly safeguarded since the time of the Spanish Inquisition.

‘Recipes of My 15 Grandmothers’ by Genie Milgrom (Gefen Publishing House)

Milgrom, who grew up devoutly Roman Catholic in Havana and Miami, has Crypto-Jewish roots. Her ancestors were Jews who practiced Judaism in secret while outwardly living as Christians to avoid being expelled, tortured, or killed by the Church. They were Crypto-Jews until the late 17th century, and lived as Catholics from then on. Through a decade-long, intense genealogical search, Milgrom discovered that she has an unbroken Jewish maternal lineage going back 22 generations to 1405 pre-Inquisition Spain and Portugal.

As a girl, Milgrom didn’t think to question the idiosyncratic customs her mother and grandmothers practiced in the kitchen.

Recipes didn’t mix milk and meat, eggs were always cracked into a separate bowl and inspected for blood before use, and rice and leafy green vegetables were washed carefully and checked for insects. Curiously, some recipes called for potato or corn starch instead of wheat flour. And perhaps most unusually, Milgrom was instructed by her Spanish-born grandmother that when preparing a large batch of dough, one had to always pull off a small piece, wrap it in foil, and throw it the back of the oven to burn.

Some of the hundreds of pages of recipes passed down through the generations of Genie Milgrom’s Crypto-Jewish family. (Courtesy of Genie Milgrom)

“She told me it was for good luck,” Milgrom, 64, told The Times of Israel during a recent interview at a Jerusalem hotel.

When Milgrom was in her 30s, as a young divorcee she felt drawn to Judaism and she converted to Modern Orthodox Judaism. She married a Jewish man and then discovered she might have Crypto-Jewish heritage. She realized that her family’s unique culinary habits were actually related to kashrut — the Jewish dietary laws — and other commandments surrounding food preparation. For instance, tearing off a bit of dough and burning it is called hafrashat challah(separating challah), a Jewish religious commandment stemming from a biblical sacrifice.

Dulces en Almibar, fried sweets in a sugar syrup. (Courtesy of Genie Milgrom)

“Recipes of My 15 Grandmothers” presents 10 chicken dishes, 8 meat dishes, 3 fish dishes, 6 side dishes, 6 sauces, 42 desserts, and 7 beverages culled from the hundreds of recipes Milgrom found in her mother’s Miami kitchen. All the recipes included are kosher and listed as either meat or dairy, as well as whether they are also kosher for Passover.

Sorting, grouping, cross-referencing and translating the handwritten recipes from Spanish was challenging. But then came the monumental task of testing the recipes and adjusting them for modern and kosher-certified ingredients. For this, Milgrom enlisted some 50 friends, colleagues and family members to help. These included rediscovered long-lost cousins in Fermoselle, the village in western Spain to which Milgrom has traced her maternal lineage. These relatives are still making some of the same dishes Milgrom inherited from her grandmothers.

Rosquillas Dora, a dessert similar to a cinnamon and sugar donut. The recipe is from Milgrom’s paternal grandmother, who was born in Costa Rica. (Courtesy of Genie Milgrom)

Measurements proved to be a considerable challenge for many of the recipes. Archaic and colloquial formulations called for such things as: “an egg-full of oil,” “a glass-full of water,” “a tablespoon of egg,” “a little bit of sugar,” and “as much flour as it will hold.”

“One measurement was ‘a small woman’s hand of flour.’ It’s a good thing I have really small hands,” Milgrom joked.

One of the biggest complaints of the testers was that the dishes came out too eggy when using the number of eggs listed in the original recipes. “Clearly their eggs were smaller than the ones we have today.” Milgrom surmised.

Almost all the recipes reflect the agricultural environment of Fermoselle, which sits close to the border with Portugal and is not far from the cities of Zamora and Salamanca. Almonds, olive oil, garlic and anis (liquor) appear as ingredients throughout the cookbook.

Genie Milgrom’s family (the Ramoses and Medinas) in Havana, Cuba in the 1940’s. This was prior to the Cuban Revolution and their emigration to the US. (Courtesy of Genie Milgrom)

At the same time, the recipes convey the journeys Milgrom’s ancestors took from Spain to the New World. Her mother’s side migrated from Spain to Portugal to Cuba. Milgrom’s father’s side moved from Spain to the Canary Islands to Colombia to Costa Rica, with only several family members eventually continuing on to Cuba. When she was four, Milgrom fled Cuba to the US with her family following the Cuban Revolution.

Once the family arrived in the tropics, new ingredients started appearing as either additions or replacements for ingredients that had been used in Spain. The anis that had flavored so many desserts (especially the Almibar sugar syrup for drizzling on sweets) was replaced with rum. Snapper took the place of cod. Avocado and turbinado sugar started popping up in recipes, and plain beet salad became beet and orange salad.

Genie Milgrom’s great-aunt Tia Paulita, who was born in Spain in the 1880’s. It is thanks to her that many of the recipes were written down, collected, and passed down. (Courtesy of Genie Milgrom)

The family’s immigration to Cuba at the turn of the 20th century heralded the inclusion of pork. Before that time, the “other” white meat was conspicuously absent from the recipes handed down from generation to generation. However, one dish that was among the oil-stained pages found by Milgrom indicated that her Crypto-Jewish ancestors took pains to make it appear as though they were eating pork while still in Spain.

Milgrom’s great-aunt Tia Paulita was born in the 1880s, never married and remained in Spain. She kept alive a highly unusual recipe that had been passed down through the generations. It was finally written down by the family on the day she died in Madrid in 1936. The recipe is for Chuletas, a unique Crypto-Jewish presentation for a French toast creation that combines milk and bread to look like a pork chop. Chuleta is Spanish for “pork chop,” but this faux version is really sugary-sweet treat.

“[Chuletas] would have fended off those trying to catch the new converts to Christianity who were hauled off to an Inquisition prison for not eating pork. This is the best look-alike to a pork chop that I have ever seen,” Milgrom wrote.

Genie Milgrom holding a plate of Chuletas. (Courtesy of Genie Milgrom)

Those hoping for beautiful color photographs of these unique dishes will be disappointed. Although the lack of images in “Recipes of My 15 Grandmothers” is regretful, it is offset by the heartwarming and informative narratives Milgrom provides for each recipe. She writes about the history of each dish, as well as the challenges faced by her volunteer test cooks and bakers in trying to perfect them.

The author of four books (each published in both English and Spanish) and a lecturer and educator, Milgrom lives in Miami but travels widely for her latest initiative: The Converso Genealogy Project. It is a joint venture with Israel-based online genealogy platform MyHeritage aimed at digitizing all the extant Inquisition records worldwide.

The busy Milgrom is also finishing up work on a new kosher Cuban cookbook, titled “Salsa,” which she believes is the first of its kind.

In the meantime, she is contemplating how to best preserve the original, fragile recipe pages with her many grandmothers’ handwriting. They are special documents, which differ from the kind she has generally encountered in researching her family’s past. In these, the women — and not the men — are front and center.

“When I make this food and smell the aromas that I know they also smelled, it connects me so strongly with the grandmothers,” said Milgrom. “Knowing that I’m eating what they ate brings me closer to my heritage.”

I waited until the day before Rosh Hashanah to make this recipe myself as I felt that with all the dark rich fruits and honey it was the perfect one for the New Year Holiday and I was right! I was rushed because there was so much cooking to do but it was worth it once I got this cake into the oven! In my whole life I have never had my house full of the most amazing and spicy aromatic scents! The cake itself is rich but not overpowering and I drizzled dark honey all over. A keeper for the holidays for sure! I have already included my modifications in the recipe below. Enjoy!

Dark Fruit Cake, an great dessert for Rosh Hashana. (Courtesy of Genie Milgrom)

½ cup margarine or butter 1 cup turbinado sugar 3 large eggs separated ¼ cup dark honey 2 cups of flour 2 teaspoons cinnamon 2 teaspoons all spice ½ teaspoon nutmeg ½ teaspoon clove powder 1 cup of dark raisins ¼ cup grape juice ½ teaspoon baking soda 1 Tablespoon hot water 1 ½ pounds of dried fruit cut up small. Apricots, figs, cherries, dates and other fruits may be used. I did not use candied fruit as I felt it might be too sweet so I used dried fruit instead

Mix together and sift the flour, allspice, cinnamon, nutmeg and clove powder three times. Beat the sugar and butter/margarine together making sure it is creamy. Beat the egg yolks together with the sugar and add the honey. Whip the egg whites separately. Mix the egg yolk mixture and egg whites together with the sugar and butter/margarine mixture. Add 1 ¾ cup flour slowly until well mixed. Toss the cut-up fruits in the remaining ¼ cup flour and add to the mixture. Add the grape juice. Dissolve the baking soda in the hot water and add to the mixture. Grease a large 9 x 11 rectangular glass mold and cover the bottom with greased parchment paper. Put mixture into the mold and bake at 300 degrees for 1 ¼ hours, Check often. Drizzle dark honey over the top with a little bit of liquor if desired.

]]>
https://www.timesofisrael.com/cache-of-crypto-jewish-recipes-dating-to-inquisition-found-in-miami-kitchen
https://www.timesofisrael.com/cache-of-crypto-jewish-recipes-dating-to-inquisition-found-in-miami-kitchenMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Show HN: Happy Hues – Curated colors in context | Hacker News]]>

]]>
https://news.ycombinator.com/item?id=21780659
https://news.ycombinator.com/item?id=21780659Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[I’m a 37-Year-Old Mom & I Spent Seven Days Online as an 11-Year-Old Girl. Here’s What I Learned.]]>

Note: This piece contains sexual content and descriptions of child sex abuse that could be disturbing to some readers. The messages, images, and conversations included here are real.

I’m standing in a bathroom with the hem of a pale blue sweatshirt bunched up under my chin as I weave an ace bandage tightly around my ribcage. The mirror serves as a guide as I wrap and wrap again the bandages around my sports bra, binding my chest. I step out of the bathroom and find our team waiting.

“This look OK?”

I get nods in response, and as Avery art directs, I pose my arms and tilt my head towards the camera. Normally, I’m not in clothes meant for a tween girl. Normally, I don’t have glitter polish on my nails and neon hair ties on my wrist. Normally, I’m dressed, I suppose, like your average 37-year-old mom. Jeans. Shirts that cover my midriff. Shoes with reasonable arch support.

Reid snaps a couple of photos of me. She scuttles off with Avery to our make-shift command center — a repurposed dining room now covered in cork boards and maps and papers and computer monitors. Will’s brow furrows as he quickly edits.

With the help of context — clothing, background, hair styling — and the magic of photo manipulation, we’re no longer staring at an image of me, an adult woman with crow’s feet.

I move to the kitchen to give him space. We’re gearing up for the heaviest part of the day, which we know from experience will be fast-paced and emotionally exhausting.

“It’s ready,” Will calls from the command center. A few of us gather around Will’s computer screen and examine.

Less than a year ago, Brian and I sat in a meeting where we wrestled with how exactly to talk to parents about online grooming. Back when Bark was a much smaller team, we encountered a particularly harrowing case of an online predator abusing a girl in middle school. She was only 12 years old, and this man was grooming her through her school email account, coercing her to send videos of herself performing sexual acts. We knew people like him were out there, but it floored us to see how quickly and deftly he was able to manipulate this child.

In 2018 alone, Bark alerted the FBI to 99 child predators. In 2019? That number is more than 300 — and counting. Each of these cases represents a real child experiencing real harm, and our challenge is to help parents and schools understand this new reality. But how do we tell stories without asking families to divulge too much? How do we explain online grooming to a generation who didn’t grow up with this danger? Numbers, though informative, are abstract and easy to gloss over.

I was frustrated by the problem we were facing, tapping my pen on the conference table and thinking out loud. “When parents think about predators,” I suggested to Brian, “they think about someone tossing their kid in a trunk and driving off. They don’t think about the unseen abuse that happens online. In a perfect world, we’d share a conversation from an actual predator, but that feels like traumatizing the victim all over again…”

I trailed off. We had gone in circles on this same concept.

“What if we just set up fake accounts ourselves to demonstrate to parents what can happen online?” Brian asked. I raised both eyebrows at the idea. Waited a beat to see if he was joking. He wasn’t.

That was nine months ago. Since then, we’ve created an entire team focused on the impromptu meeting Brian and I had in that conference room. We’ve formed continuous working relationships with the kinds of government law enforcement agencies that boast three-letter acronyms. We’ve had test runs, new hires, and countless other meetings. We’ve seen arrests and sentencings. We’ve provided testimony in court and invaluable information to investigations.

My own role changed to heading up this new special projects team. And to preserve the integrity of this project, this special projects team works largely behind-the-scenes and out of the limelight. We don’t appear on the company website, and our Twitter profile photos show inanimate objects instead of our actual faces. Brian and I are also the bridge between the team and law enforcement, with regular meetings and status updates, making sure we’re always working within not only their parameters, but those of the prosecuting attorneys. No one wants our hard work to go to waste because of missing evidence or even a hint of entrapment.

Here, now, in this media room, this isn’t our first rodeo. It’s not even our second or third rodeo. Over the past nine months, I’ve been 15-year-old Libby and 16-year-old Kait and 14-year-old Ava. I’ve been a studious sophomore contemplating bangs and a lacrosse player being raised by her aunt and an excitable junior eager for prom.

At this point, we’re seasoned veterans — but this is our first time using a persona this young. Tonight, my chest is tightly bound and my language reads significantly less mature.

Tonight, I am 11-year-old Bailey.

“Here we go,” I say to the room.

“You can do it, Sloane,” Reid says to me, patting my shoulder woodenly, but still assuredly. Reid’s chin is stern and she’s staring intently ahead. An attorney with a background in criminal law, Reid moved to the private sector and joined Bark when we launched this project. With a knowledge of law and a background in dealing with some gnarly crimes, Reid has been a welcome addition to the team. To an outsider, a shoulder pat might seem stiff, but from Reid, it feels like genuine care and support.

Pete — former military, now private security — who is quite literally three times my size, sits in the front living room. Tonight is certainly low risk, but on the days that have felt significantly scarier, he affords us all a little peace of mind.

I upload the photo to Instagram — a generic, innocuous selfie of Bailey with an ear-to-ear smile — and caption it.

v excitedd to see my friends this weekend at carly’s party! Ilysm!! followed by a string of emojis and a #friends hashtag

The photo publishes on Instagram and we wait quietly for something on the big screen to change.

This part never takes long. It’s always unnervingly fast.

At the beginning of the week, on the very first night as Bailey, two new messages came in under a minute after publishing a photo. We sat mouths agape as the numbers pinged up on the screen — 2, 3, 7, 15 messages from adult men over the course of two hours. Half of them could be charged with transfer of obscene content to a minor. That night, I had taken a breather and sat with my head in my hands.

Nine months of this, and we still continue to be stunned by the breadth of cruelty and perversion we see. I imagine this trend will continue tonight.

“Incoming,” Avery says, and we all look up at the TV. The Instagram notifications show that Bailey has three new requests for conversation.

“Hi! I was just wondering how long you’ve been a model for?”

“lol! im not a model,” I type quickly, hitting send.

“No!” he types, full of false incredulity. “You’re lying! If not, you should be a model. You’re so PRETTY.”

@ XXXastrolifer appears to be in his early 40s, but tells Bailey he’s 19. When she tells him she’s only 11, he doesn’t flinch.

The next message is from another man who greets Bailey harmlessly enough.

“Hi! How are you doing tonight?”

“Hi im good hbu”

“I’m doing alright, thank you. You are a very beautiful girl.”

I hear Josh next to me mutter. “Like clockwork.”

“Wow, thank u!”

“It’s true. I love your pictures on here. Does your mom and dad let you have a boyfriend yet?”

Bailey says no, but also, it’s not something they talk about a lot. I poll the parents in the room. They agree. Getting a boyfriend isn’t top of mind for an 11-year-old.

“Maybe I can be your Instagram bf if you would like? Up to you.”

I pause to respond to @ XXXastrolifer. The conversation ends like most of them do — in under five minutes, he sends Bailey a video to show himself masturbating.

“Do you like that? Have you seen one of those before?”

I turn my attention back to @ XXXthisguy66, the would-be Instagram boyfriend. In a matter of minutes, it escalates from “An Instagram boyfriend means we can chat with each other, send selfies back and forth, and just be there for each other” to “Since we are together, are you ready to send sexy pics to each other?”

She’s 11, and doesn’t quite know what he means. He sends a photo of his erect penis, requests a photo of her shirtless, and assures her that he can teach her how to proceed.

“Well, a lot of boyfriends like it when their girlfriend give them a blowjob. Do you know what that means?”

“No I dont.”

“That means you take the dick in your hand and then you put your mouth over it and you suck on it like you would suck on your thumb.”

“I dont get it,” Bailey types back.

“You take my dick. You put it in your mouth, and you suck on it.”

“God,” Reid interjects, and I look at her. “A child’s first sex talk shouldn’t be with a man who wants to rape her.”

I turn back to the screen.

“But why?”

“Some girls like it, but it feels really good to the boy. That’s just what a boy likes. Now what a boy and a girl really like together is if I put my dick in between your legs and push it inside you. That is called sex. Or fucking.”

“Oh. I learned about sex”

“Whenever you get a chance, send me a picture of you without your shirt on, or send me a picture of in between your legs. I would really like that.”

“Like what kind of picture? In between my legs?”

“You know your vagina? Or some people call it a pussy. I would like to see it. Because that’s where my dick goes. But I would like to see your chest too.”

“I dont really have boobs yet,” Bailey replies. She doesn’t. She wears a training bra for the ritual and camaraderie of training-bra-wearing, but she doesn’t really need one. Not yet.

“It’s ok. I’m sure you still look great though. I would still suck on your nipples.”

“I’m not good at taking body pics.”

“It’s ok. Can you send me a picture of you sucking on your finger? That way I can imagine you giving me a blowjob like we talked about earlier. I’ll send you another pic of my dick.”

He does.

I exit the conversation with @ XXXastrolifer to see another nine requests pending. My phone rings loudly through the TV speakers, startling all of us. It’s an incoming Instagram video call from a new would-be abuser.

I make a snap decision to take it, drop my phone, and pull off my sweatshirt to swap it out for one with a hood. The room knows what I’m doing.

“Keep quiet, everyone,” Nathan states the unnecessary. With my hood up and the room dimly lit, I tilt my head to obscure my face and answer the call. Dominique on my left remains poised at the ready. A former costume designer, her skills with wigs and stage makeup are unmatched. Photos of my personas side-by-side don’t even look like they’re related. I’m Latina. I’m part Asian. I’m a blonde. I’m a redhead.

We’re greeted by a man with a British accent, breathing heavily and whispering into the phone.

“Hey. How are you? I want to see you.”He tilts his phone and he’s lying in bed and shirtless. I kick my voice up an octave.

“Ummmm. I’m shy.”

“No, baby, no. Don’t be shy,”he croons, his voice soft and persuasive.

“I can’t fucking take this,” Will says, and walks out of the room, shaking his head.

The rule at Bark is that we can all call a time-out whenever we want. We can step away whenever we need to. We can take a breather; we can schedule a therapy session. We can even rotate off the team.

That includes me, and I’m the (manipulated) face of our personas.

By the end of two-and-a-half hours, I’ve had seven video calls, ignored another two dozen of them, text-chatted with 17 men (some who had messaged her before, gearing back up in hopes for more interaction), and seen the genitalia of 11 of those. I’ve also fielded (and subsequently denied) multiple requests for above-the-waist nudity (in spite of being clear that Bailey’s breasts have not yet developed) and below-the-waist nudity.

The script we see is largely the same.

You’re so pretty.

You should be a model.

I’m older than you.

What would you do if you were here, baby?

Would you touch my dick if you were here?

Have you seen one before, baby?

Baby. They keep calling her baby without an ounce of irony.

Baby, you’re so beautiful.

Talk to me, baby.

I want you to put your mouth on my dick, baby.

Just get on video chat, baby.

Don’t be shy, baby.

Bailey is a child. Libby, Kait, Ava, Alessia, Lena, Isabella. All of my personas are — legally, emotionally, physically, intellectually. They have no agency, no ability to give consent. Perhaps society loves to point fingers and victim blame (What was she wearing??), but the answer is still the same. They’re all children. And like every case of abuse, a child is never at fault.

It’s just about midnight. I stopped doing video calls an hour ago, but my thumbs have been feverishly typing. My hair is pulled back into a ponytail and I’m chugging water like I just ran a half marathon. “The body keeps the score,” as the saying goes, and my body is calling uncle. The back of my t-shirt is damp, my eyes are bleary, my neck aches, and my heart is a little sick.

Over the course of one week, over 52 men reached out to an 11-year-old girl. We sit with that stat as we soberly shut down the TV and the camcorder.

The work — while not necessarily physical — is emotionally taxing. Most of us on the team have kids, some of them the same age as the personas I play. It hits too close to home, but you don’t have to be a parent to be devastated by the predation of society’s most vulnerable.

It’s the end of the night, but every single conversation and photo still needs to be sorted, organized, and packaged to send to our law enforcement contacts. Any instance of child sexual abuse material is sent to NCMEC, the National Center for Missing and Exploited Children.

I text the law enforcement agent I work with most closely and give him a status update. We all pack up to head home, and frankly, we all look a little bruised. I can’t write this line without sounding completely self-aggrandizing, but the painful truth is that this work is hard and agonizing and very literally keeps us up at night. We could just stop. Pump the brakes. Divert our attention to the company’s day-to-day.

But the simple truth is that we know what’s at stake. The most obvious win — we’re helping identify sexual predators to the authorities and not only bring them to justice, but prevent them from abusing any more children. We’re also educating parents and schools about a nearly unbelievable reality that exists online. And from a technical standpoint, these stomach-turning conversations are training Bark’s artificial intelligence to become even better at monitoring for signs of grooming.

The brutal reality is that a predator doesn’t have to be in the same room, building, or even country to abuse a child.

I think about my kids. About my coworkers’ kids. About my own self decades ago as a young, uncertain, impressionable tween and then teen. I think about how I would have felt as Bailey. How I would have kept the abuses to myself, for fear of being shamed and blamed. How I would have suffered with it secretly and quietly. How I would have been a silent victim. How I don’t want that for any other kid — my own or anyone else’s.

The brutal reality is that a predator doesn’t have to be in the same room, building, or even country to abuse a child. And that’s what they’re doing — subjecting children to psychological and sexual abuse.

Knowing the pervasiveness of predation on the internet isn’t a burden. Not really. It’s a gift. One that helps us turn the tables on abusers. Our work has resulted in arrests of people who have shown the propensity and willingness to harm children. Technology has changed and so too have the methods by which predators find, communicate with, and harm children. If they can use technology to abuse children, we can use the same technology to help stop their crimes.

At home, I’m not Bailey. I’m a 37-year-old mom in wool socks, loading up the dishwasher and helping with homework. One of my kids is learning about sayings, proverbs, and idioms. She reads them out loud out of her notebook. Bite the bullet. Through thick and thin. Kill two birds with one stone.

“Mom,” she looks at me, pencil poised mid-air. “Do you agree that ‘ignorance is bliss’?” I rinse off my hands and dry them with a dishtowel. I look at her jotting down notes. I am a biased parent, but she is a wonder. Full of joy and wit and curiosity, much like I’d imagine Bailey to be.

“No, honey. I don’t agree with that,” I say resolutely, pulling up a chair to sit next to her at the kitchen table. I lean on my elbow and peek at her homework assignment. “Knowledge is a gift.”

I repeat it to myself as I get back up and wipe down the counter. I mean it. And even on the worst days, I mean it.

Disclaimer: Out of an abundance of caution and due to pending criminal investigations, names — including the author’s — and inconsequential details have been edited for privacy and clarity.

Sloane Ryan runs the Special Projects Team at Bark, a tech company committed to child safety. You can email her at sloane.ryan@bark.us

]]>
https://medium.com/@sloane_ryan/im-a-37-year-old-mom-i-spent-seven-days-online-as-an-11-year-old-girl-here-s-what-i-learned-9825e81c8e7d
https://medium.com/@sloane_ryan/im-a-37-year-old-mom-i-spent-seven-days-online-as-an-11-year-old-girl-here-s-what-i-learned-9825e81c8e7dMon, 16 Dec 2019 06:47:00 +0100<![CDATA[Best Fiction Books 2019 — THE WHAT]]>

We spend all year reading copious amounts of fiction and have collectively torn through over a hundred books—ranging from lightweight beach reads to highly complex literature. Below is a list of our 2019 favorites in no particular order. We rarely get advance copies from publishers and buy most of our own books. Please click on our BUY links if you want to contribute to our book buying fund. Every little bit helps.

Happy Reading!

THE SURE THING.

The Dutch House by Ann Patchett. This is a story about the unbreakable bond between a brother and sister who grow up in a grand house in Philadelphia. Unfortunate circumstances herald a reversal of fortune which alters the course of their lives, intertwining their future while stirring up the past. It’s a contemporary take on the Hansel and Gretel fable but where the villains are also victims of circumstance. We’ve recommended this book to more than a dozen friends and it’s been loved across the board. BUY IT HERE

HISTORICAL FICTION.

The Flight Portfolio by Julie Orringer. Orringer's prose is like a butterfly wing--dazzling, intricate, precise, soaring. Her novel is the reimagining of the true story of Varian Fry, a dapper, hard-headed Harvard journalist who leaves his swank life in Manhattan to run a rescue network in the South of France that saved some of Europe's finest painters, poets, writers, and intellectuals, as well 2,000+ other anti-Nazi and Jewish refugees during WWII. Varian's motley crew of unlikely operatives includes a glamorous American heiress, a thieving Legionnaire, and a housekeeper with a heart of gold. They manage to carry out their duties with panache and ample bottles of French wine and champagne in threadbare Marseille. And as they successfully smuggle Europe's intellectual capital to safer shores, they're faced with hard questions about the value and hierarchy of human life, love, and loyalty. BUY IT HERE

The Island of Sea Women by Lisa See. Before this book we knew little about Korean history and its violent struggles with Japan, China, and the U.S.. Island of Sea Women tells the story of friendship, betrayal, and the hardscrabble life between female deep-sea freedivers on the island of Jeju spanning a period of Japanese colonialism in the 30s and 40s followed by WWII and the Korean War until present day. BUY IT HERE

The Beekeeper of Aleppoby Christy Lefteri is a haunting story from the point of view of a gentle beekeeper who lives an idyllic life with his wife and son in the Syrian countryside. Many of us are aware (and deeply concerned) about the Syrian refugee crisis and other refugee crises around the world but few understand the harrowing journey these survivors must brave before, and if, they can find another home. It’s a critical read for us all as we enter an inevitable era of mass diaspora around the world caused by war, famine, climate change, and a struggle for survival. BUY IT HERE

HUMOROUS READS WITH AN EDGE.

Fleishman Is In Trouble by Taffy Brodesser-Aknercaptures all the angst, frustration, miscommunication, midlife crisis, and betrayal in a long-term marriage amidst the posh environs of New York's elite. This debut novel is being compared to hard-hitting male writers, a backhanded compliment literary reviewers often make when praising female authors. We prefer how Elizabeth Gilbert put it: “Just the sort of thing that Philip Roth or John Updike might have produced in their prime (except, of course, that the author understands women).” BUY IT HERE

Stay Up With Hugo Best by Erin Somers. Hugo Best is a David Letterman-like late night talk show host who has a penchant for the ladies that eventually catches up with him. Feeling unmoored after his forced retirement he spontaneously invites a young, brooding comedy writer on his staff (who he barely knows) for a long weekend at his luxurious compound in Connecticut. Despite their age gap, they find themselves at the same crossroads. Both share a mindset of desperation and uncertainty about what comes next and muddle through their despair together (think Lost In Translation). The exceedingly clever, comedic banter between the two is fun to read and their struggle through feelings of irrelevance and hopelessness that comes at retirement age but also right before one’s thirties feels all too familiar. Erin Somer’s debut novel is definitely worth staying up for. BUY IT HERE

There’s A Word For That by Sloane Tanen. Set in London then Malibu, a wildly successful middle-aged (and then some) author of young adult fiction (think J.K. Rowling) is blindsided by an intervention by her agent and friends after some particularly unconscionable behavior at her milestone birthday party. When she lands in a California rehab her journey takes an unexpected turn with a lovable mix of misfit including her distant professorial son, a charming ex-husband, and a former child star.Drama ensues (and possibly salvation). Funny, painful, and acerbic as only the English can be. BUY IT HERE

The Editor by Steven Rowley is set in early 90s in Manhattan. It’s a positively delightful story about a gay, first-time author whose manuscript is singled out by the inimitable Jackie O. who becomes his editor at Doubleday, a position she held in real life from roughly 1977-1993. It’s a wonderful book about the endless mystery between mothers and sons; the Camelot days of the Kennedys and later Clintons; forgiveness, and loving and supportive partnerships. Similar to Lillian Boxfish Takes A Walk, but with the added bonus of droll conversations and magical moments spent with Jackie (a fantasy I’m sure most people share). This is a book everyone will love and a must-gift for every Perennial on your list. BUY IT HERE

Join the 70,000+ readers who rely on us for WHAT to know, WHAT to buy, WHAT to try, and where to go. Everything to help you kick ass at whatever you do. Delivered to your inbox weekly. Subscribe here.

]]>
https://www.thewhatlist.com/daily/best-books2019
https://www.thewhatlist.com/daily/best-books2019Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[The best sci-fi and fantasy books of 2019 - Polygon]]>The last year in science fiction and fantasy novels saw a welcome mix of debut voices and returning favorites. While genre fiction is often seen as an escapist form of literature (something that’s understandably needed in 2019), the best stories often address real-world concerns and anxieties.

As such, if there is any overarching theme linking this year’s books together, it’s a hope that people can remake or rebuild worlds: whether it’s by building towards a better future, exploring far-off places, or standing up to oppressors and hatred. Many of the books on our year-end list imagine better futures and alternate paths that could take us there, and with a speculative twist.

With that in mind, here are our favorite reads of 2019.

Tor.com

On a distant, tidally-locked world called January, a young woman named Sophia is inadvertently branded a dissident and exiled to a city on the planet’s dark side. After saved the native lifeforms, the Gelet (called Crocodiles by the humans), save her life, Sophia, her friend Bianca, and their companions set out to change the world and humanity itself.

Charlie Jane Anders’ latest is a deeply compassionate and complicated read that interrogates privilege, love, and what it means to be human. The breathtaking adventure reminded me more than a little of Ursula K. Le Guin’s best stories.

Tin House

Alternate histories are often a way to juxtapose reality against what might have been; change one thing, and see how events in the world might have played out. K. Chess puts a spin on that trope with her debut novelFamous Men Who Never Lived. In our real-world New York City, refugees from an alternate dimension come through enmasse when a nuclear disaster befalls their world.

Hel is one such refugee, who Chess follows on her quest to memorialize the world that they’ve lost, starting with a copy of a popular science fiction novel from her own world, The Pyronauts. When it goes missing, she goes on a desperate search to track down the copy. Her story is a powerful, relevant story about what people will do to hold onto the worlds that they’ve lost, and how they move forward.

Knopf

Ted Chiang is responsible for some of the best written science fiction in recent years, and his latest collection — the first since 2002’s The Story of Your Life and Others — brings together his latest, mind-blowing repertoire.

The nine-story collection includes the likes of “The Lifecycle of Software Objects”, a brilliant story about artificial intelligence, and “Exhalation,” about a scientist’s observations of the universe. The entire book is eye-opening, thoughtful, and some of the best that science fiction has to offer.

Orbit Books

In their penultimate volume of the Expanse series, James S.A. Corey’s heroes find themselves in a dark place. The Laconians — a fascist colonial world of former Martians — have taken over the solar system and human-settled space, and hold Captain James Holden captive. They have plans to investigate some anomalies that they’ve observed around some of the remnants of a long-dead alien civilization that built a vast ring network, and their explorations seem to be triggering a cataclysmic response.

As Corey wraps up their epic space opera series, they’re running on all cylinders, playing with epic consequences for humanity, and showing that none of their long-running characters are safe from what could come. But they also put together a story that seems all-too-relevant in this day and age: a warning of the dangers that fascism and totalitarianism bring.

Tor Books

At the Osthorne Academy of Young Mages, a magical high school in California, the school’s faculty discover one of their own brutally killed in the library. When the initial investigation goes nowhere, the school’s headmaster hires a private investigator, Ivy Gamble. She reluctantly accepts the case (her estranged, twin sister is an instructor there), and delves into a world of magical secrets and high school drama to try and figure out who was behind the act.

Gailey deftly weaves together tightly wound mystery and family drama, set against the backdrop of an engrossing fantasy world. Ivy works to reconnect with her sister and discovers a devastating secret at the heart of her family’s story.

Tor Books

Max Gladstone is best-known for his fantasy Craft Sequence novels, but Empress of Forever turns over to space opera. In the near future, a tech billionaire named Vivian Liao is on the verge of taking over the world when she’s abruptly transported away from Earth to the end of the universe, summoned by a far-future Empress who wants to ensure her own power by stamping out any potential threat.

Gladstone’s novel is a fast, enthralling space opera yarn that addresses the dangers of power and how it’s used in the hands of an individual, and it makes for a good commentary on the excesses of Silicon Valley.

Redhook

A young girl named January grows up alone along the coast of Lake Champlain in Vermont, the ward of Mr. Locke, a wealthy benefactor who employs her father to collect strange artifacts from around the world. When her father goes missing and is presumed dead, January discovers a strange book, which leads her on a journey to uncover the true nature of her father’s work, only to discover that he’s the key to her own mysterious story.

Part portal fantasy, part coming-of-age adventure, and part meditation on the power and importance of storytelling, The Ten Thousand Doors of January quickly became one of my favorite novels — ever. It’s a powerful, heart-wrenching adventure of a young woman trying to figure out who she is, and how she can save the world.

Saga Press

Kameron Hurley’s latest is a riff on military science fiction classics like Robert Heinlein’s Starship Troopers or Joe Haldeman’s The Forever War. Humanity is engaged in a war against Mars, and one soldier, Dietz, is caught in the middle when they join a corporate military force, which can teleport soldiers onto the battlefield in a beam of light.

As they join the fight as part of the Light Brigade, time begins working differently for them: they become unstuck, and experience events out of order. The book is a scathing indictment on the nature of warfare and corporate feudalism, and it comes with a wonderful, recursive plot that glued me to my seat.

Riverhead Books

Marlon James is best known for winning the 2015 Man Booker prize for his novel A Brief History of Seven Killings. After that book’s publication, he noted that he wanted to turn his attention to fantasy, and a long-standing complaint that he had: that the genre often ignored or erased people of color from their narratives. The result was Black Leopard, Red Wolf, an epic fantasy that drew its inspiration from the African diaspora.

In the book, a man named Tracker is tasked with a mission: to track down a missing boy. As he embarks on his quest, he encounters other strange figures, and is forced to confront his own mysterious past. Set in the phenomenal world, it’s a strange, complicated, and thoughtful alternative to Game of Thrones.

Tor Books

Supernova Eraby Cixin Liu, translated by Joel Martinsen

Chinese author Cixin Liu is best known for his novel The Three-Body Problem, the first major science-fiction novel to be translated into English. It kicked off a flurry of interest in the country’s science fiction scene (which has included other novels by Liu, as well as a major film adaptation, The Wandering Earth). In his latest, Supernova Era, Earth’s adult population is wiped out after a nearby star goes supernova, leaving the children of the world to take their place.

Like Liu’s other novels, it’s a book about big, grand ideas. He looks at what the impact of such a cataclysmic event might have on humanity, and how people and institutions move on to figure out how to rebuild the world anew — and that it’s not just enough to survive, but how to build a future that’s fulfilling and productive.

Tor Books

In the distant future, Teixcalaanli Empire works to extends its reach to new star systems. When the ambassador of the fiercely independent Lsel Station dies unexpectedly, Ambassador Mahit Dzmare is sent to replace him, only to discover that his death was part of a conspiracy, tied to her home’s particular technological advances.

Arkady Martine, an author and historian, draws inspiration from the Byzantine Empire, and uses the novel to examine how a society’s institutional memory shapes its culture and future. It’s an engrossing look at colonialism and what’s lost when one culture subsumes another.

Tor Books

The Moon is the perfect, clean-slate environment for science fiction authors to use as a setting for alternative, political societies. Ian McDonald thrives with those types of stories, and brings his epic Luna trilogy to a close with Luna: Moon Rising.

In the preceding novels, Luna: New Moon and Luna: Wolf Moon, McDonald explores the pitfalls of a feudal, capitalistic world where corporate families (called Dragons) dominate the Moon and its inhabitants as they mine its surface for precious resources. When the Mackenzie family decimates the Cortas, Lucas Corta goes into hiding, plotting his revenge. In Moon Rising, he returns to retake control of the Moon, and McDonald examines what the cost the entire endeavor has taken on all involved, and what type of future should we build when we eventually do colonize the Moon?

Tor.com

In Tamsyn Muir’s pulpy debut, Gideon the Ninth, her titular hero has grown up in the Ninth House, training to become a swordswoman and spending years trying to escape its grim walls. When the Emperor invites representatives from all of the houses to compete in a trial, Gideon is selected by her nemesis, the Ninth House’s Reverend Daughter and necromancer Harrowhark Nonagesimus, to accompany her.

The premise of this novel is essentially “lesbian necromancers in space,” and it’s a rollicking blend of pulp science fiction and horror, with plenty of sarcasm, swordplay, romance, and adventure.

Tor Books

Time travel and alternate universes are well-used genre tropes, with countless authors exploring all the ways that travellers work to change — or preserve — the past in order to keep the future as it is. Annalee Newitz takes a slightly different approach in their latest novel, The Future of Another Timeline, telling a story about factions of time travellers battling to make edits and change the future for the better.

Jumping between the Paleozoic, the 1800s, 1990s, 2022 and other times, time travel is a known thing in Newitz’s world: historians and activists jump back and forth in time to study the past. Editing the past isn’t easy to do, but she realizes that there is a dangerous group of travellers working to create a timeline where women will be completely oppressed, and with her companions, works to counter their edits, and it proves to be a timely novel that’s all too relevant in 2019.

Tor Books

Cixin Liu might have become the best-known science fiction writers to come out of China, but he’s far from the only one. Chen Qiufan’s Waste Tide is a far cry from Liu’s epic science fiction tales, taking a grim look at the near future of China, where impoverished workers struggle to make a living from the world’s electronic waste.

Waste Tide follows a series of people who come together in Silicone Isle: Mimi, a worker who heads there for work; Scott Brandle, an American who is trying to arrange a contract; and Chen Kaizong, a translator, all of whom find themselves wrapped up in a greater plot for control. It’s a book that reminded me quite a bit of Paolo Bacigalupi’s The Windup Girl, with a pointed commentary on class warfare and the lifecycle of the devices we use.

Saga Press

After releasing her debut urban fantasy novel Trail of Lightning last year, Rebecca Roanhorse earned considerable acclaim from the science-fiction community, including nominations for the prestigious Nebula and Hugo awards. Storm of Locusts picks up shortly after that first book, and it’s just as excellent.

Roanhorse sets her story in Dinétah, the traditional Navajo homeland, during a nearish future in which climate change has ravaged the world and ancient gods have returned to roam the Earth. Monster hunter Maggie Hoskie sets off after a friend goes missing, and uncovers a conspiracy led by a mysterious, charismatic cult leader. It’s a fast, exciting read that’s reminiscent of Neil Gaiman’s American Gods and Mad Max: Fury Road.

Orbit Books

A couple of years ago, Adrian Tchaikovsky released his first science fiction novel, Children of Time, an epic story of how humanity terraformed a distant world and the rise of a civilization of uplifted spiders that inhabited it. In this year’s, standalone-ish sequel, Tchaikovsky returns to his universe with a new uplifted species, and their clash with a planet’s native lifeforms.

Like Children of Time,Children of Ruin covers vast swaths of time, jumping from generation to generation as human surveyors uplift another eight-legged creature on an aquatic world: the octopus. As he jumps through time as a survey ship arrives in the system, Tchaikovsky explores the nature of consciousness, first contact, and how humanity could eventually spread to distant stars.

Black Stone

When an alien spaceship arrives at Earth and settles over the Virgin Islands, its mysterious, shape-shifting residents promise to deliver humanity untold advances and technologies. The Ynaa appear to come in peace, but their mission is shrouded in mystery, and any threat is met with extreme, disproportionate violence. After a young boy is brutally killed, the islands and their visitors find themselves on a path towards conflict that could destroy everything.

Turnbull’s debut novel is an entrancing, powerful work that explores the imbalance of power between the Ynaa and islanders, and an exploration of the archipelago’s long history of invasions.

Del Rey

In the nearish future, a plague sweeps across the world. The afflicted appear to be asleep and can’t be awoken, but they also begin to walk to a mysterious destination. A woman named Shana accompanies her sister as she walks, and as others follow in their footsteps, the country erupts into a crisis, with violent militias threatening to kill the sleepwalkers. Scientists work to figure out how to stop it before the country descends into anarchy.

Chuck Wendig’s latest has been compared to Stephen King’s The Stand, and over the course of the book, he examines how a country deals (badly) with a major crisis and how such disasters are merely an excuse for the emergence of long-standing bigotry, hatred, and racism. The book is an ambitious epic that holds up a mirror to the state of the world in 2019, and it’s not a pretty sight.

Grove Atlantic

In early days of the Spanish Inquisition, a royal concubine named Fatima and a mapmaker named Hassan are forced to flee for their lives as their home of Grenada is overtaken by the Inquisitors. They have good reason to flee. Hassan has two a dangerous secrets: he’s queer, and has the ability to change the fabric of reality with his map, adding new features to the world with the stroke of a pen.

G. Willow Wilson’s latest is a gripping adventure that finds the pair, with the help of mythical creatures, escaping across Spain and into the unknown as they seek safety in the mythical home of the Bird King. Wilson’s story is a powerful one about the dangers of oppressive ideologies, and the power that words have to create stories and entire worlds.

Vox Media has affiliate partnerships. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. For more information, seeour ethics policy.

]]>
https://www.polygon.com/2019/12/13/21012122/best-books-science-fiction-fantasy-2019
https://www.polygon.com/2019/12/13/21012122/best-books-science-fiction-fantasy-2019Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Database Internals]]>

Have you ever wanted to learn more about Databases but did not know where to start? This is a book just for you.

We can treat databases and other infrastructure components as black boxes, but it doesn’t have to be that way. Sometimes we have to take a closer look at what’s going on because of performance issues. Sometimes databases misbehave, and we need to find out what exactly is going on. Some of us want to work in infrastructure and develop databases. This book’s main intention is to introduce you to the cornerstone concepts and help you understand how databases work.

The book consists of two parts: Storage Engines and Distributed Systems since that’s where most of the differences between the vast majority of databases is coming from.

In Storage Engines, we start with taxonomy and terminology, then explore In-Place Update storage and discuss several B-Tree variants and their structure. Then we talk about binary data formats and file organization and explore the ways to compose efficient on-disk structures. After that, we go into the detail on what techniques different databases use when implementing B-Trees and talk about related data structures such as Page Buffer, Write-Ahead Log, how to implement compression and perform defragmentation and compaction. Finally, we discuss Log-Structured storage and explore a few different storage engine approaches, such as Bw-Trees, FD-Trees, CoWB-Tress, Bitcask, WiscKey, 2/3 Component LSM, and some other ones.

In Distributed Systems, we start with basic concepts such as processes and links and start building more complex communication patterns. We quickly discover that communication is unreliable and discuss which guarantees we have and how to achieve those. We cover the Important concepts such as Failure Detection, Leader Election and Gossip Dissemination. After that, we explore different Consistency Models and talk about ways to achieve them. After covering Atomic Commitment and Broadcast, we move to the pinnacle of Distributed Systems research: Consensus Algorithms.

This book includes references to 100+ papers, 10+ books several open source database implementations and other sources you can refer to for further study.

Taxonomy and Terminology

We discuss the precise definitions, use-cases, applications and differences between the existing databases and storage engines sorts and classes: Column vs Row Oriented Stores, Memory and Disk based databases, In-Place Update and Immutable storage engines.

In-Place Update Storage Engines

Many modern databases such as PostgeSQL, MySQL and many others implement variants of the mutable in-place update data structure: B-Tree. We’ll discuss its origins, binary on-disk layout, organisation and popular variants such as Blink-Trees, B*-Trees, Copy-On-Write B-Trees and many others.

Auxiliary Structures

Storage Engines consists of a primary storage data structure and several auxiliary subsystems that take care of garbage collection, maintenance, compression. Many modern databases use Write-Ahead Log for restore and recovery and implement buffer management in a form of Page Cache.

Log-Structured Storage

With advent of SSDs, we’ve seen many databases implementing and using Log-Structured storage. We’ll explore the whole spectrum of immutable data structures, ranging from B-Tree like LSM-Trees and Bw-Trees to unsorted variants such as LLAMA, Bitcask, WiscKey.

Problems with Distributed Systems

How are the Distributed Systems different from the single-node ones? What is FLP Impossibility and Two-Generals problems. How network and communication using message passing puts limits on what we can and not do and how we can build reliable systems despite these complications.

Consistency Models

In a replicated systems, where we have multiple copies of data, we have to make to keep nodes in sync to return consistent results. We talk about concepts of Linearizability, Serializability, Eventual and Causal Consistency, their guarantees and limitations.

Leader Election and Failure Detection

Many distributed databases use a concept of Leadership to have a single point of reference and make some of the decisions locally. However, both the leader and participant may fail. We explore several Failure Detection algorithms that help us to detect these failures and react to them.

Broadcast and Consensus

With Atomic Commitment, Total Broadcast and Consensus algorithms, distributed systems can make cluster-wide decisions and communicate them to the participants preserving strong consistency guarantees. We discuss both traditional and cutting-edge algorithms used for that.

Alex is an Infrastructure Engineer, Apache Cassandra Committer, working on building data infrastructure and processing pipelines. He’s interested in CS Theory, algorithms, Distributed Systems, understanding how things work and sharing it with others through blogposts, articles and conference talks.

Where do I get a DRM-free version of the book?

Most definitely, just as any O’Reilly book, you can get it from ebooks.com.

Is this book about NoSQL/Distributed or Traditional/Relational Databases?

This book does not dissect any specific database. Instead, it takes several of them apart to understand what’s inside. B-Trees can be used both in relational databases, say, PostgreSQL and in document databases such as, for example MongoDB (WiredTiger). Similarly, there was an attempt to add LSM Trees to SQLite, while it’s used in Apache Cassandra.

Distributed systems concepts such as Two-Phase commit, Gossip, Leader Election and Failure Detection are not specific to NoSQL movement and can be (and are) used in many databases. Moreover, we witness a new generation of databases working at scale while offering rich query API and strong (or configurable) consistency guarantees. The book is about concepts that are seen in databases, all kinds of databases.

]]>
https://www.databass.dev/
https://www.databass.dev/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Challenging projects every programmer should try | Hacker News]]>

Jacob Gottlieb was considering raising money for a hedge fund. One problem: His last one had collapsed in a scandal.

While Mr. Gottlieb wasn’t accused of wrongdoing, googling his name prominently surfaced news articles chronicling the demise of Visium Asset Management LP, which once managed $8 billion. The results also included articles about his top portfolio manager, who died by suicide days after he was indicted for insider trading in 2016, and Mr. Gottlieb’s former brother-in-law, an employee of Visium who was convicted of securities fraud. Searches also found coverage of Mr. Gottlieb’s messy divorce in New York’s tabloids.

Jacob Gottlieb, who was considering raising money for a hedge fund, hired Status Labs, a company specializing in so-called reputation management.
Photo:
Jacob Kepler/Bloomberg News

So last year Mr. Gottlieb hired Status Labs, an Austin, Texas-based company specializing in so-called reputation management. Its tactic: a favorable news blitz to eclipse the negative stories.

Afterward, articles about him began to appear on websites that are designed to look like independent news outlets but are not. Most contained flattering information about Mr. Gottlieb, praising his investment acumen and philanthropy, and came up high in recent Google searches. Google featured some of the articles on Google News.

His online makeover shows the steps some executives and public figures are taking to influence what comes up on the world’s top search engine. It also illustrates that despite Google’s promises to police misinformation, sites can still masquerade as news outlets and avoid Google’s detection.

Mr. Gottlieb said in an interview with the Journal that he found his press coverage unfair and wanted to fight back.

“I worked with this company to help me launch my new venture,” he said.

Status Labs interviewed Mr. Gottlieb on his interests and plans, and told him it would get him positive press coverage, according to a person familiar with the matter. He paid between $4,000 and $5,000 per month for the services.

The phone number for Medical Daily Times on its website rang at a pizzeria in Toronto. Until recently, the article’s author was listed as BJ Hetherington. The author couldn’t be reached for comment.

A black-and-white photo accompanied BJ Hetherington’s author page on Medical Daily Times. The photo is of a Canadian theater actor, Randy Hughson. A publicist for Mr. Hughson said he wasn’t aware his photo was being used with BJ Hetherington’s name until the Journal contacted him.

The information about Mr. Gottlieb’s donation is also inaccurate: An NYU spokeswoman said Mr. Gottlieb didn’t donate to that particular initiative, though he has donated to NYU on other occasions. Medical Daily Times didn’t check the information with Mr. Gottlieb, said a person familiar with the matter.

Medical Daily Times couldn’t be reached for comment.

Share Your Thoughts

Do you think people should have the right to influence what comes up about them online? Why or why not? Join the conversation below.

Status Labs’ client list has also included scandal-rocked companies, billionaires and public officials, said former employees. Status Labs provided services to Education Secretary Betsy DeVos, according to former Status Labs employees.

Status Labs’ services can cost in the tens of thousands of dollars per month, said people familiar with the matter.

Ryan Stonerock, a lawyer for Status Labs at law firm Harder LLP, said the company wouldn’t discuss its clients. A spokeswoman for Ms. DeVos at the Education Department said it “doesn’t sound like the department or the secretary have a relationship” with Status Labs and didn’t respond to other questions sent by the Journal.

Former Status Labs employees said that in addition to helping clients bury negative information in Google’s search results by gaming the tech giant’s algorithms, the company has also edited Wikipedia pages without disclosure of its role, something Wikipedia forbids.

Academics said these actions can deprive the public of information that may help them make more informed decisions.

Mr. Stonerock said Status Labs was founded to correct what it perceives as an imbalance of power between its clients and the information available online about them.

“A single false accusation can cause permanent damage to a person or a company’s hard-earned reputation,” said Mr. Stonerock. “This imbalance of power has made the first page of Google the first, and often times the last, impression for individuals and companies.”

Fifty-five Status Labs employees use “a variety of proprietary methods, which are always evolving” to help clients “disseminate positive and truthful information about themselves online,” he said. “An internal ethics committee at Status Labs vets the potential client and determines whether Status Labs can assist the client in an honest and truthful manner.”

Status Labs executives Darius Fisher and Jesse Boskoff agreed to an interview at its Austin, Texas, headquarters but canceled the meeting due to what Mr. Stonerock said was a “pressing client matter.”

Status Labs declined to respond to most of the Journal’s questions. Mr. Stonerock said the company wouldn’t comment “on its proprietary business methods and/or trade secrets.”

Google is under increased scrutiny from global regulators. Senators recently proposed bipartisan legislation that would require Google to disclose its algorithms. A recent investigation by the Journal found that the search giant has interfered with its algorithms and changed results. A Google spokeswoman disputed the Journal’s conclusions.

Google said that it tries to monitor deceptive behavior. But Google News, the unit of Google’s search engine that highlights news articles, featured several of the outlets that contained articles about Mr. Gottlieb and other Status Labs clients.

“Any efforts to enhance someone’s online presence shouldn’t involve deceptive tactics aimed at influencing Google Search rankings,” said Google, which removed the sites from Google News because they didn’t meet its transparency standards. Google News says it doesn’t allow “sites or accounts...that misrepresent or conceal their ownership or primary purpose.”

Mr. Gottlieb said the information Status Labs helped get published about him wouldn’t have affected his reputation with investors.

“Obviously nobody invests in a hedge fund...based on an article in a no-name online blog,” Mr. Gottlieb said in a statement, though he added it has “improved my reception on Tinder.”

Former
Bank of America Corp.
executive
Omeed Malik
also received services from Status Labs, according to people familiar with the matter.

Former Bank of America Corp. executive Omeed Malik also received services from reputation management firm Status Labs, according to people familiar with the matter.
Photo:
Patrick McMullan

The website Chronicle of Week, which used the tagline “Independent News,” published information on Mr. Malik after he became the subject of articles last year related to his firing from the bank. Mr. Malik denied allegations of sexual misconduct, later filed a defamation complaint against Bank of America and obtained an eight-figure settlement, the Journal previously reported.

In October, a Chronicle of Week article about Mr. Malik appeared on the first page of an ordinary Google search of his name. The article, dated July 23, 2019, cited his “vast pertinent experiences within multiple leadership roles” and said he “stands out as a leader within the industry.” It didn’t mention the allegations or the settlement.

The website where that article appeared has also gone by the names “Chronicle of the Week” and “Chronicle Week.” Chronicle Week described itself as “an online digital newspaper” and listed an editorial staff in February 2019. By October, after the Journal’s queries, that language and staff names had been removed, and a sentence was added that some of the information appearing on the site is paid for as sponsored content.

A Wikipedia page about Mr. Malik also became the first result in a Google search of his name, displacing news articles.

Following a Journal query, Wikipedia removed Mr. Malik’s page. Anne Clin, a Wikipedia editor involved in the decision, said Mr. Malik should never have had his own page because he isn’t notable enough. A lawyer for Mr. Malik, Tom Clare, didn’t comment on the removal and said Mr. Malik has disputed reports regarding the reasons for his departure from Bank of America.

Mr. Clare didn’t respond to questions about Status Labs. In a brief phone call in October, Mr. Malik said, “we view this as libelous.”

Status Labs provided services to Ms. DeVos before she became U.S. education secretary to suppress Google search hits connecting her to her brother, Blackwater founder
Erik Prince,
a person familiar with the matter said. Blackwater was a State Department contractor that was banned from Iraq after a deadly 2007 shootout of Iraqi civilians. Mr. Prince couldn’t be reached for comment.

Betsy Devos is Education Secretary. One article about her that appeared online was titled “Betsy DeVos Positive News Article.”
Photo:
Alex Edelman/ZUMA PRESS

One article touting her accomplishments as a “reformer” appeared on chemfindit.com, a site that briefly used the same IP address as a company affiliated with executives of Status Labs called Blue Land Partners, according to a Journal analysis of data from Farsight Security Inc.

Another article about her appeared online titled “Betsy DeVos Positive News Article.” The August 2017 article was on a website called “Enable Diversity,” which also featured Mr. Gottlieb. The website has since been removed.

Disgraced blood-testing startup Theranos Inc. also received services from Status Labs, according to former employees. An editing account used by Status Labs was called Jppcap, according to people familiar with the matter. That account made several favorable edits to Theranos’ Wikipedia page. One edit removed a reference to an article in the Journal reporting Theranos devices often failed accuracy requirements. Theranos dissolved last year. A lawyer for Theranos founder
Elizabeth Holmes
and lawyers for Theranos while it was in business couldn’t be reached.

The co-founders of Status Labs, Mr. Fisher and Jordan French, also ran a company called Wiki-PR, which edited Wikipedia pages for clients, according to a former employee. Mr. French left Status Labs in 2015 following a dispute, according to that former employee and a press release from Mr. French. Status Labs was founded in 2012.

Wiki-PR edited clients’ Wikipedia pages without disclosure, according to a cease and desist letter sent by a lawyer for the Wikimedia Foundation, which oversees Wikipedia.

“When outside publicity firms and their agents conceal or misrepresent their identity by creating or allowing false, unauthorized or misleading user accounts, Wikipedia’s reputation is harmed,” the letter said. It added that the practice “is expressly prohibited by Wikipedia’s Terms of Use.”

The Wikimedia Foundation banned Wiki-PR and its “employees, contractors, owners and anyone who derives financial benefit from editing the English Wikipedia on behalf of Wiki-PR.com or its founders,” the letter said.

Status Labs continued to stealthily edit clients’ pages, said former employees.

The hedge fund of billionaire
Ken Griffin,
Citadel LLC, hired Status Labs to edit information on Wikipedia in 2015 about the fund’s investments and Mr. Griffin’s art collection, according to a person familiar with the matter. Citadel’s spokesman said “changes made to the Wikipedia pages in 2015 were to correct factual errors.”

The hedge fund of billionaire Ken Griffin, Citadel LLC, hired Status Labs to edit information on Wikipedia in 2015 about the fund’s investments and Mr. Griffin’s art collection, according to a person familiar with the matter. Citadel’s spokesman said “changes made to the Wikipedia pages in 2015 were to correct factual errors.”
Photo:
Patrick T. Fallon/Bloomberg News

The Wikipedia account used, Jppcap, was the same one that worked on the Theranos page, according to a review of Wikipedia’s edits, which are public. The account didn’t disclose it was working on behalf of Status Labs or a paid client, which would have been a violation of Wikipedia’s terms, said Ms. Clin, the Wikipedia editor.

Mr. Gottlieb didn’t end up starting a new fund, but he is now managing his own money. His Google results still do not prioritize certain articles. In August, his former firm filed a lawsuit against the widow of Mr. Gottlieb’s deceased portfolio manager, seeking more than $100 million in repayment for money it said it paid the money manager.

The Journal published an article about the lawsuit and mentioned Mr. Gottlieb’s full name once and last name twice toward the end of the article. Many factors, such as the keywords used, help determine what Google turns up in search results.

In October, a Google search of Mr. Gottlieb’s name returned 19 pages of results. The Journal’s article wasn’t among them.

—Rob Barry, Jim Oberman and Russell Gold contributed to this article.

]]>
https://www.wsj.com/articles/how-the-1-scrubs-its-image-online-11576233000
https://www.wsj.com/articles/how-the-1-scrubs-its-image-online-11576233000Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[The Loss Of Micro-Privacy]]>

How small design changes rewrote the rules of messaging and how we feel about one another

Few applications have affected mass consumer psychology as much as messaging apps. While social media helps us build communities, a following, and a digital presence, messaging enables us to stay in touch with people we care about. With the ongoing trend of more intimate and personal communication, a myriad of privacy scandals and general social media fatigue, messaging is here to stay.

When looking at it from a distance however, it seems like messaging hasn’t really changed all that much over the last two and a half decades. It’s too easy to overlook the small design and privacy changes that fundamentally rewrote the rules of communication and how we feel like when we talk to one another.

To better understand how we ended up where we are today and to fully appreciate the psychological ramifications of a series of seemingly small changes, we need to take a step back and go back to 1996 — the year where messaging as we know started.

In the early 90s, five Israeli developers realized that most non-unix users had no easy way to send instant messages to one another. The terminal was reserved for power users, and well designed software applications with a user-friendly GUI were still sparse. They got together and started working on a cross-platform messaging client for Windows and Mac, and gave it the catchy name ICQ (“I seek you”).

It didn’t take long until early versions of ICQ had most features we’re taking for granted in today’s instant messaging apps:

ICQ Version 99A

With ICQ 99a, the platform featured conversation history, user search, contact list grouping, and the iconic “Uh-uh” sound that played whenever you received a message. Within a very short time, ICQ amassed millions of users during a time when global internet traffic was a fraction of what it is today.

One of the critical challenges during this period was that users weren’t online at all times. During the age of 56k dial-in modems, chat rooms could feel like hanging out at an empty bar. The team came up with an ingenious and deceptively simple concept to let others know when users are available to chat: the online status.

The online status was the first wide-spread instance in digital communication where users gave up a tiny bit of privacy to make a service more engaging and useful. It all started as a seemingly perfect win-win situation: by turning your online status into something that’s shared and visible by everyone in your contacts, it turned your computer into a less lonely place.

After signing onto the service, your friends would immediately get notified. As a result, most users found themselves chatting to someone within minutes. The product’s engagement increased, and the issue of lonely chat rooms soon became a thing of the past.

While ICQ was taking the internet by storm, others quickly took notice and an array of messaging platforms started popping up.

MSN Messenger on Windows XP

The most infamous alternative to ICQ was MSN Messenger. Microsoft Messenger contained all the features that defined ICQ’s success. The press release even emphasized the online status as one of its key features.

“MSN Messenger Service tells consumers when their friends, family and colleagues are online and enables them to exchange online messages and email with more than 40 million users.”

As the MSN user base increased, more users lamented they didn’t feel like they were in control. Upon logging on to the service, they immediately got pinged by people they didn’t necessarily want to talk to. The problem of lonely chat rooms was effectively replaced with a new problem: How can users be in control of who they want to talk to?

How can users be in control of who they want to talk to?

For many, not replying wasn’t a viable option as they felt guilty about ignoring incoming texts. It soon became clear that the automatic sign-in and public online status wasn’t without its flaws.

Microsoft’s response was to introduce a new feature that enabled users to “appear” as offline. With this small change, users gained back some level of control on how openly they want to share their online activity. It wasn’t all perfect though.

In its wake, the offline status left behind a trail of paranoia that gave rise to tools that allowed users to screen whether friends had blocked them. These third-party tools encouraged anyone to become a cyberspace Sherlock Holmes to check in on their contacts statuses.

As we will see, this is a common chain of events in the realm of messaging. Every change involving micro-privacy has a counter-reaction that can go from barely noticeable, to harmful, to downright problematic. So what is micro-privacy?

Since there didn’t seem to exist a concept

Micro-privacy are small nuggets of information that reveal something about a user’s online activity.

What characterizes micro-privacy is that a minimal amount of information can have huge repercussions on product engagement, user behavior, and wellbeing.

In simple terms, design teams can build more engaging products by reducing privacy on two ends: either between the provider and its users, or between the users themselves. We spend a lot of time worrying about the former, but almost completely neglect the latter.

Let’s have a closer look by looking at it through another example that might feel strangely familiar.

Microsoft was in trouble. Their platform gained a lot of traction but one of the things that kept plaguing the early version of MSN were flakey internet connections. When two users talked to one another, you could never tell whether the person you were talking to was still there, whether they went away, or whether their connection had simply timed out. Sometimes, sending a message felt like sending it into a vortex. You never knew whether you were going to get something back.

In order to better set expectations, the chat community developed a linguistic toolbox to let others know when they might not respond immediately. As a result, chat rooms of the early 2000 were full of acronyms like AFK (away from keyboard) and BRB (be right back).

It wasn’t until a team of engineers at Microsoft came up with a genius micro-interaction that would redefine the psychology of messaging as we know it forever.

In order to set expectations and make conversations feel more engaging, the team introduced what they called the typing indicator. Every time users started writing a message, it sent a signal to the server that would in turn inform the person on the other end that the user is typing. This was massive technical bet. Around 95% of all MSN traffic was not the content of the messages itself, but simple bits of data that would trigger the iconic dots to show up and disappear! [2]

Karen is typing…

From an engagement model perspective, the typing indicator flipped all the right behavioral switches that got people hooked. Every time someone started typing, it created anticipation followed by a variable reward. Today, this is a well researched area in psychology that serves as a foundation for anyone attempting to build addictive products.

The typing indicator elegantly solved what the team had set out to solve. But it also did a bit more than that. Apart from increased engagement, it also single-handedly introduced a whole new level of emotional nuance to online communication. This seemingly small detail inadvertently conveyed things no message by itself ever could. Picture this scenario:

Bob: “Hey Anna! It was so great to meet you. You’d like to go out for a drink tonight?”

Anna: “Starts typing…”

Anna: “Stops typing…”

Anna: “Starts typing again…”

Anna”: “Sure!”

How convinced is Anna really? You might have experienced it yourself: the angst of prolonged typing indicators followed by a short response or even worse: nothing! Bob might have been happier if he hadn’t observed Anna’s typing pattern. But he did. And now he wonders how such a tiny animation can have such a profound impact on how he feels…

It turns out, Bob isn’t alone. It didn’t take long until users started coming up with strategies and hacks to get control back over their micro-privacy and online activity. From typing it into a document and then copying it over, to first thinking hard before even attempting to write something.

This problem gets further exacerbated in modern applications that involve group chat, always-on messaging services, or dating apps. But this was still before the iPhone came along to change the internet as we know it.

Today, typing indicators are ubiquitous. And while we can’t argue that it made messaging more useful, it also made it more addictive by playing an innocent but powerful sleight of hand: we were handed an exciting pair of cards, at the cost of someone observing us from the other side.

Of course, this wasn’t the last time we happily played along.

Divorce lawyers in Italy know something that you and I don’t. But it first took a shift in technology for them to get to that insight. That shift kicked off in late 2007, when we went from a type of internet we used at home and at the office, to the type of internet that was with us at all times.

The introduction of iPhone marked a technical leap that affected every aspect imaginable in computing and with it, every aspect of society.

Brian Acton and Jan Koum went on vacation after they had left Yahooo! When they came back and tried the iPhone for the first time, they immediately saw huge potential in the device and its App Store model. They started working on a new type of messaging app that included an online status as part of the core messaging experience. They give it a catchy and memorable name — WhatsApp — to sound like the colloquial “what’s up?” everyone’s familiar with.

Growth was relatively slow and the two almost decided to give up on their venture. That changed when Apple introduced a new service that almost instantly catapulted their brainchild to the top of the App Store. After integrating the newly released push notification system, their user base shot up to 250,000 in no time.

There were a couple of things that made WhatsApp different and attractive. First, it sent messages over the internet so users no longer had to pay for every single SMS. Second, it re-introduced the online status that had originally been developed during a time of chat rooms and flakey internet connections over a decade earlier. And third, it featured the infamous typing indicator we’ve all come to love. All these things combined made WhatsApp feel lightyears ahead of any traditional SMS application of its time.

Today, WhatsApp has more than a billion users and it’s the preferred way of sending messages in many countries all around the world. One of those countries is — you guessed it — Italy!

According to Gian Ettore Gassani — president of the Italian Association of Matrimonial Lawyers — WhatsApp messages sent by cheating spouses play an integral role in 40% of Italian divorce cases citing adultery, writes Rachel Thompson from Mashable.

The thing that often led to those deeply troublesome insights? The “last seen online” indicator. Unlike the traditional online status of the early 2000, “last seen” added a new level of insight to written chat: the exact time someone used WhatsApp.

Last seen online indicator (WhatsApp illustration)

Like any service that turned the knob on micro-privacy, the outcome was predictable: high user engagement at the cost of reduced user-to-user privacy.

What does it mean when your spouse was last seen online at 4:30 in the morning? Why would someone be online but not pick up the phone minutes after they had just been seen online? How come your secret crush and your best friend always seem to be online at the same time, coincidence?

Coincidence or not, users decided to start doing something about it to get their micro-privacy back. In very little time, the internet lit up with tons of articles and tutorials both through written and step-by-step video instructions. These tutorials ranged from creating a fake last seen status, to freezing the time, to disabling it altogether.

The Last Seen “feature” had such strong psychological impact on users that some started referring to it as Last Seen Syndrome (LSS). In her research about how WhatsApp impacts youth, Dr. Anshu Bhatt notes “This app has been found to be highly addictive, which leaves a trace that becomes difficult to control”. The myriad of articles offering advice on how to control privacy, limit time spent in the app, and outsmart the Last Seen indicator further offers a glimpse into the challenges many users are facing today.

And just like when it seemed there wasn’t any more micro-privacy we would willingly disclose, there was still one tiny area that went largely overlooked…

It was again a seemingly small “detail” that deeply reshaped our experience and expectations towards one another. Like many of the ideas we’ve discussed so far, this one too can be understood as loosely inspired by technology that was invented decades earlier. In this case, it was email.

Replying late to incoming texts or emails used to be simple: a short “only saw this now” was good enough to get back to someone without any feeling of guilt or fear of retaliation. Today, only seeing this now will hardly suffice and we’re all in need of a better alibi.

It was again a seemingly small “detail” that deeply reshaped our experience and expectations towards one another. Like many of the ideas we’ve discussed so far, this one too can be understood as loosely inspired by technology that was invented decades earlier. In this case, it was email.

Manually entering an email address was (and still is) an error-prone process. The idea of sending messages digitally was both novel and hard to grasp. Upon hitting the send button, users had very little information on whether their message was delivered, pending, or aborted. To offer more transparency and make email more understandable, the team implemented Delivery Status Notifications (DSN). Through DSN, users gained more insight about what happened to their message, after hitting the send button.

Fast forward 30 years and the industry keeps solving similar problems. This time however, in a slightly different context and a slightly different moment in computing history.

In 2011, Apple introduced iMessage. What made iMessage different from its predecessor was that it seamlessly migrated users from sending messages through the traditional SMS protocol, to sending them over the web. This set the foundation needed for iMessage to evolve beyond a simple text messaging app.

Among the many newly introduced changes was an inconspicuous “feature” that quickly became known as one of the most contentious and controversial moves in the messaging space: read receipts.

Read receipts in iMessage

In no time, read receipts become inspiration for a lot of lively drama ranging from relationship issues, to increased social expectations, to many sadistic psychological games. And despite its effect — or perhaps precisely because of it — we took that any day over texting someone with a green bubble.

The introduction of read receipts marked a critical moment where seeing a message was no longer understood as an oversight, but a perceived act of ignorance. Sending a message slowly set in motion a feeling of being ignored for the sender and established an obligation to respond for the receiver.

When friends or romantic partners don’t text back after seeing a message, it no longer feels like a matter of patience. It can feel like being left behind. When our boss texts late after work to finish a presentation, we can longer pretend we haven’t seen it. We either grind through the work, or better have a good excuse lined up the next day.

As a result, people started tricking systems to stop systems from tricking them. From putting their phones into Flight Mode before opening the App, to only looking at incoming reading texts from the lockscreen, to actually holding off from reading the message altogether.

A study at the University of Copenhagen found that over 80% of participants had developed read-receipt avoidance strategies. Many participants also mentioned they started speculating and coming up with their own stories why the other person hadn’t responded yet.

Overall, none of the participants liked read receipts, and yet they kept them on because they wanted to know what was going on in other peoples’ life. Some users went as far as intentionally turning on read receipts to explicitly convey they are ignoring the person on the other end. Unlike some of the other forms of micro-privacy, read-receipts just like emojis and confetti, have become an active part of the conversation itself.

Today, read-receipts are ubiquitous. And while Apple was forthcoming enough to provide a way for users to turn read receipts off, other messaging clients did not.

When WhatsApp introduced the now well-known blue checkmark, it instantly faced massive flak from its users. It took a few weeks and another checkmark popped up in the privacy settings of the app to turn receipts back off.

Read receipts aren’t about informing us whether our message was successfully delivered. They’re about offering us a glimpse into another person’s life. And while we’ve come to accept them as a constituent of modern messaging apps, time will tell whether they’ll remain so.

The story of the online status, typing indicators, and read receipts is a story about the unresolved and ever ongoing tension between privacy and engagement. And while we looked at this through the lens of messaging, I believe these insights apply to *any* product involving people interacting with one another in any way whatsoever.

One of the most simple and insightful theories that came out of the field of organizational psychology is the simple idea of assuming good intentions. If we look for negativity in the world, that’s what we’re going to get. I believe most products we use today are designed with good intentions. But I also believe that designing with good intentions is no longer enough.

When we’re designing products that can reach parts of the world’s population, details aren’t details anymore, they become the design.

Product designers can longer recount fairytales about how design is turning the world into a better place through more engaging products. Whenever privacy is at stake, things just don’t get to be that simple. Designers who don’t ask themselves critically whether revealing any user information is truly necessary or whether it could have detrimental effects on users’ wellbeing, are effectively deciding to not doing their jobs. Engagement is a one-dimensional variable that is easy to track, but it will not serve as a sustainable metric for the future we’re going to be designing for (or the future I want to live in).

As such, privacy remains one of the big and unresolved issues in our industry and while we often worry about data leaks and agonize over how much companies know about us, we often forget that it’s the small and barely noticeable losses of end-to-end user privacy that affect us socially the most. And while turning every privacy related decision into a setting might be enticing, it’s ultimately shortsighted. Designers are well aware that most users won’t bother changing a default. And the act of changing a default ironically always inadvertently reveals something about users, whether they want or not.

So what does a future that respects people’s micro-privacy feel like?

It’s knowing you can go online without having to fear what our online status may reveal about you. It’s about liking someone’s photo without the anxiety of being called out for it. And above anything, it’s about reading a message, without feeling guilty of not sending an immediate response.

Sounds idealistic? That’s because it is.

The design systems we’ve put in place created social expectations that seem irreversible. But they don’t have to be. And if any field should worry about keeping privacy and engagement in check, it’s us.

Thanks for reading. I‘d love to hear your thoughts. You can find me on Twitter.

]]>
https://medium.com/@azumbrunnen/the-loss-of-micro-privacy-baa088f0660d
https://medium.com/@azumbrunnen/the-loss-of-micro-privacy-baa088f0660dMon, 16 Dec 2019 06:47:00 +0100<![CDATA["My Car does not start when I buy Vanilla Ice Cream"]]>Did you ever imagine that an “Ice Cream” could shake the entire General Motors? In 2010, the Pontiac Division of General Motors received a very funny complaint from one of its customers. It was so weird & bizarre that it took the entire General Motors by storm. However, on reading the entire Case, this definitely caught our interest and we realized that this is by far an epic case of ‘Customer Care’. It teaches us that however weird the complaint is, never under estimate your Client!

Below is the complaint, which was received by the Pontiac Division of General Motors:

“This is the second time I have written to you, and I don’t blame you for not answering me, because I sounded crazy, but it is a fact that we have a tradition in our family of Ice-Cream for dessert after dinner each night, but the kind of ice cream varies so, every night, after we’ve eaten, the whole family votes on which kind of ice cream we should have and I drive down to the store to get it. It’s also a fact that I recently purchased a new Pontiac and since then my trips to the store have created a problem…

You see, every time I buy a vanilla ice-cream, when I start back from the store my car won’t start. If I get any other kind of ice cream, the car starts just fine. I want you to know I’m serious about this question, no matter how silly it sounds What is there about a Pontiac that makes it not start when I get vanilla ice cream, and easy to start whenever I get any other kind?”

The Pontiac President was understandably skeptical about the letter, but nevertheless sent an Engineer to check it out anyway.

Learning Point No. 1– Always respond to your Customer Complaints & make them feel you took an Action!

Moving on, the Engineer & the Man drove to the ice cream store that night. It was vanilla ice cream that night and, sure enough, to the Engineer’s surprise, after they came back to the car, it actually wouldn’t start.

To be sure, the Engineer returned for three more nights. The first night, they got chocolate. The car started. The second night, he got strawberry. The car started. The third night he ordered vanilla. The car failed to start. Now that is something scary!

Now the Engineer, being a logical man, refused to believe that this man’s car was allergic to vanilla ice cream. He arranged, therefore, to continue his visits for as long as it took to solve the problem. And toward this end he began to take notes, he jotted down all sorts of data: time of day, type of gas uses, time to drive back and forth etc.

In a short time, he had a clue: the man took less time to buy vanilla than any other flavor. Why? The answer was in the layout of the store. Vanilla, being the most popular flavor, was in a separate case at the front of the store for quick pickup. All the other flavors were kept in the back of the store at a different counter where it took considerably longer to check out the flavor.

Now, the question for the Engineer was why the car wouldn’t start when it took less time. Eureka – Time was now the problem – not the vanilla ice cream!

The engineer quickly came up with the answer, which was “Vapor Lock”.

So what actually happened?

Since the Man took extra time to get the other flavors, it allowed the engine to cool down sufficiently to start. When the Man got vanilla, the engine was still too hot for the vapor lock to dissipate!

Hence, the case of the Vanilla Ice-Cream was finally solved!

Learning Point No. 2 – Even crazy looking problems are sometimes real & all problems seem to be simple only when we find the solution, with cool thinking.

This was a cool act by General Motors especially in this world of Internet when news can go viral in matter of seconds Your Reputation matters! Address your customer queries before it becomes complaint in this all-go viral Digital world.

]]>
https://www.digitalrepublik.com/digital-marketing-newsletter/2015/05/10/my-car-does-not-start-when-i-buy-vanilla-ice-cream-said-a-man-to-general-motors/
https://www.digitalrepublik.com/digital-marketing-newsletter/2015/05/10/my-car-does-not-start-when-i-buy-vanilla-ice-cream-said-a-man-to-general-motors/Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Stanford CS 144: Introduction to Computer Networking | Hacker News]]>Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

]]>
https://news.ycombinator.com/item?id=21794270
https://news.ycombinator.com/item?id=21794270Mon, 16 Dec 2019 06:47:00 +0100<![CDATA[Best Movies of the Decade: Top Movies of 2010s]]>

From celestial collisions to man-eating aliens to human centipedes, here’s to another decade at the movies.*
Photo: Ari Liloan

Want to feel old? In 2010, the No. 1 movie at the box office was Avatar. Hypercool indie darlings Robert Pattinson and Kristen Stewart were the coupled-up stars of a lucrative supernatural teen romance. Netflix, which is likely to be competing with itself at the Oscars this year, was just starting to stream in Canada. Toy Story 3 came out, and now we’re on … well, Toy Story 4. Not everything has been marked by radical changes, even over 10 years and the release of so many movies. Thousands of movies. The sheer amount posed the first and biggest challenge of putting together this list: deciding which of the movies that filtered through theaters and VOD services and streaming platforms qualified for a ranking of the decade’s best films.

Perhaps arbitrarily, we decided that for a film to be eligible for our ranking, it had to have played in at least four theaters in the U.S. That cut down the crop to something slightly more manageable. But were we punishing smaller films simply because their distributors deemed them uncommercial? Cue the guilt, and the exceptions. A movie released in less than four theaters could force its way into the mix, we determined, if it had been nominated for an industry award — from the Academy, the Directors Guild, the New York Film Critics Circle, etc. Of course, we’re critics, so we decided to allow any film included in one of our previous best-of-the-year lists to compete too. Soon, we were permitting ourselves a few wild-card contenders, because list-making is more art than science. It’s more argument than science, too, which we proved over the course of a few mostly civil weeks of bickering and bantering over personal preferences, creating an appropriately chaotic point system for our ranking and then throwing that point system out the window — haphazardly moving things up, down, or off the top and bottom tiers entirely as we grew nearer to our deadline.

The list wasn’t a result of any consensus (to which the various annotations attest), and it certainly isn’t definitive. (Is there such a thing as a definitive ranking?) But it does reflect the highs and lows of a tumultuous decade. We hadn’t, for instance, planned for our top-three picks to be so … apocalyptic. But when it turned out that way, it felt entirely appropriate. The past 10 years have been marred by doomsday predictions about cinema, whether the harbingers are Netflix or superheroes or high frame rates. On the other hand, we absolutely did plan for our bottom choices to be incendiary, and to speak to larger tendencies in the industry that have filled us with dread — except, of course, when they fill us with delight. It’s messy, and it’s filled with contradictions, and we wouldn’t have it any other way. Here are the films of the 2010s, ranked, for better and worse.

It’s the end of the world, er, decade. (Pictured: Melancholia, Mad Max: Fury Road, Tree of Life.)
Photo: Magnolia Pictures, Warner Bros. and Fox Searchlight Pictures

No image from this decade at the movies has felt as radical-reverberant as that of Kirsten Dunst luxuriating — like a Grande Odalisque of annihilation — in the light of the rogue planet set to destroy humanity in Melancholia. Lars von Trier’s 2011 magnum opus is a film about depression, and it’s a film about the end of the world, and more than anything, it’s a profoundly resonant film about how the two can feel indistinguishable from one another. Melancholia is about personal apocalypses so all-consuming that it can be hard to notice when an object from space burns up all the air in the atmosphere on its collision course with Earth. Von Trier knows his destructive impulses — he essentially sabotaged the premiere of this, the best thing he’s ever made, with his behavior at the press conference, which speaks to his close personal knowledge of the impulse to burn down everything around you. But what pushes this particular work into greatness is the tenderness it affords for characters like the one played by Charlotte Gainsbourg, who are invested in the world, and who fight, no matter how fruitlessly, against entropy. It may conjure up a vast emotional void, but it never forgets what it means, and how much it costs, to care. —Alison Willmore

Maybe movies were invented to capture life as we know it in motion. And maybe they were invented to show a man shooting flames out of an electric guitar while heading up a fleet of mutant vehicles riding to battle across a burnt-out desert wasteland. Mad Max: Fury Road is one of the greatest action movies ever made. It’s also a long-in-coming installment of a series that started in a recognizable future and that stretched itself into territory that’s nearly mythological. It’s proof that existing within a franchise doesn’t have to limit imagination or creativity and that a dystopian Mel Gibson revenge saga can end up passing the baton to a formidable warrior-mother figure played by Charlize Theron and blessed with the incredible moniker of Imperator Furiosa. George Miller’s spectacle is a testament to dreaming big — so big that it feels like the only appropriate way to take it in is with a grin on your face and tears pouring out of your eyes. —A.W.

Rebuttal: Many are puzzled why this often exciting but repetitive, formulaic chase movie has been so wildly embraced (“one of the best films of the decade!”) by pretentious film critics. Though a pretentious film critic, I say, “Moi aussi.” —David Edelstein

Stretching from the origins of time to the heat death of the universe, with a lyrical coming-of-age story sandwiched in between, the most ambitious film of Terrence Malick’s career — so ambitious he worked on it for 30-plus years and has now actually made it five times and counting — is a movie whose power matches the scale of its vision. Malick connects the somewhat autobiographical (and symphonically constructed) central story, about a trio of brothers growing up in mid-century Texas with a stern father and an angelic mother, to the metaphysical and astronomical forces at play in creation itself. Where many other filmmakers see good and evil, Malick sees constriction and expansion, tension and movement, retribution and acceptance, aggression and grace. —Bilge Ebiri

The Rider announced the talents of Chloé Zhao, who directs this film with untold grace and fine-tuned simplicity. It’s a beguiling blend of documentary and fiction that follows amateur actors playing fictionalized versions of themselves. The film charts the life of Brady Blackburn (Brady Jandreau provides the bruised performance), a Lakota Sioux on the Pine Ridge Reservation, in the aftermath of suffering a traumatic head injury during a bronco competition that prevents him from riding again. The Rider is a grim meditation on the nature of masculinity and what happens when we lose the ability to do what brings meaning to our lives. —Angelica Jade Bastién

Asghar Farhadi’s divorce drama came out in 2011, and, with apologies to Noah Baumbach, there has been no more finely wrought film about the dissolution of a relationship since. A Separation begins with an upper-middle-class couple from Tehran battling in front of a judge about whether their marriage is over, then builds into so much more — a spellbinding two-household tragedy that encompasses themes of class, faith, generational obligations, and the flight of human capital. That Iran, with its many restrictions on filmmakers, chose A Separation as its foreign-language Oscar submission (it won) speaks to how deftly the film’s political criticisms are woven into its human dramas. They’re so organic to the story as to evade threat of censorship because of how they’re portrayed — simply as the stuff of life. —A.W.

How can I do justice to director Barry Jenkins and co-writer Tarell Alvin McCraney’s evocative 2016 film? Watching Moonlight has brought me to tears, with its piercing turns by Ashton Sanders, Trevante Rhodes, and Mahershala Ali. There’s an overheated beauty to seeing my childhood home of Miami onscreen, shot exquisitely by cinematographer James Laxton, who renders the city in shades of ice, cobalt, and amber. I love the film’s smaller details, too: Chiron’s relationship with the ocean, how Kevin (André Holland) serves Chiron pollo a la plancha when they reconnect. Moonlight is the decade’s trembling, heartfelt coming-of-age story about queerness, blackness, and the ripple effects of addiction. —A.J.B.

An 11-year-old girl who spends her days in a Cincinnati boxing gym is drawn into the world of an all-girl dance squad, and her strained, aggressive reality is transformed into one of freedom, movement, and possibility — and then a mysterious, possibly symbolic illness starts to hit the squad. This is a glorious work of pure cinema: Director Anna Rose Holmer’s expressive use of space and motion conveys psychology and emotion in ways reams of dialogue and conventional “acting” never could. That said, the film’s young star, Royalty Hightower, is also one of the most exciting faces to emerge this decade. —B.E.

Kenneth Lonergan’s masterpiece was shot in the aughts, yanked from its director in the 2010s, and released in a longer (three-plus-hour) cut in a DVD supplement. Go for the longer version. It charts the agonizing journey of a teenage Manhattan girl, Lisa (Anna Paquin, fearless), a blasé relativist early on, to come to terms with the moral arc of the universe (or lack thereof) following a bus accident that cuts a woman in half. Lonergan knows that teenagers see and, more importantly, feel on a different level, and Lisa’s desperate attempts to communicate lead grown-ups to accuse her of overdramatizing. But that’s what teenagers do, says Lonergan, in a world in which people rarely connect or see the world through one another’s eyes. —D.E.

The best superhero movie of the Superhero-Movie Decade wasn’t part of the MCU or the DCU but rather this dazzling animated adventure in which Brooklyn teen Miles Morales becomes the new Spider-Man with a little help from a cavalcade of Spideys across multiple universes, each with his or her own style of animation and even formal and narrative logic. Relentlessly inventive and hilarious while also being enormously powerful, this is the closest cinema has come yet to replicating the aesthetic delirium of comic books. —B.E.

Sean Baker’s The Florida Project centers on rambunctious little kids (principally the astounding Brooklynn Prince) bopping around a transient motel not far from Disney World. Baker captures their crazy elation — playing pranks, mouthing off — but also the gnawing uncertainty of their lives, the grayness of their families’ financial precariousness a counterweight to the eye-popping artificial pinks and purples. Willem Dafoe is unforgettable as the motel manager who can’t fix what most needs fixing. —D.E.

Robert Greene’s uncommonly intimate documentary about the life of Brandy Burre, an actress who got her big break on The Wire but then moved to Beacon, New York, to start a family, is modest in setup — Burre was Greene’s next-door neighbor when he decided to start filming her — but insanely ambitious in execution and effect. In showing all the faces Burre must put on in her life — whether she’s trying to get her next gig, playing mom, or entertaining guests — the film imparts a great truth about the way we all perform our way through our lives. In its full-blooded, compassionate, complex portrait of its subject, it’s the rare documentary that achieves the emotional breadth of a great novel. —B.E.

Running just a little over an hour and consisting primarily of animated, featureless black-and-white stick figures, Don Hertzfeldt’s look at depression, dementia, death, and transcendence builds an insanely beautiful cinematic cathedral out of the simplest ideas. The title is both ironic and sincere: This is the story of an ordinary man dying of a brain disease (well, sort of) but somehow it’s also a life-affirming reaffirmation of the awe-inspiring wonder of existence. To that end, the film includes mundane interactions that sometimes slip into surrealism, and wild experimental passages. An unimportant exchange suddenly reveals deeper realities; odd, throwaway images come back as soul-crushing memories. The utter meaninglessness and forgettable humiliation of an ordinary life is reimagined as a heartbreaking tribute to our common humanity. How can something so small, created by one guy slaving away with a pen and paper for years, be so complex, so indescribably transcendent? You have never, ever, ever seen anything like it. —B.E.

David Mackenzie’s haunting drama (from a witty, layered script by Taylor Sheridan) is the greatest Western of our post-financial-collapse era. It features bank robbers, rangers, cowboys, and Indians, but the time is the present and the West — here, West Texas — is a different place: The frontier that gave birth to symbols of “rugged individualism” is now a home for the collectively dispossessed, with Native Americans and white people who once upon a time took their land in the same sinking boat. Chris Pine and Ben Foster are the brothers who steal from banks that have stolen from others, Jeff Bridges the sardonic lawman on their tail. —D.E.

This is probably, what, the 78th best-of-list blurb you’re reading about Korean director Bong Joon-ho’s masterpiece? Is there anything left to say about it? Maybe just this: In telling the story of a lumpen family who insinuates itself into the lives of a wealthy family, only to discover there’s someone even lower on the food chain, director Bong has not only crafted a metaphor-engorged thriller about capitalist striving and class warfare, he’s also managed to reinvigorate the farce as a tool for artful social criticism. —B.E.

Late in Jonathan Glazer’s icy science-fiction film is a scene in which Scarlett Johansson’s alien lead curiously examines the landscape of her naked body. What could have felt gratuitous, even silly, instead is rendered with care and specificity. Loosely based on the strange novel of the same name by Michel Faber, the film follows Johansson’s alien through Glasgow and the Scottish Highlands as she searches for prey. It’s a striking parable about gender — its elasticity and its horrors. The film boasts a mesmerizing, lucid turn by the actress that ranks as some of the best work of her career.—A.J.B.

Director Park Chan-wook loosely adapts Fingersmith by Sarah Waters, transporting it to Japanese-occupied Korea, creating a culturally sumptuous queer tale brimming with turns of fortune and double crossings. The probing gaze of Park and cinematographer Chung Chung-hoon is rich with hypnotic detail and texture. Coupled with its evocative performances, particularly by Kim Min-hee as the mysterious and yearning Lady Hideko, watching The Handmaiden is like being lulled into submission by an ornate spell. —A.J.B.

Kirsten Johnson, who shot some of the most important documentaries of the last three decades — including films like Laura Poitras’s Citizenfour and The Oath, and Kirby Dick’s Derrida — uses discarded snippets and scenes from those previous efforts (as well as some of her own personal home-movie footage) to put together this marvelous memoir. It’s not that we see her in this footage, however; instead, we see blown takes, tripod adjustments, filmmaker interventions, drifty longueurs, and other bits of cinematic detritus that come together to create a poetically inflected portrait of the consciousness behind the camera. In so doing, Johnson not only gives us a glimpse into the observational, technical, and emotional work that filmmaking requires, she teaches us how to see anew. —B.E.

After a decade of running in place (to great box-office success), Quentin Tarantino found his sweet spot again — a fetishistic collage of Hollywood ’60s bric-a-brac that allows him to examine (or maybe just to live inside) the world that, for better or worse, shaped his fantasies. On paper, it’s reactionary: Two increasingly irrelevant white males from ’50s cowboy TV (Leonardo DiCaprio as the star, Brad Pitt his devoted stunt double/valet) recover their mojo enough to defend themselves against dirty hippie girls (Mansonites) and thereby save a blonde, pregnant movie princess from being butchered. But it’s more wistful pipe dream than manifesto, building to a denouement at once euphoric and heartbreaking. —D.E.

A high-flown title for a film of countless earthly pleasures, chief among them the faces of three fascinating performers: Juliette Binoche as an aging international star, Kristen Stewart as her jittery personal assistant, and Chloë Grace Moretz as the ripening young actress poised to seize the throne. The writer-director Olivier Assayashas a genius for using ephemeral, gossip-magazine ingredients — wealth, fashion, celebrity — as a springboard for that most timeless of themes: the ephemerality of us. There’s so little in the way of histrionics that it’s hard to put your finger on why the film is so terrifically intense. —D.E.

Early in Paul Schrader’s exacting, thrilling drama, Ethan Hawke’s struggling pastor Ernst Toller delivers a line — told to a man he’s counseling who is wrestling with the horror of bringing a child into a world beset by climate change — that stopped me cold: “I talked my son into a war that had no moral justification,” Toller says, referring to his son who died in Iraq, an event that ruptured his former marriage. Perhaps it was the 4:3 frame ratio that made every scene a titch more claustrophobic. Perhaps it was the world-weariness lining Hawke’s face. Perhaps it was Schrader’s cool eye upon him. But what this moment signaled to me is that I was in the hands of truly striking filmmakers. —A.J.B.

Ozarks 17-year-old Ree Dolly (Jennifer Lawrence in her breakthrough role) embarks on a bloody, nightmarish odyssey to locate her lost father to keep the bank from foreclosing on the house in which she lives with her young siblings in Debra Granik’s harshly beautiful adaptation of Daniel Woodrell’s “Ozark noir.” The film achieves a mythical intensity, building to a midnight boat ride on what might be the River Styx — and to a silent scream that will echo forever in your mind. The performances of John Hawkes (as Ree’s meth-fueled uncle) and Dale Dickey (as a violent but all-too-human matriarch) are beyond praise. —D.E.

Christopher Nolan’s WWII epic, a nesting-doll of ticking-clock narratives built around the British Expeditionary Force’s 1940 evacuation from France, is the most ambitious film of his career to date, and perhaps also the most compassionate. The movie’s three timelines all play out in ways that foreground the subjectivity of the people experiencing them. And as the timelines and stories and characters collide amid the escalating delirium of war, what comes through is a touching narrative about the clarifying power of defeat and failure. —B.E.

Rebuttal: Christopher Nolan has his (often belligerent) enthusiasts, but some of us couldn’t tell one skinny white male from another and find his synchronized-swimming worldview as inane as it is labored. Great opening shot though. —D.E.

Lynne Ramsay’s experimental thriller, featuring Joaquin Phoenix as a mercenary-vigilante (no, really, he’s both) who finds missing people and takes brutal revenge, is an electrifying portrait of absence. In following a character whose great power is his ability to evade and disappear, it explores the psychic scars that propel his need for self-negation. A formally dazzling movie that represented a triumphant return to form for Ramsay after a series of aborted projects and poisonous press. So glad to have her back. —B.E.

Director Jung Byung-gil’s The Villainess opens with one of the most dazzling fight sequences in years, all shot from the point of view of our lead, Sook-hee (Kim Ok-bin). It’s bloody and exhilarating, with the camera swinging and serving with dancerly grace. The action in this South Korean spectacle is undergirded by an intriguing, moving tale of control, power, and trauma anchored by Kim’s tremendous performance. —A.J.B.

Leos Carax’s ecstatic hallucination is an elegy for filmmaking and evidence of its power, with Denis Lavant as its steadfast avatar, chauffeured around Paris in a limousine to appointments that amount to dropping into different narratives. In their far-reaching variation — a motion-capture love scene, a neorealist family drama, and an intensely mournful encounter with Kylie Minogue — is a testament to the magic and madness of creating miniature worlds for the camera. But it’s the intermission that makes my heart explode with joy, with Lavant on accordion, joined by a bevy of other musicians for a cover of R.L. Burnside’s “Let My Baby Ride.” It’s shot in the Saint-Merri Church, of course, the sacred and profane in one glorious interlude — Trois! Douze! Merde! —A.W.

The thing about The Social Network is that it never really set out to be about the details of how Mark Zuckerberg founded Facebook in the first place. But it ended up being a spiritually accurate portrait anyway, by turning Zuckerberg’s rise into a fable about a young man who wants to be liked, and who, rather than attempt to be likable, builds a social-media empire so people won’t have any choice but to pay attention to him. The script remains the best thing Aaron Sorkin’s written, because it’s one of the few things of his in which the main character is actually supposed to be a dick. And whatever ragged ends of sympathy might have been there on the page get smoothed away by David Fincher’s butter-rich direction, which treats what happens to its subject, played so impeccably by Jesse Eisenberg, as a supervillain origin story. —A.W.

Abderrahmane Sissako’s deeply human fable about a real-life 2012 Islamist takeover in Mali avoids alarmist and exploitative clichés and instead finds terror and tragedy through the lightest of touches. The jihadi invaders are all too human, even goofy at times, which makes their casually monstrous actions that much more startling and horrific. —B.E.

On one level, the title of This Is Not a Film is an extremely dark joke — Jafar Panahi made it with his co-director, Mojtaba Mirtahmasb, while on house arrest, after having been sentenced to a 20-year ban on filmmaking by the Iranian government, and it was smuggled out to its Cannes premiere on a flash drive hidden inside a cake. On another level, it’s an acknowledgment of the amorphous nature and deceptive casualness of the 76-minute feature, which was shot in Panahi’s Tehran apartment building, partially on an iPhone. What starts as a chronicle of loneliness and resilience becomes a testament to its creator’s insatiable curiosity about the friends and strangers who cross his doorway. It’s a funny, tremendously sad work of protest, a reminder that you can ban someone from making films, but you can’t stop him from being a filmmaker. —A.W.

Freddie Quell (Joaquin Phoenix) is a man undone. He’s an erratic and yearning World War II vet struggling to adapt to quotidian life who becomes entangled with the likes of Lancaster Dodd (Philip Seymour Hoffman), the leader of what can only be described as a cult in its early years, and his wife, Peggy (Amy Adams), who wields more power in this close-knit community than it first seems. Writer-director Paul Thomas Anderson creates a work defined by its precision and details — the achingly serene blue of the ocean, light the color of melted gold, alcohol used as both healer and weapon. But what transfixes are the performances. Phoenix stands with crooked, hunched posture, making Freddie look like a living question mark. Hoffman portrays Lancaster Dodd with both ragged egoism, a hulking presence, and the shimmer of self-doubt. —A.J.B.

Four score and seven films — at least — might be contrived from the life of Abraham Lincoln, but Steven Spielberg and screenwriter Tony Kushner home in on a few months in 1865 leading to the vote in the U.S. House of Representatives on the 13th Amendment outlawing slavery. The prism is politics, the fine and coarse art of persuasion, the machine in a democracy through which ideals are translated into legislation and legislation into law. Daniel Day-Lewis speaks in a soft, cracked voice that lulls its listeners with indirection before driving home a lawyerly point. He captures what contemporaries described as Lincoln’s mysterious private sadness. You don’t feel you know Lincoln — few in his time claimed they did. But you feel you know what it was like to be in his presence. The film is based in part on Doris Kearns Goodwin’s book Team of Rivals: The Political Genius of Abraham Lincoln, widely reported to be on Barack Obama’s bedside table after winning the presidency. Can Lincoln be taken as a smack at Republicans or a gentle rebuke to Obama, who lacked the Lincolnesque wiles to entice his rivals to the table? —D.E.

Jennifer Kent’s horror film is best known for its malevolent titular creature whose graphically rendered design immediately grounds itself in your imagination (so much so that it’s become an unexpected queer icon). It follows Amelia (Essie Davis), a widow raising her annoying-as-hell son alone, riddled with exhaustion and increasing unease over the figure of the Babadook she first encounters in a pop-up book. It isn’t just genuinely scary, but a layered treatise on unexpected loss and mental illness. —A.J.B.

A sprawling nocturnal procedural that turns into a metaphysical reverie before making a sharp turn into harsh daytime realism, Turkish director Nuri Bilge Ceylan’s masterpiece is a murder mystery unlike any other. It’s less about guilt and violence and obfuscation — the traditional material of the crime drama — and more about how our mundane lives are constantly influenced by the dreamlike forces of symbol, myth, and romance. —B.E.

There’s nothing supernatural about Christian Petzold’s Phoenix, but it is, nevertheless, a ghost story in which a woman returns from the (presumed) dead to find herself haunting the bombed-out remains of the Berlin life she used to have before the war — before she was sent to a concentration camp, possibly betrayed by someone close to her. Phoenix is a postwar noir, an incredible showcase for star Nina Hoss, and a reworking of Vertigo from the opposing perspective. More than anything else, though, it’s an exploration of trauma so great that its survivor has yet to begin to reckon with it. —A.W.

A triumph of humanist filmmaking. Brie Larson (in her breakout film) is Grace, a counselor at a short-term resident foster facility for at-risk kids, where many of her charges stay for years — and where Grace must confront her own history of abuse. Writer-director Destin Daniel Cretton has a brutally real design: Every seeming breakthrough is followed by a harsh fallback. Lakeith Stanfield is a kid whose raps are howls of rage against his mother, and Kaitlyn Dever is a studiously blasé emo girl — until the demonic rage comes. —D.E.

In Two Days, One Night, written and directed by the Dardenne brothers,Marion Cotillard delivers a raw nerve of a performance as Sandra, a woman returning from mental-health leave to work at a small solar-panel factory only to find her position on the line in precarious circumstances: Management has realized it can force Sandra’s colleagues to cover her shifts, making her redundant. The company offers each stand-in a €1,000 bonus to do so. Sandra must now convince her peers to turn down those bonuses so she can keep her job. Two Days, One Night follows the character as she makes her case and unearths fraught emotions. It’s an austere, working-class drama brimming with genuine feeling and power. —A.J.B.

A dark, delectable comedy involving two distant cousins: the formidable Lady Sarah (Rachel Weisz) and the wily, on-the-make Abigail (Emma Stone), each vying to be the favorite of the ailing Queen Anne (Olivia Colman). There’s so much to marvel at in this film — its sharp script by Deborah Davis and Tony McNamara, the lush costume design by Sandy Powell, Yorgos Lanthimos’s sharp direction that sets aflame even minute moments with intrigue, sexual and otherwise. But I return to The Favourite for its tremendous performances — especially Weisz’s cunning, sultry turn — that work in concert to create a film of piercing magnitude. —A.J.B.

And made for television, really, but shown in enough theaters to qualify for encomiums and awards from film critics — and to make us once again muse on the dwindling distance between the various means of exhibition. Using amazing archival footage and fresh interviews, Ezra Edelman’s 467-minute O.J. Simpson epic pokes and prods, extrapolates and interpolates. We see the fractious world out of which the inhumanly handsome and talented black football star emerged, and the impact of that world on his psyche. The horrible irony lingers — that this man with zero interest in being a symbol for his race became an instrument of black revenge on a police force that had brutalized it for decades. —D.E.

Toward the end of Joshua Oppenheimer’s documentary about the Indonesian genocide, one of its subjects begins retching after returning to a spot where, by his account, he committed many murders as a leader of a Sumatran death squad. It’s one of the most disturbing images I’ve ever seen — as though someone’s body were trying to acknowledge what his mind still refused to. The Act of Killing is an extraordinary experiment, a way of using cinema to test the boundaries of denial and erasure by having two government-sanctioned killers reenact the atrocities they participated in, in increasingly fantastical interpretations. It’s not a record of history so much as it is a document of how history is erased, and how it nevertheless lingers in the memories and in the very forms of those who survived it — and who perpetrated its worst crimes. —A.W.

Madeline (Helena Howard) is a teenage actress encouraged by her theater director (Molly Parker) to blur the lines between the character she’ll be playing onstage and the actual life she leads with her mother (Miranda July). Writer-director Josephine Decker pushes the boundaries of reality and dreams, creation and personhood, through a series of bold aesthetic and narrative choices. POV shots disorient. Scenes are blurred at the edges. It’s a slippery, exasperating, transcendent film that haunts long after seeing it. —A.J.B.

The decade’s greatest bookends are the first and last scenes of Mother, which feature the nameless main character, a widow played by Kim Hye-ja, dancing. The first is a deadpan lark, the second absolutely devastating, and how the film navigates from one to the other is a testament to the electric unpredictability of director Bong Joon-ho’s tonal shifts. Mother has the setup of an unlikely detective story in which a middle-aged woman attempts to clear her son’s name after he’s accused of murder. But what it becomes it so much darker and more profound — a brilliant meditation on the monstrous side of maternal love, a tie forever binding you to someone, no matter how much hurt comes with it. —A.W.

Mike Mills’s melancholy comedy goes down so easily, you can forget how inventive it is: a philosophical, free-form (sometimes madcap) weave of past and present that eases you into the mind of its hero (Ewan McGregor) as he agonizes over his emotional inheritance from a father who has come out of the closet at age 75. That’s well and good for the dad (at the end of life), but Mills’s fictional alter ego has been scarred from growing up in a family of secrets in a culture of façades (presented via archetypal photos): He has no experience bonding for real. A long-overdue Oscar went to Christopher Plummer, who’s light and lithe, buoyed by his new life among the boys. —D.E.

For the past three decades, Keanu Reeves has prevailed as one of our most beguiling modern stars. With 2014’s John Wick, Reeves synthesizes his greatest strengths: unerring cool, an astute understanding of loneliness, and a facility with the ways our bodies communicate the stories we tell ourselves in order to live. The neo-noir-tinged action flick, written by Derek Kolstad, takes a simple premise — an ex-assassin plagued by grief returns to his former life when the sniveling son (Alfie Allen) of a powerful mob boss kills his dog — wringing from it supreme, wholly cinematic pleasures. Surprising performances by Willem Dafoe, Ian McShane, Adrianne Palicki, and Michael Nyqvist. Neon-drenched gun battles. A lightning-bright, fresh mythos. What makes the film rise to the level of one of the best of the decade is how directors Chad Stahelski and David Leitch, former stuntmen and coordinators who met Reeves on The Matrix, understand the beauty and mayhem of the human figure, capturing its contours with an unprecedented clarity. —A.J.B.

Rebuttal: Respectfully, this is as basic as thrillers get: You killed my dog, prepare to die. I will concede that the sequel, John Wick: Chapter 2, was sensational, its carnage so balletic it was almost abstract. (The third part, Parabellum, had some great stuff amid the bloat.) —D.E.

What starts off as an immersive, loving re-creation of the American folk-rock scene in the 1960s becomes, in the Coen brothers’ hands, a kind of anti-Odyssey. Following the travails of a promising but way too abrasive and strident folkie (played by Oscar Isaac, becoming a star before our very eyes) who has too much integrity to sell out, and not enough talent or charisma or luck to break out big, they give us a journey of failure masquerading as triumph. Listen carefully to the background song of the final scene: It’s Bob Dylan, turning into the kind of star our hero will never become. Sad! —B.E.

Blockbusters largely left reality behind in the 2010s in favor of the fantastic and the intensely franchised, and while Ryan Coogler played a prime role in that with Black Panther, it’s his Creed that’s lingered with me as ideal big-screen entertainment, as well as a reminder that earthbound stories can also feel larger than life. With Creed, Coogler didn’t just deftly craft the kind of sports drama that gets a theater full of people cheering in the aisles — he created one that celebrated Rocky while also interrogating its place as a Great White Hope fantasy. Adonis Creed is a fascinatingly complicated underdog for a new millennium, and Michael B. Jordan is a true-blue movie star who’s equally compelling in virtuosic fight scenes, romantic interludes, and tender sequences of mentorship. That the sequel, which Coogler wasn’t directly involved with, was disappointing was all the more in line with the boxing series’ brand. —A.W.

Howard Ratner (Adam Sandler) is a jewelry-store owner, and, more importantly, he’s a gambler, and the Safdie brothers’ latest is an adrenaline-ridden adventure in being addicted to having your whole life riding on your next big bet. Watching it is like taking a joyride in a car with its breaks cut, following Howard as he careens around the city attempting to balance his business, his family, his mistress (the awesome Julia Fox), and his debts, and doing an outstandingly terrible job of it. The Safdie brothers have always had a way with live-wire subjects and intensely New York setting, but Uncut Gems is in its own league, a movie about a man who thrives on chaos that replicates his point of view with a cinematic jolt of sensory overload. —A.W.

Rebuttal: I’d like this movie more if anyone in it behaved like an actual human being. —B.E.

I stumbled onto the independent Irish horror film A Dark Song when it was still streaming on Netflix and was blown away by the arresting simplicity of its staging and visual landscape, along with its lead performance by Catherine Walker. It’s a claustrophobic film about a grieving mother (Walker) who abruptly lost her son and who hires a gruff occultist (Steve Oram) to perform an arcane ritual that would allow her to summon a guardian angel. Tense and riveting, A Dark Song grapples with the nature of grief in a way that terrifies and emotionally bruises in equal measure. —A.J.B.

It may not feel like it at first, but Mia Hansen-Løve’s Eden is a horror story — one about staying too long at the party, about looking up and realizing that everyone around you grew up while you’re still trying to make a go of youthful ambitions. The rub is that Paul (Félix de Givry), the young, and then no longer quite so young, man at its center, isn’t a failure in his pursuit of being a DJ. He’s just not enough of a success to live off it. The highlights of his journey, like the stretch in which he travels to New York to perform for an adoring crowd at MoMa PS1, are intoxicating, and then time slips by and reality rises up unavoidably under his feet like the ground beneath a skydiver. There are a lot of movies about chasing your dreams, and almost none about coming to terms with moving on from them — and Eden is a masterful reflection of the latter. —A.W.

Long story, but this Georgian masterpiece never actually saw the theatrical light of day after premiering at Sundance, garnering wild acclaim and getting picked up by Netflix — who promptly buried it deep in their lineup with little announcement or fanfare or screenings or anything. It’s the story of a middle-aged woman who decides one day to leave her husband and her grown kids and move into an apartment by herself, not for any scandalous reasons but because she wants to be by herself, free of obligations and expectations and all the doublethink that life demands. That’s a simple idea, but the filmmaking here is outrageously beautiful, with every moment ringing achingly true. —B.E.

Martin Scorsese’s Judas Iscariot saga is a threnody for lost grace, a work of self-abnegation set in the gangster milieu where Scorsese normally showboats. Shaped around the 1975 (presumed) killing of Teamsters president Jimmy Hoffa, the film is notable for what it doesn’t have: flashy set pieces, whip pans to carnage, or Rolling Stones songs to pump up the adrenaline. The violence is brusque, flat — un-mythic. The computer de-aging of the characters half works: It doesn’t make you suspend your disbelief, but the knowledge that the stars are old men adds to the poignancy. Robert De Niro and Al Pacino are very fine, but it’s Joe Pesci who anchors the film. Who could imagine the pop-top Pesci as a gangster who seeks to modulate every encounter, accepting that murder is inevitable but, sadly, seeing it as the ultimate failure? —D.E.

Rebuttal: I love The Irishman, but Silence was Scorsese’s true masterpiece this decade. —B.E.

Maren Ade’s chronicle of a father and his semi-estranged adult daughter is an exquisite miracle of tone — a true tragicomedy, a movie about deep familial dysfunction that plays out via a series of escalating dares. The superb Sandra Hüller is the uptight Ines, and Peter Simonischek is her puckish father, Winfried, and after the initial impulsive visit he pays his child goes wrong, he comes back to try again, in character as the fictional oddball of the title. Toni Erdmann is funny in structure and often terribly pitiful in practice, a story of two people who love each other and are fundamentally unable to communicate, culminating in the world’s most fraught performance of “Greatest Love of All.” —A.W.

Director Pawel Pawlikowski’s austere drama, following a young novice nun in 1960s Poland who uncovers her Jewish roots, is a movie about buried secrets, restricted lives, the return of the repressed — and as such, the eerily still black-and-white photography represents not just a bold visual choice, but an emotional one as well. After making films for years in England, the director announced his return to Poland with this Oscar-winning movie. He followed it up with last year’s almost equally monumental Cold War, confirming his status as one of the great cinematic masters of our time. —B.E.

Comedies always go underrepresented on best-of lists, this one included, because it can feel harder to gauge how they’re going to hold up over time. Will, for instance, a mockumentary about the rise and fall of a pop-rapper named Conner4Real remain as funny in a few years as it was when it came out? Yes. The answer is yes. The Lonely Island’s riff on This Is Spinal Tap remains deliriously good, even as its most specific 2016 details have started to make it a micro period piece. The Macklemore skewering, hoverboards, and home-appliance partnership were, anyway, just the trappings of what is, at heart, an enduring story about the fickleness of celebrity, the enduring bonds of friendship, and Seal getting attacked by wolves during a viral proposal gone wrong. —A.W.

Weirdly, Damien Chazelle’s exuberantly original, medium-budget romantic musical ended up standing in for the white Hollywood Establishment against the outsider indie Moonlight, when any other year it might have been hailed as the closest thing since TheUmbrellas ofCherbourg to what might be called a “unified field theory” of music and film. The flow of the camera, the vibrant colors of the set and costumes, the gait of the gorgeous leads (Emma Stone, Ryan Gosling) enhance everything else, so the stylishness seems exponential, if not existential. —D.E.

So close, yet so far. (Pictured: The Tale of the Princess Kaguya.)
Photo: Hatake Jimusho/GNDHDDTK

24 Frames | 45 Years | A Better Man | A Fantastic Woman | A Star Is Born | A Touch of Sin | Ad Astra | After the Storm | Almayer’s Folly | Amazing Grace | American Factory | American Hustle | Amour | An Oversimplification of Her Beauty | Anna Karenina | Annihilation | Anomalisa | Aquarius | Ash Is Purest White | Before Midnight | Beloved | Best of Enemies | Beyond the Lights | Big Eyes | Black Mother | Blackhat | Blind | Blue Caprice | Blue Valentine | Bone Tomahawk | BPM | Brooklyn | Burning | Byzantium | Caesar Must Die | Call Me By Your Name | Captain Phillips | Carlos | Carol | Catfish | Certified Copy | Chronicle | Climax | Coherence | Cold War | Cosmopolis | Dark Horse | Dear White People | Django Unchained | Dogtooth | Edge of Tomorrow | Eighth Grade | Elle | Ex Machina | Extraterrestrial | Faces Places | Fast Five | Felicite | Fire at Sea | First They Killed My Father | Fish Tank | Fog | Force Majeure | Ford vs. Ferrari | Foxcatcher | Frances Ha | Get Out | Goodbye First Love | Graduation | Happy as Lazzaro | Haywire | Heart of a Dog | Heaven Knows What | Hell and Back Again | Her | Hereditary | Hissein Habre, a Chadian Tragedy | Home | Hustlers | I Am Love | I Am Not Your Negro | In the Fade | In the Family | Inception | Inherent Vice | Interstellar | Into the Abyss | It Follows | Jackie | Jafar Panahi’s Taxi | James White | Jauja | Julieta | Kedi | Keep the Lights On | Knight of Cups | Kubo and the Two Strings | Last Train | Let Me In | Leviathan | Life of Pi | Little Men | Logan | Love & Friendship | Love Is Strange | Loveless | Manakamana | Manchester by the Sea | Margin Call | Marriage Story | Martha Marcy May Marlene | Marwencol | Me and You | Meek’s Cutoff | Minding the Gap | Monos | Moonrise Kingdom | Mountains May Depart | Mudbound | Mustang | Nightcrawler | Okja | Oklahoma City | Pain and Gain | Paterson | Phantom Thread | Poetry | Prometheus | Psychohydrography | Raw | Room | Room 237 | Rust and Bone | Saint Laurent | Samsara | Selma | Sembene! | Shame | Silence | Skyfall | Snowpiercer | Starless Dreams | Step Up to the Plate | Stories We Tell | Stray Dogs | Support the Girls | Sweetgrass | Take Shelter | Tangerine | The Adventures of Tintin | The Arbor | The Bling Ring | The Cabin in the Woods | The Counselor | The Dark Knight Rises | The Death of Stalin | The Deep Blue Sea | The Diary of a Teenage Girl | The Duke of Burgundy | The Eclipse | The Edge of Seventeen | The Ghost Writer | The Grand Budapest Hotel | The Great Beauty | The Green | The Grey | The Guest| The House | The Hunt | The Immigrant | The Invisible Woman | The Jungle Book | The Keeping Room | The LEGO Movie | The Loneliest Planet | The Lost City of Z | The Martian | The Mend | The Mill and the Cross | The Past | The Perks of Being a Wallflower | The Post | The Queen of Versailles | The Raid: Redemption | The Second Mother | The Souvenir | The Spectacular Now | The Square | The Tale of the Princess Kaguya | The Trip | The Witch | The World’s End | Things to Come | To the Wonder | Tower | Toy Story 3 | Train to Busan | True Grit | Uncle Boonmee Who Can Recall His Past Lives | Vincere | War Horse | Warrior | We Are the Best! | Weekend | Weiner | What We Do in the Shadows | Where Is Kyra? | Whiplash | Wild Grass | Wild Rose | Wild Tales | Wonderstruck | Young Adult | Zero Dark Thirty

1,000 Times Good Night | 10 Cloverfield Lane | 10 Days in a Madhouse | 10 Years | 10,000 Km | 100 Bloody Acres | 102 Not Out | 10x10 | 11-11-11 | 12 O'Clock Boys | 12 Strong | 13 Assassins | 13 Minutes | 13 Sins | 16 Bars | 17 Girls | 180 South | 1898: Los ultimos de Filipinas | 1911 | 1915 | 1917 | 1945 | 1991 | 2 Days in New York |2 Guns | 2 States | 20 Feet from Stardom | 20 Once Again | 20,000 Days on Earth | 20th Century Women | 21 and Over | 21 Bridges | 21 Jump Street | 23 Blast | 28 Hotel Rooms | 3 1/2 Minutes, 10 Bullets | 3 Days to Kill | 3 Faces | 3 Geezers! | 3 Generations | 3 Hearts | 3 Idiotas | 3 Weeks in Yerevan | 30 Beats | 30 Minutes or Less | 300: Rise of An Empire | 306 Hollywood | 31 | 3100: Run and Become | 311 Enlarged to Show Detail 3 | 35 and Ticking | 350 Days - Legends. Champions. Survivors. | 36 Saints | 360 | 42 | 42nd Street: The Musical | 44 Inch Chest | 47 Meters Down | 47 Meters Down: Uncaged | 47 Ronin | 5 Broken Cameras | 5 Days of War | 5 Flights Up | 5 to 7 | 50 to 1 | 56 Up | 5B | 6 Underground | 7 Boxes | 7 Chinese Brothers | 7 Days in Entebbe | 7 Khoon Maaf | 7 Witches | 71 | 71 Into the Fire | 78/52: Hitchcock's Shower Scene | 85: The Greatest Team in Football History |8: The Mormon Proposition | 9/11 | 90 Minutes in Heaven | 99 Homes | A Bad Moms Christmas | A Ballerina’s Tale A Band Called Death | A Beautiful Day in the Neighborhood | A Beautiful Life | A Beautiful Now | A Beautiful Planet | A Better Life | A Bigger Splash | A Birder's Guide to Everything | A Bit of Bad Luck | A Borrowed Identity | A Boy Called Po | A Boy. A Girl. A Dream. | A Brilliant Young Mind | A Brother's Love | A Cat in Paris | A Chance in the World - Premiere | A Ciambra | A Coffee in Berlin | A Cool Fish | A Cure for Wellness | A Dangerous Method | A Dog's Journey | A Dog's Purpose | A Dog's Way Home | A Faithful Man | A Field in England | A Fierce Green Fire | A Film Unfinished | A Five Star Life | A Gentleman | A Ghost Story | A Girl and a Gun | A Girl Like Grace | A Girl Walks Home Alone at Night | A Glimpse Inside the Mind of Charles Swan III | A Good Day to Die Hard | A Good Old Fashioned Orgy | A Green Story | A Haunted House | A Hidden Life | A Hijacking |A Hologram for the King | A Journey Through Time with Anthony | A Kid Like Jake | A La Mala | A Late Quartet | A LEGO Brickumentary | A Letter to Momo | A Little Bit of Heaven | A Little Chaos | A Little Help | A Long Way Down | A Long Way Off | A Man Called Ove | A Master Builder | A Matter of Faith | A Melody to Remember | A Million Little Pieces | A Million Ways to Die in the West | A Monster Calls | A Monster with a Thousand Heads | A Most Violent YearA Most Wanted ManA Nightmare on Elm Street | A Pigeon Sat on a Branch Reflecting on Existence | A Place at the Table | A Private War | A Prophet | A Question Of Faith | A Quiet Passion | A Quiet PlaceA Reason | A Resurrection | A Royal Affair | A Royal Night Out | A Silent Voice | A Simple Life | A Single Shot | A Street Cat Named Bob | A Summer's Tale | A Tale of Love and Darkness | A Taxi Driver | A Teacher | A Thousand Words | A Tuba to Cuba | A United Kingdom | A Very Harold & Kumar 3D Christmas | A Walk Among the Tombstones | A Walk in the Woods | A War | A Werewolf Boy | A Wizard's Tale | A Woman's Life | A Woman, a Gun and a Noodle Shop | A Wrinkle in Time | A.C.O.D. | A.X.L. | Aarakshan | Abacus: Small Enough to Jail | ABCD (Any Body Can Dance) | ABCD 2 | ABCs of Death 2 | Abduction | Abominable | About Elly | Above and Beyond | Above and Beyond: NASA's Journey to Tomorrow | Abracadabra | Abraham Lincoln: Vampire Hunter | Absolutely Anything | Absolutely Fabulous: The Movie | Abuse of Weakness | Act ofValor | Action Jackson | Action Point | Adderall Diaries | Addicted | AddictionIncorporated | Adore | Adrift | Adult Beginners | Adult World | Advanced Style | Ae Dil Hai Mushkil | Aferim! | Afflicted | African Cats | After | After Auschwitz | After Earth | After Love | After The Ball | After the Wedding | After Tiller | After.Life | Afterimage | Afternoon Delight | Afternoon of a Faun | Aftershock | Against the Sun | Age of Uprising: The Legend of Michael Kohlhaas | Agent Mr. Chan | Agent Vinod | Agneepath | Agora | Ai Weiwei: Never Sorry | Ai Weiwei: The Fake Case | Aida's Secrets | Ain't In It For My Health: A Film About Levon Helm | Ain't Them Bodies Saints | Air Racers 3D | Airpocalypse | Ajami | Alabama Moon | Aladdin | Alan Partridge: The Movie | Albert Nobbs | Alex Cross | Alexander and the Terrible, Horrible, No Good, Very Bad Day | Alice in Wonderland | Alice Through the Looking Glass | Alien Abduction | Alien Intrusion: Unmasking a Deception | Alita: Battle Angel | Alive and Kicking | Alive Inside | All About Nina | All Eyez on Me | All Good Things | All I See is You | All is Bright | All Is Lost | All Is True | All Saints | All the Money in the World | All These Sleepless Nights | All Things Must Pass | All Together | All's Faire in Love | All's Well, End's Well | Allegiance To Broadway | Allied | Almost Christmas | Almost Friends | Almost Holy | Almost Human | Aloft | Aloha | Alone in Berlin | Alone Yet Not Alone | Along with the Gods: The Last 49 Days | Along with the Gods: The Two Worlds | Alpha | Alpha and Omega | Already Tomorrow in Hong Kong | Altered Perception | Alvin and the Chipmunks: Chipwrecked | Alvin and the Chipmunks: The Road Chip | Always at the Carlyle | Always Kabhi Kabhi | Always Miss You | Always Shine | Amanda & Jack Go Glamping | America | American Animals | American Assassin | American Chaos | American Dharma | American Dream: Detroit | American Dresser | American Honey | American Made | American Made Movie | American Pastoral | American Promise | American Reunion | American Satan | American Sniper | American Ultra | American Woman | American: The Bill Hicks Story | Americons | AmeriGeddon | Amigo | Amira & Sam | Amityville: The Awakening | Amy | An Acceptable Loss | An Actor Prepares | An Evening with Beverly Luff Linn | An Honest Liar | An Inconvenient Sequel: Truth to Power | An Interview with God | Anchorman 2: The Legend Continues | And So It Goes | And They're Off | And While We Were Here | Andhadhun | Andy Irons: Kissed by God | Anesthesia | Angel has Fallen | Aniara | Animal Kingdom | Animals | Anita | Anjaana Anjaani | Anna | Anna | Anna and the Apocalypse | Annabelle | Annabelle Comes Home | Annabelle: Creation | Annie | Anohana The Movie: The Flower We Saw That Day | Anonymous | Anonymous | Another Earth | Another WolfCop | Another Year | Answers to Nothing | Antarctica: A Year on Ice | Antarctica: Ice & Sky | Anthropocene: The Human Epoch | Anthropoid | Antonio Lopez 1970: Sex Fashion & Disco | Any Day Now | Anything | Apollo 11 | Apollo 18 | Apparition Hill | Approaching Midnight | Approaching the Unknown | Appropriate Behavior | April and the Extraordinary World | Aquaman | Aquarela | Araby | Arbitrage | Architects of Denial | Arctic| Ardor | Area 51| Argento's Dracula 3D | Arjun: The Warrior Prince | Armed | Armstrong | Arrival | Art and Craft | Art of the Steal | Arthur Christmas | Arthur Newman | As Above/So Below | As I Open My Eyes | Asbury Park: Riot, Redemption, Rock & Roll | Ashby | Ask Dr. Ruth | Assassin's Creed | Assassination | Assassination Nation | Assaulted: Civil Rights Under Fire | Asterix: The Secret of the Magic Potion | Asura: The City of Madness | At Any Price | At Berkeley | At First Light | At Middleton | Atomic Blonde | Attack on Titan: Part 1 | Attack on Titan: Part 2 | Attack the Block | August: Osage County | Augustine | Austenland | Author: The JT LeRoy Story | Awake: The Life of Yogananda | Baaghi | Baaghi 2 | Baahubali 2: The Conclusion | Baahubali: The Beginning | Baar Baar Dekho | Babies | Baby Driver | Bachelorette | Back in Time | Back to 1942 | Back to Burgundy | Back to the Future Da | Back to the Jurassic | Backstreet Boys: Show 'Em What You're Made Of | Backwards | Bad Blood the Hunger | Bad Grandmas | Bad Lucky Goat | Bad Match | Bad Milo! | Bad Mom | Bad Reputation | Bad Samaritan | Bad Santa 2 | Bad Times At The El Royale | Bad Words | Badla | Badlapur | Badrinath Ki Dulhania | Bag of Marbles | Baggage Claim | Bajatey Raho | Bajirao Mastani | Bajrangi Bhaijaan | Ballerina | Ballet 422 | Balloon | Ballplayer: Pelotero | Balls to the Wall | Band Aid | Band Baaja Baaraat | Band of Robbers | Bang Bang | Bang! The Bert Berns Story | Bangistan | Bangkok Revenge | Banjo | Barbara | Barbershop: The Next Cut | Barefoot | Barely Lethal | Barfi! | Barney's Version | Batkid Begins | Batla House | Batman v Superman: Dawn of Justice | Batman: The Killing Joke | Batti Gul Meter Chalu | Battle of Jangsari | Battle of Memories | Battle of the Brides | Battle of the Sexes | Battle of the Year | Battle: Los Angeles | Battlefield America | Baywatch | Be Here Now: The Andy Whitfield Story | Be Natural: The Untold Story of Alice Guy-Blaché | Be Somebody | Beach Rats | Bears | Beast | Beastly | Beasts of No Nation | Beasts of the Southern Wild | Beatriz At Dinner | Beats, Rhymes & Life: The Travels of a Tribe Called Quest | Beautiful Accident | Beautiful Creatures | Beautifully Broken | Beauty and the Beast | Beauty and the Dogs | Because of Gracia | Becoming Astrid | Becoming Traviata | Befikre | Before I Disappear | Before I Fall | Before I Go To Sleep | Before We Go | Before We Vanish | Before You Know It | Begin Again | Beginning of the Great Revival | Beijing Love Story | Being 17 | Being Charlie | Being Elmo: A Puppeteer's Journey | Being Flynn | Being Frank | Beirut | Bel Ami | Bel Canto | Believe | Believe | Believe Me | Believer | Belle | Belle and Sebastian, Friends for Life | Bellflower | Beloved Sisters | Ben is Back | Ben-Hur | Beneath the Harvest Sky | Bennett's War | Berberian Sound Studio | Bereavement | Berlin Syndrome | Bernie | Bert Stern: Original Mad Man | Besharam | Best F(r)iends Movie | Best F(r)iends Volume Two | Best Worst Thing That Ever Could Have Happened | Beta Test | Bethany Hamilton: Unstoppable | Bethlehem | Better Days | Better Living Through Chemistry | Better Than Something: Jay Reatard | Better Watch Out | Bettie Page Reveals All | Between Me and My Mind | Between the Lines | Beuys

At Eternity’s Gate | Beautiful Boy | Black Mass | BlacKkKlansman | Blaze | Bohemian Rhapsody | Cesar Chavez | Churchill | Colette | First Man | Florence Foster Jenkins | Gainsbourg: A Heroic Life | Genius | Get On Up | Hacksaw Ridge | Hitchcock | I, Tonya | J. Edgar | Joy | Judy | Legend | Lizzie | Love & Mercy | Loving Pablo | Miles Ahead | Million Dollar Arm | Molly’s Game | Moneyball | Mr. Turner | Nowhere Boy | On the Basis of Sex | Richard Jewell | Rocketman | Straight Outta Compton | The Fighter | The Imitation Game | The Iron Lady | The Runaways | The Theory of Everything | The Wolf of Wall Street | Tolkien | Vice

Beware of Mr. Baker | Beyond the Black Rainbow | Beyond the Hills | Beyond the Mask | Beyond the Reach | Bhaag Milkha Bhaag | Bharat| Bharat Ane Nenu | Bhavesh Joshi Superhero | Bhutto | Bicycling with Moliere | Bidder 70| Big Bad Wolves| Big Brother| Big Game | Big Hero 6 |Big Miracle | Big Mommas: Like Father, Like Son | Big Sonia | Big Star: Nothing Can Hurt Me | Big Stone Gap | Big Sur | Bigger | Biggest Little Farm| Bilal: A New Breed of Hero | Bill Cunningham New York | Bill Nye: Science Guy | Bill W. | Billy Lynn's Long Halftime Walk | Bird People | Birdboy: The Forgotten Children | Birds of Passage | Birth of the Dragon | Birth of the Living Dead | Bisbee '17 | Bitter Harvest | Biutiful | Bjork - Biophilia Live | Black '47 | Black and Blue | Black Butler: Book of the Atlantic | Black Christmas | Black Death | Black Nativity | Black or White | Black Out | Black Panthers: Vanguard of the Revolution | Black Sea | Black Souls | Black Swan | Blackbird | Blackfish | Blackthorn | Blackway | Blade of the Immortal | Blade Runner 2049 | Blair Witch | Blancanieves | Bleed for This | Blended | Bless Me Ultima | Blinded By the Light | Blindspotting | Blink of an Eye | Blockers | Blood Done Sign My Name | Blood Fest | Blood Ties | Bloodworth | Blue Exorcist The Movie | Blue Is the Warmest Color | Blue Like Jazz | Blue Ruin | Bluebeard | Bodied | Body at Brighton Rock | Bodyguard | Bol | Bol Bachchan | Bolshoi Ballet: Hero of our Time | Bomb City | Bombay Velvet | Bombshell | Bombshell: The Hedy Lamarr Story | Boo! A Madea Halloween | Book Club | Booksmart | Boom for Real: The Late Teenage Years of Jean-Michel Basquiat | Border | Borg vs. McEnroe | Borgman | Born in China | Born to be Blue | Born to Be Wild | Boruto: Naruto the Movie | Boston: An American Running Story | Boulevard | Boundaries | Boxing Gym | Boy | Boy and the World | Boy Erased | Boy Meets Girl | Boyhood | Brad's Status | Bran Nue Dae | Branded | Brave | Brave New Jersey | Break Ke Baad | Breaking In (2018) | Breaking Upwards | Breakthrough | Breakup Buddies | Breath | Breathe | Breathe | Breathe In | Brian Banks | Brick Mansions | Bricked | Bride Flight | Bridesmaids | Bridge of Spies | Bright Days Ahead | Bright Ones | Brightburn | Brighton Rock | Brigsby Bear | Brimstone and Glory | Bring the Soul: The Movie | Bringing Up Bobby | Britt-Marie Was Here | Brittany Runs a Marathon | Broadway Idiot | Broken Circle Breakdown | Broken City | Bronx Gothic | Brooklyn Castle | Brooklyn's Finest | Brother Nature | Brotherly Love | Brothers: Blood Against Blood | BTS World Tour: Love Yourself in Seoul | Buck | Bucky Larson: Born to Be a Star | Buddies in India | Buddy | Budrus | Buen Dia, Ramon | Buena Vista Social Club: Adios | Bullet to the Head | Bullet Vanishes | Bullhead | Bullitt County | Bully | Bumblebee | Burden | Buried | Burlesque | Burn | Burn the Stage: The Movie | Burning Bodhi | Burnt | Busco Novio Para Mi Mujer | Buster's Mal Heart | But Always | But Deliver Us from Evil | Butter | Buttons: A New Musical Film | Buybust | Buzzard | Buñuel in the Labyrinth of the Turtles | By the Grace of God | By the Sea | Bye Bye Germany | C'est Si Bon | C.O.G. | Ca$h | Caesar & Otto's Summer Camp Massacre | Cafe Society | Caged No More | Cairo Time | Cake | California Typewriter | Calvary | Camille Claudel 1915 | Camp | Camp X-Ray | Can You Dig This | Can You Ever Forgive Me? | Can't Stand Losing You: Surviving the Police | Canal Street | Canopy | Cantinflas | Capernaum | Capital | Captain Fantastic | Captain Underpants: The First Epic Movie | Captive | Captive State | Capture the Flag | Carancho | Carmine Street Guitars | Carnage | Carrie | Carrie Pilby | Cars 2 | Cars 3 | Cartas a Elena | Cartel Land | Carter High | Casa De Mi Padre | Case 39 | Casino Jack | Casino Jack and the United States of Money | Cassandro, the Exotico! | Cat Run | Catching the Sun | Cats | Cats & Dogs: The Revenge of Kitty Galore | Cave of Forgotten Dreams | Cavemen | CBGB | Cedar Rapids | Celeste and Jesse Forever | Cemetery of Splendor | Censored Voices | Central Intelligence | Centurion | Ceremony | Certain Women | Cezanne et moi | Chaar Sahibzaade: Rise of Band Singh Bahadur | Chain Letter | Chakravyuh | Chalo Dilli | Champion | Champion | Chance Pe Dance | Chappaquiddick | Chappie | Chapter & Verse | Charlie Countryman | Charlie Says | Charlie St. Cloud | Charlie's Angels | Charlie's Country | Chasing Einstein | Chasing Ice | Chasing Madoff | Chasing Mavericks | Chasing the Blues | Chasing the Dragon | Chasing the Dragon 2: Wild Wild Bunch | Chasing Trane: The John Coltrane Documentary | Chavela | Cheap Thrills | Cheatin' | Chef | Chef Flynn | Chely Wright: Wish Me Away | Chennai Express | Chernobyl Diaries | Chevalier | Chhichhore | Chi-Raq | Chicken with Plums | Chico & Rita | Child 44 | Child's Play | Child's Pose | Chillar Party | Chimpanzee | China Heavyweight | Chinese Puzzle | CHiPs | Chloe | Chocolate City | Chonda Pierce: Enough | Chonda Pierce: Unashamed | Chongqing Hot Pot | Chris Brown: Welcome To My Life | Christian Mingle | Christine | Christmas Eve | Christmas Jars | Chronic | Chuck | Cinco De Mayo: La Batalla | Cinderella | Circo | Circumstance | Cirque Du Soleil: Worlds Away | Citadel | Citizen Jane | Citizen Koch | Citizenfour | City Island| City of Ghosts | City of Gold | City of Life and Death | City of Rock | Claire's Camera | Clash of the Titans | Class Rank | Clemency | Client 9: The Rise and Fall of Eliot Spitzer | Cliffs of Freedom | Clinton Road | Closed Circuit | Closed Curtain | Closet Monster | Cloud Atlas | Clown | Club Life | Cock and Bull | Cocktail | Coco | Coco Chanel & Igor Stravinsky | Code Black | Code Geass: Lelouch of the Resurrection | Cold Blood | Cold Case Hammarskjöld | Cold Comes the Night | Cold in July | Cold Pursuit | Cold War 2 | Cold Weather | Collide | Colliding Dreams | Colombiana | Colonia | Colossal | Columbus | Combat Obscura | Come Back to Me | Come Out and Play | Come What May | Comet | Comic-Con Episode IV: A Fan's Hope | Coming Home | Coming Through the Rye | Command and Control | Commitment | Compadres | Complete Unknown | Compliance | Computer Chess | Conan O'Brien Can't Stop | Conan the Barbarian | Concussion | Concussion | Condorito: La Pelicula | Confidential Assignment | Connected | Contagion | Contemporary Color | Contraband | Conviction | Cook County | Cool It | Cooties | Cop Car | Cop Out | Copperhead | Coriolanus | Countdown | Countdown to Zero | Country Strong | Courageous | Cowboys & Aliens | Cowgirls n' Angels | Cracks | Crawl | Crazy Horse | Crazy on the Outside | Crazy Wisdom| Creation | Creative Control | Creature | Creed II | Crime After Crime | Criminal | Crimson Peak | Crooked Arrows | Cropsey | Crown Heights | Crystal Fairy | Cuban Fury | Cunningham | Custody | Cutie and the Boxer | Cynthia | Cyrus | Dabangg | Dabangg 2 | Dallas Buyers Club | Damsel | Damsels in Distress | Dancer | Dancing Across Borders | Dancing in Jaff | Dangal | Dangerous Liaisons | Danny Collins | Danny Says | Dark Horse | Dark Money | Dark Phoenix | Dark Places | Dark Shadows | Dark Skies | Dark Star: H.R. Giger's World | Dark Waters | Darkest Hour | Darling Companion | Dave Made a Maze | David and Goliath | David Crosby: Remember My Name | David Lynch: The Art Life | Dawn of the Planet of the Apes | Dawson City: Frozen Time | Daybreakers | De De Pyaar De | De Mai Tinh | De Palma | Dead Awake | Dead Man Down | Dead Snow 2: Red vs. Dead | Deadfall | Deadly Renovations | Deadpool | Deadpool 2 | Deadtime | Dealt | Dean | Dear John | Dear Mr. Watterson | Dear Zindagi | Death at a Funeral | Death House | Death of a Nation | Death Wish | Deceptive Practice: The Mysteries and Mentors of Ricky Jay | Declaration of War | Decoding Annie Parker | Deep Gold | Deepwater Horizon | Default | Defendor | Deitrick Haddon's A Beautiful Soul | Delhi Belly | Deli Man | Deliver Us From Evil | Delivery Man | Demolition | Demon | Den of Thieves | Denial | Desert Dancer | Desert Flower | Desi Boyz | Desierto | Desolation Center | Despicable Me 2 | Despicable Me 3 | Destroyer | Detachment | Detective Byomkesh Bakshy | Detective Chinatown | Detective Chinatown 2 | Detective Dee and the Mystery of the Phantom Flame | Detective Dee: The Four Heavenly Kings | Detective K: Secret of the Living Dead | Detective K: Secret of the Lost Island | Detour | Detroit | Detropia | Devil | Devil and Angel | Devil's Due | Dheepan | Dhobi Ghat | Dhoom 3 | Diamantino | Diana | Diana Ross: Her Life, Love And Legacy | Diana Vreeland: The Eye Has to Travel | Diane | Diary of a Chambermaid | Diary of a Wimpy Kid | Diary of a Wimpy Kid: Dog Days | Diary of a Wimpy Kid: Rodrick Rules | Diary of a Wimpy Kid: The Long Haul | Dictator | Difret | Digging for Fire | Dil Dhadakne Do | Dilwale | Dina | Dinner for Schmucks | Dior and I | Diplomacy | Dirty Girl | Dirty Grandpa | Dirty Wars | Disconnect | Dishkiyaoon | Dishoom | Disney's Christopher Robin | Disobedience | Disorder | District B13: Ultimatum | Divergent | Divide and Conquer: The Story of Roger Ailes | Django | Do I Sound Gay? | Do Not Resist | Do You Believe? | Doctor Sleep | Doctor Who: Logopolis | Dog Days | Dogman | Dolemite Is My Name | Dolores | Dolphin Tale | Dolphin Tale 2 | Dom Hemingway | Don 2 | Don Jon | Don McKay | Don Verdean

Don't Be Afraid of the Dark | Don't Blink - Robert Frank | Don't Breathe | Don't Let Go | Don't Stop Believin': Everyman's Journey | Don't Think I've Forgotten: Cambodia's Lost Rock and Roll | Don't Think Twice | Don't Worry He Won't Get Far on Foot | Donald Cried | Dongju: The Portrait of a Poet | Dope | Dora and the Lost City of Gold | Dorfman in Love | Double Dhamaal | Double Lover | Double Trouble | Douchebag | Dough | Down for Life | Downsizing | Downton Abbey | Dr. Cabbie | Dr. Seuss' The Grinch | Dr. Seuss' The Lorax | Dracula Untold | Dragon | Dragon Ball Super: Broly | Dragon Ball Z: Battle of Gods | Dragon Ball Z: Resurrection 'F' | Dragon Blade | Dream House | Dreams Rewired | Dredd | Drinking Buddies | Drive | Drive Angry | Drug War | Drunk Stoned Brilliant Dead: The Story of the National Lampoon | Drunk Wedding | Duckweed | Due Date | Dum Maaro Dum | Dumb and Dumber To | Dumbo | Dylan Dog: Dead of Night | Eames: The Architect and the Painter | Early Man | Earth to Echo | Earth: One Amazing Day | Easy Money | Eat That Question: Frank Zappa in His Own Words | Eating Animals | Eating You Alive | ECCO | Echo in the Canyon | Eddie the Eagle | Eddie the Sleepwalking Cannibal | Edge of Darkness | Edie | Effie Gray | Eisenstein in Guanajuato | Ek Ladki Ko Dekha Toh Aisa Laga | Ek Main Aur Ekk Tu | Ek Tha Tiger | Ek Thi Daayan | Ek Villain | El Angel | El Bulli: Cooking in Progress | El Chicano | El Clan | El Coyote | El Jeremias | El Pacto | Elaine Stritch: Shoot Me | Elektra Luxx | Elena | Eli | Elite Squad: The Enemy Within | Elizabeth Blue | Elles | Elliot: The Littlest Reindeer | Elsa & Fred | Elstree 1976 | Elvis & Nixon | Elysium | Embrace of the Serpent | Embrace: The Documentary | Emperor | En el Septimo Dia | End of the Century | End of Watch | Endless Love | Endless Poetry | Enemies of the People | Enemy | English Vinglish | Enough Said | Entertainment | Entourage | Epic | Equals | Equity | Ernest & Celestine | Escape Fire | Escape From Planet Earth | Escape from Tomorrow | Escape Plan | Escape Room | Escapes | Escobar: Paradise Lost | Europa Report | Eva | Eva Hesse | Evangelion 2.0: You Can (Not) Advance | Evangelion: 3.0 You Can (Not) Redo | Even the Rain | Everest | Every Day | Every Day | Every Secret Thing | Everybody Knows | Everybody Loves Somebody | Everybody Wants Some!! | Everybody's Everything | Everything Must Go | Everything, Everything | Evidence of a Haunting | Evil Dead | Evocateur: The Morton Downey Jr. Movie | Evolution | Ex Libris: The New York Public Library | Ex-File 3 | Exit | Exodus: Gods and Kings | Expedition to the End of the World | Expelled from Paradise | Experimenter | Explosion | Exporting Raymond | Extraction | Extraordinary Measures | Extraordinary Mission | Extreme Job | Extremely Loud & Incredibly Close | Eye in the Sky | Fabricated City | Fading Gigolo | Fagara | Fahrenheit 11/9 | Fair Game | Fairy Tail: Dragon Cry | Faith of Our Fathers | Faith, Hope & Love | Falcon Rising | Fall in Love Like a Star | Family | Fan | Fantastic Beasts and Where To Find Them | Fantastic Beasts: The Crimes of Grindelwald | Fantastic Four | Fantastic Fungi | Far from the Madding Crowd | Far from the Tree | Far Out Isn't Far Enough: The Tomi Ungerer Story | Farewell My Queen | Farmageddon | Fast & Furious 6 | Fast & Furious Presents: Hobbs & Shaw | Fast Color | Faster | Fate/Stay Night: Heaven's Feel - I. Presage Flower | Fate/Stay Night: Heaven's Feel - II. Lost Butterfly | Father Figures | Father of My Children | Fatima | Faust | Fed Up | Feed the Fish | Felix and Meira | Fences | Feng Shui | Ferdinand | Ferrari Ki Sawaari | Fetih 1453: The Conquest of Constantinople | Fiddler: A Miracle of Miracles | Fifty Shades Darker | Fifty Shades Freed | Fifty Shades of Black | Fifty Shades of Grey | Fighting with My Family | Fill the Void | Filly Brown | Film Stars Don't Die in Liverpool | Filmistaan | Filmworker | Filth | Filth to Ashes, Flesh to Dust | Final Destination 5 | Final Portrait | Final: The Rapture | Finders Keepers | Finding Dory | Finding Fanny | Finding Fela | Finding Joe | Finding Oscar | Finding Vivian Maier | Finding Your Feet | Fire in the Blood | Fireflies in the Garden | Fireworks | Fireworks Wednesday | First Love | First Position | First We Take Brooklyn | Fist Fight | Fists of Legend | Fitoor | Five Feet Apart | Five Nights in Maine | Five Seasons: The Gardens of Piet Oudolf | Flamenco, Flamenco | Flatliners | Flipped | Flower | Flowers | Flying Monsters | Flying Swords of Dragon Gate | Focus | Follow Me: The Yoni Netanyahu Story | Followers | Footloose | Footnote | For a Few Bullets | For a Woman | For Ahkeem | For Greater Glory | For No Good Reason | For the Love of Spock | Forever My Girl | Forever Young | Forgiveness of Blood | Forks Over Knives | Formosa Betrayed | Fort McCoy | Four Lions | Foxtrot | Fragments of Truth | Framing John DeLorean | Francofonia | Frank | Frank and Lola | Frank Miller's Sin City: A Dame to Kill For | Frank Serpico | Frankenweenie | Frankie | Frankie & Alice | Frantz | Freak Show | Freakonomics | Freaks | Freaks of Nature | Free Angela and All Political Prisoners | Free Birds | Free Fire | Free Men | Free Solo | Free State of Jones | Free the Mind | Free Trip to Egypt | Freeheld | Freetown | Friend Request | Friends and Romans | Friends with Kids | Fright Night | From Beneath | From Paris with Love | From Prada to Nada | From the Land of the Moon | From Up on Poppy Hill | Frontera | Frozen | Frozen | Frozen II | Fruitvale Station | Fukrey | Fullmetal Alchemist: The Sacred Star of Milos | Fun Size | Funan | Furie | Furious | Furious 7 | Furry Vengeance | Fury | Futuro Beach | G.I. Joe: Retaliation | Gabbar is Back | Gabo: The Creation of Gabriel Garcia Marquez | Game | Game Day | Game Night | Gang of Ghosts | Gangster's Paradise: Jerusalema | Gauguin: Voyage to Tahiti | Gemini | Gemini Man | Gemma Bovery | General Education | General Magic | Generation Found | Generation Iron | Generation War | Generation Wealth | Genesis: Paradise Lost | Genius Within: The Inner Life of Glenn Gould | Gentleman | George A. Romero's Survival of the Dead | George Takei's Allegiance | George Takei's Allegiance | Geostorm | Gerhard Richter Painting | Get Hard | Get Him to the Greek | Get Low | Getaway | Gett: The Trial of Viviane Amsalem | Getting Grace | Ghanchakkar | GhettoPhysics: Will the Real Pimps and Hos Please Stand Up? | Ghost Fleet | Ghost in the Shell | Ghost in the Shell: The New Movie | Ghost Rider: Spirit of Vengeance | Ghost Stories | Ghost Team | Ghost Team One | Ghostbusters | Giant Little Ones | Gift | Gifted | Gimme Danger | Gimme Shelter | Gimme the Loot | Ginger & Rosa | Girl Asleep | Girl in Progress | Girl Model | Girl Most Likely | Girl on a Bicycle | Girl Rising | Girlfriend Boyfriend | Girlhood | Girls of the Sun | Girls Trip | Girls vs Gangsters | Give Me Liberty | Gladiators of Rome | Gleason | Glen Campbell... I'll Be Me | Gloria | Gloria Bell | GMO OMG | Gnomeo and Juliet | Go Away Mr. Tumor | Go For It | Go For Sisters | Go Goa Gone | God Bless America | God Bless the Broken Road | God Help the Girl | God Knows Where I Am | God Loves Uganda | God of Vampires | God of War | God the Father | God's Not Dead | God's Not Dead 2 | God's Not Dead: A Light in Darkness | God's Own Country | God's Pocket | Godard Mon Amour | Gods of Egypt | GODSPEED The Race Across America | Godzilla | Godzilla: King of the Monsters | Godzilla: The Japanese Original | Going Attractions: The Definitive Story of the American Drive-in Movie | Going Attractions: The Definitive Story of the Movie Palace | Going in Style | Going the Distance | Gold | Goldbuster | Golden Exits | Golden Job | Golden Slumber | Goldstone | Golmaal 3 | Golmaal Again | Gone | Gone Doggy Gone | Gone Girl | Gonjiam: Haunted Asylum | Good Boys | Good Kill | Good Manners | Good Ol' Freda | Good Time | Goodbye Christopher Robin | Goodbye Mr. Loser | Goodbye to Language | Goodnight Mommy | Gook | Goon | Goosebumps | Goosebumps 2: Haunted Halloween | Gore Vidal: United States of Amnesia | Gori Tere Pyaar Mein | Gosnell: The Trial of America's Biggest Serial Killer | Grace Jones:Bloodlight and Bami | Grace Unplugged | Graceland | Grand Masti | Grand Piano | Grandma | Grandmaster | Granito: How to Nail a Dictator | Gravity | Gray Matter | Great Directors | Great Expectations | Greater | Greedy Lying Bastards | Green Lantern | Green Room | Green Zone | Greenberg | Greener Grass | Greta | Gridiron Heroes | Gringo| Growing Up Smith| Grown Ups | Grudge Match | Gueros | Gulliver's Travels | Gully Boy | Gun Hill Road | Gurukulam | Guy and Madeline on a Park Bench | Guzaarish | Hagazussa | Haider | Haikara-San: Here Comes Miss Modern | Hail Satan? | Hail, Caesar! | Hal | Hale County This Morning, This Evening | Half of a Yellow Sun | Hall Pass | Halloween | Halston | Hamari Adhuri Kahani | Hampstead | Hands of Stone | Handsome Harry | Hanna | Hannah Arendt | Hansel and Gretel: Witch Hunters | Happy Bhaag Jayegi | Happy Christmas | Happy Death Day | Happy Death Day 2U | Happy End | Happy Ending | Happy Feet Two | Happy People: A Year in the Taiga | Happy Phirr Bhag Jayegi | Happy Tears | Happy Valley | Happy, Happy | HappyThankYouMorePlease | Hara-Kiri: Death of a Samurai | Hardcore Henry | Hardflip | Hare Krishna! The Mantra, the Movement and the Swami Who Started it All | Harlan: In the Shadow of Jew Suss | Harmonium | Harold and Lillian: A Hollywood Love Story | Harriet | Harry & Snowman | Harry Benson: Shoot First | Harry Brown | Harry Dean Stanton: Partly Fiction | Harry Potter and the Deathly Hallows Part 1 | Harry Potter and the Deathly Hallows Part 2 | Harvest | Hasee Toh Phasee | Hatchet 2 | Hateship Loveship | Hating Breitbart | Haute Cuisine | Hava Nagila | Have a Nice Day | Hayride 2 | Hazlo Como Hombre | He Matado a mi Marido! | He Named Me Malala | Head Full of Honey | Head Games | Headhunters | Heading Home: The Tale of Team Israel | Heartbeats | Heartbreaker | Hearts Beat Loud | Heaven is for Real | Heavy Trip | Heavy Water | Hecho En Mexico | Hector And The Search For Happiness | Heist | Helicopter Eela | Hell and Back | Hell Baby | Hell Fest | Hellbound? | Hellboy | Hellion | Hello Herman | Hello I Must Be Going | Hello, My Name is Doris | Hemingway's Garden of Eden | Heneral Luna | Henry's Crime | Her Smell

About Last Night | About Time | Admission | Always Be My Maybe | Bridget Jones’s Baby | Crazy Rich Asians | Crazy, Stupid, Love | Easy A | Isn’t It Romantic | Juliet, Naked | Just Go With It | Just Wright | Larry Crowne | Last Christmas | Leap Year | Love, Simon | Mamma Mia! Here We Go Again | Midnight in Paris | New Year’s Eve | Nobody’s Fool | Obvious Child | Plus One | Ruby Sparks | Set It Up | Silver Linings Playbook | Sleeping with Other People | Something Borrowed | The Back-Up Plan | The Big Sick | The Switch| To All the Boys I’ve Loved Before | Valentine’s Day | What Men Want | Yesterday

Herb and Dorothy 50x50 | Hercules | Here and Now | Here and There | Here Comes the Boom | Hereafter | Hermano | Hermia & Helena | Hero | Heroine | Heropanti | Hesburgh | Hesher | Hey, Boo: Harper Lee & To Kill A Mockingbird | Hichki | Hidden Figures | Hide Away | Hideaway (Le Refuge) | Hieronymus Bosch: Touched by the Devil | High Life | High on the Hog | High School | High Strung | High Strung Free Dance | High-Rise | Higher Ground | Highway | Hillary's America: The Secret History of the Democratic Party | Hillsong - Let Hope Rise | Himmatwala | History of Jazz: Oxygen for the Ears | Hit and Run | Hit So Hard | Hitchcock/Truffaut | Hitler's Hollywood | Hitman: Agent 47 | Ho Mann Jahaan | Hobo With a Shotgun | Hockney | Holiday | Holla II | Holmes and Watson | Holy Hell | Holy Rollers | Home | Home Again | Home Run | Honeyland | Hoodwinked Too! Hood vs. Evil | Hooligan Sparrow | Hop | Hope Springs | Horns | Hostiles | Hot Pursuit | Hot Tub Time Machine | Hot Tub Time Machine 2 | Hot Water | Hotel Artemis | Hotel by the River | Hotel Mumbai | Hotel Transylvania | Hotel Transylvania 2 | Hotel Transylvania 3: Summer Vacation | House at the End of The Street | Housefull | Housefull 2 | Housefull 3 | Housefull 4 | How Do You Know | How He Fell in Love | How I Live Now | How Long Will I Love U | How Much Does Your Building Weigh, Mr. Foster? | How to be a Latin Lover | How to Be Single | How to Let Go of the World and Love All Things Climate Can't Change | How to Live Forever | How to Make Money Selling Drugs | How to Survive a Plague | How to Talk to Girls at Parties | How to Train Your Dragon | How to Train Your Dragon 2 | How to Train Your Dragon: The Hidden World | How Victor 'The Garlic' Took Alexey 'The Stud' to the Nursing Home | Hubble 3D | Hugh Hefner: Playboy, Activist and Rebel | Hugo | Human Capital | Human Flow | Humor Me | Humpty Sharma Ki Dulhania | Hungry Hearts | Hunky Dory | Hunt for the Wilderpeople | Hunter Gatherer | Husbands in Goa | Hyde Park on Hudson | Hyena | Hyena Road | Hysteria | I | I Am Ali | I Am Big Bird | I Am Divine | I Am Eleven| I Am Not a Witch | I Am Not Madame Bovary | I Am Number Four | I am the Blues | I Belonged to You | I Called Him Morgan | I Can Only Imagine | I Declare War | I Do... Until I Don't | I Give It a Year | I Got the Hook Up 2 | I Hate Luv Storys | I Kissed a Vampire | I Love You Both | I Love You, Phillip Morris | I Origins | I Saw the Devil | I Saw the Light | I Smile Back | I Spit on Your Grave | I Still See You | I Used to Be Darker | I Want to Eat Your Pancreas | I Want Your Money | I Will Follow | I Wish | I'll Push You | I'll Take Your Dead | I'm In Love With a Church Girl | I'm Not Ashamed | I'm So Excited | I, Frankenstein | Ice Age: Collision Course | Ice Age: Continental Drift | Ice Dragon: The Legend of the Blue Daisies | Iceman | Iceman | Identity Thief | If a Tree Falls: A Story of the Earth Liberation Front | If Beale Street Could Talk | If I Stay | If I Were You | If the Dancer Dance | If You Are the One 2 | If You Build It | Ilo Ilo | Immigration Tango | Immortal Hero | Immortals | In a Better World | In a Valley of Violence | In a World | In Another Country | In Between | In Bloom | In Darkness | In Fabric | In Jackson Heights | In Like Flynn | In No Great Hurry: 13 Lessons in Life with Saul Leiter | In Order of Disappearance | In Our Hands: The Battle for Jerusalem | In Search of Greatness | In Secret | In The Aisles | In the Heart of the Sea | In the House | In the House of Flies | In the Land of Blood and Honey | In the Name of my Daughter | In the Shadow of Women | In the Steps of Trisha Brown | In This Corner of the World | In Time | InAPPropriate Comedy | Incarnate | Incendies | Incredibles 2 | Independence Day: Resurgence | India's Most Wanted | Indian Horse | Indignation | Indivisible | Inequality for All | Inescapable | Inferno | Infinitely Polar Bear | Ingrid Bergman: In Her Own Words | Ingrid Goes West | Inhumans | Inkubus | Inni | Innocence | Inside Job | Inside the Mind of Leonardo Da Vinci in 3D | Insidious | Insidious Chapter 2 | Insidious Chapter 3 | Insidious: The Last Key | InSight | Inspector Bellamy | Instant Family | Instructions Not Included | | Into Eternity | Into the Forest | Into The Storm | Into the Woods | Ip Man 2: Legend of the Grandmaster | Ip Man 3 | Ip Man: The Final Fight | Iris | Iron Sky | Irrational Man | Irving Berlin's Holiday Inn The Broadway | Is Genesis History? | Is It Wrong to Try to Pick Up Girls in a Dungeon?: Arrow of the Orion | Is That a Gun in Your Pocket? | Is the Man Who Is Tall Happy? | Isle Of Dogs | ISM | Ismael's Ghosts | Isn't It Romantic | It | It Comes At Night | It's a Disaster | It's Kind of a Funny Story | It: Chapter Two | ITTEFAQ | Itzhak | Ivory Tower | Ixcanul | Iyengar | Izzy Gets the Fuck Across Town | Jab Harry Met Sejal | Jab Tak Hai Jaan| Jack Goes Boating | Jack Reacher | Jack Reacher: Never Go Back | Jack Ryan: Shadow Recruit | Jack the Giant Slayer | Jackass 3-D | Jackass Presents: Bad Grandpa | Jagga Jasoos | Jai Ho | James Cameron's Deepsea Challenge 3D | Jane | Jane and Emma | Jane Eyre | Jane Got a Gun | Janis: Little Girl Blue | Jason Bourne | Jason Mraz: Have It All The Movie | Jay & Silent Bob Reboot | Jay Myself | Jayne Mansfield's Car | Jealousy | Jean-Michel Basquiat: The Radiant Child | Jeepers Creepers 3 | Jem and the Holograms | Jeremiah Tower | Jersey Boys | Jerusalem | Jesus Is King | Jet Trash | Jewtopia | Jexi | Jig | Jigsaw | Jim Allison: Breakthrough | Jim Henson's Holiday Special with Fraggle Rock and Emmet Otter | Jimi: All Is By My Side | Jimmy P | Jimmy Vestvood: Amerikan Hero | Jimmy's Hall | Jinn | Jiro Dreams of Sushi | Joan Rivers: A Piece of Work | Jodi Breakers | Jodorowsky's Dune | Joe | John Carter | John Dies at the End | John McEnroe: In the Realm of Perfection | John Rabe | John Wick: Chapter 3 - Parabellum | John Wick: Chapter Two | Johnny English Reborn | Johnny English Strikes Again | Jojo Rabbit | Joker | Joker | Jolly Llb 2 | Jonah Hex | Joseph Pulitzer: Voice of the People | Journey 2: The Mysterious Island | Journey to the South Pacific | Journey to the West | Journey to the West: The Demons Strike Back | Journey's End | Joyful Noise | Judwaa 2 | Judy Moody and the NOT Bummer Summer | Julia | Jumanji: The Next Level | Jumanji: Welcome to the Jungle | Jumping the Broom | Junglee | Jupiter Ascending | Jurassic World | Just a Breath Away | Just a Sigh | Just Getting Started | Just Mercy | Just One Drop | Justice | Justice League | Justin Bieber: Never Say Never | K-12 | K: Missing Kings | Kaashmora | Kabali | Kaboom | Kahaani 2 | Kahlil Gibran's The Prophet | Kai Po Che | Kaili Bluesd | Kalank | Kapoor & Sons | Kaptaan | Karl Marx City | Karthik Calling Karthik | Karwaan | Katti Batti | Katy Perry: Part of Me | Keanu | Kedarnath | Keep On Keepin' On | Keep the Change | Keep Watching | Keeping Up with the Joneses | Kelly & Cal | Kepler'sDream | Keyhole | Khalid: Free Spirit | Khatta Meetha | Khiladi 786 | Khoobsurat | Ki & Ka | Kick | Kick-Ass | Kick-Ass 2 | Kickboxer Retaliation | Kicks | Kid With a Bike | Kidnap | Kids For Cash | Kill List | Kill Me Three Times | Kill the Messenger | Kill Your Darlings | Kill Zone 2 | Killer Elite | Killer Joe | Killer Unicorn | Killerman | Killers | Killing Season | Killing Them Softly | Kilo Two Bravo | Kindergarten Teacher | King Arthur: Legend of the Sword | King Georges | King of Thieves | Kingdom Men Rising | Kings (2018) | Kings Faith | Kings of Pastry | Kings of the Evening | Kingsglaive: Final Fantasy: XV | Kingsman: The Golden Circle | Kingsman: The Secret Service | Kinky Boots The Musical (2019) | Kinyarwanda | Kirk Cameron REVIVE US 2 | Kirk Cameron's Saving Christmas | Kirk Cameron: Connect | Kiss of the Damned | Kisses | Kites | Klown | Klown Forever | Knife+Heart | Knight & Day | Knights of Badassdom | Knives and Skin | Knives Out | Koch | Kochadaiiyaan | Kon-Tiki | Kong: Skull Island | KonoSuba: God's Blessing on this Wonderful World! Legend of Crimson | Korengal | Krampus | Krisha | Krrish 3 | Krystal | Ktown Cowboys | Kuleana | Kumare | Kumiko, The Treasure Hunter | Kundo: Age of the Rampant | Kung Fu Killer | Kung Fu Yoga | Kusama: Infinity | L!fe Happens | L'Amour Fou | L'attesa | L.A. Slasher | L.O.R.D: Legend of Ravaging Dynasties | La Boda de Valentina | La Camioneta | La Mission | La Sapienza | La Soga | Labios Rojos | Labyrinth of Lies | Ladrones | Lady Bird | Lady Macbeth | Laggies | Lamb | Lambert & Stamp | Land Ho! | Land of Mine | Landfill Harmonic | Landline | Language of a Broken Heart| Larry Crowne | Last Cab to Darwin | Last Call at the Oasis| Last Christmas | Last Days in Vietnam | Last Flag Flying | Last Flight of the Champion | Last Letter | Last Men in Aleppo | Last Night | Last Ounce of Courage | Last Passenger | Last Rampage | Last Vegas | Last Weekend | Late Night | Lawless | Lay the Favorite | Lazer Team | LBJ | Le Chef | Le Havre | Le Quattro Volte | Le Week-End | League of Gods | Lean on Pete | Leaning Into The Wind | Leap! | Learning to Drive | Least Among Saints | Leave No Trace | Leaves of Grass | Leaving | Lebanon | Lebanon, Pa. | Lee Daniels' The Butler | Left Behind | Legend of the Fist: The Return of Chen Zhen | Legend of the Guardians: The Owls of Ga'Hoole | Legend of the Naga Pearls | Legendary | Legends from the Sky | Legends of Oz: Dorothy's Return | Legion | Lekar Hum Deewana Dil | Leo Da Vinci: Mission Mona Lisa | Leonie | Les cowboys | Les Miserables | Let it Rain | Let the Bullets Fly | Let the Corpses Tan | Let the Fire Burn | Let the Sunshine In | Let there be Light | Let Yourself Go | Let's Be Cops | Let's Get Married | Letters from Baghdad | Letters to God | Letters to Juliet | Level Up | Leviathan | Liar's Autobiography | Liberal Arts | Life & Nothing More | Life | Life After Beth | Life as We Know It | Life During Wartime | Life in a Day | Life Itself | Life of a King | Life of Crime | Life of the Party | Life, Above All | Life, Animated | Light of My Life | Lights Out | Like Arrows | Like Crazy | Like Dandelion Dust | Like Father, Like Son | Like for Likes | Like Me | Like Someone in Love | Like Sunday, Like Rain | Lila & Eve | Lilting | Limelight | Limitless | Linda Ronstadt: The Sound of My Voice | Line Walker 2 Invisible Spy | Linsanity | Lion | Listen to Me Marlon | Little | Little Accidents | Little Boy | Little Fockers | Little Italy | Little Joe | Little Pink House | Little White Lies | Little Women | Little Women | Live By Night | Living in Emergency | Liyana | Liz and the Blue Bird | Lo and Behold, Reveries of the Connected World | Lobster Cop | Locke | Lockout | Logan Lucky | Lola Versus | Lolo | London Fields | London Has Fallen | Lone Survivor | Long Day's Journey Into Night | Long Strange Trip - The Untold Story of The Grateful Dead | Looking for Eric | Looking Up | Looper | Loopers: The Caddie's Long Walk | Lootera | Lords of Chaos | Lore | Loro | Los Domirriqueños 2 | Los Reyes | Lost & Found | Lost and Found in Armenia | Lost in Hong Kong | Lost in Paris | Lost in Thailand | Lost Woods | Lottery Ticket | Louder Than a Bomb | Louder than Bombs | Love & Taxes | Love | Love | Love After Love | Love and Honor | Love and Lost | Love and Other Drugs | Love At First Fight | Love Crime | Love in Space | Love in the Buff | Love is All You Need | Love is in the Air | Love Live! Sunshine!! The School Idol Movie Over The Rainbow | Love Live! The School Idol Movie | Love Me True | Love on the Cloud | Love Ranch | Love the Coopers | Love Thy Nature | Love U Mr. Kalakaar | Love, Antosha | Love, Cecil | Love, Gilda | Love, Kennedy | Love, Rosie | Lovely Molly | Lovely, Still | Loving | Loving Vincent | Low Down | Lowriders | Lu Over the Wall | Luce | Lucha Mexico | Luck-Key | Lucky | Lucky | Lucky Them | Lucy | Lucy in the Sky | Luis & the Aliens | Luka Chuppi | LUV | Luv Shuv Tey Chicken Khurana | Lycan | M.S. Dhoni: The Untold Story | Ma ma | Macbeth | MacGruber | Machete | Machete Kills | Machine Gun Preacher | Madagascar 3: Europe's Most Wanted | Madame Bovary | Made In Abyss: Journey's Dawn | Made in Dagenham | Mademoiselle C | Mademoiselle Chambon | Maggie | Maggie's Plan | Magic in the Moonlight | Magic Mike | Magic Mike XXL | Magic to Win | Magic Trip | Magnus | Maiden | Maidentrip | Main Tera Hero | Make Your Move | Making the Five Heartbeats | Making Waves: The Art of Cinematic Sound | Maleficent | Maleficent: Mistress of Evil | Malek | Mali Blues | Mama | Mamma Mia! Here We Go Again | Man From Reno | Man of Steel | Man of Tai Chi | Man on a Ledge | Man on a Mission | Man Underground | Mandela: Long Walk to Freedom | Mandy | Manglehorn | Maniac | Manifesto | Manmarziyaan | Manolo: The Boy Who Made Shoes for Lizards | Mansfield 66/67 | Mansome | Mantra: Sounds into Silence | Mao's Last Dancer | Mapplethorpe

13 Hours | Donnybrook | Flight | Hold the Dark | Little Woods | Mickey and the Bear | Only the Brave | Parkland | Standoff at Sparrow Creek | Stretch |The Kitchen | The Lone Ranger | The Walk | World War Z

If we had to choose one, the most medium movie of the decade is probably Ivan Reitman’s Draft Day. It stars Kevin Costner as a general manager who’s trying to get the most of his team’s No. 1 draft pick. It’s not really an NFL procedural; if it were devoted to the nitty-gritty of the business, it’d be more interesting. But instead it just is. If middling had a face, it’d be someone’s from Draft Day. —A.W.

Jennifer Garner doesn’t deserve this.
Photo: Summit Entertainment

Maps to the Stars | Maquia: When the Promised Flower Blooms (Subtitled) | Marguerite | Maria by Callas | Marianne & Leonard: Words of Love | Marie Curie: The Courage of Knowledge | Marie's Story | Marina Abramovic: The Artist is Present | Marjaavaan | Marjorie Prime | Mark Felt: The Man Who Brought Down the White House | Marley | Marmaduke | Marrowbone | Mars Needs Moms | Marshall | Mary and the Witch's Flower | Mary Kom | Mary Magdalene | Mary Queen of Scots | Mary Shelley | Mas Negro Que La Noche | Masquerade | Master | Master Z: Ip Man Legacy | Masterminds | Mastizaade | Matangi/Maya/M.I.A. | Match | Matthias & Maxime | Maudie | Mausam | Max | Max Rose | Max Steel | Maximum Ride| May in the Summer | May It Last: A Portrait Of The Avett Brothers | Mayday Life | Maze | Mazinger Z: INFINITY | McCanick | McFarland, USA | McQueen | Me and Earl and the Dying Girl | Me Before You | Mechanic: Resurrection | Meerkats | Meet Me in Montenegro | Meet Monica Velour | Meet the Blacks | Meet the Mormons | Meet the Patels | Meeting Gorbachev | Megamind | Megan Leavey | Memoir of a Murderer | Memoir of War | Memories of the Sword | Memory: The Origins of Alien | Memphis | Men & Chicken | Men in Black International | Men, Women & Children | Menashe | Mental | Menteur | Meow Wolf: Origin Story | Merchants of Doubt | Mere Brother Ki Dulhan | Meru | Mesrine: Killer Instinct | Mesrine: Public Enemy No. 1 | Metallica Through the Never | MFKZ | Mia and the White Lion | Mia Madre | MIB 3 | Michael Moore In TrumpLand | Micmacs | Microbe & Gasoline | Mid-August Lunch | Mid90s | Middle Men | Middle of Nowhere | Middle School: The Worst Years of My Life | Midnight Diner | Midnight in Paris | Midnight Reckoning | Midnight Special | Midnight Sun | Midnight Traveler | Midnight's Children | Midsommar | Midway | Mifune: The Last Samurai | Mike and Dave Need Wedding Dates | Mike Wallace is Here | Mile 22 | Miles Davis: Birth of Cool | Milford Graves Full Mantis | Mine 9 | Minimalism: A Documentary About the Important Things | Minions | Miracles from Heaven | Mirai | Miral | Mirror Mirror | Misery Loves Comedy | Miss & Mrs. Cops | Miss Bala | Miss Hokusai | Miss Minoes | Miss Peregrine's Home for Peculiar Children | Miss Sharon Jones | Miss You Already | Missing Link | Mission Control: The Unsung Heroes of Apollo | Mission Mangal | Mission Park | Missionary | Mississippi Grind | Mistaken for Strangers | Mister America | Mistress America | Moana | Mobile Suit Gundam NT (Narrative) | Mohenjo Daro | Mojave | Mojin: The Lost Legend | Mojin: The Worm Valley | Moka | Mommy | Moms' Night Out | Money | Money for Nothing: Inside the Federal Reserve | Money Monster | Monk With a Camera | Monkey Kingdom | Monogamy | Monrovia, Indiana | Monsieur Lazhar | Monster Family | Monster Hunt | Monster Hunt 2 | Monster Trucks | Monsters | Monsters and Men | Monsters University | Monte Carlo | Monumental: In Search of America's National Treasure | Mood Indigo | Moonlight Sonata: Deafness in Three Movements | Mooz-lum | More than Blue | More Than Funny: Everybody Has a Punchline | More Than Honey | Morgan | Morning | Morning Glory | Morris from America | Mortal Engines | Mortdecai | Moscow Never Sleeps | Moses| Mother and Child | Mother of George | mother! | Mother's Day | Motherless Brooklyn | Mountain | Movie 43 | Mozart's Sister | Mr. Church | Mr. Donkey | Mr. Gaga: A True Story of Love and Dance | Mr. Holmes | Mr. Nobody | Mr. Peabody & Sherman | Mr. Pip | Mr. Popper's Penguins | Mr. Right | Mr. Six | Mr. X | Ms. Purple | Mubarakan | Much Ado About Nothing | Mud | Mugamoodi | Mukkabaaz | Mully | Multiple Sarcasms | Mumia: Long Distance Revolutionary | Munger Road | Munna Michael | Muppets Most Wanted | Muran | Murder 2 | Murder on the Orient Express | Muscle Shoals | Museo | Museum Hours | Musical Chairs | My Afternoons with Margueritte | My All American | My Best Friend's Wedding | My Big Fat Greek Wedding 2 | My Brother The Devil | My Cousin Rachel | My Dear Liar | My Dog Tulip | My Entire High School Sinking Into the Sea | My Father Die | My Friend Dahmer | My Golden Days | My Hero Academia: Two Heroes | My Journey Through French Cinema | My King | My Kingdom | My Life as a Zucchini | My Little Pony: The Movie | My Love, Don't Cross that River | My Lucky Star | My Name is Khan | My Old Lady | My People, My Country | My Perestroika | My Reincarnation | My Scientology Movie | My Son | My Soul to Take | My Uncle Rafael | My Way | My Week with Marilyn | My Worst Nightmare | Mysteries of Lisbon | N'Secure | Naam Hai Akira | Namaste England | Namiya | Nancy | Nanny McPhee Returns | Napping Princess | Narcissister Organ Player | Narco Cultura | NAS: Time is Illmatic | Nasty Baby | National Bird | National Gallery | Ne Zha | Nebraska | Need for Speed | Neerja | Neil Young Journeys | Neon Bull | Neruda| Nerve | Never | Never Goin' Back | Never Heard | Never Let Me Go | Never Look Away | Never Surrender: A Galaxy Quest Documentary | Never-Ending Man: Hayao Miyazaki | New World | New York, New York | Next Goal Wins | NH10 | Nick Saban: Gamechanger | Nico, 1988 | Night at the Museum: Secret of the Tomb | Night Catches Us | Night Is Short, Walk On Girl | Night Moves | Night School | NightLights | Nine Lives | Nitro Circus the Movie 3D | No | No culpes al karma | No Eres Tu, Soy Yo | No Escape | No God, No Master | No Good Deed | No Greater Love | No Home Movie | No Manches Frida | No Manches Frida 2 | No One Killed Jessica | No One Knows About Persian Cats | No One Lives | No One's Life Is Easy: So I Married an Anti-Fan | No Pay, Nudity | No Place on Earth | No Problem | No Safe Spaces | No Tears for the Dead | Noah | Noble | Nobody Else But You | Nobody Walks | Nocturama | Nocturnal Animals | Noma - My Perfect Storm | Non-Fiction | Non-Stop | Norm of the North | Norman Lear: Just Another Version of You | Norman: The Moderate Rise and Tragic Fall of a New York Fixer | North Face | Northern Limit Line | Northern Soul | Nostalgia | Nostalgia for the Light | Not Fade Away | Not Today | Nothing Bad Can Happen | Nothing Left to Fear | Nothing to Lose | Nothing to Lose 2 | November | Novitiate | Now You See Me 2 | Now, Forager | Nowitzki | Nureyev | Nuts! | Nymphomaniac: Volume I | Nymphomaniac: Volume II | Oasis: Supersonic | Obit. | Oblivion | OC87 | Occupy Unmasked | Ocean Waves | Oceans | October Baby | October Baby | October Country | Oculus | Ode to Joy | Ode to My Father | Of Fathers and Sons | Of Gods and Men | Office Christmas Party | Official Secrets | Oh Lucy! | Ok Jaanu | Okko's Inn | Old Fashioned | Old Stone | Oldboy | Olivia Experiment | Olympus Has Fallen | Omar | On Any Sunday: The Next Chapter | On Chesil Beach | On Her Shoulders | On My Way | On the Ice | On the Job | On The Map | On the Other Side of the Tracks | On the Road | On Wings of Eagles | Once I Was a Beehive | Once Upon a Deadpool | Once Upon A Time | Ondine | One Chance | One Child Nation | One Cut of the Dead | One Day | One Direction: This is Us | One For the Money | One Last Thing | One Piece Film: Gold | One Piece: Stampede | One Small Hitch | One Track Heart: The Story of Krishna Das | One Week and a Day | Ong Bak 3 | Only God Forgives | Only Lovers Left Alive | Only You | In The BIG Balloon Adventure | Oolong Courtyard | Operation Chromite | Operation Finale | Operation Mekong | Operation Red Sea | Ophelia

127 Hours | 22 Jump Street | 50/50 | A Simple Favor | Alien: Covenant | Arctic Dogs | Bad Teacher | Child of God | Cloudy with a Chance of Meatballs 2 | Daddy’s Home | Daddy’s Home 2 | Date Night | Despicable Me | Eat Pray Love | For a Good Time, Call… | Game Night | Goat | Gulliver’s Travels | Homefront | Honey Boy | Horrible Bosses | Horrible Bosses 2 | Howl | Hunter Killer | I Don’t Know How She Does It | I Feel Pretty | I’ll See You In My Dreams | Intruders | Jeff, Who Lives at Home | Jobs | Kill the Irishman | Kin | King Cobra | Kung Fu Panda 2 | Kung Fu Panda 3 | Lemon | Long Shot | Lovelace | Made in Cleveland | Memoria | Neighbors | Neighbors 2: Sorority Rising | Oz The Great and Powerful | Palo Alto | Paul | Rise of the Planet of the Apes | Sausage Party | Sex Tape | Spring

In his first 1,000 days in office Donald Trump made 13,435 false or misleading claims, according to the good folk at the

Washington Postwho painstakingly monitor the president’s habit of bending the truth. How we Brits have smiled at this con man’s Teflon gift. Could never happen here.But consider the lessons political managers around the world might have learned about our election and how we struggled to negotiate the increasingly blurred lines between truth and falsehood; facts and propaganda; openness and stealth; accountability and impunity; clarity and confusion; news and opinion.

It rather looks as if one or two skilled backroom manipulators (we can guess) studied Trump’s ability to persuade enough people that black is white and, rather than recoil in disgust, came to the opposite conclusion: it works.

One far off day we will discover whether 40 new hospitals will be built, and whether 20,000 new police officers will materialise along with 50,000 “new” nurses. It won’t be long before we learn whether we’ve now finally got Brexit “done” or whether this is just the start of a long and painful process of negotiating our future trading relationships with a greatly weakened hand.

We’ll learn the reality of whether there is to be frictionless trade between the mainland of Britain and the island of Ireland. We will read the truth about alleged Russian interference in the 2016 Brexit referendum … and much more. But by then life will have moved on, and maybe many of us will have forgotten the promises, evasions and outright lies of late 2019.

Lessons learned? That, in an age of information chaos, you can get away with almost any amount of misleading. You can doctor videos; suppress information; avoid challenging interviews – but only after your opponents have been thoroughly grilled. You can expel dissenting journalists from the press pack or hide in a fridge. You can rebrand a fake “fact-checking” website. In the end, none of it matters.

Coin one unforgettable message and stick to it. “Get Brexit done” was brilliant, never mind that the meaning of “Brexit” and “done” was far from clear: this is an age of simplicity, not complexity. Even the so-called mainstream media will do far more to amplify that slogan rather than question it. Try this stunt: slap the words on a JCB digger and drive it through a pile of polystyrene bricks ... and watch as news editors obligingly clear their front pages for the image. They are making posters, not doing journalism.

And remember that in most countries, governments have unusual power over public service broadcasters. So, in the event that television journalists seem to be getting too big for their boots, it is often useful to drop a heavy hint there will be a price to pay. Maybe Channel 4 has outlived its usefulness? Possibly it’s time to privatise the BBC? That should do the trick.

Old-fashioned press conferences should be kept to the minimum. A manifesto should say almost nothing. Gaffe-prone colleagues should be “disappeared”. If in real trouble, make things up. You’ll be amazed how readily even the best journalists will repeat unattributable fictions (see the “row” over the four-year-old boy in Leeds General Infirmary and what “happened” during the subsequent visit of health secretary Matt Hancock). By the time the journalists have corrected themselves and Twitter has spent 24 hours arguing about the truth, the world will have moved on.

So, as Trump has discovered, the liars, myth-makers and manipulators are in the ascendancy – and however valiantly individual journalists attempt to hold them to account (and many, especially at a local level, have tried magnificently) the dice are loaded against them.

The one over-riding thought is that for many years I looked at US newspapers and pitied colleagues there who “just” ran the newsroom, leaving comment pages to others. Pity has turned to envy. I now think it would be cleansing for all British national newspapers to split the responsibility for news and comment. It’s simply too hard for the average reader – especially, but not only online – to tell the difference.

And a hero? After the

Yorkshire Evening Post‘s reporting of the Leeds story was questioned, its editor in chief, James Mitchinson, wrote a long and considered reply to a reader who, on the basis of something she read on social media, thought the story was fake. Mitchinson’s reply courteously asks the reader why she would believe the word of a total stranger (who might not even exist) over a newspaper she had read for many years in good faith.The fact the paper knew the story to be true was, said Mitchinson, down to “bog-standard journalism”. It was a powerful statement of why good journalism – independent and decently crafted – should matter. So let’s hear it for bog-standard journalism. There’s too little of it. It may not be enough, but it’s all we have.

Alan Rusbridger is chair of the Reuters Institute for the Study of Journalism