gorgeousgen9487

2022-07-14

Here's a problem I thought of that I don't know how to approach:

You have a fair coin that you keep on flipping. After every flip, you perform a hypothesis test based on all coin flips thus far, with significance level $\alpha $, where your null hypothesis is that the coin is fair and your alternative hypothesis is that the coin is not fair. In terms of $\alpha $, what is the expected number of flips before the first time that you reject the null hypothesis?

Edit based on comment below: For what values of α is the answer to the question above finite? For those values for which it is infinite, what is the probability that the null hypothesis will ever be rejected, in terms of $\alpha $?

Edit 2: My post was edited to say "You believe that you have a fair coin." The coin is in fact fair, and you know that. You do the hypothesis tests anyway. Otherwise the problem is unapproachable because you don't know the probability that any particular toss will come up a certain way.

You have a fair coin that you keep on flipping. After every flip, you perform a hypothesis test based on all coin flips thus far, with significance level $\alpha $, where your null hypothesis is that the coin is fair and your alternative hypothesis is that the coin is not fair. In terms of $\alpha $, what is the expected number of flips before the first time that you reject the null hypothesis?

Edit based on comment below: For what values of α is the answer to the question above finite? For those values for which it is infinite, what is the probability that the null hypothesis will ever be rejected, in terms of $\alpha $?

Edit 2: My post was edited to say "You believe that you have a fair coin." The coin is in fact fair, and you know that. You do the hypothesis tests anyway. Otherwise the problem is unapproachable because you don't know the probability that any particular toss will come up a certain way.

sniokd

Beginner2022-07-15Added 22 answers

EDIT: This answer was unclear for OP at first, so I tried to make it clearer through a new approach. Apparently it arose another legitimate doubt, so I tried now to put both answers together and clarify them even more. (Still I might be wrong, but I'll try to express myself better)

What you look for, is the expected number of tosses before we do a Type I error (rejecting ${H}_{0}$ when it was true). The probability of that is precisely $\alpha $ (that's another way to define it).

So $P(Type\text{}I\text{}error)=\alpha $

Let ${X}_{n}$ be the event of rejecting ${n}^{th}$ test.

Now, $E[{X}_{1}]=\alpha $ stands for the expected number of games (a game is starting to test in the way we do a new coin) where ${H}_{0}$ was rejected on the first throw. $E[{X}_{1}+{X}_{2}]=E[{X}_{1}]+E[{X}_{2}]$ is the expected number of games where ${H}_{0}$ is rejected either on the first or the second throw. Note that with most $\alpha $ this will be lower than 1, so the expectation for a single game is not to reject ${H}_{0}$ yet.

When do we expect to have rejected ${H}_{0}$? Precisely when the number of expected games in which we reject ${H}_{0}$ is 1. Therefore, we look for n such as

$E[{X}_{1}+{X}_{2}+...+{X}_{n}]=1\phantom{\rule{0ex}{0ex}}E[{X}_{1}+{X}_{2}+...+{X}_{n}]=E[n{X}_{1}]=nE[{X}_{1}]=n\alpha =1\phantom{\rule{0ex}{0ex}}n=\frac{1}{\alpha}$

The other answer goes like this: Let the variable T count the number of tests before rejecting one. We look for E[T].

Also, using previous notation,$P({X}_{n})=\alpha (1-\alpha {)}^{n-1}$ (I'm aware this implies independence between the events ${X}_{n}$ and ${X}_{n-1}$ but since I'm looking for the expected value, for the linearity of the Expected Value, it shouldn't be a problem, though I'm aware I'm not being polite with notation).

$E[T]=\sum _{n=1}^{\mathrm{\infty}}nP({X}_{n})=\sum _{n=1}^{\mathrm{\infty}}n\alpha (1-\alpha {)}^{n-1}=\alpha \sum _{n=1}^{\mathrm{\infty}}n(1-\alpha {)}^{n-1}=\phantom{\rule{0ex}{0ex}}\alpha \sum _{n=0}^{\mathrm{\infty}}(n+1)(1-\alpha {)}^{n}=\alpha (\sum _{n=0}^{\mathrm{\infty}}n(1-\alpha {)}^{n}+\sum _{n=0}^{\mathrm{\infty}}(1-\alpha {)}^{n})=\alpha (\frac{1-\alpha}{{\alpha}^{2}}+\frac{1}{\alpha})\phantom{\rule{0ex}{0ex}}E[T]=\frac{1}{\alpha}$

What you look for, is the expected number of tosses before we do a Type I error (rejecting ${H}_{0}$ when it was true). The probability of that is precisely $\alpha $ (that's another way to define it).

So $P(Type\text{}I\text{}error)=\alpha $

Let ${X}_{n}$ be the event of rejecting ${n}^{th}$ test.

Now, $E[{X}_{1}]=\alpha $ stands for the expected number of games (a game is starting to test in the way we do a new coin) where ${H}_{0}$ was rejected on the first throw. $E[{X}_{1}+{X}_{2}]=E[{X}_{1}]+E[{X}_{2}]$ is the expected number of games where ${H}_{0}$ is rejected either on the first or the second throw. Note that with most $\alpha $ this will be lower than 1, so the expectation for a single game is not to reject ${H}_{0}$ yet.

When do we expect to have rejected ${H}_{0}$? Precisely when the number of expected games in which we reject ${H}_{0}$ is 1. Therefore, we look for n such as

$E[{X}_{1}+{X}_{2}+...+{X}_{n}]=1\phantom{\rule{0ex}{0ex}}E[{X}_{1}+{X}_{2}+...+{X}_{n}]=E[n{X}_{1}]=nE[{X}_{1}]=n\alpha =1\phantom{\rule{0ex}{0ex}}n=\frac{1}{\alpha}$

The other answer goes like this: Let the variable T count the number of tests before rejecting one. We look for E[T].

Also, using previous notation,$P({X}_{n})=\alpha (1-\alpha {)}^{n-1}$ (I'm aware this implies independence between the events ${X}_{n}$ and ${X}_{n-1}$ but since I'm looking for the expected value, for the linearity of the Expected Value, it shouldn't be a problem, though I'm aware I'm not being polite with notation).

$E[T]=\sum _{n=1}^{\mathrm{\infty}}nP({X}_{n})=\sum _{n=1}^{\mathrm{\infty}}n\alpha (1-\alpha {)}^{n-1}=\alpha \sum _{n=1}^{\mathrm{\infty}}n(1-\alpha {)}^{n-1}=\phantom{\rule{0ex}{0ex}}\alpha \sum _{n=0}^{\mathrm{\infty}}(n+1)(1-\alpha {)}^{n}=\alpha (\sum _{n=0}^{\mathrm{\infty}}n(1-\alpha {)}^{n}+\sum _{n=0}^{\mathrm{\infty}}(1-\alpha {)}^{n})=\alpha (\frac{1-\alpha}{{\alpha}^{2}}+\frac{1}{\alpha})\phantom{\rule{0ex}{0ex}}E[T]=\frac{1}{\alpha}$

The product of the ages, in years, of three (3) teenagers os 4590. None of the have the sane age. What are the ages of the teenagers???

Use the row of numbers shown below to generate 12 random numbers between 01 and 99

78038 18022 84755 23146 12720 70910 49732 79606

Starting at the beginning of the row, what are the first 12 numbers between 01 and 99 in the sample?How many different 10 letter words (real or imaginary) can be formed from the following letters

H,T,G,B,X,X,T,L,N,J.Is every straight line the graph of a function?

For the 1s orbital of the Hydrogen atom, the radial wave function is given as: $R(r)=\frac{1}{\sqrt{\pi}}(\frac{1}{{a}_{O}}{)}^{\frac{3}{2}}{e}^{\frac{-r}{{a}_{O}}}$ (Where ${a}_{O}=0.529$ ∘A)

The ratio of radial probability density of finding an electron at $r={a}_{O}$ to the radial probability density of finding an electron at the nucleus is given as ($x.{e}^{-y}$). Calculate the value of (x+y).Find the sets $A$ and $B$ if $\frac{A}{B}=\left(1,5,7,8\right),\frac{B}{A}=\left(2,10\right)$ and $A\cap B=\left(3,6,9\right)$. Are they unique?

What are the characteristics of a good hypothesis?

If x is 60% of y, find $\frac{x}{y-x}$.

A)$\frac{1}{2}$

B)$\frac{3}{2}$

C)$\frac{7}{2}$

D)$\frac{5}{2}$The numbers of significant figures in $9.1\times {10}^{-31}kg$ are:

A)Two

B)Three

C)Ten

D)Thirty oneWhat is positive acceleration?

Is power scalar or vector?

What is the five-step process for hypothesis testing?

How to calculate Type 1 error and Type 2 error probabilities?

How long will it take to drive 450 km if you are driving at a speed of 50 km per hour?

1) 9 Hours

2) 3.5 Hours

3) 6 Hours

4) 12.5 HoursWhat is the square root of 106?