solve $T(n)=T(n-1)+T(\frac{n}{2})+n$

Using the recursion tree i tried solving this: $T(n)=T(n-1)+T(\frac{n}{2})+n$; the tree has two parts (branches) one that of $T(n-1)$ and other branch is of $T(\frac{n}{2})$. But as the term T(n-1) reach higher depth, I solved that part of the tree. Finally it resulted in O$(2^n)$ is it correct or is it $O(n^2)$? How can we solve it by substitution method?


If you want a final answer, skip to the last line.

$T(n) = T(n - 1) + T(n/2) + n$

The problem is at least $O(n^2)$. My guess is that the $T(n/2)$ contributes $log(n)$.

So let's check if $T(n) \in O(n^2 log(n))$

Does there exist $M$ such that as $n$ gets large:

$$\exists M>0, n_0, \forall n > n_0, T(n) \le M n^2 log(n) $$

Known: $T(n) = T(n - 1) + T(n/2) + n$

substitute known recursion into $T(n) \le M (n^2 log(n))$ :

$$T(n - 1) + T(n/2) + n \le M n^2 log(n)$$

$$T(n - 1) + T(n/2) \le M n^2 log(n) - n$$

We need 2 inductive hypothesis to check this, so we'll be using super violent strong induction:

$$\text{Inductive hypothesis:}$$ $$T(n - 1) \le M (n - 1)^2 log(n - 1)$$ $$T(n/2) \le M (n/2)^2 log(n/2)$$

We have inductive hypothesis of the forms $a < b$ and $c < d$ , we wish to use that to prove a statement of the form $a+c < e$, so we need to establish that $b+d < e$

$$M (n - 1)^2 log(n - 1) + M (n/2)^2 log(n/2) \le M n^2 log(n) - n$$

So we want to find if there is any $M$ that doesn't depend on $n$, that for a sufficiently large $n$, makes the above statement true.

$$M ((n - 1)^2 log(n - 1) + (n/2)^2 log(n/2) - n^2 log(n)) \le - n$$

$$M (n^2 log(n - 1) - 2nlog(n - 1) + log(n - 1) + 1/4n^2 log(n) - 1/4n^2log(2) - n^2 log(n)) \le -n$$

$$M (n^2 (log(n - 1) - 3/4 log(n) - 1/4log(2)) - 2nlog(n - 1) + log(n - 1) ) \le -n$$

The coefficient of $n^2$ is $log(n - 1) - 3/4 log(n) - 1/4log(2)$, which you can see by inspection will be positive for sufficiently large $n$. Unfortunately, since $M$ must be positive, and the right hand side is negative, that means that no value of $M$ will make the equation true for sufficiently large $n$.

Therefore we cannot conclude that $T(n) \in O(n^2 log(n))$. One step in the above was an implication rather than an equivalence, so we can't necessarily say that it is false either. We'll check another bound.


Let's see if any polynomial satisfies this. Is $T(n) \in O(n^c)$ for some c.

$$T(n) \le M n^c$$

Known: $T(n) = T(n - 1) + T(n/2) + n$

substitute known recursion into $T(n) \le M n^c$ :

$$T(n - 1) + T(n/2) + n \le M n^c$$

$$T(n - 1) + T(n/2) \le M n^c - n$$

2 inductive hypothesis again: $$T(n - 1) \le M (n - 1)^c$$ $$ T(n/2) \le M (n/2)^c$$

We have inductive hypothesis of the forms $a < b$ and $c < d$ , we wish to use that to prove a statement of the form $a+c < e$, so we need to establish that $b+d < e$

$$M (n - 1)^c + M (n/2)^2 \le M n^c - n$$

So we want to find if $\exists M, \forall n > n_0, \text {above statement holds}$. Let's ignore all polynomial terms of degree less than $c$, since we are only concerned with $n$ above a threshhold.

$$M ((n - 1)^c + (n/2)^c - n^c) \le - n$$

$$M (n^c + \frac{1}{2^c} n^c - n^c + \dots) \le -n$$

$$M \frac{1}{2^c} n^c + \dots \le -n$$

So we can't find any polynomial which seems to solve this.


I think someone suggested checking if $T(n) \in O(n^{log(n)})$. That would be an interesting result.

$$T(n) \le M n^{log(n)}$$

Known: $T(n) = T(n - 1) + T(n/2) + n$

substitute known recursion into $T(n) \le M n^{log(n)}$ :

$$T(n - 1) + T(n/2) + n \le M n^{log(n)}$$

$$T(n - 1) + T(n/2) \le M n^{log(n)} - n$$

2 inductive hypothesis again: $$ T(n - 1) \le M (n - 1)^{log(n - 1)}$$ $$ T(n/2) \le M (n/2)^{log(n / 2)}$$

...blah blah blah

$$M (n - 1)^{log(n - 1)} + M (n/2)^{log(n / 2)} \le M n^{log(n)} - n$$

$$M ((n - 1)^{log(n - 1)} + (n/2)^{log(n / 2)} - n^{log(n)}) \le - n$$

$$M (n^{log(n)} - (n - 1)^{log(n - 1)} - (n/2)^{log(n / 2)}) \ge n$$

At this point it's pretty obvious that we've found a solution. Yay. Try $M=1$.

$$n^{log(n)} - (n - 1)^{log(n - 1)} - (n/2)^{log(n / 2)} \ge n$$

This holds for $n > 5.4$, a witness for $n_0$.

Therefore we can conclude that $T(n) \in O(n^{log(n)})$.

We can all be happy now. Please double check this for typos.


Following up on Michael's answer, let $S(n) = T(n) + 2n + 2$. Calculation shows that $$ S(n) = S(n-1) + S(n/2). $$ (This is assuming that by $n/2$ you actually mean $n/2$ rather than $\lfloor n/2 \rfloor$ or the like.) Mahler showed that every function satisfying the very similar functional equation $S(n+1) = S(n) + S(n/2)$ has the asymptotic form $$ S(n) = \Theta\left( \sum_{m=0}^\infty 2^{-m(m-1)/2} \frac{n^m}{m!} \right) . $$ A function satisfying our functional equation probably has the very same asymptotic form. Stirling's approximation gives $$ 2^{-m(m-1)/2} \frac{n^m}{m!} \sim \frac{n^m}{\sqrt{2\pi m}(m/e)^m \sqrt{2}^{m(m-1)}}. $$ Taking logarithm of the $m$th term, we get roughly $$ m\log n - \Theta(m^2). $$ This is maximized for $\log n = \Theta(m)$, at which case the value of the logarithm is $\Theta(\log^2 n)$. We deduce that the sum itself is roughly $$ \exp \Theta(\log^2 n) = n^{\Theta(\log n)}. $$ Therefore (modulo some inaccuracies), $$ T(n) = n^{\Theta(\log n)}. $$


I asked a similar question earlier this year, and found a technical paper on it from 1948. Even without the extra $+n$ at the end, the formula was pretty cool.

how many ways to make change, asymptotics

For this problem, without the floor function, try $T(n)=an+b$, and solve for $a$ and $b$.