Complexities of binary tree traversals

What is the time complexity of inorder,postorder and preorder traversal of binary trees in data structures?? Is it O(n) or O(log n) or O(n^2)??


In-order, Pre-order, and Post-order traversals are Depth-First traversals.

For a Graph, the complexity of a Depth First Traversal is O(n + m), where n is the number of nodes, and m is the number of edges.

Since a Binary Tree is also a Graph, the same applies here. The complexity of each of these Depth-first traversals is O(n+m).

Since the number of edges that can originate from a node is limited to 2 in the case of a Binary Tree, the maximum number of total edges in a Binary Tree is n-1, where n is the total number of nodes.

The complexity then becomes O(n + n-1), which is O(n).


O(n), because you traverse each node once. Or rather - the amount of work you do for each node is constant (does not depend on the rest of the nodes).


Introduction

Hi

I was asked this question today in class, and it is a good question! I will explain here and hopefully get my more formal answer reviewed or corrected where it is wrong. :)

Previous Answers

The observation by @Assaf is also correct since binary tree traversal travels recursively to visit each node once.

But!, since it is a recursive algorithm, you often have to use more advanced methods to analyze run-time performance. When dealing with a sequential algorithm or one that uses for-loops, using summations will often be enough. So, what follows is a more detailed explanation of this analysis for those who are curious.

The Recurrence

As previously stated,

T(n) = 2*T(n/2) + 1

where T(n) is the number of operations executed in your traversal algorithm (in-order, pre-order, or post-order makes no difference.

Explanation of the Recurrence

There are two T(n) because inorder, preorder, and postorder traversals all call themselves on the left and right child node. So, think of each recursive call as a T(n). In other words, **left T(n/2) + right T(n/2) = 2 T(n/2) **. The "1" comes from any other constant time operations within the function, like printing the node value, et cetera. (It could honestly be a 1 or any constant number & the asymptotic run-time still computes to the same value. Explanation follows.).

Analysis

This recurrence actually can be analyzed using big theta using the masters' theorem. So, I will apply it here.

T(n) = 2*T(n/2) + constant

where constant is some constant (could be 1 or any other constant).

Using the Masters' Theorem , we have T(n) = a*T(n/b) + f(n).

So, a=2, b=2, f(n) = constant since f(n) = n^c = 1, then it follows that c = 0 since f(n) is a constant.

From here, we can see that a = 2 and b^c = 2 ^0 = 1. So, a>b^c or 2>2^0. So, c < logb(a) or 0 < log2(2)

From here we have T(n) = BigTheta(n^{logb(a)}) = BigTheta(n^1) = BigTheta(n)

If your not famliar with BigTheta(n), it is "similar" ( please bear with me :) ) to O(n) but it is a "tighter bound" or tighter approximation of the run-time. So, BigTheta(n) is both worst-case O(n), and best-case BigOmega(n) run-time.

I hope this helps. Take care.