Algorithm Analysis
Algorithm analysis is the process of determining the efficiency of an algorithm in terms of time and space complexity. It helps in understanding how an algorithm performs as the input size grows, which is crucial for solving Leetcode problems efficiently.
Table of Contents
- Algorithm Analysis Basics
- Time Complexity
- Space Complexity
- Common Algorithm Analysis Patterns
- Related Leetcode Questions
Algorithm Analysis Basics
- Definition: Algorithm analysis involves evaluating the efficiency of an algorithm in terms of time and space complexity.
- Key Concepts:
- Time Complexity: Measures the amount of time an algorithm takes to complete as a function of the input size.
- Space Complexity: Measures the amount of memory an algorithm uses as a function of the input size.
- Big O Notation: Describes the upper bound of an algorithm’s complexity, providing a worst-case scenario.
Time Complexity
1. Constant Time - O(1)
- Definition: The algorithm takes the same amount of time regardless of the input size.
- Example: Accessing an element in an array by index.
public int getElement(int[] arr, int index) {return arr[index]; // O(1) time complexity}
2. Linear Time - O(n)
- Definition: The algorithm’s time complexity grows linearly with the input size.
- Example: Finding the maximum element in an array.
public int findMax(int[] arr) {int max = arr[0];for (int num : arr) { // O(n) time complexityif (num > max) {max = num;}}return max;}
3. Logarithmic Time - O(log n)
- Definition: The algorithm’s time complexity grows logarithmically with the input size.
- Example: Binary search in a sorted array.
public int binarySearch(int[] arr, int target) {int left = 0, right = arr.length - 1;while (left <= right) { // O(log n) time complexityint mid = left + (right - left) / 2;if (arr[mid] == target) {return mid;} else if (arr[mid] < target) {left = mid + 1;} else {right = mid - 1;}}return -1;}
4. Quadratic Time - O(n^2)
- Definition: The algorithm’s time complexity grows quadratically with the input size.
- Example: Bubble sort.
public void bubbleSort(int[] arr) {int n = arr.length;for (int i = 0; i < n - 1; i++) { // O(n^2) time complexityfor (int j = 0; j < n - i - 1; j++) {if (arr[j] > arr[j + 1]) {int temp = arr[j];arr[j] = arr[j + 1];arr[j + 1] = temp;}}}}
5. Exponential Time - O(2^n)
- Definition: The algorithm’s time complexity grows exponentially with the input size.
- Example: Generating all subsets of a set.
public List<List<Integer>> subsets(int[] nums) {List<List<Integer>> result = new ArrayList<>();backtrack(nums, 0, new ArrayList<>(), result); // O(2^n) time complexityreturn result;}private void backtrack(int[] nums, int start, List<Integer> path, List<List<Integer>> result) {result.add(new ArrayList<>(path));for (int i = start; i < nums.length; i++) {path.add(nums[i]);backtrack(nums, i + 1, path, result);path.remove(path.size() - 1);}}
Space Complexity
1. Constant Space - O(1)
- Definition: The algorithm uses a fixed amount of memory regardless of the input size.
- Example: Swapping two numbers.
public void swap(int a, int b) {int temp = a; // O(1) space complexitya = b;b = temp;}
2. Linear Space - O(n)
- Definition: The algorithm’s space complexity grows linearly with the input size.
- Example: Storing elements in an array.
public int[] copyArray(int[] arr) {int[] copy = new int[arr.length]; // O(n) space complexityfor (int i = 0; i < arr.length; i++) {copy[i] = arr[i];}return copy;}
3. Quadratic Space - O(n^2)
- Definition: The algorithm’s space complexity grows quadratically with the input size.
- Example: Storing a 2D matrix.
public int[][] createMatrix(int n) {int[][] matrix = new int[n][n]; // O(n^2) space complexityfor (int i = 0; i < n; i++) {for (int j = 0; j < n; j++) {matrix[i][j] = i + j;}}return matrix;}
Common Algorithm Analysis Patterns
1. Divide and Conquer
- Problem: Break the problem into smaller subproblems, solve each subproblem, and combine the results.
- Example: Merge sort.
public void mergeSort(int[] arr, int left, int right) {if (left < right) {int mid = left + (right - left) / 2;mergeSort(arr, left, mid); // O(log n) recursive callsmergeSort(arr, mid + 1, right); // O(log n) recursive callsmerge(arr, left, mid, right); // O(n) time complexity}}private void merge(int[] arr, int left, int mid, int right) {int n1 = mid - left + 1;int n2 = right - mid;int[] L = new int[n1]; // O(n) space complexityint[] R = new int[n2]; // O(n) space complexityfor (int i = 0; i < n1; i++) {L[i] = arr[left + i];}for (int j = 0; j < n2; j++) {R[j] = arr[mid + 1 + j];}int i = 0, j = 0, k = left;while (i < n1 && j < n2) {if (L[i] <= R[j]) {arr[k] = L[i];i++;} else {arr[k] = R[j];j++;}k++;}while (i < n1) {arr[k] = L[i];i++;k++;}while (j < n2) {arr[k] = R[j];j++;k++;}}
- Time Complexity:
O(n log n) - Space Complexity:
O(n)
2. Dynamic Programming
- Problem: Solve the problem by breaking it down into simpler subproblems and storing the results of subproblems to avoid redundant computations.
- Example: Fibonacci sequence.
public int fib(int n) {if (n <= 1) return n;int[] dp = new int[n + 1]; // O(n) space complexitydp[0] = 0;dp[1] = 1;for (int i = 2; i <= n; i++) { // O(n) time complexitydp[i] = dp[i - 1] + dp[i - 2];}return dp[n];}
- Time Complexity:
O(n) - Space Complexity:
O(n)
3. Greedy Algorithms
- Problem: Make locally optimal choices at each step with the hope of finding a global optimum.
- Example: Activity selection.
public int activitySelection(int[] start, int[] end) {int n = start.length;int[][] activities = new int[n][2]; // O(n) space complexityfor (int i = 0; i < n; i++) {activities[i][0] = start[i];activities[i][1] = end[i];}Arrays.sort(activities, (a, b) -> a[1] - b[1]); // O(n log n) time complexityint count = 1;int lastEnd = activities[0][1];for (int i = 1; i < n; i++) { // O(n) time complexityif (activities[i][0] >= lastEnd) {count++;lastEnd = activities[i][1];}}return count;}
- Time Complexity:
O(n log n) - Space Complexity:
O(n)