FREE

Create Professional Invoices

Generate clean, GST-ready invoices in seconds. No signup required.

Open Invoice Generator

Algorithms and Data Structures(Using Java)

Algorithms and Data Structures in Java

Complete guide to mastering algorithms and data structures with Java implementations

Introduction

Algorithms and Data Structures form the foundation of computer science and software development. Understanding these concepts is crucial for writing efficient, scalable, and maintainable code.

What are Data Structures?

Data structures are ways of organizing and storing data in a computer so that it can be accessed and modified efficiently. They define the relationship between data and the operations that can be performed on that data.

What are Algorithms?

Algorithms are step-by-step procedures or formulas for solving problems. They describe how to perform a computation and are independent of any programming language.

Why Java? Java provides excellent built-in data structures through the Collections Framework, but understanding the underlying implementations helps you choose the right tool for each problem.

Data Structures

Data structures can be classified into two main categories:

  • Linear Data Structures: Arrays, Linked Lists, Stacks, Queues
  • Non-Linear Data Structures: Trees, Graphs, Hash Tables

Arrays

Arrays are the simplest data structure - a collection of elements stored in contiguous memory locations.

Concept: Memory Layout

Arrays store elements in contiguous memory locations, meaning each element is stored right next to the previous one. This allows for:

  • O(1) Access: Direct access using index calculation: address = base_address + (index × element_size)
  • Cache Efficiency: Contiguous memory improves CPU cache performance
  • Fixed Size: Size must be known at declaration (or use dynamic arrays)

Visualization: Array Memory Layout

10
20
30
40
50

Memory Addresses: If base address is 1000 and each integer is 4 bytes:
Index 0: 1000, Index 1: 1004, Index 2: 1008, Index 3: 1012, Index 4: 1016

Array Declaration and Initialization

// Array declaration
int[] numbers = new int[5];
String[] names = {"Alice", "Bob", "Charlie"};

// Accessing elements
int first = numbers[0];
numbers[0] = 10;

// Array length
int length = numbers.length;

Dynamic Array Implementation

public class DynamicArray {
    private int[] array;
    private int size;
    private int capacity;

    public DynamicArray() {
        capacity = 2;
        array = new int[capacity];
        size = 0;
    }

    public void add(int element) {
        if (size >= capacity) {
            resize();
        }
        array[size++] = element;
    }

    private void resize() {
        capacity *= 2;
        int[] newArray = new int[capacity];
        System.arraycopy(array, 0, newArray, 0, size);
        array = newArray;
    }

    public int get(int index) {
        if (index < 0 || index >= size) {
            throw new IndexOutOfBoundsException();
        }
        return array[index];
    }

    public int size() {
        return size;
    }
}

Time Complexity

Operation Time Complexity
Access O(1)
Search O(n)
Insertion O(n)
Deletion O(n)
Q1
What is the time complexity to access an element in an array by index?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

Array access by index is O(1) because arrays use direct memory addressing. The memory address of any element can be calculated directly using: base_address + (index × element_size), allowing constant-time access regardless of array size.

Q2
What happens when you try to access an array index that is out of bounds in Java?
02:00
The program continues with a default value
The array automatically resizes
ArrayIndexOutOfBoundsException is thrown
The last element is returned
Answer: C - ArrayIndexOutOfBoundsException is thrown

Java throws ArrayIndexOutOfBoundsException when you try to access an array element with an invalid index (negative or >= array.length). This is a runtime exception that must be handled or the program will terminate.

Q3
What is the time complexity of inserting an element at the beginning of a dynamic array (ArrayList) in Java?
02:00
O(n)
O(1)
O(log n)
O(n²)
Answer: A - O(n)

Inserting at the beginning requires shifting all existing elements one position to the right, which takes O(n) time. This is why ArrayList is better for insertions at the end (O(1) amortized) rather than at the beginning.

Q4
Which of the following is the correct way to declare and initialize a 2D array in Java?
02:00
int[][] arr = new int[3][];
int arr[][] = new int[3][4];
int[] arr[] = {{1,2},{3,4}};
All of the above
Answer: D - All of the above

Java allows multiple syntaxes for 2D arrays: int[][] arr, int arr[][], and int[] arr[] are all valid. You can also create jagged arrays (different row lengths) or initialize with values directly.

Q5
What is the space complexity of a dynamic array that doubles its size when full?
02:00
O(1)
O(n)
O(n²)
O(log n)
Answer: B - O(n)

The space complexity is O(n) where n is the number of elements. Even though the array may allocate more space than needed (up to 2n-1), the space complexity is still linear with respect to the number of elements stored.

Q6
What is the best case time complexity for searching an element in an unsorted array?
02:00
O(log n)
O(n log n)
O(1)
O(n)
Answer: C - O(1)

The best case occurs when the element is found at the first position (index 0), requiring only one comparison. However, the average and worst case are O(n) for unsorted arrays.

Q7
Given an array of size n, what is the maximum number of comparisons needed to find the maximum element?
02:00
n-1
n
n+1
2n
Answer: A - n-1

To find the maximum, you need to compare each element with the current maximum. Starting with the first element as max, you need n-1 comparisons to check the remaining n-1 elements.

Q8
What is the time complexity of reversing an array in-place?
02:00
O(1)
O(log n)
O(n log n)
O(n)
Answer: D - O(n)

Reversing an array in-place requires swapping elements from both ends until the middle, which takes n/2 swaps = O(n) time. The space complexity is O(1) since no extra array is needed.

Linked Lists

A linked list is a linear data structure where elements are stored in nodes, and each node points to the next node in the sequence.

Concept: Node Structure

Each node in a linked list contains:

  • Data: The actual value stored
  • Next Pointer: Reference to the next node (null for the last node)

Key Difference from Arrays: Elements are not stored contiguously in memory. Each node can be anywhere in memory, connected only by pointers.

Visualization: Singly Linked List Structure

Head
10
Node1
20
Node2
30
Node3
40
Node4
50
Node5
null
Each node: [Data | Next Pointer]
Memory Layout: Nodes can be scattered in memory. Only the "next" pointer connects them.

Visualization: Insertion at Beginning

Step 1: Create new node with data = 5
Step 2: Set new node's next = current head
Step 3: Update head to point to new node
Before:
Head
10
20
30
null
After:
Head
5
10
20
30
null

Singly Linked List Implementation

public class ListNode {
    int val;
    ListNode next;

    ListNode(int val) {
        this.val = val;
        this.next = null;
    }
}

public class LinkedList {
    private ListNode head;
    private int size;

    public LinkedList() {
        head = null;
        size = 0;
    }

    public void add(int val) {
        ListNode newNode = new ListNode(val);
        if (head == null) {
            head = newNode;
        } else {
            ListNode current = head;
            while (current.next != null) {
                current = current.next;
            }
            current.next = newNode;
        }
        size++;
    }

    public void addFirst(int val) {
        ListNode newNode = new ListNode(val);
        newNode.next = head;
        head = newNode;
        size++;
    }

    public boolean remove(int val) {
        if (head == null) return false;
        
        if (head.val == val) {
            head = head.next;
            size--;
            return true;
        }

        ListNode current = head;
        while (current.next != null) {
            if (current.next.val == val) {
                current.next = current.next.next;
                size--;
                return true;
            }
            current = current.next;
        }
        return false;
    }

    public void display() {
        ListNode current = head;
        while (current != null) {
            System.out.print(current.val + " -> ");
            current = current.next;
        }
        System.out.println("null");
    }

    public int size() {
        return size;
    }
}

Doubly Linked List

public class DoublyListNode {
    int val;
    DoublyListNode next;
    DoublyListNode prev;

    DoublyListNode(int val) {
        this.val = val;
        this.next = null;
        this.prev = null;
    }
}

public class DoublyLinkedList {
    private DoublyListNode head;
    private DoublyListNode tail;
    private int size;

    public DoublyLinkedList() {
        head = null;
        tail = null;
        size = 0;
    }

    public void add(int val) {
        DoublyListNode newNode = new DoublyListNode(val);
        if (head == null) {
            head = tail = newNode;
        } else {
            tail.next = newNode;
            newNode.prev = tail;
            tail = newNode;
        }
        size++;
    }

    public void remove(int val) {
        DoublyListNode current = head;
        while (current != null) {
            if (current.val == val) {
                if (current.prev != null) {
                    current.prev.next = current.next;
                } else {
                    head = current.next;
                }
                if (current.next != null) {
                    current.next.prev = current.prev;
                } else {
                    tail = current.prev;
                }
                size--;
                return;
            }
            current = current.next;
        }
    }
}

Time Complexity

Operation Singly Linked List Doubly Linked List
Access O(n) O(n)
Search O(n) O(n)
Insertion (beginning) O(1) O(1)
Insertion (end) O(n) O(1)
Deletion O(n) O(1)

Practice Questions - Linked Lists 10 Questions

Q9
What is the time complexity of inserting an element at the beginning of a singly linked list?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

Inserting at the beginning of a linked list is O(1) because you only need to create a new node, set its next pointer to the current head, and update the head pointer. No traversal is needed.

Q10
What is the time complexity of searching for an element in a singly linked list?
02:00
O(n)
O(1)
O(log n)
O(n²)
Answer: A - O(n)

Searching requires traversing the list from head to find the element, which in the worst case requires checking all n nodes, making it O(n) time complexity.

Q11
Which algorithm is used to detect a cycle in a linked list?
02:00
Binary Search
Depth First Search
Floyd's Cycle Detection (Tortoise and Hare)
Breadth First Search
Answer: C - Floyd's Cycle Detection

Floyd's Cycle Detection uses two pointers (slow and fast) moving at different speeds. If there's a cycle, they will eventually meet. This is also known as the "tortoise and hare" algorithm.

Q12
What is the main advantage of a doubly linked list over a singly linked list?
02:00
Takes less memory
Faster insertion
Faster deletion
Can traverse in both directions
Answer: D - Can traverse in both directions

Doubly linked lists have pointers to both next and previous nodes, allowing bidirectional traversal. However, they use more memory and require updating two pointers for insertions/deletions.

Q13
What is the time complexity of finding the middle element of a linked list using the two-pointer technique?
02:00
O(1)
O(n)
O(log n)
O(n²)
Answer: B - O(n)

Using two pointers (slow moves 1 step, fast moves 2 steps), we traverse the list once. When fast reaches the end, slow is at the middle. This takes O(n) time but is more efficient than counting nodes first.

Q14
What happens if you try to reverse a linked list with a cycle?
02:00
Infinite loop
NullPointerException
The cycle is broken
Only the non-cyclic part is reversed
Answer: A - Infinite loop

Reversing a linked list with a cycle will cause an infinite loop because the reversal algorithm will keep traversing the cycle indefinitely. Always check for cycles before reversing.

Q15
In a circular linked list, how do you identify if you've completed one full traversal?
02:00
Check if current == null
Count nodes until count == size
Check if current == head
Use a visited flag
Answer: C - Check if current == head

In a circular linked list, the last node points back to the head. So when traversing, if current == head (and you've moved at least once), you've completed one full cycle.

Q16
What is the space complexity of reversing a linked list iteratively?
02:00
O(n)
O(log n)
O(n log n)
O(1)
Answer: D - O(1)

Iterative reversal uses only a constant amount of extra space (for prev, curr, and next pointers), making it O(1) space. Recursive reversal would be O(n) due to the call stack.

Q17
Given two sorted linked lists, what is the time complexity of merging them into one sorted list?
02:00
O(1)
O(n + m)
O(n log m)
O(n × m)
Answer: B - O(n + m)

Merging two sorted lists requires visiting each node once from both lists. If list1 has n nodes and list2 has m nodes, the time complexity is O(n + m), which is linear.

Q18
What is the worst-case time complexity of removing a node from a linked list when you only have a reference to that node (not the previous node)?
02:00
O(1) by copying next node's data
O(n) to find previous node
O(log n)
Cannot be done
Answer: A - O(1) by copying next node's data

You can delete a node in O(1) by copying the next node's data to the current node and then deleting the next node. This works unless it's the last node, which requires O(n) to find the previous node.

Stacks and Queues

Stacks and queues are linear data structures that follow specific ordering principles.

Stack Implementation

A stack follows LIFO (Last In First Out) principle.

import java.util.EmptyStackException;

public class Stack {
    private int[] stack;
    private int top;
    private int capacity;

    public Stack(int capacity) {
        this.capacity = capacity;
        this.stack = new int[capacity];
        this.top = -1;
    }

    public void push(int element) {
        if (top >= capacity - 1) {
            throw new StackOverflowError("Stack is full");
        }
        stack[++top] = element;
    }

    public int pop() {
        if (isEmpty()) {
            throw new EmptyStackException();
        }
        return stack[top--];
    }

    public int peek() {
        if (isEmpty()) {
            throw new EmptyStackException();
        }
        return stack[top];
    }

    public boolean isEmpty() {
        return top == -1;
    }

    public int size() {
        return top + 1;
    }
}

Queue Implementation

A queue follows FIFO (First In First Out) principle.

public class Queue {
    private int[] queue;
    private int front;
    private int rear;
    private int size;
    private int capacity;

    public Queue(int capacity) {
        this.capacity = capacity;
        this.queue = new int[capacity];
        this.front = 0;
        this.rear = -1;
        this.size = 0;
    }

    public void enqueue(int element) {
        if (isFull()) {
            throw new IllegalStateException("Queue is full");
        }
        rear = (rear + 1) % capacity;
        queue[rear] = element;
        size++;
    }

    public int dequeue() {
        if (isEmpty()) {
            throw new IllegalStateException("Queue is empty");
        }
        int element = queue[front];
        front = (front + 1) % capacity;
        size--;
        return element;
    }

    public int peek() {
        if (isEmpty()) {
            throw new IllegalStateException("Queue is empty");
        }
        return queue[front];
    }

    public boolean isEmpty() {
        return size == 0;
    }

    public boolean isFull() {
        return size == capacity;
    }
}

Using Java Collections

import java.util.Stack;
import java.util.Queue;
import java.util.LinkedList;
import java.util.ArrayDeque;

// Stack using Java Collections
Stack<Integer> stack = new Stack<>();
stack.push(10);
int top = stack.pop();

// Queue using LinkedList
Queue<String> queue = new LinkedList<>();
queue.offer("First");
queue.offer("Second");
String first = queue.poll();

// Deque (Double-ended queue)
ArrayDeque<Integer> deque = new ArrayDeque<>();
deque.addFirst(1);
deque.addLast(2);
int first = deque.removeFirst();

Practice Questions - Stacks and Queues 8 Questions

Q19
What is the time complexity of push and pop operations in a stack implemented using an array?
02:00
O(n)
O(1)
O(log n)
O(n²)
Answer: B - O(1)

Push and pop operations in a stack are O(1) because they only involve accessing the top element, which can be done in constant time using an index pointer.

Q20
Which data structure is used to implement function call stack in programming languages?
02:00
Queue
Stack
Linked List
Tree
Answer: B - Stack

Function calls use a stack (LIFO) to manage execution. When a function is called, it's pushed onto the stack, and when it returns, it's popped.

Q21
What is the time complexity of enqueue and dequeue operations in a queue implemented using a circular array?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

Using a circular array with front and rear pointers, both enqueue and dequeue operations can be performed in O(1) time.

Q22
Which algorithm uses a stack for its implementation?
02:00
BFS
DFS
Dijkstra's
Binary Search
Answer: B - DFS

Depth-First Search uses a stack (either explicitly or via recursion) to keep track of vertices to visit next.

Q23
What happens when you try to pop from an empty stack?
02:00
Returns null
Returns 0
Throws StackUnderflowException
Returns -1
Answer: C - Throws StackUnderflowException

Popping from an empty stack should throw an exception (like StackUnderflowException or EmptyStackException) to indicate an error condition.

Q24
What is the space complexity of implementing a stack using a linked list?
02:00
O(1)
O(n)
O(log n)
O(n²)
Answer: B - O(n)

Each node in the linked list requires space for data and a pointer, so the space complexity is O(n) where n is the number of elements.

Q25
In a priority queue, what is the time complexity of extracting the maximum element?
02:00
O(1)
O(log n)
O(n)
O(n log n)
Answer: B - O(log n)

When implemented as a max-heap, extracting the maximum requires O(log n) time to maintain the heap property after removal.

Q26
Which of the following is NOT a valid application of stacks?
02:00
Expression evaluation
Function call management
BFS traversal
Undo operations
Answer: C - BFS traversal

BFS uses a queue, not a stack. Stacks are used for expression evaluation (postfix/infix), function calls, and undo operations.

Trees

Trees are hierarchical data structures with a root node and child nodes. Each node can have zero or more children.

Binary Tree Node

public class TreeNode {
    int val;
    TreeNode left;
    TreeNode right;

    TreeNode(int val) {
        this.val = val;
        this.left = null;
        this.right = null;
    }
}

Binary Tree Traversals

public class BinaryTree {
    private TreeNode root;

    // Pre-order: Root -> Left -> Right
    public void preOrder(TreeNode node) {
        if (node == null) return;
        System.out.print(node.val + " ");
        preOrder(node.left);
        preOrder(node.right);
    }

    // In-order: Left -> Root -> Right
    public void inOrder(TreeNode node) {
        if (node == null) return;
        inOrder(node.left);
        System.out.print(node.val + " ");
        inOrder(node.right);
    }

    // Post-order: Left -> Right -> Root
    public void postOrder(TreeNode node) {
        if (node == null) return;
        postOrder(node.left);
        postOrder(node.right);
        System.out.print(node.val + " ");
    }

    // Level-order (BFS)
    public void levelOrder(TreeNode root) {
        if (root == null) return;
        
        Queue<TreeNode> queue = new LinkedList<>();
        queue.offer(root);
        
        while (!queue.isEmpty()) {
            TreeNode node = queue.poll();
            System.out.print(node.val + " ");
            
            if (node.left != null) {
                queue.offer(node.left);
            }
            if (node.right != null) {
                queue.offer(node.right);
            }
        }
    }
}

Binary Search Tree (BST)

public class BinarySearchTree {
    private TreeNode root;

    public void insert(int val) {
        root = insertRecursive(root, val);
    }

    private TreeNode insertRecursive(TreeNode root, int val) {
        if (root == null) {
            return new TreeNode(val);
        }

        if (val < root.val) {
            root.left = insertRecursive(root.left, val);
        } else if (val > root.val) {
            root.right = insertRecursive(root.right, val);
        }

        return root;
    }

    public boolean search(int val) {
        return searchRecursive(root, val);
    }

    private boolean searchRecursive(TreeNode root, int val) {
        if (root == null) return false;
        if (root.val == val) return true;
        
        if (val < root.val) {
            return searchRecursive(root.left, val);
        } else {
            return searchRecursive(root.right, val);
        }
    }

    public void delete(int val) {
        root = deleteRecursive(root, val);
    }

    private TreeNode deleteRecursive(TreeNode root, int val) {
        if (root == null) return null;

        if (val < root.val) {
            root.left = deleteRecursive(root.left, val);
        } else if (val > root.val) {
            root.right = deleteRecursive(root.right, val);
        } else {
            // Node to delete found
            if (root.left == null) {
                return root.right;
            } else if (root.right == null) {
                return root.left;
            }

            // Node with two children
            root.val = minValue(root.right);
            root.right = deleteRecursive(root.right, root.val);
        }

        return root;
    }

    private int minValue(TreeNode root) {
        int min = root.val;
        while (root.left != null) {
            min = root.left.val;
            root = root.left;
        }
        return min;
    }
}

Practice Questions - Trees 12 Questions

Q27
What is the time complexity of searching for an element in a Binary Search Tree (BST)?
02:00
O(1)
O(log n)
O(n)
O(n log n)
Answer: B - O(log n)

In a balanced BST, searching takes O(log n) time as we eliminate half the tree at each step. However, in worst case (skewed tree), it can be O(n).

Q28
What is the worst-case time complexity of searching in a BST?
02:00
O(log n)
O(1)
O(n)
O(n²)
Answer: C - O(n)

In a skewed BST (all nodes on one side), searching degenerates to a linear search, resulting in O(n) time complexity.

Q29
What is the time complexity of in-order traversal of a binary tree?
02:00
O(n)
O(log n)
O(n log n)
O(1)
Answer: A - O(n)

In-order traversal visits each node exactly once, so the time complexity is O(n) where n is the number of nodes.

Q30
What is the maximum number of nodes in a binary tree of height h?
02:00
2^h
2^(h+1) - 1
h^2
2h
Answer: B - 2^(h+1) - 1

A complete binary tree of height h has at most 2^(h+1) - 1 nodes. This is the maximum when the tree is perfectly balanced.

Q31
In a complete binary tree, what is the relationship between the number of leaf nodes and internal nodes?
02:00
Leaves = Internal nodes
Leaves = Internal nodes + 1
Leaves = Internal nodes - 1
Leaves = Internal nodes + 1 (for binary trees)
Answer: D - Leaves = Internal nodes + 1

In a binary tree, the number of leaf nodes is always one more than the number of internal nodes with two children. This is a fundamental property of binary trees.

Q32
What is the time complexity of finding the minimum element in a BST?
02:00
O(1)
O(n)
O(log n)
O(n log n)
Answer: C - O(log n)

In a balanced BST, finding the minimum requires traversing from root to the leftmost node, which takes O(log n) time in a balanced tree, or O(n) in worst case.

Q33
What is the space complexity of recursive tree traversal?
02:00
O(h) where h is height
O(n)
O(log n)
O(1)
Answer: A - O(h) where h is height

Recursive traversal uses the call stack, which has a depth equal to the height of the tree. For balanced trees, h = log n, but for skewed trees, h = n.

Q34
Which traversal gives sorted order for a BST?
02:00
Pre-order
In-order
Post-order
Level-order
Answer: B - In-order

In-order traversal (left-root-right) of a BST visits nodes in ascending order, giving a sorted sequence.

Q35
What is the time complexity of deleting a node with two children from a BST?
02:00
O(1)
O(n)
O(n²)
O(log n)
Answer: D - O(log n)

Deleting a node with two children requires finding the inorder successor (or predecessor), which takes O(log n) in a balanced BST, then replacing and adjusting pointers.

Q36
What is the minimum height of a binary tree with n nodes?
02:00
n
log n
⌊log₂(n)⌋
n/2
Answer: C - ⌊log₂(n)⌋

The minimum height occurs when the tree is complete/balanced. For n nodes, the minimum height is ⌊log₂(n)⌋, which is achieved in a complete binary tree.

Q37
What data structure is typically used to implement level-order (BFS) traversal of a tree?
02:00
Queue
Stack
Array
Linked List
Answer: A - Queue

Level-order traversal uses a queue to process nodes level by level. Nodes are enqueued as they're discovered and dequeued for processing.

Q38
In a balanced BST with n nodes, what is the worst-case time complexity of insertion?
02:00
O(1)
O(log n)
O(n)
O(n log n)
Answer: B - O(log n)

In a balanced BST, insertion requires finding the correct position by traversing from root to leaf, which takes O(log n) time. The tree remains balanced after insertion.

Graphs

Graphs are collections of nodes (vertices) connected by edges. They can be directed or undirected, weighted or unweighted.

Graph Representation: Adjacency List

import java.util.*;

public class Graph {
    private int vertices;
    private List<List<Integer>> adjacencyList;

    public Graph(int vertices) {
        this.vertices = vertices;
        this.adjacencyList = new ArrayList<>();
        for (int i = 0; i < vertices; i++) {
            adjacencyList.add(new ArrayList<>());
        }
    }

    public void addEdge(int source, int destination) {
        adjacencyList.get(source).add(destination);
        // For undirected graph, uncomment:
        // adjacencyList.get(destination).add(source);
    }

    public void printGraph() {
        for (int i = 0; i < vertices; i++) {
            System.out.print("Vertex " + i + ": ");
            for (Integer neighbor : adjacencyList.get(i)) {
                System.out.print(neighbor + " ");
            }
            System.out.println();
        }
    }
}

Graph Traversal: BFS

public void bfs(int startVertex) {
    boolean[] visited = new boolean[vertices];
    Queue<Integer> queue = new LinkedList<>();

    visited[startVertex] = true;
    queue.offer(startVertex);

    while (!queue.isEmpty()) {
        int vertex = queue.poll();
        System.out.print(vertex + " ");

        for (Integer neighbor : adjacencyList.get(vertex)) {
            if (!visited[neighbor]) {
                visited[neighbor] = true;
                queue.offer(neighbor);
            }
        }
    }
}

Graph Traversal: DFS

public void dfs(int startVertex) {
    boolean[] visited = new boolean[vertices];
    dfsRecursive(startVertex, visited);
}

private void dfsRecursive(int vertex, boolean[] visited) {
    visited[vertex] = true;
    System.out.print(vertex + " ");

    for (Integer neighbor : adjacencyList.get(vertex)) {
        if (!visited[neighbor]) {
            dfsRecursive(neighbor, visited);
        }
    }
}

Practice Questions - Graphs 10 Questions

Q39
What is the time complexity of BFS traversal of a graph with V vertices and E edges?
02:00
O(V)
O(V + E)
O(V × E)
O(E)
Answer: B - O(V + E)

BFS visits each vertex once (O(V)) and examines each edge once (O(E)), resulting in O(V + E) time complexity.

Q40
What is the space complexity of DFS traversal using recursion?
02:00
O(V)
O(E)
O(V + E)
O(1)
Answer: A - O(V)

DFS recursion uses the call stack, which can have at most V frames (one per vertex in the longest path), plus O(V) for the visited array.

Q41
Which data structure is used to implement BFS?
02:00
Stack
Priority Queue
Queue
Heap
Answer: C - Queue

BFS uses a queue (FIFO) to process vertices level by level. Vertices are enqueued as discovered and dequeued for processing.

Q42
What is the time complexity of checking if two vertices are connected in an adjacency matrix representation?
02:00
O(V)
O(E)
O(V + E)
O(1)
Answer: D - O(1)

In an adjacency matrix, checking if vertex i is connected to vertex j is done by accessing matrix[i][j], which is O(1) time.

Q43
What is the space complexity of storing a graph using an adjacency list?
02:00
O(V)
O(V + E)
O(V²)
O(E)
Answer: B - O(V + E)

Adjacency list uses O(V) space for the array of lists and O(E) space for storing all edges, resulting in O(V + E) total space.

Q44
Which algorithm can detect cycles in a directed graph?
02:00
DFS with color coding
BFS only
Dijkstra's algorithm
Binary search
Answer: A - DFS with color coding

DFS with color coding (white/gray/black) can detect cycles in directed graphs. A back edge (gray to gray) indicates a cycle.

Q45
What is the time complexity of finding all neighbors of a vertex in an adjacency list?
02:00
O(1)
O(V)
O(degree of vertex)
O(E)
Answer: C - O(degree of vertex)

In an adjacency list, finding neighbors requires iterating through the list for that vertex, which takes O(degree) time where degree is the number of edges connected to that vertex.

Q46
What is the maximum number of edges in a directed graph with V vertices?
02:00
V(V-1)/2
V(V-1)
2V
Answer: B - V(V-1)

In a directed graph, each vertex can have an edge to every other vertex (including self-loops). Maximum edges = V × V = V², but excluding self-loops it's V(V-1).

Q47
What is the time complexity of DFS traversal of a graph?
02:00
O(V + E)
O(V)
O(E)
O(V × E)
Answer: A - O(V + E)

DFS visits each vertex once (O(V)) and examines each edge once (O(E)), resulting in O(V + E) time complexity, same as BFS.

Q48
Which representation is better for sparse graphs (few edges)?
02:00
Adjacency Matrix
Both are equal
Neither
Adjacency List
Answer: D - Adjacency List

Adjacency list uses O(V + E) space, which is better for sparse graphs. Adjacency matrix uses O(V²) space regardless of edge count, wasting space for sparse graphs.

Hash Tables

Hash tables provide fast insertion, deletion, and lookup operations using a hash function to map keys to array indices.

Simple Hash Table Implementation

public class HashTable {
    private static class Entry {
        String key;
        Integer value;
        Entry next;

        Entry(String key, Integer value) {
            this.key = key;
            this.value = value;
            this.next = null;
        }
    }

    private Entry[] table;
    private int capacity;
    private int size;

    public HashTable(int capacity) {
        this.capacity = capacity;
        this.table = new Entry[capacity];
        this.size = 0;
    }

    private int hash(String key) {
        return Math.abs(key.hashCode()) % capacity;
    }

    public void put(String key, Integer value) {
        int index = hash(key);
        Entry entry = table[index];

        // Check if key already exists
        while (entry != null) {
            if (entry.key.equals(key)) {
                entry.value = value;
                return;
            }
            entry = entry.next;
        }

        // Add new entry
        Entry newEntry = new Entry(key, value);
        newEntry.next = table[index];
        table[index] = newEntry;
        size++;
    }

    public Integer get(String key) {
        int index = hash(key);
        Entry entry = table[index];

        while (entry != null) {
            if (entry.key.equals(key)) {
                return entry.value;
            }
            entry = entry.next;
        }

        return null;
    }

    public boolean remove(String key) {
        int index = hash(key);
        Entry entry = table[index];
        Entry prev = null;

        while (entry != null) {
            if (entry.key.equals(key)) {
                if (prev == null) {
                    table[index] = entry.next;
                } else {
                    prev.next = entry.next;
                }
                size--;
                return true;
            }
            prev = entry;
            entry = entry.next;
        }

        return false;
    }
}

Using Java HashMap

import java.util.HashMap;
import java.util.Map;

// HashMap usage
Map<String, Integer> map = new HashMap<>();
map.put("apple", 5);
map.put("banana", 3);
int count = map.get("apple");
boolean exists = map.containsKey("banana");
map.remove("apple");

// Iterating over HashMap
for (Map.Entry<String, Integer> entry : map.entrySet()) {
    System.out.println(entry.getKey() + ": " + entry.getValue());
}

Practice Questions - Hash Tables 8 Questions

Q49
What is the average time complexity of search, insert, and delete operations in a hash table?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

With a good hash function and proper load factor management, hash table operations average O(1) time. Worst case can be O(n) due to collisions.

Q50
What is the worst-case time complexity of hash table operations?
02:00
O(1)
O(log n)
O(n)
O(n²)
Answer: C - O(n)

In worst case, all keys hash to the same bucket, causing a linear search through the collision chain, resulting in O(n) time.

Q51
Which technique is used to handle collisions in hash tables?
02:00
Chaining or Open Addressing
Binary Search
Sorting
Recursion
Answer: A - Chaining or Open Addressing

Collisions are handled by chaining (linked list at each bucket) or open addressing (probing to find next available slot).

Q52
What is the ideal load factor for a hash table to maintain O(1) average performance?
02:00
0.5
1.0
2.0
0.75
Answer: D - 0.75

A load factor of 0.75 (75% full) is commonly used. When exceeded, the table is resized to maintain performance. Java's HashMap uses 0.75 as default.

Q53
What happens when a hash table's load factor exceeds the threshold?
02:00
Operations become O(log n)
Table is resized (rehashed)
New entries are rejected
Hash function changes
Answer: B - Table is resized (rehashed)

When load factor exceeds threshold, the table is resized (typically doubled) and all elements are rehashed to new positions to maintain performance.

Q54
What is the time complexity of rehashing all elements in a hash table?
02:00
O(1)
O(log n)
O(n)
O(n²)
Answer: C - O(n)

Rehashing requires visiting all n elements and computing their new hash values, taking O(n) time. However, this happens infrequently, so amortized cost remains O(1).

Q55
In Java's HashMap, what methods must be overridden for custom objects used as keys?
02:00
equals() and hashCode()
equals() only
hashCode() only
toString() and equals()
Answer: A - equals() and hashCode()

Both equals() and hashCode() must be overridden. hashCode() determines the bucket, and equals() checks for equality within the bucket. They must be consistent.

Q56
What is the space complexity of a hash table with n elements?
02:00
O(1)
O(log n)
O(n log n)
O(n)
Answer: D - O(n)

Hash table stores n elements, so space complexity is O(n). Additional overhead for buckets/arrays is also O(n), making total space O(n).

Sorting Algorithms

Sorting algorithms arrange elements in a particular order (ascending or descending).

Concept: Sorting Algorithm Categories

  • Comparison-based: Compare elements (Bubble, Quick, Merge) - Lower bound: O(n log n)
  • Non-comparison: Use properties of data (Counting, Radix) - Can achieve O(n)
  • Stable: Maintains relative order of equal elements (Merge, Insertion)
  • In-place: Uses O(1) extra space (Quick, Heap, Bubble)
  • Adaptive: Performs better on partially sorted data (Insertion, Bubble)

Bubble Sort

Visualization: Bubble Sort Step-by-Step

Concept: Repeatedly swap adjacent elements if they're in wrong order. Largest element "bubbles" to the end.

Initial:
64
34
25
12
22
11
90
Pass 1: (64 bubbles to end)
34
25
12
22
11
64
90
Pass 2: (34 bubbles to position)
25
12
22
11
34
64
90
Pass 3: (25 bubbles to position)
12
22
11
25
34
64
90
Pass 4: (22 bubbles to position)
12
11
22
25
34
64
90
Pass 5: (12 bubbles to position)
11
12
22
25
34
64
90
Sorted:
11
12
22
25
34
64
90

Time Complexity: O(n²) | Space Complexity: O(1)

public void bubbleSort(int[] arr) {
    int n = arr.length;
    for (int i = 0; i < n - 1; i++) {
        for (int j = 0; j < n - i - 1; j++) {
            if (arr[j] > arr[j + 1]) {
                // Swap
                int temp = arr[j];
                arr[j] = arr[j + 1];
                arr[j + 1] = temp;
            }
        }
    }
}
// Time Complexity: O(n²)
// Space Complexity: O(1)
                

Selection Sort

public void selectionSort(int[] arr) {
    int n = arr.length;
    for (int i = 0; i < n - 1; i++) {
        int minIndex = i;
        for (int j = i + 1; j < n; j++) {
            if (arr[j] < arr[minIndex]) {
                minIndex = j;
            }
        }
        // Swap
        int temp = arr[minIndex];
        arr[minIndex] = arr[i];
        arr[i] = temp;
    }
}
// Time Complexity: O(n²)
// Space Complexity: O(1)
                

Insertion Sort

public void insertionSort(int[] arr) {
    int n = arr.length;
    for (int i = 1; i < n; i++) {
        int key = arr[i];
        int j = i - 1;
        
        while (j >= 0 && arr[j] > key) {
            arr[j + 1] = arr[j];
            j--;
        }
        arr[j + 1] = key;
    }
}
// Time Complexity: O(n²)
// Space Complexity: O(1)
                

Merge Sort

public void mergeSort(int[] arr, int left, int right) {
    if (left < right) {
        int mid = left + (right - left) / 2;
        
        mergeSort(arr, left, mid);
        mergeSort(arr, mid + 1, right);
        
        merge(arr, left, mid, right);
    }
}

private void merge(int[] arr, int left, int mid, int right) {
    int n1 = mid - left + 1;
    int n2 = right - mid;
    
    int[] leftArr = new int[n1];
    int[] rightArr = new int[n2];
    
    System.arraycopy(arr, left, leftArr, 0, n1);
    System.arraycopy(arr, mid + 1, rightArr, 0, n2);
    
    int i = 0, j = 0, k = left;
    
    while (i < n1 && j < n2) {
        if (leftArr[i] <= rightArr[j]) {
            arr[k++] = leftArr[i++];
        } else {
            arr[k++] = rightArr[j++];
        }
    }
    
    while (i < n1) {
        arr[k++] = leftArr[i++];
    }
    
    while (j < n2) {
        arr[k++] = rightArr[j++];
    }
}
// Time Complexity: O(n log n)
// Space Complexity: O(n)
                

Quick Sort

public void quickSort(int[] arr, int low, int high) {
    if (low < high) {
        int pi = partition(arr, low, high);
        
        quickSort(arr, low, pi - 1);
        quickSort(arr, pi + 1, high);
    }
}

private int partition(int[] arr, int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    
    for (int j = low; j < high; j++) {
        if (arr[j] < pivot) {
            i++;
            int temp = arr[i];
            arr[i] = arr[j];
            arr[j] = temp;
        }
    }
    
    int temp = arr[i + 1];
    arr[i + 1] = arr[high];
    arr[high] = temp;
    
    return i + 1;
}
// Time Complexity: O(n log n) average, O(n²) worst case
// Space Complexity: O(log n)
                

Sorting Algorithm Comparison

Algorithm Best Case Average Case Worst Case Space Stable
Bubble Sort O(n) O(n²) O(n²) O(1) Yes
Selection Sort O(n²) O(n²) O(n²) O(1) No
Insertion Sort O(n) O(n²) O(n²) O(1) Yes
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes
Quick Sort O(n log n) O(n log n) O(n²) O(log n) No

Practice Questions - Sorting Algorithms 12 Questions

Q57
What is the time complexity of Merge Sort?
02:00
O(n)
O(n²)
O(n log n)
O(log n)
Answer: C - O(n log n)

Merge Sort divides the array in half (log n levels) and merges n elements at each level, resulting in O(n log n) time complexity in all cases.

Q58
What is the worst-case time complexity of Quick Sort?
02:00
O(n log n)
O(n²)
O(n)
O(log n)
Answer: B - O(n²)

Quick Sort's worst case occurs when the pivot is always the smallest or largest element, causing O(n²) time. Average case is O(n log n).

Q59
Which sorting algorithm is stable?
02:00
Merge Sort
Quick Sort
Heap Sort
Selection Sort
Answer: A - Merge Sort

Merge Sort is stable because it preserves the relative order of equal elements. Quick Sort and Heap Sort are not stable by default.

Q60
What is the space complexity of Merge Sort?
02:00
O(1)
O(log n)
O(n log n)
O(n)
Answer: D - O(n)

Merge Sort requires O(n) extra space for the temporary arrays used during merging. This is the main disadvantage compared to in-place sorts.

Q61
What is the best-case time complexity of Bubble Sort?
02:00
O(n²)
O(n)
O(n log n)
O(log n)
Answer: B - O(n)

With an optimized Bubble Sort that checks if any swaps occurred, if the array is already sorted, it can detect this in one pass, making best case O(n).

Q62
Which sorting algorithm has O(1) space complexity?
02:00
Merge Sort
Counting Sort
Bubble Sort
Radix Sort
Answer: C - Bubble Sort

Bubble Sort, Insertion Sort, Selection Sort, and Heap Sort are in-place algorithms with O(1) space complexity. Merge Sort requires O(n) space.

Q63
What is the time complexity of Heap Sort?
02:00
O(n log n)
O(n)
O(n²)
O(log n)
Answer: A - O(n log n)

Heap Sort builds a heap in O(n) time, then extracts n elements, each extraction taking O(log n), resulting in O(n log n) overall time complexity.

Q64
Which sorting algorithm is best for small arrays (typically < 10 elements)?
02:00
Merge Sort
Quick Sort
Heap Sort
Insertion Sort
Answer: D - Insertion Sort

Insertion Sort is efficient for small arrays due to its simplicity and low overhead. Many hybrid algorithms (like Timsort) use Insertion Sort for small subarrays.

Q65
What is the average time complexity of Quick Sort?
02:00
O(n)
O(n log n)
O(n²)
O(log n)
Answer: B - O(n log n)

On average, Quick Sort partitions the array into roughly equal halves, resulting in O(n log n) time complexity. This is why it's widely used despite O(n²) worst case.

Q66
Which sorting algorithm maintains relative order of elements with equal keys?
02:00
Quick Sort
Heap Sort
Insertion Sort
Selection Sort
Answer: C - Insertion Sort

Insertion Sort is stable - it preserves the relative order of equal elements. Merge Sort is also stable, while Quick Sort and Heap Sort are not.

Q67
What is the time complexity of Counting Sort for n elements with range [0, k]?
02:00
O(n + k)
O(n log n)
O(n × k)
O(k)
Answer: A - O(n + k)

Counting Sort counts occurrences (O(n)), then reconstructs the array (O(n + k)). It's efficient when k is small compared to n.

Q68
Which sorting algorithm uses the divide-and-conquer approach?
02:00
Bubble Sort
Insertion Sort
Selection Sort
Merge Sort and Quick Sort
Answer: D - Merge Sort and Quick Sort

Both Merge Sort and Quick Sort use divide-and-conquer: they divide the problem into smaller subproblems, solve them recursively, and combine results.

Searching Algorithms

Searching algorithms find the location of a target value within a data structure.

Linear Search

public int linearSearch(int[] arr, int target) {
    for (int i = 0; i < arr.length; i++) {
        if (arr[i] == target) {
            return i;
        }
    }
    return -1; // Not found
}
// Time Complexity: O(n)
// Space Complexity: O(1)
                

Binary Search

// Iterative approach
public int binarySearch(int[] arr, int target) {
    int left = 0;
    int right = arr.length - 1;
    
    while (left <= right) {
        int mid = left + (right - left) / 2;
        
        if (arr[mid] == target) {
            return mid;
        } else if (arr[mid] < target) {
            left = mid + 1;
        } else {
            right = mid - 1;
        }
    }
    
    return -1; // Not found
}

// Recursive approach
public int binarySearchRecursive(int[] arr, int target, int left, int right) {
    if (left > right) {
        return -1;
    }
    
    int mid = left + (right - left) / 2;
    
    if (arr[mid] == target) {
        return mid;
    } else if (arr[mid] < target) {
        return binarySearchRecursive(arr, target, mid + 1, right);
    } else {
        return binarySearchRecursive(arr, target, left, mid - 1);
    }
}
// Time Complexity: O(log n)
// Space Complexity: O(1) iterative, O(log n) recursive
                

Practice Questions - Searching Algorithms 8 Questions

Q69
What is the time complexity of Binary Search on a sorted array?
02:00
O(n)
O(log n)
O(n log n)
O(1)
Answer: B - O(log n)

Binary Search eliminates half the search space at each step, resulting in O(log n) time complexity for searching in a sorted array.

Q70
What is the time complexity of Linear Search?
02:00
O(n)
O(log n)
O(1)
O(n²)
Answer: A - O(n)

Linear Search checks each element sequentially until found or the end is reached, resulting in O(n) time complexity in the worst case.

Q71
What is the prerequisite for Binary Search?
02:00
Array must be unsorted
Array must contain only integers
Array must be sorted
Array must have unique elements
Answer: C - Array must be sorted

Binary Search requires the array to be sorted (ascending or descending) to correctly eliminate half the search space at each step.

Q72
What is the best-case time complexity of Linear Search?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

Best case occurs when the target element is at the first position, requiring only one comparison, making it O(1).

Q73
What is the space complexity of iterative Binary Search?
02:00
O(n)
O(log n)
O(n log n)
O(1)
Answer: D - O(1)

Iterative Binary Search uses only a few variables (left, right, mid), making it O(1) space. Recursive Binary Search uses O(log n) space for the call stack.

Q74
How many comparisons are needed in the worst case for Binary Search on an array of size n?
02:00
⌊log₂(n)⌋ + 1
n
n/2
log₂(n)
Answer: A - ⌊log₂(n)⌋ + 1

In worst case, Binary Search needs ⌊log₂(n)⌋ comparisons to narrow down to one element, plus one final comparison, totaling ⌊log₂(n)⌋ + 1.

Q75
Which search algorithm can be used on unsorted arrays?
02:00
Binary Search
Ternary Search
Linear Search
Jump Search
Answer: C - Linear Search

Linear Search works on both sorted and unsorted arrays. Binary Search, Ternary Search, and Jump Search all require sorted arrays.

Q76
What is the time complexity of searching in a hash table (average case)?
02:00
O(n)
O(1)
O(log n)
O(n log n)
Answer: B - O(1)

With a good hash function and proper load factor, hash table search averages O(1) time, making it faster than Binary Search for large datasets.

Graph Algorithms

Graph algorithms solve problems related to graphs, such as finding shortest paths, detecting cycles, or finding connected components.

Dijkstra's Algorithm (Shortest Path)

import java.util.*;

public class Dijkstra {
    public int[] dijkstra(int[][] graph, int source) {
        int vertices = graph.length;
        int[] distances = new int[vertices];
        boolean[] visited = new boolean[vertices];
        
        Arrays.fill(distances, Integer.MAX_VALUE);
        distances[source] = 0;
        
        PriorityQueue<int[]> pq = new PriorityQueue<>(
            (a, b) -> a[1] - b[1]
        );
        pq.offer(new int[]{source, 0});
        
        while (!pq.isEmpty()) {
            int[] current = pq.poll();
            int u = current[0];
            
            if (visited[u]) continue;
            visited[u] = true;
            
            for (int v = 0; v < vertices; v++) {
                if (graph[u][v] != 0 && !visited[v]) {
                    int newDist = distances[u] + graph[u][v];
                    if (newDist < distances[v]) {
                        distances[v] = newDist;
                        pq.offer(new int[]{v, newDist});
                    }
                }
            }
        }
        
        return distances;
    }
}

Topological Sort

import java.util.*;

public class TopologicalSort {
    public List<Integer> topologicalSort(int vertices, List<List<Integer>> graph) {
        int[] inDegree = new int[vertices];
        
        // Calculate in-degrees
        for (int i = 0; i < vertices; i++) {
            for (int neighbor : graph.get(i)) {
                inDegree[neighbor]++;
            }
        }
        
        Queue<Integer> queue = new LinkedList<>();
        for (int i = 0; i < vertices; i++) {
            if (inDegree[i] == 0) {
                queue.offer(i);
            }
        }
        
        List<Integer> result = new ArrayList<>();
        while (!queue.isEmpty()) {
            int u = queue.poll();
            result.add(u);
            
            for (int v : graph.get(u)) {
                inDegree[v]--;
                if (inDegree[v] == 0) {
                    queue.offer(v);
                }
            }
        }
        
        return result;
    }
}

Practice Questions - Graph Algorithms 8 Questions

Q77
What is the time complexity of Dijkstra's algorithm using a binary heap?
02:00
O(V)
O(V + E)
O((V + E) log V)
O(V²)
Answer: C - O((V + E) log V)

With a binary heap, each extract-min takes O(log V) and we do it V times, plus O(log V) for each edge relaxation, resulting in O((V + E) log V).

Q78
Dijkstra's algorithm works correctly only when:
02:00
All edge weights are non-negative
Graph is acyclic
Graph is undirected
Graph is connected
Answer: A - All edge weights are non-negative

Dijkstra's algorithm assumes non-negative weights. With negative weights, it may not find the shortest path. Use Bellman-Ford for negative weights.

Q79
What is the time complexity of Topological Sort?
02:00
O(V)
O(V + E)
O(V × E)
O(V²)
Answer: B - O(V + E)

Topological Sort visits each vertex once and examines each edge once, resulting in O(V + E) time complexity.

Q80
Topological Sort can be applied to:
02:00
Any graph
Undirected graphs only
Complete graphs
Directed Acyclic Graphs (DAGs)
Answer: D - Directed Acyclic Graphs (DAGs)

Topological Sort only works on DAGs. If a cycle exists, there's no valid topological ordering.

Q81
What is the time complexity of finding shortest paths from a single source using Bellman-Ford algorithm?
02:00
O(V × E)
O(V + E)
O(V log V)
O(V²)
Answer: A - O(V × E)

Bellman-Ford relaxes all edges V-1 times, resulting in O(V × E) time complexity. It can handle negative weights unlike Dijkstra's.

Q82
Which algorithm can detect negative cycles in a graph?
02:00
Dijkstra's
DFS
Bellman-Ford
BFS
Answer: C - Bellman-Ford

Bellman-Ford can detect negative cycles by checking if distances can still be improved after V-1 iterations. If yes, a negative cycle exists.

Q83
What is the time complexity of finding all pairs shortest paths using Floyd-Warshall algorithm?
02:00
O(V + E)
O(V³)
O(V²)
O(V × E)
Answer: B - O(V³)

Floyd-Warshall uses three nested loops over V vertices, resulting in O(V³) time complexity. It finds shortest paths between all pairs of vertices.

Q84
Which algorithm is used to find minimum spanning tree?
02:00
Dijkstra's
Bellman-Ford
Topological Sort
Kruskal's or Prim's
Answer: D - Kruskal's or Prim's

Kruskal's and Prim's algorithms find Minimum Spanning Trees. Kruskal's uses Union-Find, Prim's uses a priority queue. Both are greedy algorithms.

Dynamic Programming

Dynamic Programming is a technique for solving complex problems by breaking them down into simpler subproblems and storing the results.

Fibonacci Sequence (Memoization)

import java.util.HashMap;
import java.util.Map;

public class Fibonacci {
    private Map<Integer, Long> memo = new HashMap<>();
    
    public long fibonacci(int n) {
        if (n <= 1) {
            return n;
        }
        
        if (memo.containsKey(n)) {
            return memo.get(n);
        }
        
        long result = fibonacci(n - 1) + fibonacci(n - 2);
        memo.put(n, result);
        return result;
    }
    
    // Bottom-up approach
    public long fibonacciDP(int n) {
        if (n <= 1) return n;
        
        long[] dp = new long[n + 1];
        dp[0] = 0;
        dp[1] = 1;
        
        for (int i = 2; i <= n; i++) {
            dp[i] = dp[i - 1] + dp[i - 2];
        }
        
        return dp[n];
    }
}

Longest Common Subsequence

public int longestCommonSubsequence(String text1, String text2) {
    int m = text1.length();
    int n = text2.length();
    int[][] dp = new int[m + 1][n + 1];
    
    for (int i = 1; i <= m; i++) {
        for (int j = 1; j <= n; j++) {
            if (text1.charAt(i - 1) == text2.charAt(j - 1)) {
                dp[i][j] = dp[i - 1][j - 1] + 1;
            } else {
                dp[i][j] = Math.max(dp[i - 1][j], dp[i][j - 1]);
            }
        }
    }
    
    return dp[m][n];
}

0/1 Knapsack Problem

public int knapsack(int[] weights, int[] values, int capacity) {
    int n = weights.length;
    int[][] dp = new int[n + 1][capacity + 1];
    
    for (int i = 1; i <= n; i++) {
        for (int w = 1; w <= capacity; w++) {
            if (weights[i - 1] <= w) {
                dp[i][w] = Math.max(
                    dp[i - 1][w],
                    dp[i - 1][w - weights[i - 1]] + values[i - 1]
                );
            } else {
                dp[i][w] = dp[i - 1][w];
            }
        }
    }
    
    return dp[n][capacity];
}

Practice Questions - Dynamic Programming 10 Questions

Q85
What is the main principle behind Dynamic Programming?
02:00
Divide and conquer
Store solutions to subproblems to avoid recomputation
Greedy selection
Random selection
Answer: B - Store solutions to subproblems to avoid recomputation

DP optimizes recursive solutions by storing results of subproblems (memoization/tabulation) to avoid solving the same subproblem multiple times.

Q86
What is the time complexity of Fibonacci using memoization?
02:00
O(2^n)
O(n²)
O(n)
O(log n)
Answer: C - O(n)

With memoization, each Fibonacci number is computed only once, reducing time from O(2^n) to O(n). Space is also O(n) for the memo table.

Q87
What are the two approaches to implement Dynamic Programming?
02:00
Top-down (memoization) and Bottom-up (tabulation)
Recursive and Iterative
Greedy and Divide-conquer
BFS and DFS
Answer: A - Top-down (memoization) and Bottom-up (tabulation)

Top-down starts from the problem and recursively solves subproblems with memoization. Bottom-up builds solutions from smallest subproblems upward using a table.

Q88
What is the time complexity of the 0/1 Knapsack problem using DP?
02:00
O(n)
O(n log n)
O(2^n)
O(n × W) where W is capacity
Answer: D - O(n × W) where W is capacity

DP solution fills a 2D table of size n × W, where each cell takes O(1) time to compute, resulting in O(n × W) time complexity.

Q89
What is the time complexity of Longest Common Subsequence (LCS) using DP?
02:00
O(n)
O(m × n)
O(2^(m+n))
O(m + n)
Answer: B - O(m × n)

LCS fills a 2D table of size m × n where m and n are lengths of the two strings. Each cell takes O(1) time, resulting in O(m × n) complexity.

Q90
Which property must a problem have to be solved using Dynamic Programming?
02:00
Greedy choice property
Optimal substructure only
Optimal substructure and overlapping subproblems
Divide and conquer property
Answer: C - Optimal substructure and overlapping subproblems

DP requires optimal substructure (optimal solution contains optimal solutions to subproblems) and overlapping subproblems (same subproblems recur).

Q91
What is the space complexity of Fibonacci using bottom-up DP?
02:00
O(1) with optimization
O(n)
O(n²)
O(log n)
Answer: A - O(1) with optimization

By storing only the last two values instead of the entire array, Fibonacci can be computed in O(1) space. Standard bottom-up uses O(n) space.

Q92
What is the difference between memoization and tabulation?
02:00
Memoization is iterative, tabulation is recursive
Tabulation uses less space
Memoization is faster
Memoization is top-down recursive, tabulation is bottom-up iterative
Answer: D - Memoization is top-down recursive, tabulation is bottom-up iterative

Memoization starts from the main problem and recursively solves subproblems. Tabulation builds solutions from smallest subproblems upward iteratively.

Q93
What is the time complexity of computing nth Fibonacci number using naive recursion?
02:00
O(n)
O(2^n)
O(n²)
O(log n)
Answer: B - O(2^n)

Naive recursion recalculates the same values multiple times, creating an exponential recursion tree with O(2^n) time complexity.

Q94
In DP, what does "state" refer to?
02:00
The current position in the array
The algorithm being used
The parameters that define a subproblem
The data structure used
Answer: C - The parameters that define a subproblem

State represents the parameters that uniquely identify a subproblem. For example, in LCS, state is (i, j) representing positions in the two strings.

Time Complexity Analysis

Time complexity describes how the runtime of an algorithm grows with the input size.

Big O Notation

Notation Name Example
O(1) Constant Array access, HashMap lookup
O(log n) Logarithmic Binary search, BST operations
O(n) Linear Linear search, traversing array
O(n log n) Linearithmic Merge sort, Quick sort (average)
O(n²) Quadratic Bubble sort, nested loops
O(2ⁿ) Exponential Recursive Fibonacci (naive)
Tip: When analyzing time complexity, focus on the dominant term and ignore constants. For example, O(3n + 5) simplifies to O(n).

Practice Questions - Time Complexity 6 Questions

Q95
What does Big O notation represent?
02:00
Exact running time
Upper bound on growth rate
Lower bound on growth rate
Average case complexity
Answer: B - Upper bound on growth rate

Big O notation describes the worst-case upper bound of an algorithm's time complexity, showing how runtime grows as input size increases.

Q96
What is the time complexity of O(3n² + 2n + 1)?
02:00
O(3n²)
O(n² + n)
O(n²)
O(3n² + 2n)
Answer: C - O(n²)

We drop constants and lower-order terms. O(3n² + 2n + 1) simplifies to O(n²) as n² is the dominant term.

Q97
Which of the following has the best time complexity?
02:00
O(1)
O(log n)
O(n)
O(n log n)
Answer: A - O(1)

O(1) is constant time - the best possible. Then O(log n), O(n), O(n log n), O(n²), etc. in increasing order of complexity.

Q98
What is the time complexity of nested loops where outer loop runs n times and inner loop runs m times?
02:00
O(n)
O(m)
O(n + m)
O(n × m)
Answer: D - O(n × m)

Nested loops multiply their complexities. Outer loop runs n times, and for each iteration, inner loop runs m times, resulting in O(n × m) total operations.

Q99
What is the time complexity of an algorithm that performs a constant-time operation n² times?
02:00
O(n)
O(n²)
O(1)
O(log n)
Answer: B - O(n²)

If an O(1) operation is performed n² times, the total time complexity is O(1 × n²) = O(n²).

Q100
Which notation represents the tight bound (both upper and lower bound)?
02:00
Big O
Big Omega (Ω)
Big Theta (Θ)
Little o
Answer: C - Big Theta (Θ)

Big Theta (Θ) represents a tight bound - both upper and lower bounds are the same. Big O is upper bound, Big Omega (Ω) is lower bound.

Best Practices

1. Choose the Right Data Structure

  • Use ArrayList for frequent random access
  • Use LinkedList for frequent insertions/deletions
  • Use HashMap for O(1) lookups
  • Use TreeSet for sorted unique elements
  • Use PriorityQueue for priority-based operations

2. Optimize Space Usage

  • Use iterative solutions when possible to avoid stack overflow
  • Reuse arrays/collections instead of creating new ones
  • Consider space-time tradeoffs

3. Handle Edge Cases

  • Empty arrays/lists
  • Single element
  • Null values
  • Duplicate values
  • Very large inputs

4. Code Readability

  • Use meaningful variable names
  • Add comments for complex logic
  • Break down complex algorithms into smaller functions
  • Follow Java naming conventions

5. Testing

  • Test with small inputs first
  • Test edge cases
  • Test with large inputs for performance
  • Verify correctness with known outputs
Remember: Understanding the problem thoroughly before coding is more important than writing code quickly. Take time to analyze the requirements and choose the appropriate algorithm and data structure.

Summary

This guide covered essential algorithms and data structures in Java:

  • Data Structures: Arrays, Linked Lists, Stacks, Queues, Trees, Graphs, Hash Tables
  • Sorting: Bubble, Selection, Insertion, Merge, Quick Sort
  • Searching: Linear and Binary Search
  • Graph Algorithms: BFS, DFS, Dijkstra's, Topological Sort
  • Dynamic Programming: Memoization, Tabulation, Classic Problems

Practice implementing these concepts and solving problems on platforms like LeetCode, HackerRank, or CodeChef to strengthen your understanding.

Previous
Next Post »

BOOK OF THE DAY