Department of Informatics
Faculty of Industrial Engineering
Universitas Pembangunan Nasional “Veteran” Yogyakarta
Data
Structure
Andi Nurkholis, S.Kom., M.Kom.
Recursion: Concept
& Implementation
November 3, 2025
Definition of Recursion
Recursion is a programming technique in which a function calls itself—either
directly or indirectly—to solve a large problem by breaking it down into
smaller, similar subproblems. This process continues until it reaches a base
case, which can be solved directly without further recursive calls. In essence,
recursion defines a problem in terms of itself.
A simple example: the factorial of n (written as n!) is defined as follows:
n! = n × (n - 1)! // Recursive formula that calls the previous value, if n > 1
1! = 1 // Base case that stops the recursion
Structure of Recursion
The basic structure of a recursive function consists of two key components:
1. Base Case: The condition that stops the recursion. Without a base case, the
function calls would continue indefinitely, leading to a stack overflow.
2. Recursive Case: Breaks the problem into smaller parts and calls the function
itself.
General structure of a recursive function:
data_type function(parameter) {
if (base_case_condition) {
// direct solution
return value;
} else {
Example of
Recursion
1. Factorial Recursion
2. Fibonacci Recursion
Factorial Recursion
One of the classic examples of recursion
is the factorial function. The factorial of a
number n (denoted as n!) is the product of
all positive integers up to n.
Factorial Formula:
$n! = n * (n - 1)!$ for $n > 0$
$0! = 1$ (base case)
Fibonacci Recursion
The Fibonacci sequence is a series of
numbers that begins with 0 and 1, where
each subsequent number is the sum of the
two preceding ones.
Fibonacci Sequence Formula:
$F(0) = 0$
$F(1) = 1$
$F(n) = F(n-1) + F(n-2)$ for $n > 1$
Workflow of
Recursion
1. Function Call with Smaller
Arguments
2. Information Storage in the Call
Stack
3. Reaching the Base Case and
Backtracking Process
4. Combining Results to Obtain the
Final Solution
Workflow of Recursion
1. Function Call with Smaller Arguments
ü A recursive function breaks a large problem into smaller subproblems.
ü Each new call uses a simpler argument, moving closer to the base case.
ü Example: In factorial calculation, factorial(5) calls factorial(4), then
factorial(3), and so on.
ü Illustration: factorial(5) → 5 × factorial(4) → 5 × (4 × factorial(3)) → …
until factorial(1)
Workflow of Recursion
2. Information Storage in the Call Stack
ü Each function call stores its parameters, the address of the next
instruction, and temporary local variables in the call stack.
ü This stack operates on the LIFO (Last In, First Out) principle — the last
call made is the first one to complete.
ü The call stack allows the program to know where to return once the
recursive process finishes.
ü Example: When factorial(5) is called, the call stack stores each function call
waiting for the result of the previous one until it reaches factorial(1) as the
base case.
Workflow of Recursion
3. Reaching the Base Case and Backtracking Process
ü Recursion continues until it reaches the base case — the simplest
condition that can be resolved directly without further function calls.
ü Once the base case is found, the function begins returning values to its
previous callers.
ü This process is called backtracking, which refers to returning results
sequentially from the bottom to the top of the stack.
ü Example: factorial(1) returns 1, then this result is used by factorial(2) to
compute 2 × 1 = 2, and so on until factorial(5) obtains its final result.
Workflow of Recursion
4. Combining Results to Obtain the Final Solution
ü Each recursive function that finishes execution returns a value to its
caller.
ü These returned values are then combined according to the operations
defined within the function.
ü Ultimately, the value from the first (root) call becomes the final answer to
the given problem.
ü Example (Factorial): 1! = 1, 2! = 2, 3! = 6, 4! = 24, 5! = 120 — each
value is multiplied by the factorial of the previous number.
Advantages of Recursion
ü Simplicity and Elegance: Recursion often produces cleaner and more
understandable solutions.
ü Code Reduction: In many cases, recursive functions allow problems to be
solved with less code compared to using iteration.
ü Solution for Complex Problems: Recursion is highly useful for problems that
can be divided into smaller subproblems.
Disadvantages of Recursion
ü Memory Usage: Each additional function call is stored in the call stack, which
can lead to excessive memory usage. In cases of deep recursion, this may
cause a stack overflow.
ü Performance Efficiency: Some recursive algorithms (such as the Fibonacci
example above) may be inefficient for large inputs, as they repeatedly
compute the same values. This approach can be optimized using memoization
or iteration techniques.
ü More Difficult Debugging: Debugging recursive code is often more complex
than debugging iterative code.
Recursion Optimization
Recursion simplifies code but can be inefficient because repeated calculations
increase execution time. Memoization is a recursive optimization technique
that stores previously computed results in a cache, so repeated calls don’t
recompute — they simply retrieve the stored results. How Memoization Works:
ü The function is called with a specific argument.
ü If the result already exists in the cache, it is returned immediately.
ü If not, the function computes the result, stores it in the cache, and then
returns the value.
ü Subsequent calls with the same argument simply fetch the value from the
cache, avoiding redundant computation.
Recursion Optimization
Characteristics of Memoization:
ü Recursion-Based: Typically used in recursive functions that perform repeated
calculations on the same arguments.
ü Uses a Cache (temporary storage): The cache can be an array, hash table, or
other data structures.
ü Reduces Redundancy: Avoids unnecessary repeated computations.
ü Improves Efficiency: Time complexity can decrease significantly, for example
from O(2^n) to O(n) in the Fibonacci problem.
Recursion & Iteration
ü Simple vs. Complex: Recursion is well-suited for solving complex and
hierarchical problems, such as tree traversal or divide-and-conquer algorithms.
Recursive code tends to be more concise and easier to read. Iteration is better
for simple problems, as it uses direct loops, making it more efficient and
easier to control.
ü Memory Usage: Recursion requires space in the call stack for each function
call, risking a stack overflow if the recursion depth is too large. Iteration is
more memory-efficient since it does not accumulate function calls, making it
safe for large loops.
ü Time Efficiency: Recursion has function call overhead, making it slower.
Iteration runs directly within loops, making it faster.
Department of Informatics
Faculty of Industrial Engineering
Universitas Pembangunan Nasional “Veteran” Yogyakarta
Andi Nurkholis, S.Kom., M.Kom.
November 3, 2025
Done
Thank
You

Data Structure - 9 Recursion

  • 1.
    Department of Informatics Facultyof Industrial Engineering Universitas Pembangunan Nasional “Veteran” Yogyakarta Data Structure Andi Nurkholis, S.Kom., M.Kom. Recursion: Concept & Implementation November 3, 2025
  • 2.
    Definition of Recursion Recursionis a programming technique in which a function calls itself—either directly or indirectly—to solve a large problem by breaking it down into smaller, similar subproblems. This process continues until it reaches a base case, which can be solved directly without further recursive calls. In essence, recursion defines a problem in terms of itself. A simple example: the factorial of n (written as n!) is defined as follows: n! = n × (n - 1)! // Recursive formula that calls the previous value, if n > 1 1! = 1 // Base case that stops the recursion
  • 3.
    Structure of Recursion Thebasic structure of a recursive function consists of two key components: 1. Base Case: The condition that stops the recursion. Without a base case, the function calls would continue indefinitely, leading to a stack overflow. 2. Recursive Case: Breaks the problem into smaller parts and calls the function itself. General structure of a recursive function: data_type function(parameter) { if (base_case_condition) { // direct solution return value; } else {
  • 4.
    Example of Recursion 1. FactorialRecursion 2. Fibonacci Recursion
  • 5.
    Factorial Recursion One ofthe classic examples of recursion is the factorial function. The factorial of a number n (denoted as n!) is the product of all positive integers up to n. Factorial Formula: $n! = n * (n - 1)!$ for $n > 0$ $0! = 1$ (base case)
  • 6.
    Fibonacci Recursion The Fibonaccisequence is a series of numbers that begins with 0 and 1, where each subsequent number is the sum of the two preceding ones. Fibonacci Sequence Formula: $F(0) = 0$ $F(1) = 1$ $F(n) = F(n-1) + F(n-2)$ for $n > 1$
  • 7.
    Workflow of Recursion 1. FunctionCall with Smaller Arguments 2. Information Storage in the Call Stack 3. Reaching the Base Case and Backtracking Process 4. Combining Results to Obtain the Final Solution
  • 8.
    Workflow of Recursion 1.Function Call with Smaller Arguments ü A recursive function breaks a large problem into smaller subproblems. ü Each new call uses a simpler argument, moving closer to the base case. ü Example: In factorial calculation, factorial(5) calls factorial(4), then factorial(3), and so on. ü Illustration: factorial(5) → 5 × factorial(4) → 5 × (4 × factorial(3)) → … until factorial(1)
  • 9.
    Workflow of Recursion 2.Information Storage in the Call Stack ü Each function call stores its parameters, the address of the next instruction, and temporary local variables in the call stack. ü This stack operates on the LIFO (Last In, First Out) principle — the last call made is the first one to complete. ü The call stack allows the program to know where to return once the recursive process finishes. ü Example: When factorial(5) is called, the call stack stores each function call waiting for the result of the previous one until it reaches factorial(1) as the base case.
  • 10.
    Workflow of Recursion 3.Reaching the Base Case and Backtracking Process ü Recursion continues until it reaches the base case — the simplest condition that can be resolved directly without further function calls. ü Once the base case is found, the function begins returning values to its previous callers. ü This process is called backtracking, which refers to returning results sequentially from the bottom to the top of the stack. ü Example: factorial(1) returns 1, then this result is used by factorial(2) to compute 2 × 1 = 2, and so on until factorial(5) obtains its final result.
  • 11.
    Workflow of Recursion 4.Combining Results to Obtain the Final Solution ü Each recursive function that finishes execution returns a value to its caller. ü These returned values are then combined according to the operations defined within the function. ü Ultimately, the value from the first (root) call becomes the final answer to the given problem. ü Example (Factorial): 1! = 1, 2! = 2, 3! = 6, 4! = 24, 5! = 120 — each value is multiplied by the factorial of the previous number.
  • 12.
    Advantages of Recursion üSimplicity and Elegance: Recursion often produces cleaner and more understandable solutions. ü Code Reduction: In many cases, recursive functions allow problems to be solved with less code compared to using iteration. ü Solution for Complex Problems: Recursion is highly useful for problems that can be divided into smaller subproblems.
  • 13.
    Disadvantages of Recursion üMemory Usage: Each additional function call is stored in the call stack, which can lead to excessive memory usage. In cases of deep recursion, this may cause a stack overflow. ü Performance Efficiency: Some recursive algorithms (such as the Fibonacci example above) may be inefficient for large inputs, as they repeatedly compute the same values. This approach can be optimized using memoization or iteration techniques. ü More Difficult Debugging: Debugging recursive code is often more complex than debugging iterative code.
  • 14.
    Recursion Optimization Recursion simplifiescode but can be inefficient because repeated calculations increase execution time. Memoization is a recursive optimization technique that stores previously computed results in a cache, so repeated calls don’t recompute — they simply retrieve the stored results. How Memoization Works: ü The function is called with a specific argument. ü If the result already exists in the cache, it is returned immediately. ü If not, the function computes the result, stores it in the cache, and then returns the value. ü Subsequent calls with the same argument simply fetch the value from the cache, avoiding redundant computation.
  • 15.
    Recursion Optimization Characteristics ofMemoization: ü Recursion-Based: Typically used in recursive functions that perform repeated calculations on the same arguments. ü Uses a Cache (temporary storage): The cache can be an array, hash table, or other data structures. ü Reduces Redundancy: Avoids unnecessary repeated computations. ü Improves Efficiency: Time complexity can decrease significantly, for example from O(2^n) to O(n) in the Fibonacci problem.
  • 16.
    Recursion & Iteration üSimple vs. Complex: Recursion is well-suited for solving complex and hierarchical problems, such as tree traversal or divide-and-conquer algorithms. Recursive code tends to be more concise and easier to read. Iteration is better for simple problems, as it uses direct loops, making it more efficient and easier to control. ü Memory Usage: Recursion requires space in the call stack for each function call, risking a stack overflow if the recursion depth is too large. Iteration is more memory-efficient since it does not accumulate function calls, making it safe for large loops. ü Time Efficiency: Recursion has function call overhead, making it slower. Iteration runs directly within loops, making it faster.
  • 17.
    Department of Informatics Facultyof Industrial Engineering Universitas Pembangunan Nasional “Veteran” Yogyakarta Andi Nurkholis, S.Kom., M.Kom. November 3, 2025 Done Thank You