Self Studies

Algorithms Test 1

Result Self Studies

Algorithms Test 1
  • Score

    -

    out of -
  • Rank

    -

    out of -
TIME Taken - -
Self Studies

SHARING IS CARING

If our Website helped you a little, then kindly spread our voice using Social Networks. Spread our word to your readers, friends, teachers, students & all those close ones who deserve to know what you know now.

Self Studies Self Studies
Weekly Quiz Competition
  • Question 1
    2 / -0.33

    When using Dijkstra’s algorithm to find shortest path in a graph, which of the following statement is not true?

    Solution

    Concept:

    Dijkstra’s algorithm is used to find single source shortest path i.e path between any two vertices in a graph. All edges must have non-negative weights and graph must be connected.

    Explanation:

    V

    a

    b

    c

    d

    e

    f

    g

    a

    0a

    3a

    5a

    6a

    Inf

    Inf

    Inf

    b

    0a

    3a

    5a

    5b

    Inf

    Inf

    Inf

    C

    0a

    3a

    5a

    5b

    11c

    8c

    12c

    d

    0a

    3a

    5a

    5b

    11c

    8c

    12c

    f

    0a

    3a

    5a

    5b

    11c

    8c

    9f

    So, we get the shortest distance from a to g . (a -> b -> c -> d -> f -> g).

  • Question 2
    2 / -0.33

    What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?

    Solution
    • The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n-1).
    • So recurrence is T(n) = T(n-1) + T(0) + O(n).
    • The above expression can be rewritten as T(n) = T(n-1) + O(n) 

    void exchange(int *a, int *b) 

    int temp; 
    temp = *a; 
    *a = *b; 
    *b = temp; 

    int partition(int arr[], int si, int ei) 

    int x = arr[ei]; 
    int i = (si - 1); 
    int j; 

    for (j = si; j <= ei - 1; j++) 

        if(arr[j] <= x) 
        { 
        i++; 
        exchange(&arr[i], &arr[j]); 
        } 

    exchange (&arr[i + 1], &arr[ei]); 
    return (i + 1); 

    /* Implementation of Quick Sort 
    arr[] --> Array to be sorted 
    si --> Starting index 
    ei --> Ending index 
    */
    void quickSort(int arr[], int si, int ei) 

    int pi; /* Partitioning index */
    if(si < ei) 

        pi = partition(arr, si, ei); 
        quickSort(arr, si, pi - 1); 
        quickSort(arr, pi + 1, ei); 

    }

  • Question 3
    2 / -0.33

    Which of the following sorting algorithms in its typical implementation gives best performance when applied on an array which is sorted or almost sorted (maximum 1 or two elements are misplaced).

    Solution

    Insertion sort takes linear time when input array is sorted or almost sorted (maximum 1 or 2 elements are misplaced). All other sorting algorithms mentioned above will take more than linear time in their typical implementation.

  • Question 4
    2 / -0.33
    What is the worst-case and average-case time complexity of the Binary search?
    Solution

    Binary search algorithm:

    Binary search algorithm is used to find an element in an already sorted array.

    STEPS 1:

    It finds the middle element of the array and compare the element with the element to be searched, if it matches then return true.

    STEPS 2:

    If not, then divide the array into two halves in which if element to be search is less than middle element then search takes place in left part otherwise search in right half.

    STEP 3:

    Repeat this process, until you get the element.

    Explanation:

    For worst case 52

    Worst Case: Search 50 in below give array

    11

    12

    15

    24

    35

    50

    51

    63

     

    \({\rm{midde\;index}} = \frac{{0 + 9}}{2} = 4\therefore {\rm{a}}\left[ 4 \right] = 35\)

    50 > 32

    \({\rm{midde\;index}} = \frac{{5 + 9}}{2} = 7\;\therefore {\rm{a}}\left[ 7 \right] = 63\)

    50 < 63

    \({\rm{midde\;index}} = \frac{{5 + 6}}{2} = 8\;\therefore {\rm{a}}\left[ 5 \right] = 50\)

    matched

    T(n) = O(log n)

    Also, for average case:

    T(n) = O(log n)
  • Question 5
    2 / -0.33
    Consider the below-given statement written in C language:

    int A = {25, 9, 15, 30, 11, 17 , 22}

    Bubble sort is applied on array A. Which element is present at index 2 after the second iteration?

    Solution

    Concept:

    Bubble sort, sometimes referred to as sinking sort, is a simple sorting algorithm that repeatedly steps through the list, compares adjacent pairs, and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted.

    Explanation:

    In 1st iteration: 25, 9, 15, 30, 11, 17 , 22

    Output: 9, 15, 25, 11, 17, 22, 30

    In 2nd  iteration: 9 15 25 11 17 22 30

    Output: 9 15 11 17 22 25 30

    11 is at index 2.

    Important Point:

    int A = {25, 9, 15, 30, 11, 17 , 22} //index starts with 0

  • Question 6
    2 / -0.33

    What is the time complexity for the below given block of code where i, j, n are integers and lines of code do contain any extra line space between them?

                    for(i = 0; i < n; i ++ )

                    for(j = m ; j > 0; j = j/2)

     printf(“Hello world” );
    Solution

    For the first loop it will run n times

    for(i = 0; i < n; i ++ ) → complexity is O(n).

    For the second loop it will run for ⌈log2 m⌉

    for(int j = m ; j > 0; j= j/2) → complexity is O(log2m).

    Since Outer loop has complexity O(n) and inner loop has complexity O(log2m)

    Time complexity for code block = O(n × log2m)
  • Question 7
    2 / -0.33

    Consider the following table:

     Algorithms

     Design Paradigms

     (A) Huffman Coding

     (I) Divide and Conquer

     (B) Merge sort

     (II) Greedy

     (C) Coin change problem

     (III) Dynamic Programming

     

    Match the algorithms to the design paradigms they are based on.
    Solution
    •  Huffman Coding uses Greedy approach to solve the problem
    • Merge sort is an efficient sorting algorithm that uses a divide-and-conquer approach to order elements in an array.
    • Dynamic Programming is used in Coin change problem
  • Question 8
    2 / -0.33

    Suppose we are sorting an array of eight integers using quicksort, and we have just finished the first partitioning with the array looking like this: 2 5 1 7 9 12 11 10 Which statement is correct?

    Solution

    7 and 9 both are at their correct positions (as in a sorted array). Also, all elements on left of 7 and 9 are smaller than 7 and 9 respectively and on right are greater than 7 and 9 respectively.

  • Question 9
    2 / -0.33

    Consider a situation where swap operation is very costly. Which of the following sorting algorithms should be preferred so that the number of swap operations are minimized in general?

    Solution

    Selection sort makes O(n) swaps which is minimum among all sorting algorithms mentioned above.

  • Question 10
    2 / -0.33
    If T(n) = 2T(n/2)+nlogn then T(n)?
    Solution

    Master Theorem: T(n)= aT(n/b)+f(n); a>=1,b>1.

    Therefore a = 2, b = 2, f(n) = nlogn

    \(n^{log_ba} = n^{log_22}=n\)

    \({f(n) > n^{log_ba}}\)

    ∴  T(n) = O(n(logn)2)
  • Question 11
    2 / -0.33
    If G = (V,E) is a directed graph, then length of adjacency list is ______ and it requires ______ amount of memory.
    Solution
    If G is a directed graph, the sum of the lengths of all the adjacency lists is |E|. If G is an undirected graph, the sum of the lengths of all the adjacency lists is 2|E|. For both directed and undirected graphs, the adjacency-list representation has the desirable property that the amount of memory it requires is θ(V+E).
  • Question 12
    2 / -0.33

    Suppose we are sorting an array of eight integers using heapsort, and we have just finished some heapify (either maxheapify or minheapify) operations. The array now looks like this: 16 14 15 10 12 27 28. How many heapify operations have been performed on root of heap?

    Solution

    In Heapsort, we first build a heap, then we do following operations till the heap size becomes 1. a) Swap the root with last element b) Call heapify for root c) reduce the heap size by 1. In this question, it is given that heapify has been called few times and we see that last two elements in given array are the 2 maximum elements in array. So situation is clear, it is maxheapify whic has been called 2 times.

  • Question 13
    2 / -0.33

    Consider 5 array of size 25, 31, 37, 49, 57 respectively. All arrays are sorted. All the arrays into a single array by using the optimal algorithm and the single array is again a sorted array. What is the maximum number of comparisons that will be needed in the worst case?

    Solution

    Using optimal algorithm (merge sort)

    Sequence: 25, 31, 37, 49, 57

    Ascending order: 25, 31, 37, 49, 57

    Merge(25, 31) → new size (56)

    Number of comparisons = 25 + 31 – 1 = 55 

    Sequence: 37, 49, 56, 57

    Merge(37, 49) → new size (86)

    Number of comparisons = 37 + 49 – 1 = 85  

    Sequence:  56, 57, 86

    Merge(56, 57) → new size (113)

    Number of comparisons = 56 + 57 – 1 = 112

    Sequence: 86, 113

    Merge(86, 113) → new size (199)

    Number of comparisons = 86 + 113 – 1 = 198  

    Therefore, total number of comparisons, 55+ 85 + 112 + 198 = 450

  • Question 14
    2 / -0.33

    What is the best time complexity of bubble sort?

    Solution

    The only significant advantage that bubble sort has over most other algorithms, even quicksort, but not insertion sort, is that the ability to detect that the list is sorted efficiently is built into the algorithm. When the list is already sorted (best-case), the complexity of bubble sort is only O(n).

  • Question 15
    2 / -0.33

    You have to sort 1 GB of data with only 100 MB of available main memory. Which sorting technique will be most appropriate?

    Solution

    The data can be sorted using external sorting which uses merging technique. This can be done as follows: 1. Divide the data into 10 groups each of size 100. 2. Sort each group and write them to disk. 3. Load 10 items from each group into main memory. 4. Output the smallest item from the main memory to disk. Load the next item from the group whose item was chosen. 5. Loop step #4 until all items are not outputted. The step 3-5 is called as merging technique.

  • Question 16
    2 / -0.33

    Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?

    Solution
    1. To sort the array firstly create a min-heap with first k+1 elements and a separate array as resultant array.
    2. Because elements are at most k distance apart from original position so, it is guranteed that the smallest element will be in this K+1 elements.
    3. Remove the smallest element from the min-heap(extract min) and put it in the result array.
    4. Now,insert another element from the unsorted array into the mean-heap, now,the second smallest element will be in this, perform extract min and continue this process until no more elements are in the unsorted array.finally, use simple heap sort for the remaining elements

    Time Complexity

    1. O(k) to build the initial min-heap.
    2. O((n-k)logk) for remaining elements.
    3. 0(1) for extract min.

    So overall O(k) + O((n-k)logk) + 0(1) = O(nlogk)

  • Question 17
    2 / -0.33

    Which of the following is not a stable sorting algorithm in its typical implementation.

    Solution

    Out of the given options quick sort is the only algorithm which is not stable. Merge sort is a stable sorting algorithm.

  • Question 18
    2 / -0.33

    Which of the following is not true about comparison based sorting algorithms?

    Solution

    Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared. Counting Sort is not a comparison based sorting algorithm. Heap Sort is not a comparison based sorting algorithm.

  • Question 19
    2 / -0.33

    What is the worst case time complexity of insertion sort where position of the data to be inserted is calculated using binary search?

    Solution

    Applying binary search to calculate the position of the data to be inserted doesn't reduce the time complexity of insertion sort. This is because insertion of a data at an appropriate position involves two steps: 1. Calculate the position. 2. Shift the data from the position calculated in step #1 one step right to create a gap where the data will be inserted. Using binary search reduces the time complexity in step #1 from O(N) to O(logN). But, the time complexity in step #2 still remains O(N). So, overall complexity remains O(N2).

  • Question 20
    2 / -0.33

    Consider a hash table with 9 slots. The hash function is ℎ(k) = k mod 9. The collisions are resolved by chaining. The following 9 keys are inserted in the order: 5, 28, 19, 15, 20, 33, 12, 17, 10. The maximum, minimum, and average chain lengths in the hash table, respectively, are

    Solution

    Concept:

    Keys: 5, 28, 19, 15, 20, 33, 12, 17,

    10.

    ℎ(k) = k mod 9

    Chaining

    Maximum chain length = 3 (28 -> 19 -> 10)

    Minimum chain length = 0 (0, 4, 7 slot doesn’t have any element)

    \({\rm{Average\;chain\;length\;}} = \frac{{0 + 3 + 1 + 1 + 0 + 1 + 2 + 0 + 1}}{9} = \frac{9}{9} = 1\)

    Hence option 1 is the correct answer.

  • Question 21
    2 / -0.33

    Which sorting algorithm will take least time when all elements of input array are identical? Consider typical implementations of sorting algorithms.

    Solution

    The insertion sort will take  time when input array is already sorted.

  • Question 22
    2 / -0.33

    Consider a hypothetical system in which merge sort is implemented on an array of size of 2 bytes each. The size of the array is 2 MB. The maximum time taken by the merge sort to process the array is 256 seconds. What is the size of the input in MB of a problem that can be solved in 81.92 minutes?

    Solution

    Data:

    T1 = 256 sec = 28 sec

    number of elements  = n1 = (2÷ 2) M =  220

    T2 = 81.92 min = 81.92 × 60 sec

    Formula:

    Time complexity of merge sort is O(nlog2n),

    T = c × n × log2n

    where c is constant

    Calculation:

    T1 = c × n1 × log2 n1

    28 = c × 220 × log220

    1 = c × 212 × 20

    \(\therefore c = \frac{1}{{{5×2^{14}}}}\)

    T2 = c × n × log2n

    \(81.92 × 60 = \frac{1}{{{5×2^{14}}}}\;{n_2}{\log _2}{n_2}\)

    nLog2n = 24576 × 214

    nLog2n = 24 × 224

    ∴ n = 224

    Size = 2× 224 = 32 MB

  • Question 23
    2 / -0.33

    Consider two strings M = ”abababbaa” and N = ”babaa”.  The length of the longest common subsequence between two given string is r and the number of times the length of the longest common subsequence repeated in the matrix is n while using Dynamic programming. What is the value of r + r×n + n2 ?

    NOTE: Common subsequence may not necessarily be contiguous

    Solution

    Concepts:

    Initialize the first row and first column with 0.

    If there is match, A[i, j] = A[i – 1, j  – 1] + 1

    If not match: max(A[i – 1, j], A[i, j – 1])

    Table of longest common subsequence:

     

    M →

    a

    b

    a

    b

    a

    b

    b

    a

    a

    N↓

    0

    0

    0

    0

    0

    0

    0

    0

    0

    0

    b

    0

    0

    1

    1

    1

    1

    1

    1

    1

    1

    a

    0

    1

    1

    2

    2

    2

    2

    2

    2

    2

    b

    0

    1

    2

    2

    3

    3

    3

    3

    3

    3

    a

    0

    1

    2

    3

    3

    4

    4

    4

    4

    4

    a

    0

    1

    2

    3

    3

    4

    4

    4

    5

    5

    Length of longest common subsequence = r = 5

    5 is repeated 2 times

    r + r×n + n2 = 5 + 5×2 + 22 = 19

    The value is 19.

  • Question 24
    2 / -0.33

    In quick sort, for sorting n elements, the (n/4)th smallest element is selected as pivot using an O(n) time algorithm. What is the worst case time complexity of the quick sort?

    Solution

    The recursion expression becomes: T(n) = T(n/4) + T(3n/4) + cn After solving the above recursion, we get θ(nLogn)

  • Question 25
    2 / -0.33

    Consider the 10 Jobs for A to J. The execution of each job takes 1 unit of time. One job can be executed at a time. Each job is related with a profit and a deadline. If job is completed before deadline then the corresponding profit is earned. The maximum profit earned is _____.

    Job

    A

    B

    C

    D

    E

    F

    G

    H

    I

    J

    Profit

    25

    35

    40

    30

    18

    25

    33

    43

    19

    25

    Deadline

    4

    7

    4

    7

    1

    4

    6

    5

    3

    6

    Solution

    1. Array size = min(max(deadline)), number of jobs) = (7, 10) = 7

    2. Sort the Jobs in descending order based on profit

    3. Fill the array in reverse order of deadline

    Array

    Job

    A

    J

    D

    C

    H

    G

    B

    Profit

    25

    25

    30

    40

    43

    33

    35

                0         1          2         3          4          5          6         7

    Maximum profit = 25 + 25 + 30 + 40 + 43 + 33 + 35 = 231

Self Studies
User
Question Analysis
  • Correct -

  • Wrong -

  • Skipped -

My Perfomance
  • Score

    -

    out of -
  • Rank

    -

    out of -
Re-Attempt Weekly Quiz Competition
Self Studies Get latest Exam Updates
& Study Material Alerts!
No, Thanks
Self Studies
Click on Allow to receive notifications
Allow Notification
Self Studies
Self Studies Self Studies
To enable notifications follow this 2 steps:
  • First Click on Secure Icon Self Studies
  • Second click on the toggle icon
Allow Notification
Get latest Exam Updates & FREE Study Material Alerts!
Self Studies ×
Open Now