403. Frog Jump


A frog is crossing a river. The river is divided into x units and at each unit there may or may not exist a stone. The frog can jump on a stone, but it must not jump into the water.

Given a list of stones' positions (in units) in sorted ascending order, determine if the frog is able to cross the river by landing on the last stone. Initially, the frog is on the first stone and assume the first jump must be 1 unit.

If the frog's last jump was k units, then its next jump must be either k - 1, k, or k + 1 units. Note that the frog can only jump in the forward direction.

Note:

  • The number of stones is ≥ 2 and is < 1,100.
  • Each stone's position will be a non-negative integer < 231.
  • The first stone's position is always 0.

Example 1:

[0,1,3,5,6,8,12,17]

There are a total of 8 stones.
The first stone at the 0th unit, second stone at the 1st unit,
third stone at the 3rd unit, and so on...
The last stone at the 17th unit.

Return true. The frog can jump to the last stone by jumping 
1 unit to the 2nd stone, then 2 units to the 3rd stone, then 
2 units to the 4th stone, then 3 units to the 6th stone, 
4 units to the 7th stone, and 5 units to the 8th stone.

Example 2:

[0,1,2,3,4,8,9,11]

Return false. There is no way to jump to the last stone as 
the gap between the 5th and 6th stone is too large.


Summary

Given a sorted stone array containing the positions at which there are stones in a river. We need to determine whether it is possible or not for a frog to cross the river by stepping over these stones, provided that the frog starts at position 0, and at every step the frog can make a jump of size , or if the previous jump is of size .

Solution


Approach #1 Brute Force [Time Limit Exceeded]

In the brute force approach, we make use of a recursive function which takes the given stone array, the current position and the current as input arguments. We start with and . Then for every function call, we start from the and check if there lies a stone at , where, the could be , or . In order to check whether a stone exists at the specified positions, we check the elements of the array in a linear manner. If a stone exists at any of these positions, we call the recursive function again with the same stone array, the and the as the parameters. If we are able to reach the end of the stone array through any of these calls, we return to indicate the possibility of reaching the end.

Complexity Analysis

  • Time complexity : . Recursion tree can grow upto .
  • Space complexity : . Recursion of depth is used.

Approach #2 Better Brute Force[Time Limit Exceeded]

Algorithm

In the previous brute force approach, we need to find if a stone exists at , where could be either of , or . But in order to check if a stone exists at the specified location, we searched the given array in linearly. To optimize this, we can use binary search to look for the element in the given array since it is sorted. Rest of the method remains the same.

Complexity Analysis

  • Time complexity : . Recursion tree can grow upto .
  • Space complexity : . Recursion of depth is used.

Approach #3 Using Memorization [Accepted]

Algorithm

Another problem with above approaches is that we can make the same function calls coming through different paths e.g. For a given , we can call the recursive function with the , say . This could be resulting from previous being , or . Thus, many redundant function calls could be made prolonging the running time. This redundancy can be removed by making use of memorization. We make use of a 2-d array, initialized by s, to store the result returned from a function call for a particular and . If the same and happens is encountered again, we can return the result directly using the array. This helps to prune the search tree to a great extent.

Complexity Analysis

  • Time complexity : . Memorization will reduce time complexity to .

  • Space complexity : . matrix of size is used.


Approach #4 Using Memorization with Binary Search [Accepted]

Algorithm

We can optimize the above memorization approach, if we make use of Binary Search to find if a stone exists at instead of searching linearly.

Complexity Analysis

  • Time complexity : . We traverse the complete matrix once . For every entry we take atmost numbers as pivot.

  • Space complexity : . matrix of size is used.


Approach #5 Using Dynamic Programming[Accepted]

Algorithm

In the DP Approach, we make use of a hashmap which contains pairs such that refers to the position at which a stone is present and is a set containing the which can lead to the current stone position. We start by making a hashmap whose s are all the positions at which a stone is present and the s are all empty except position 0 whose value contains 0. Then, we start traversing the elements(positions) of the given stone array in sequential order. For the , for every possible in the set, we check if exists in the , where can be either , , . If so, we append the corresponding set with . We continue in the same manner. If at the end, the set corresponding to the last position is non-empty, we conclude that reaching the end is possible, otherwise, it isn't.

For more understanding see this animation-

!?!../Documents/403_Frog.json:1000,563!?!

Complexity Analysis

  • Time complexity : . Two nested loops are there.

  • Space complexity : . size can grow upto .


Analysis written by: @vinod23