# Graph Search-Based Motion Planner with Motion Primitives
This is a programming exercise for the lecture **Introduction to Artificial Intelligence**(WS19) delivered at the Department of Informatics, TUM. Please clone this repository or download it using the button at the upper-right corner. The repository has the following folder structure:
This is a programming exercise for the lecture **Introduction to Artificial Intelligence**and **Gems of Informatics II** delivered at the Department of Informatics, TUM. Please clone this repository or download it using the button at the upper-right corner. The repository has the following folder structure:
``` code-block:: text
commonroad-search/
├GSMP/
...
...
@@ -97,7 +97,7 @@ Navigate your terminal to `commonroad-search/` folder, and start Jupyter Noteboo
$ jupyter notebook
```
In the prompt up page, navigate to `notebooks/tutorials/` and follow the tutorials `tutorial_commonroad-io.ipynb` and `tutorial_commonroad-search.ipynb`. Remember to refer to `pdfs/0_Guide_for_Exercise.pdf` for additional explanation. The executed Jupyter notebooks for tutorials can also be found [here](https://commonroad.in.tum.de/tutorials/).
In the prompt up page, navigate to `notebooks/tutorials/` and follow the tutorials `tutorial_commonroad-search.ipynb`, `cr_uninformed_search_tutorial.ipynb`, and `cr_informed_search_tutorial.ipynb`. Remember to refer to `pdfs/0_Guide_for_Exercise.pdf` for additional explanation. The executed Jupyter notebooks for tutorials can also be found [here](https://commonroad.in.tum.de/tutorials/).
# Tutorial: Informed Search Algorithms and CommonRoad Search
This tutorial shows how we can use motion primitives in informed search algorithms to find a trajectory that connects an **initial state** and a **goal region**.
%% Cell type:markdown id: tags:
## How to use this tutorial
Before you start with this tutorial, make sure that
* you have read through the tutorial for [CommonRoad-io](https://commonroad.in.tum.de/static/docs/commonroad-io/index.html). Its tutorial can be found [here](https://commonroad.in.tum.de/tutorials/).
* you have installed all necessary modules for CommonRoad Search according to the installation manual.
* you have done the tutorial on uninformed search algorithms and CommonRoad Search.
* you have read through the tutorial for [CommonRoad-io](https://commonroad.in.tum.de/static/docs/commonroad-io/index.html). Its tutorial can be found [here](https://gitlab.lrz.de/tum-cps/commonroad_io/-/tree/master/commonroad%2Ftutorials).
* you have installed all necessary modules for CommonRoad Search according to the [installation manual](https://gitlab.lrz.de/tum-cps/commonroad-search/-/blob/master/README.md).
* you have done the [tutorial on uninformed search algorithms](https://gitlab.lrz.de/tum-cps/commonroad-search/-/blob/master/notebooks/tutorials/cr_uninformed_search_tutorial.ipynb).
Let's start with importing the modules and loading the commonroad scenario.
As mentioned in the tutorial on uninformed search, GBFS is based on the Best-First Search and uses as evaluation function f(n) the heuristic cost h(n). For this application, we need a heuristic which estimates the time to reach the goal. Therefore, we calculate the euclidean distance of the final matched state to the goal region and divide it by the velocity in the final matched state. This is a very simple heuristic, but it works for our example.
Before we run the algorithm you can have a look at the implementation of the Best-First Search and the evaluation function.
```python
defsearch_alg(self):
'''
Implementation of Best-First Search (tree search) using a Priority queue
Short reminder: When executing the following code block, you will see a "visualize" button directly beneath the "iteration" slider if you are running this notebook for the first time. Otherwise you can always find the button on the bottom.
Click the "visualize" button and let the search algorithm run through, once it's completed, you can use the slider to see all the iterations step by step.
Congratulations! You finished the tutorial on informed search and commonroad search. Now you are ready to implement your own search algorithms and heuristics to solve more complex planning problems.
# Tutorial: Uninformed Search Algorithms and CommonRoad Search
This tutorial shows how we can use motion primitives, i.e., short trajectory pieces, in uninformed search algorithms to find a trajectory that connects an **initial state** and a **goal region**.
%% Cell type:markdown id: tags:
## How to use this tutorial
Before you start with this tutorial, make sure that
* you have read through the tutorial for [CommonRoad-io](https://commonroad.in.tum.de/static/docs/commonroad-io/index.html). Its tutorial can be found [here](https://commonroad.in.tum.de/tutorials/). The API of CommonRoad-io can be found [here](https://commonroad.in.tum.de/static/docs/commonroad-io/api/index.html)
* you have installed all necessary modules for CommonRoad Search according to the installation manual.
* you have read through the tutorial for [CommonRoad-io](https://commonroad.in.tum.de/static/docs/commonroad-io/index.html). Its tutorial can be found [here](https://gitlab.lrz.de/tum-cps/commonroad_io/-/tree/master/commonroad%2Ftutorials). The API of CommonRoad-io can be found [here](https://commonroad.in.tum.de/static/docs/commonroad-io/api/index.html).
* you have installed all necessary modules for CommonRoad Search according to the [installation manual](https://gitlab.lrz.de/tum-cps/commonroad-search/-/blob/master/README.md).
Let's start with importing the modules we need for setting up the automaton and the CommonRoad scenario.
In the following, we load the motion primitives from an XML-File and generate a Maneuver Automaton.
The maneuver automaton for this tutorial consists of 7 motion primitives and stores the connectivity to other motion primitives.
Some additional explanations on the motion primitives:
* The motion primitives are generated for the Kinematic Single Track-Model (see [Vehicle Model Documentation](https://gitlab.lrz.de/tum-cps/commonroad-vehicle-models/blob/master/vehicleModels_commonRoad.pdf)) and the vehicle parameter are chosen for a BMW320i (vehicle_type_id=2).
* We have motion primitives for driving with constant velocity and the steering angle is changed with constant steering angle velocity. We generated motion primitives for all combinations of the steering angles in the initial state and end state for 0 rad and 0.2 rad, i.e., 4 combinations. The three motion primitives for turning left are mirrored for turning right, resulting in total 7 motion primitives.
* motion primitives can only be connected if they have matching initial/final velocities and matching initial/final steering angles.
In the next step, we set-up the BFS with the generated maneuver automaton to obtain a trajectory from the initial state to the goal region. The inital state and the goal region are specified in the planning problem.
When executing the following code block, you will see a "visualize" button directly beneath the "iteration" slider if you are running this notebook for the first time. Otherwise you can always find the button on the bottom.
Click the "visualize" button and let the search algorithm run through, once it's completed, you can use the slider to see all the iterations step by step.
Now we show the same example for the DFS. We use a simple implementation of the DFS which is similar to the BFS implementation but uses a LastInFirstOut(LIFO)-queue. Since the rest of the implementation is the same as the BFS, we directly run the DFS.
In this scenario, we were not able to find a solution using DFS, since DFS would append motion primitives for an infinitely long time (infinite state space). This shows that DFS is not complete, i.e., DFS is not guaranteed to find a solution if one exist.
To overcome this problem we introduce a depth limit, resulting in Depth-Limited Search (DLS). This search algorithm is introduced in the next section.
%% Cell type:markdown id: tags:
## Depth-Limited Search (DLS)
%% Cell type:markdown id: tags:
Before we run the algorithm, you can have a look at the impementation. We use the recursive implementation as introduced in the lecture.
Now let's run the algorithm and see what changes with the introduced limit. Here we set the limit to 7, as we know from BFS there exists a solution consisting of 7 motion primtives.
As you can see, depth-limited search finds a solution.
%% Cell type:markdown id: tags:
## Uniform-Cost Search
Up to now, we looked at all algorithms, which do not consider costs during search. In the following, we look at the uniform-cost search. Uniform-Cost search is optimal for any step costs, as it expands the node with the lowest path cost g(n). In this example our cost is the time to reach the goal. Thus, our cost g(n) is the time it took, to reach our current final state.
The Uniform-Cost Search is based on the Best-First Search, which we will also use for Greedy-Best-First Search and A\* Search. These algorithms only differ in their evaluation function. In Uniform-Cost Search, the evaluation function is f(n) = g(n).
Before we run the search, you can have a look at the implementation of the algorithm and the evaluation function. Again, we removed all the visualization parts, so it is easier to understand the code.
```python
defsearch_alg(self):
'''
Implementation of Best-First Search (tree search) using a Priority queue