Autonomous Navigation for Flying Robots -

0K - views

Autonomous Navigation for Flying Robots

Lecture 2.3:. 2D Robot Example. Jürgen . Sturm. Technische. . Universität. . München. 2D . Robot. Robot is . located somewhere . in space. Jürgen Sturm. Autonomous Navigation for Flying Robots.

Embed :
Download Link

Download - The PPT/PDF document "Autonomous Navigation for Flying Robots" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Autonomous Navigation for Flying Robots






Presentation on theme: "Autonomous Navigation for Flying Robots"— Presentation transcript:

Slide1

Autonomous Navigation for Flying RobotsLecture 2.3:2D Robot Example

Jürgen

Sturm

Technische

Universität

München

Slide2

2D RobotRobot is located somewhere

in space

Jürgen Sturm

Autonomous Navigation for Flying Robots

2

Slide3

2D RobotRobot is located

somewhere in space

Robot pose:

Position

Orientation (yaw angle/heading)

Jürgen Sturm

Autonomous Navigation for Flying Robots

3

Slide4

Robot PoseRobot is located somewhere in space

Robot pose:

Position

Orientation (yaw angle/heading)

Robot pose represented as transformation matrix:

Jürgen Sturm

Autonomous Navigation for Flying Robots

4

Slide5

Robot PoseRobot is located at

Robot pose

Jürgen Sturm

Autonomous Navigation for Flying Robots

5

Slide6

Coordinate TransformationsRobot is located somewhere in space

Robot pose:

Position

Orientation (yaw angle/heading)

What is the pose after moving 1m forward?How do we need to move to reach a

certain position?

Jürgen Sturm

Autonomous Navigation for Flying Robots

6

Slide7

Coordinate TransformationsRobot moves forward by 1mWhat is its position afterwards?

Point located 1m in front of the robot in local coordinates:

Jürgen Sturm

Autonomous Navigation for Flying Robots

7

Slide8

Coordinate TransformationsRobot moves forward by 1mWhat is its position afterwards?

Point located 1m in front of the robot in global coordinates:

Jürgen Sturm

Autonomous Navigation for Flying Robots

8

Slide9

Coordinate TransformationsWe transformed local to global coordinatesSometimes we need to do the inverse

How can we transform global coordinates into local coordinates?

Jürgen Sturm

Autonomous Navigation for Flying Robots

9

Slide10

Coordinate TransformationsWe transformed local to global coordinatesSometimes we need to do the inverse

How can we transform global coordinates into local coordinates?

Jürgen Sturm

Autonomous Navigation for Flying Robots

10

Slide11

Coordinate TransformationsNow consider a different motionRobot moves 0.2m forward,

0.1m sideward and turns by 10deg

Euclidean transformation:

Jürgen Sturm

Autonomous Navigation for Flying Robots

11

Slide12

Coordinate System TransformationsNow consider a different motionRobot moves 0.2m forward,

0.1m sideward and turns by 10deg

After this motion, the robot pose becomes

Jürgen Sturm

Autonomous Navigation for Flying Robots

12

Slide13

Coordinate System TransformationsNote: The order matters!

Compare:

Move

1m forward, then turn 90deg left

Turn 90deg left, then move 1m forward

Jürgen Sturm

Autonomous Navigation for Flying Robots

13

1.

2

.

Slide14

Robot OdometryHow can we estimate the robot motion?

Control-based

models predict the estimated motion from the issued control commands

Odometry

-based

models are used when systems are equipped with distance sensors (e.g., wheel encoders)Velocity-based

models have to be applied when no wheel encoders are given

Jürgen Sturm

Autonomous Navigation for Flying Robots

14

Slide15

Dead ReckoningIntegration of odometry is

also called dead reckoning

Mathematical

procedure to determine

the present location of a vehicleAchieved by calculating the current pose of the vehicle based on

the estimated/measured velocities and the elapsed time

Jürgen Sturm

Autonomous Navigation for Flying Robots

15

Slide16

Motion ModelsEstimating the robot pose based

on the issued

controls

(

or IMU readings) and

the previous

locationMotion model

Jürgen Sturm

Autonomous Navigation for Flying Robots

16

Slide17

ExerciseGiven:

IMU readings from real flight of

Ardrone

quadrotorHorizontal speed in the local frame

Yaw angular speedWanted: Position and orientation in global frame

Integrate these values to get robot pose

Jürgen Sturm

Autonomous Navigation for Flying Robots

17

Slide18

Lessons Learned2D poseConversion between local and global coordinates

Concatenation of (robot) motions

Robot

odometry

Jürgen Sturm

Autonomous Navigation for Flying Robots

18