1、SLAMTim Bailey 1Simultaneous Localisation and MappingTim BaileyAustralian Centre for Field RoboticsSLAMTim Bailey 2Overview The SLAM problem History Models Standard EKF Formulation Data association Feature initialisationSLAMTim Bailey 3What is SLAM?SLAM asks the following question:Is it possible for
2、 an autonomous vehicle to start at an unknown location in an unknown environment and then to incrementally build a map of this environment while simultaneously using this map to compute vehicle location?SLAM has many indoor,outdoor,in-air and underwater applications for both manned and autonomous ve
3、hicles.Examples Explore and return to starting point(Newman)Learn trained paths to different goal locations Traverse a region with complete coverage(eg,mine fields,lawns,reef monitoring)SLAMTim Bailey 4Components of SLAM Localisation Determine pose given a priori map Mapping Generate map when pose i
4、s accurately known from auxiliary source.SLAM Define some arbitrary coordinate origin Generate a map from on-board sensors Compute pose from this map Errors in map and in pose estimate are dependent.SLAMTim Bailey 5Basic SLAM OperationSLAMTim Bailey 6Example:SLAM in Victoria ParkSLAMTim Bailey 7Basi
5、c SLAM OperationSLAMTim Bailey 8Basic SLAM OperationSLAMTim Bailey 9Basic SLAM OperationSLAMTim Bailey 10Basic SLAM OperationSLAMTim Bailey 11Dependent ErrorsSLAMTim Bailey 12Correlated EstimatesSLAMTim Bailey 13SLAM Convergence An observation acts like a displacement to a spring system Effect is gr
6、eatest in a close neighbourhood Effect on other landmarks diminishes with distance Propagation depends on local stiffness(correlation)properties With each new observation the springs become increasingly(and monotonically)stiffer.In the limit,a rigid map of landmarks is obtained.A perfect relative ma
7、p of the environment The location accuracy of the robot is bounded by The current quality of the map The relative sensor measurement SLAMTim Bailey 14Spring AnalogySLAMTim Bailey 15Spring Analogy VideoSLAMTim Bailey 16Monotonic ConvergenceWith each new observation,the determinant decreases over the
8、map and for any submatrix in the map.SLAMTim Bailey 17History of SLAM It all started about 20 years ago at ICRA86 in San Francisco.Probabilistic methods were new to robotics and AI Several researchers were looking at applying estimation-theoretic methods to mapping and localisation problems They saw
9、 that:Consistent probabilistic mapping was a fundamental problem Major conceptual and computational issues needed to be addressed Key papers were written on geometric uncertainty(Smith and Cheeseman,HDW).They showed that estimates exhibit a high degree of correlation between geometric features(ie,la
10、ndmark locations in a map).SLAMTim Bailey 18History of SLAM Landmark paper by Smith,Self and Cheeseman Landmark estimates correlated via vehicle pose estimate Important implication A consistent full solution requires a joint state composed of the vehicle pose and every landmark position Many landmar
11、ks means huge state vector Estimate to be updated following each landmark observation At the time,estimation meant Kalman filters Computation and storage costs scale quadratically with the number of landmarks SLAMTim Bailey 19History of SLAM Researchers initially thought SLAM would not converge Assu
12、med estimated map errors would exhibit a random walk behaviour Unbounded map uncertainty Leonard(1991):Simultaneous mapping and localisation,which came first,the chicken or the egg?Researchers tried to minimise correlations between landmarks Applied approximations to minimise or eliminate correlatio
13、ns,or simply assumed they were zero Reduced the full filter to a series of decoupled landmark to vehicle filters.Theoretical work on SLAM came to a temporary halt Work focused on either mapping or localisation as separate problems.SLAMTim Bailey 20History of SLAMConceptual break-through:SLAM converg
14、es(Csorba,Dissa)Must correctly formulate as a joint state;correlations are essential Correlations grow,not shrink Stronger correlations means better relative map estimateThe term SLAM was introduced in 1995.Groups started working again at SLAM(aka CML)ACFR,Zaragoza,MIT Key series of papers through 2
15、000 showing convergence properties.Initial work on computational efficiency and on loop-closure.Growth in SLAM interest.First SLAM session held at ISRR 1999.Included also probabilistic mapping methods of Sebastian Thrun and Dieter Fox.First ICRA SLAM workshop:2000 First SLAM summer school:2002 Massi
16、ve increase in SLAM papers since 2002SLAMTim Bailey 21Models Models are central to creating a representation of the world.Must have a mapping between sensed data(eg,laser,cameras,odometry)and the states of interest(eg,vehicle pose,stationary landmarks)Two essential model types:Vehicle motion Sensing
17、 of external objectsSLAMTim Bailey 22An Example SystemRADARSteering angleWheel speedMODELGyro(INS)ObservationEstimatorGPSMAP statesVehicle poseComparisonObjects DetectionData AssociationAdditional Landmarks propertiesCompassLASERSLAMTim Bailey 23States,Controls,ObservationsJoint state with momentary
18、 poseJoint state with pose historyControl inputs:Observations:SLAMTim Bailey 24Vehicle Motion Model Ackerman steered vehicles:Bicycle model Discrete time model:SLAMTim Bailey 25SLAM Motion Model Joint state:Landmarks are assumed stationarySLAMTim Bailey 26Observation Model Range-bearing measurement
19、SLAMTim Bailey 27SLAM Graphical ModelSLAMTim Bailey 28Models in Graphical ModelSLAMTim Bailey 29Perfect World:Deterministic Exact pose from motion model Global localisation by triangulationEven if range-only or bearing-only sensors,can localise given enough measurementsSolve simultaneous equations:N
20、 equations for N unknownsSLAMTim Bailey 30Real World:Uncertain All measurements have errors In SLAM,measurement errors induce dependencies in the landmark and vehicle pose estimatesEverything is correlatedSLAMTim Bailey 31Bayesian Estimation Standard theory for dealing with uncertain information in
21、a consistent mannerSLAMTim Bailey 32Brief Overview of Probability Theory Probability density function(PDF)over N-D state space is denoted Properties of a PDFSLAMTim Bailey 33Brief Overview of Probability Theory State vector Joint PDF is Conditional PDF of x1 given x2 and x3 Conditional independence:
22、if x1 is independent of x2 given x3 then SLAMTim Bailey 34Two Essential Rules for Manipulating Probabilities Sum rule Product ruleSLAMTim Bailey 35Implications of the Product Rule Conditionals Independence Markov Models Bayes theoremSLAMTim Bailey 36Bayesian Estimation over a Dynamic State Space SLA
23、M has a dynamic state space Grow state-space:via augmentation Reduce state-space:via marginalisation Fuse new information:via Bayes theoremSLAMTim Bailey 37Augmentation:Introduce new states into joint vector Given a prior PDF and a functional relationship where q is an independent vector with PDF We
24、 have a conditional PDF And form the joint PDF asSLAMTim Bailey 38Marginalisation:Remove old states As per the sum rule Marginal says:what is PDF of x1 when we dont care what value x2 takes;ie,p(x1)regardless of x2 Important distinction:x1 is still dependent on x2,but p(x1)is not a function of x2SLA
25、MTim Bailey 39Bayesian Update:Inverse probability Bayes theorem Observation model Conditional probability Likelihood functionSLAMTim Bailey 40Bayes Update Update Denominator term often seen as just a normalising constant,but is important for saying how likely a model or hypothesis isUsed in FastSLAM
26、 for determining particle weightsUsed in multi-hypothesis data associationSLAMTim Bailey 41Applying Bayes to SLAM:Available Information States (Hidden or inferred values)Vehicle posesMap;typically composed of discrete parts called landmarks or features ControlsVelocity Steering angle ObservationsRan
27、ge-bearing measurementsSLAMTim Bailey 42Augmentation:Adding new poses and landmarks Add new pose Conditional probability is a Markov ModelSLAMTim Bailey 43Augmentation Product rule to create joint PDF p(xk)Same method applies to adding new landmark statesSLAMTim Bailey 44Marginalisation:Removing pas
28、t poses and obsolete landmarks Augmenting with new pose and marginalising the old pose gives the classical SLAM prediction stepSLAMTim Bailey 45Fusion:Incorporating observation information Conditional PDF according to observation model Bayes update:proportional to product of likelihood and priorSLAM
29、Tim Bailey 46Implementing Probabilistic SLAM The problem is that Bayesian operations are intractable in general.General equations are good for analytical derivations,not good for implementation We need approximationsLinearised Gaussian systems(EKF,UKF,EIF,SAM)Monte Carlo sampling methods(Rao-Blackwe
30、llised particle filters)SLAMTim Bailey 47EKF SLAM The complicated Bayesian equations for augmentation,marginalisation,and fusion have simple and efficient closed form solutions for linear Gaussian systems For non-linear systems,just lineariseEKF,EIF:JacobiansUKF:use deterministic samplesSLAMTim Bail
31、ey 48Kalman Implementation So can we just plug the process and observation models into the standard EKF equations and turn the crank?Several additional issues:Structure of the SLAM problem permits more efficient implementation than nave EKF.Data association.Feature initialisation.SLAMTim Bailey 49St
32、ructure of SLAM Key property of stochastic SLAM Largely a parameter estimation problem Since the map is stationary No process model,no process noise For Gaussian SLAM Uncertainty in each landmark reduces monotonically after landmark initialisation Map converges Examine computational consequences of
33、this structure in next session.SLAMTim Bailey 50Data Association Before the Update Stage we need to determine if the feature we are observing is:An old feature A new feature If there is a match with only one known feature,the Update stage is run with this feature information.()()(/1)()TxxS kh k P k
34、khkR()()(/)kz kh x k k1120.95()()()Tk SkkSLAMTim Bailey 51Validation GatingSLAMTim Bailey 52New Features If there is no match then a potential new feature has been detected We do not want to incorporate a spurious observation as a new feature It will not be observed again and will consume computatio
35、nal time and memory It will add clutter,increasing risk of future mis-associations The features are assumed to be static.We dont not want to accept dynamic objects as features:cars,people etc.SLAMTim Bailey 53Acceptance of New FeaturesGet the feature in a list of potential featuresIncorporate the fe
36、ature once it has been observed for a number of timesAdvantages:Simple to implement Appropriate for High Frequency external sensorDisadvantages:Loss of information Potentially a problem with sensor with small field of view:a feature may only be seen very few timesAPPROACH 1SLAMTim Bailey 54Acceptanc
37、e of New FeaturesThe state vector is extended with past vehicle positions and the estimation of the cross-correlation between current and previous vehicle states is maintained.With this approach improved data association is possible by combining data form various points J.J.Leonard and R.J.Rikoski.I
38、ncorporation of delayed decision making into stochastic mapping Stephan Williams,PhD Thesis,2001,University of SydneyAdvantages:No Loss of Information Well suited to low frequency external sensors(ratio between vehicle velocity and feature rate information)Absolutely necessary for some sensor modali
39、ties(eg,range-only,bearing-only)Disadvantages:Cost of augmenting state with past poses The implementation is more complicatedAPPROACH 2SLAMTim Bailey 55Incorporation of New Features00,000,v vv mm vm mPPPPP We have the vehicle states and previous mapWe observed a new feature and the covariance and cr
40、oss-covariance terms need to be evaluated00,001,?v vv mm vm mPPPPPSLAMTim Bailey 56Incorporation of New Features Approach 1000000000vvvmmvmmPPPPPAWith A very large1()(/1)()()()()(/1)()(/)(/1)()()()TxTxxTW kP k khk SkS kh k P k khkRP k kP k kW k S k Wk 1111111111vvvmvnmvmmmnnvnmnnPPPPPPPPPP Easy to u
41、nderstand and implement Very large values of A may introduce numerical problemsSLAMTim Bailey 57Analytical Approach00,000,v vv mm vm mPPPPP We can also evaluate the analytical expressions of the new terms00,001,?v vv mm vm mPPPPPSLAMTim Bailey 58Now We know Structure of the Problem Models Kalman Fil
42、ter Equations Data Association Feature Incorporation Problem Solved?SLAMTim Bailey 59Efficient and Robust SLAMTim BaileyAustralian Centre for Field Robotics SLAMTim Bailey 60Overview Efficient SLAM Simplifications in the Prediction and Update Stages Sparse Information-form SLAM Partitioned SLAM Subm
43、aps FastSLAM Robust Data Association Batch Gating FastSLAM Environment Representation Generalised Features Dense Maps SLAM Website:openslam.orgSLAMTim Bailey 61EKF-SLAM Computation CostsStorage requirements are O(N2)for the covariance matrix.The prediction step is O(N3)if implemented naively.F*P*FT
44、is a cubic operationThe update step is O(N2).The augmentation step(ie.,adding new landmarks)is O(N2)if implemented naively.These costs limit the scale of the map that can be represented and maintained in real-time.We can reduce cost of prediction and augmentation to O(N)Prediction operation is,in fa
45、ct,a special case augmentation operationSLAMTim Bailey 62EKF Augmentation Add new pose(adding new landmarks is the same)Compute mean vector directly from non-linear model Compute covariance by linearisationSLAMTim Bailey 63Covariance Augmentation Need Jacobians of vehicle motion model with respect t
46、o all uncertain variables Presume,without loss of generality,that all motion uncertainty is contained in control variables uk and has covariance Uk To simplify notation on next slide,let SLAMTim Bailey 64Covariance Augmentation Covariance before augmentation After augmentationSLAMTim Bailey 65EKF Ma
47、rginalisation Marginalisation of covariance-form Gaussian is entirely trivial:just extract relevant partsSLAMTim Bailey 66EKF Fusion Presume we have a non-linear model with additive noise r that has covariance R Standard EKF update step:an N2 operationSLAMTim Bailey 67General Form of EKF AugmentSLAM
48、Tim Bailey 68Computation Optimal EKF-SLAM has an O(N2)update step.Quadratic cost in computation and storage.This limits the scale of the map that can be represented and maintained in real-time.Various solutions to this cost Sparse information form SLAM Submap methods FastSLAMSLAMTim Bailey 69General
49、 Information-Form AugmentSLAMTim Bailey 70Sparse Information-Form SLAMSLAMTim Bailey 71Problems with Information Form Need to recover mean for model linearisation.Need to recover mean and covariance for data association.Potentially very expensive(N3).There exist fast ways to recover moment form info
50、rmation of selected marginals.Takahashi sparse inverse Column-wise covariance recovery(and partial column recovery)SLAMTim Bailey 72Partitioned Updates For traditional EKF-SLAM,a quadratic update step is required for each observation.Updates are a high-frequency operation.It is possible to partition