Full stack development of Robot navigation
Hi everyone,
Well,
Developing a system that can behave intelligently in environment that previously unknown is challenging? because decision, it has to make is totally, well i can say mostly,? unpredictable before hand . Saying totally unpredictable is wrong which leads to unresolvable problem,So? that means every environment should have some way of predictability in order to get confident judgment about next move . As in controlled environment like assembly line next movement is measured and predictable , if we consider a car in the road , judgment of the next set of action can be predicted using the road signs , other vehicle movement , position and destination of the car and etc.. with tolerable perturbations handling . But if you consider the rover in the mars , only what system has is estimated model how environment could be , build using previous observation , that mean system has to predicted the action mostly according to the what it sens.
So that sot of Automated system only source of prediction to the environment around is sensors, it is carrying. But system also could not totally rely on sensors since sensors could not possibly build total picture but some aspect of it. also all sensors man made contain electrical and mechanical noise regardless of it?s grading or price. That mean system has to build a reliable enough model of the surrounding using incomplete and noisy sensor data. If the sensors are perfect (cannot build), and if they are giving total picture (cannot achieve) system is straight? forward and will produce output exactly according to the input.
Since that is not possible, system need account for all incompleteness?, overwhelming noises of the data and imperfection of the system as a whole. Accounting all those aspect, we are trying develop? a Autonomous navigational? 4 wheel (vehicle like) driving robot using of 2D Lidar as the base sensor. Since 2D Lidar Only can use to generated distances related to two dimension (x,y) and in fix Z axis point navigation only be accurate for plat surface with tolerable perturbations. Although It is little bit complex to use 4 wheel since driving extra dynamic need to be consider for the robot dynamic model. I am using this extra dynamic to study the possibility of implementing on car.
When going through the reading , I did not find much of straight forward? material, Although part of discussion here and there,more of them theoretical stand point. But there is few Which I realty liked , should mention as reference for someone really want learn the base
We will discuss the whole system using block digram and then will go through one by one with theoretical detail and the implementation. this may be vary because implementation is not finish yet but this is the overall plan up to now .
Basically system is containing two part, robot and the server , robot collected all the data ,process it and transmitted to server also move around according to the control data coming from the server. server, other hand collect all the incoming sensor data,(lidar data and encoder data),? process it and generate control data and feed it back to the robot. this two use wi-fi link as there communication medium. apart from main parts there is one connecting to server for monitoring and manual command.
Basic Block Diagram
Robot, is containing 5 separate component working together.
Well,
Developing a system that can behave intelligently in environment that previously unknown is challenging? because decision, it has to make is totally, well i can say mostly,? unpredictable before hand . Saying totally unpredictable is wrong which leads to unresolvable problem,So? that means every environment should have some way of predictability in order to get confident judgment about next move . As in controlled environment like assembly line next movement is measured and predictable , if we consider a car in the road , judgment of the next set of action can be predicted using the road signs , other vehicle movement , position and destination of the car and etc.. with tolerable perturbations handling . But if you consider the rover in the mars , only what system has is estimated model how environment could be , build using previous observation , that mean system has to predicted the action mostly according to the what it sens.
So that sot of Automated system only source of prediction to the environment around is sensors, it is carrying. But system also could not totally rely on sensors since sensors could not possibly build total picture but some aspect of it. also all sensors man made contain electrical and mechanical noise regardless of it?s grading or price. That mean system has to build a reliable enough model of the surrounding using incomplete and noisy sensor data. If the sensors are perfect (cannot build), and if they are giving total picture (cannot achieve) system is straight? forward and will produce output exactly according to the input.
Since that is not possible, system need account for all incompleteness?, overwhelming noises of the data and imperfection of the system as a whole. Accounting all those aspect, we are trying develop? a Autonomous navigational? 4 wheel (vehicle like) driving robot using of 2D Lidar as the base sensor. Since 2D Lidar Only can use to generated distances related to two dimension (x,y) and in fix Z axis point navigation only be accurate for plat surface with tolerable perturbations. Although It is little bit complex to use 4 wheel since driving extra dynamic need to be consider for the robot dynamic model. I am using this extra dynamic to study the possibility of implementing on car.
When going through the reading , I did not find much of straight forward? material, Although part of discussion here and there,more of them theoretical stand point. But there is few Which I realty liked , should mention as reference for someone really want learn the base
- probabilistic robotics by Sebastian THRUN
- SLAM lectures series by Professor Claus Brenner
We will discuss the whole system using block digram and then will go through one by one with theoretical detail and the implementation. this may be vary because implementation is not finish yet but this is the overall plan up to now .
Basically system is containing two part, robot and the server , robot collected all the data ,process it and transmitted to server also move around according to the control data coming from the server. server, other hand collect all the incoming sensor data,(lidar data and encoder data),? process it and generate control data and feed it back to the robot. this two use wi-fi link as there communication medium. apart from main parts there is one connecting to server for monitoring and manual command.
Basic Block Diagram
Robot, is containing 5 separate component working together.
- controller - controls the 4 motors connecting to the robot according to the data coming from the control server
- control server - listening? to the incoming data signal from the communicator,process the data and converted to PWM signal for the controller
- sensor - sensor network containing lidar and encoders , feeding the data to the location data server
- localization data server - listening to the incoming data signal from the sensor and process them to transmitted
- communicator - maintaining data link between robot and server. listening to the both localization data server and control server push back and forth data
Localization server,
is a work as API, from the data coming from the robot,lidar and encoders, localize the robot position related to it surrounding? while mapping the surrounding? respected to the robot position .
For localization API uses Kalmarn filter inside a particle filter Implementation? for localize and API treat the obstacle it pacing as Landmark of the map and position the landmark on the map according to there probability cures.
Also API puses all the result and performance data to the Data base end of the each calculation cycle.
Data base implementation is not finish since i am not yet done with control server but up to now it keeping all the localization data, performance data and keep cursor on latest unused data for the control sever and to keep the rest of the system up to date with localization cycle.
Viewer is a API for demonstrated the localization, path planing with generated map manual control panel for debug. It is paging with Data base for Live Data and also old Data as per needed
Path Planer is combination of A* Algorithm and Greedy Algorithm to make the system reliable and efficient . it is rendering live map from the Data base then Algorithm work graphical manner and generate set of coordinate for robot to travel and then converted back to real world coordinate and puses back to the Data base
Control server puses in the path Data from the Data base and according to the current position. it generate immediate action and puses it in to the Data link
Comments
Post a Comment