Code: | M.EEC035 | Acronym: | PMAP |
Keywords | |
---|---|
Classification | Keyword |
OFICIAL | Automation and Control |
Active? | Yes |
Responsible unit: | Department of Electrical and Computer Engineering |
Course/CS Responsible: | Master in Electrical and Computer Engineering |
Acronym | No. of Students | Study Plan | Curricular Years | Credits UCN | Credits ECTS | Contact hours | Total Time |
---|---|---|---|---|---|---|---|
M.EEC | 23 | Syllabus | 2 | - | 6 | 39 | 162 |
Perception and Mapping addresses the current challenges of extracting 2D and 3D information, reconstructing and interpreting scenes based on active and passive sensors (or a set of sensors). Examples of application of UC contents include improving the perceptual capacity of autonomous agents (land, air and sea) so that they are able to properly interact with the surrounding environment.
Students who successfully complete this UC must:
-understand and be able to explain the concepts of image and point cloud processing, and the fundamental algorithms for sensory fusion applied to localization and mapping.
- have knowledge of existing methods for visual data analysis and be able to apply them in practical situations;
-acquire skills that allows them to use tools such as ROS, OpenCV and PCL, which implement some of the analyzed algorithms, and implement novel algorithms described in the literature;
-be able to analyze and understand selected scientific articles in the fields of computer vision and mobile robotics.
It is also intended that students shall be able to design and conceive advanced 2D and 3D perception systems.
Teaching and learning methods aim the knowledge of the contents referred to in the syllabus, reaching the targeted goals and competencies. The diversity of proposed methodologies aims at enhancing the skills and competencies established, seeking to evidence different levels of analysis, fostering the integration of knowledge. The proposed methods and strategies aim to develop students' knowledge, understanding and skills in processing 2D and 3D information.
The generic skills of teamwork, organization, etc. will be worked on in the group project.
Likewise, the ability to develop computer vision according to existing needs and to apply the most appropriate technological tools, to know, apply and evaluate 2D/3D information will be worked out in the weekly exercises and group project.
The specific skills in perception techniques will be worked during the semester in theoretical-practical classes.
Introduction to Perception and Mapping
Sensors (cameras, LiDAR, Sonar, GPS, Inertial)
Image processing and analysis
Acquisition of digital images of intensity and color
Geometric and radiometric model of a camera
Extraction of features and outliers
Multiple image geometry
Processing and analysis of point clouds
Acquisition of points clouds (active and passive)
Filters, Feature Extraction and Registration
2D and 3D Perception Systems
Calibration of multisensory systems
Sensor fusion (KF, Bayes, etc)
Scenario representation
SLAM
Case Study
Theoretical-Practical Classes: exposition and discussion of the contents of the program, and the resolution of exercises.
Practical assignments: development of programming projects related to perception techniques/algorithms.
Two projects/ assignments will be developed during the semester; these projects must be developed both during classes and at home.
There is also a final project in which each group will present a short report, in article format, and the project will be presented orally.
The final exam is worth 25% of the final classification.
Designation | Weight (%) |
---|---|
Exame | 25,00 |
Trabalho laboratorial | 75,00 |
Total: | 100,00 |
Designation | Time (hours) |
---|---|
Estudo autónomo | 40,00 |
Trabalho escrito | 40,00 |
Trabalho laboratorial | 43,00 |
Total: | 123,00 |
Students who have not carried out the practical component during the academic period will have to carry out an equivalent work.