- 78
- 13 253
PUT Mobile Robots Laboratory
Приєднався 17 гру 2015
Robotic Christmas Demo - Mobile Robots Lab 2024
Opus Project:
opus-nemo.put.poznan.pl/
Sonata Project:
sonata.put.poznan.pl/
Robot 4.0 project:
robot40.put.poznan.pl
Our laboratory:
lrm.put.poznan.pl/
Music from pixbay.com:
pixabay.com/pl/service/license-summary/
Institute of Robotics and Machine Intelligence (IRIM)
Poznan University of Technology (PUT)
2024
opus-nemo.put.poznan.pl/
Sonata Project:
sonata.put.poznan.pl/
Robot 4.0 project:
robot40.put.poznan.pl
Our laboratory:
lrm.put.poznan.pl/
Music from pixbay.com:
pixabay.com/pl/service/license-summary/
Institute of Robotics and Machine Intelligence (IRIM)
Poznan University of Technology (PUT)
2024
Переглядів: 1 340
Відео
Universal Wearable Haptic Glove for Force MeasurementDuring Object Manipulation
Переглядів 509 місяців тому
Most of the robotic hands are equipped with relatively simple force sensors to detect contact with manipulated objects. Very often they are only attached to the finger's tips and are not sufficient to measure the interaction forces for all shapes of objects and grasp types. The goal of this research was to develop a sensory glove that can be used on the existing robot's hands to enhance the sen...
MirrorNet: Hallucinating 2.5D Depth Images for Efficient 3D Scene Reconstruction
Переглядів 97Рік тому
Robots face challenges in perceiving new scenes, particularly when registering objects from a single perspective, resulting in incomplete shape information about objects. Partial object models negatively influence the performance of grasping methods. To address this, robots can scan the scene from various perspectives or employ methods to directly fill in unknown regions. This research reexamin...
Direct Object Reconstruction on RGB-D Images in Cluttered Environment
Переглядів 43Рік тому
Robots have limited perception capabilities when observing the scene from a single viewpoint. Some objects on the scene might be partially occluded and their 3D shape is not fully available to the robot. Existing methods obtain object models through a series of observations using RGB-D sensors or the robot is trained to operate in the presence of occlusions. In this paper, we directly address o...
2021 Samochody autonomiczne na WARiE
Переглядів 60Рік тому
creef.put.poznan.pl/artykul/samochody-autonomiczne-na-warie
Franka Emika Panda FR3 with a custom-made 3D-printed griper.
Переглядів 201Рік тому
Franka Emika Panda FR3 with a custom-made 3D-printed griper in the new Robotic Manipulation Laboratory (related to the NCN ProRoc project).
The Mobot robot using Edge Insights for Autonomous Mobile Robots (EI for AMR) from Intel on ROS2
Переглядів 3732 роки тому
Our Mobot robot using Edge Insights for Autonomous Mobile Robots (EI for AMR) from Intel on ROS2: www.intel.com/content/www/us/en/developer/topic-technology/edge-5g/edge-solutions/autonomous-mobile-robots/overview.html
CNN-based Joint State Estimation During Robotic Interaction with Articulated Objects
Переглядів 542 роки тому
In this paper, we investigate the problem of state estimation of rotational articulated objects during robotic interaction. We estimate the position of a joint axis and the current rotation of an object from a pair of RGB-D images registered by the depth camera mounted on the robot. However, the camera mounted on the robot has limited view due to occlusions of the robot's arm. Moreover, some co...
On the descriptive power of LiDAR intensity images for segment-based loop closing in 3-D SLAM
Переглядів 572 роки тому
A video from the paper presented at IEEE/RSJ IROS 2021 (Prague, on-line)
Informed Guided Rapidly-exploring Random Trees*-Connect for Path Planning of Walking Robots
Переглядів 742 роки тому
In this paper, we deal with the problem of full-body path planning for walking robots. The state of walking robots is defined in multi-dimensional space. Path planning requires defining the path of the feet and the robot's body. Moreover, the planner should check multiple constraints like static stability, self-collisions, collisions with the terrain, and the legs workspace. As a result, checki...
What's on the Other Side? A Single-View 3D Scene Reconstruction
Переглядів 672 роки тому
Robots have limited perception capabilities when observing a new scene. When the objects on the scene are registered from a single perspective, only partial information about the shape of the objects is registered. Incomplete models of objects influence the performance of grasping methods. In this case, the robot should scan the scene from other perspectives to collect information about the obj...
Learning from Experience for Rapid Generation of Local Car Maneuvers
Переглядів 1163 роки тому
Supplemental video for the paper "Learning from Experience for Rapid Generation of Local Car Maneuvers" submitted to Engineering Applications of Artificial Intelligence. Abstract: Being able to rapidly respond to the changing scenes and traffic situations by generating feasible local paths is of pivotal importance for car autonomy. We propose to train a deep neural network (DNN) to plan feasibl...
Planar Features for Accurate Laser-Based3-D SLAM in Urban Environments
Переглядів 813 роки тому
• SLAM for autonomous vehicles: day and night,, under dierent weather conditions. • 3-D laser scanner is the sensor of choice for automotive SLAM. • Application: ADAS for city buses that requires accurate localization in mixed outdoor/indoor environments. • Plane-LOAM includes new map representation and improved data association. • Plane-LOAM estimates the full 6 d.o.f. sensor pose and does not...
Practical Aspects of Detection and Grasping Objects by a Mobile Manipulating Robot
Переглядів 1903 роки тому
Practical Aspects of Detection and Grasping Objects by a Mobile Manipulating Robot
Convolutional Neural Network-based Local Obstacle Avoidance for a Mobile Robot
Переглядів 3244 роки тому
Convolutional Neural Network-based Local Obstacle Avoidance for a Mobile Robot
Autonomous vehicles: bridging the gap between uncertainperception and provably safe actions
Переглядів 864 роки тому
Autonomous vehicles: bridging the gap between uncertainperception and provably safe actions
Rapid path planning for autonomous car - navigation in the traffic jam
Переглядів 1054 роки тому
Rapid path planning for autonomous car - navigation in the traffic jam
Rapid path planning for autonomous car - collision avoidance
Переглядів 1614 роки тому
Rapid path planning for autonomous car - collision avoidance
Robot 4.0 project: autonomous driving between workspaces
Переглядів 1334 роки тому
Robot 4.0 project: autonomous driving between workspaces
Robot 4.0 - localization and building Octomap
Переглядів 8114 роки тому
Robot 4.0 - localization and building Octomap
Robot 4.0 - localization and 3D perception
Переглядів 1154 роки тому
Robot 4.0 - localization and 3D perception
CNN-based Foothold Selection for Mechanically Adaptive Soft Foot
Переглядів 854 роки тому
CNN-based Foothold Selection for Mechanically Adaptive Soft Foot
Budowa mapy termicznej 3D w czasie rzeczywistym (PL)
Переглядів 1184 роки тому
Budowa mapy termicznej 3D w czasie rzeczywistym (PL)
Spatiotemporal Calibration of Camera and 3D Laser Scanner
Переглядів 1084 роки тому
Spatiotemporal Calibration of Camera and 3D Laser Scanner
Cross-modal transfer learning forsegmentation of non-stationary objectsusing lidar intensity data
Переглядів 484 роки тому
Cross-modal transfer learning forsegmentation of non-stationary objectsusing lidar intensity data
A Fast and Practical Method of Indoor Localization for Resource-Constrained Devices [...]
Переглядів 694 роки тому
A Fast and Practical Method of Indoor Localization for Resource-Constrained Devices [...]
Great video!
Too cute
can you publish your source??
good job, how about computation efficiency
Very sad that Galgo didn't receive any gift :(
So, what's in those boxes? :)