Ermano Arruda
ermano dot arruda at gmail dot com

I completed my PhD in Computer Science at the University of Birmingham. I was supervised by Prof. Jeremy Wyatt and Dr. Marek Kopicki. I worked in the Intelligent Robotics Lab, where my research focus was applying machine learning for robotic manipulation.

I have worked on active vision for robotic grasping in the PaCMan project .

Before being a postgraduate student, I received a Bachelors in Computer Science at the Center for Informatics, Federal University of Pernambuco. I have also been part of the VoxarLabs , where I worked on computer vision, augmented and virtual reality research and applications.

Google Scholar  /  GitHub  /  LinkedIn

Research

I am interested in how closing the loop between perception and action can improve performance in manipulation tasks for robotics. My research focus is robot perception, machine learning and control.

>

Uncertainty averse pushing with model predictive path integral control
Ermano Arruda*, Michael J Mathew*, Marek Kopicki, Michael Mistry, Morteza Azad, Jeremy Wyatt
(* equal contribution)
International Conference on Humanoid Robots (Humanoids), 2017
video / bibtex

This work introduces a planner that makes use of an uncertain, learnt forward (dynamical) model to plan robust push manipulation. The forward model of the system is learned by poking the object in random directions and is then utilised by a model predictive path integral controller to push the box to a required goal location. By utilising path integral control, the proposed approach is able to avoid regions of high predictive uncertainty in the forward model. Thus, pushing tasks are completed in a robust fashion with respect to estimated uncertainty in the forward model and without the need of differentiable cost functions.

>

Active vision for dexterous grasping of novel objects
Ermano Arruda, Jeremy Wyatt, Marek Kopicki
International Conference on Intelligent Robots and Systems (IROS), 2016
video / bibtex

We tackled the problem of improving robot grasp performance using active vision. We sought to increase grasping success via two view selection heuristics: one that would allow the robot to explore good quality grasp contact points, and another that would permit the robot to investigate its workspace to make sure candidate grasp trajectories would not lead to collisions with unseen parts of the object to be grasped. Our results showed that this approach yielded better grasp success rate when compared to a random view selection strategy, while using fewer camera views for grasp planning.

Workshops
>

Generative grasp synthesis from demonstration using parametric mixtures
Ermano Arruda, Claudio Zito, Mohan Sridharan, Marek Kopicki, Jeremy L. Wyatt
Task-Informed Grasping (TIG-II): From Perception to Physical Interaction, Robotics: Science and Systems (RSS), 2019.
video / bibtex


We present a parametric formulation for learning generative models for grasp synthesis from a demonstration. We cast new light on this family of approaches, proposing a parametric formulation for grasp synthesis that is computationally faster compared to related work and indicates better grasp success rate performance in simulated experiments. The proposed implementation is also able to incorporate arbitrary constraints for grasp ranking that may include task-specific constraints. Results are reported followed by a brief discussion on the merits of the proposed methods noted so far.

Course Projects
>

A study on SLAM techniques with applications on robot perception
Ermano Arruda, Veronica Teichrieb, Joao Paulo Lima, 2015
video


In my final year project I have implemented a Graph SLAM (Simultaneous Localisation and Mapping) system for mobile robots. The system was quantitatively evaluated on the TUM RGB-D SLAM Benchmark dataset. The final system was able to successfully map a whole flat using FAB-MAP for loop-closure detection.

Awards
pacman

  • First place at ISMAR Off-site Tracking Competition, Fukuoka, Japan, 2015. Developed a monocular visual odometry system with additional sparse bundle adjustment for camera trajectory optimisation.
  • video1 / video2

    code

pacman

  • Winning team CESAR-VoxarLabs at LARC/CBR - Latin American and Brazilian Robotics Competition, RoboCup@Home, 2014. Worked on object-tracking and detection system.
  • meet i-zak

Teaching
teach

06-13520: Intelligent Robotics
Teaching Assistant (TA)

06-28912: Graphics
Teaching Assistant (TA)

06-27821: Software Workshop 1
Teaching Assistant (TA)


This webpage is really nice.