Today FROG is the centre of attention at the Royal Alcázar in Seville!
All camera’s and microphones were turned on FROG…
The event was started with short presentations by Vicente C. Guzmán Fluja the Rector of Universidad Pablo de Olavide in Seville, and FROG’s own Luis Merino Cabañas also from UPO, and Vanessa Evers, coordinator of this EU project representing the University of Amsterdam.
FROG then took everyone on a short tour of the Alcázar and demonstrated his capabilities. Journalists from all media were in the party, both local and national!
Back in the on-site lab, Luis Merino and Fernando Caballero were interviewed for TV and radio.
This afternoon there will be a workshop for a small group of official tour guides from the Royal Alcázar and one from the Lisbon Zoo.
Maja Pantic leads the Intelligent Behaviour Understanding Group (iBUG) at Imperial College in London.
Ioannis Marras is a researcher from the Intelligent Behaviour Understanding Group (iBUG) at Imperial College, London. His research interests include image/video processing, computer vision and pattern recognition. During his work, he has developed multiple computer vision techniques for 2D/3D face tracking and 2D/3D face recognition in the wild.
Jie Shen is a researcher from the Intelligent Behaviour Understanding Group (iBUG) at Imperial College, London. His research interests include software engineering, human-robot interaction, and affect-sensitive human-computer interfaces. During his work, he has developed the HCI^2 Framework (http://ibug.doc.ic.ac.uk/resources/hci2-framework/), which is a software integration framework for human-computer interaction systems, currently being used in multiple research projects.
Jie Shen defended his PhD thesis, titled ‘Realising Affect-Sensitive Multimodal Human-Computer Interface: Hardware and Software Infrastructure‘, in May, 2014.
Unfortunately, the rain in Spain doesn’t always stay on the plain. Today we had plenty of it at the Royal Alcázar and here we have a FROG that doesn’t like getting too wet.
In spite of all of the visitors the staff keep the halls and patios amazingly clean. There is a lot of loose yellow sand in the gardens and, of course, this gets washed and walked everywhere.
Fernando Caballero Benitez and Luis Merino have been working together since 2003. They work on robot localisation and navigation, respectively. A robot first needs to know where it is (localisation) in order to carry out its mission (navigation). Fernando’s expertise is a bit more towards aerial robots, Luis’ towards ground robots.
PhD students who are fortunate enough to have these two as their supervisors are very lucky. It may be hard work but that goes with some solid training in the science that is the fundament of what they do, and sincere appreciation for what the PhD’s achieve. It’s a pleasure to see this close supervision in action – even if you don’t understand Spanish.
Fernando likes a good barbecue and he designed our polos.
One day Fernando will get around to making a serious, English language website and you will then find the link here. Till then we’ll have to make do with his publications page.
You will find some of UPO’s output for FROG and other research projects here: Luis, datasets and publications.
Luis and Fernando lead and supervise UPO’s team of young researchers………..
Ignacio Pérez Hurtado de Mendoza is a computer science postdoc interested in the application of machine learning techniques to social robotics. In the FROG project supports the development of software modules related to the navigation of the robot and assists with the deployment of experiments.
Noé Pérez Higueras is in charge of the safe and robust navigation of the FROG robot in (indoor and outdoor) environments with people. Noé is trying to add social capabilities to navigation algorithms so that the robot respects human social conventions and guarantees the comfort of surrounding persons.
Noé’s PhD is directly related to his work in the FROG project. Mainly, he is studying the different robot navigation algorithms and trying to extend them by adding social skills. To do that he employs machine learning techniques in order to learn from real people how they navigate between each other in crowded scenarios.
In 2014 Noé spent 3 weeks in the Netherlands, mapping the Gallery at the University of Twente for the opening ceremony with the Dutch king.
Noé fitted in well with the UT PhDs and will continue to work with some of them in the TERESA project. When he makes a website we will link to it here.
In the FROG project, Javier Pérez-Lara is working on improving localisation algorithms, based not only in laser readings but also on appearance matching, to recover from erroneous convergences to wrong localisation or even recovering from complete lost situations.
Rafael Ramón Vigo’s knowledge of transversal competences like electronics, software and hardware are all put to good use in the FROG project. Rafael provides help with the general set up of the robotic platform and assists with the deployment of experiments.
Rafael’s PhD thesis work is on how to infer from data and its statistics the insights of human motion navigation, with the idea of transferring it to the robot’s navigation stack. The basis of this approach is to use machine learning algorithms.
Rafael also grows delicious mangoes. Recently his family planted nearly 2000 young trees that will come into production in 4 or 5 years time.
How about this for an emblem? It looks great on the FROG polos.
The emblem printed on the polos is actually a reversed version of the docking target. It is a set of ArUco markers that just happen to say FROG – though it probably took Fernando Caballero quite some time to find them. UPO uses this ArUco marker to align the FROG to its charger during the docking procedure.
He’s always there to help:
He assembled baby-FROG and is wizard with the game pad during experiments…
He works all hours…
And all too often he has to leave his wife and kids in Lisbon while he attends meetings… Thanks from the FROG consortium to Carlos’ family!
IDMind’s main output for FROG is a beautiful big green robot and the documentation to go with it.
Yesterday, while the FROG was powered-down for hardware enhancements, Randy Klaassen and Jan Kolkmeier from the University of Twente added some features to their FROG Wizard-of-Oz web interface.
They designed and implemented this web interface to control not only the output of specific AR content but also to trigger chosen steps from the State Machine.
This interface is not used for the FROG tour when the robot is running autonomously but is a splendid tool for experiments or testing as it can trigger specific states or abilities. It can also be used to keep the FROG in action in the case f really bad weather as it can be used to run just the indoor parts of the robot mission.
Earlier that had made a quick round of the Alcázar and effectuated some quick fixes in their coding for “Ripples through Europe” and “The toilets are NOT over there!“