BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.2//
METHOD:PUBLISH
X-WR-CALNAME;VALUE=TEXT:Eventi DIAG
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:STANDARD
DTSTART:20131027T030000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20140330T020000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:calendar.6745.field_data.0@www.open.diag.uniroma1.it
DTSTAMP:20260404T210814Z
CREATED:20131009T113900Z
DESCRIPTION:Humans use their hands in most of their everyday life activitie
 s. Thus\, the development of technical systems that track the 3D position\
 , orientation and full articulation of human hands from markerless visual 
 observations can be of fundamental importance in supporting a number of di
 verse applications. In this talk\, we provide an overview of our work \non
  hand tracking. First\, we describe methods for vision-based detection and
  tracking of hands and fingers in 2D\, with emphasis on occlusions handlin
 g and illumination invariance. We also demonstrate hand posture recognitio
 n techniques and their use in HCI and HRI. Then\, we focus on a recently p
 roposed framework for exploiting markerless visual observations to track t
 he 3D position\, orientation and full articulation of a human hand that mo
 ves in isolation in front of an RGBD camera. We treat this as an optimizat
 ion problem that is effectively solved using a variant of Particle Swarm O
 ptimization (PSO). Next\, we show how the core of the tracking framework h
 as been employed to provide state-of-the-art solutions for problems of eve
 n higher dimensionality and complexity\, e.g.\, for tracking two strongly 
 interacting hands or for tracking the state of a complex scene where a han
 d interacts with several objects. Finally\, we demonstrate how the results
  of hand tracking have been used to recognize human actions and infer huma
 n intentions in the context of tabletop object manipulation scenarios.\n\n
  Antonis Argyros is an Associate Professor at the Computer Science Departm
 ent\, University of Crete and a researcher at the Institute of Computer Sc
 ience (ICS)\,  Foundation for Research and Technology-Hellas (FORTH) in He
 raklion\, Crete\, Greece. He received a B.Sc. degree in Computer Science (
 1989) and a M.Sc. degree in Computer Science (1992)\, both from the Comput
 er Science Department\, University of Crete. On July 1996\, he completed h
 is PhD on visual motion analysis at the same Department. He has been a pos
 tdoctoral fellow at the Computational Vision and Active Perception Laborat
 ory (CVAP) at the Royal Institute of Technology in Stockholm\, Sweden. Sin
 ce 1999\, as a member of the Computational Vision and Robotics Laboratory 
 (CVRL) of FORTH-ICS\, he has been involved in many RTD projects in compute
 r vision\, image analysis and robotics. He is an area editor for the Compu
 ter Vision and Image Understanding Journal (CVIU)\, member of the Editoria
 l Board of the IET Image Processing Journal and one of the general chairs 
 of the 11th European Conference in Computer Vision (ECCV'2010\, Heraklion\
 , Crete). He is also a faculty member of the Brain and Mind interdisciplin
 ary graduate program and a member of the Strategy Task Group of the Europe
 an Consortium for Informatics and Mathematics (ERCIM). The research intere
 sts of Argyros fall in the areas of computer vision with emphasis on track
 ing\, human gesture and posture recognition\, 3D reconstruction and omnidi
 rectional vision. He is also interested in applications of computational v
 ision in the fields of robotics and smart environments.\n\n \n 
DTSTART;TZID=Europe/Paris:20131111T150000
DTEND;TZID=Europe/Paris:20131111T150000
LAST-MODIFIED:20131110T154419Z
LOCATION:Aula Magna
SUMMARY:Tracking the motion of human hands - Antonis Argyros
URL;TYPE=URI:http://www.open.diag.uniroma1.it/node/6745
END:VEVENT
END:VCALENDAR
