Content


The half-day workshop will focus on promoting open-access grasping and manipulation related datasets and identifying the critical needs on new datasets and a methodology on utilizing the data. The scopes of the datasets in this workshop would include human motion datasets, instrumental activities of daily living (IADL) datasets, other activity datasets, object geometry and motion datasets, haptics interaction datasets, as well as any other datasets on human and robot grasping and manipulation. The datasets are not only crucial for evaluating and comparing the performances of novel methods, but also extremely valuable for offline robotic learning and training. Recently, there is a worldwide trend on providing and using high-quality open-access grasping and manipulation datasets. Many different datasets were recently collected by a number of groups and shared for different research purposes. The workshop will bring together researchers from different domains for their common interests in grasping and manipulation related datasets. The objective of the workshop is to address the need to promote the use of open-access datasets, coordinating the efforts and resources and avoid pitfalls in collecting high-quality datasets, clearing up confusion in selecting suitable datasets, and identifying new demands in datasets.

The workshop will be comprised of two oral sessions and one addition session of summary and panel discussion to encourage Half of the presentations in the oral sessions are invited and the other half are openly solicited and reviewed to attract broader participations and facilitate vibrant discussions. The presentations will be concise and limited to 10 to15 minutes and the speakers will be encouraged to include interactive demos.

In the summary and panel session, an overview of the datasets that are not covered by the speakers will be presented by one of the organizers and then a panel discussion will be organized to discuss open questions that will be raised during the talks and more broader questions such as "What is missing in the existing datasets?" and "Can we merge the existing datasets and how?"

Tentative Agenda

8.00-8.10 Welcome by organizers and introduction to Workshop
   
  Part I Grasping and Manipulation Datasets: The Objects’ point of view
8.10-8.25 Goldberg (UC Berkeley): Dexterity Network (Dex-Net): A Cloud-Based Network of 3D Objects for Robust Grasp Planning
8.25-8.40 D. Fox (U. of Washington): Experiences with RGB-D object dataset
8.40 – 8.55 Y. Bekiroglu (U. of Birmingham): Assessing grasp stability and object shape modeling based on visual and tactile data
8.55 – 9.10 A. Dollar (Yale): The YCB object benchmark for manipulation research
   
  Part II Grasping and Manipulation Datasets: The Humans/Robots’ point of view
9.10 – 9.25 M. Bianchi (U. of Pisa and Istituto Italiano di Tecnologia): An open-access repository to share data and tools for the study of human and robotic hands: the HandCorpus initiative
9.25 – 9.40 J. Bohg (Max Planck Institute): Leveraging Big Data for Grasp Planning
9.40 – 9.55 T. Asfour (Karlsruher Institut für Technologie): The KIT Whole-Body Human Motion Database
9.55 – 10.20 Poster Spotlight
   
10.20 – 10.40 Coffee Break
10.20 – 11.00 Poster Session
   
11.00-11.15 H. Marino/M. Gabiccini (University of Pisa). Datasets (and tools) from disconnected markers to organized behaviors: a path towards autonomous manipulation
11.15 – 11.30 Y. Sun (U. of South Florida): Interactive motion and wrench in instrument manipulation
   
11.30 – 12.30 Panel Discussion (Panelists: A. Rodriguez; S. Levine; A. Gupta; all invited speakers;)

Confirmed Speakers

Aaron Dollar, Yale U.
Title: The YCB object benchmark for manipulation research
Abstract:
I will discuss some of our joint efforts at Yale, CMU, and Berkeley towards developing a physical benchmark of objects, and software tools for autonomous robotic manipulation.

Dieter Fox, University of Washington
Title: Experiences With an RGB-D Object Dataset
Abstract:
In this talk, I will present our effort in developing the first dataset for RGB-D based object recognition. I will also discuss lessons learned from this work and how this might apply to grasping and manipulation datasets.

Yasemin Bekiroglu, University of Birmingham
Title: Assessing grasp stability and object shape modeling based on visual and tactile data
Abstract:
I will talk about probabilistic approaches using real sensory data, e.g., visual and tactile, for learning models to assess grasp success (discriminative and generative) and understanding object shape which is important for grasp planning. I will also introduce a low-cost pipeline and database for reproducible manipulation research. Our approach combines an inexpensive generation of detailed 3D object models via monocular camera images with a state of the art object tracking algorithm.

Matteo Bianchi, IIT
Title: An open-access repository to share data and tools for the study of human and robotic hands: the HandCorpus initiative
Abstract:
The HandCorpus, is an open-access repository for sharing data, tools and analyses about human and robotic hands. The HandCorpus website represents a cross-platform and user-friendly portal for researchers interested in sharing datasets and/or exchanging ideas, regarding the most versatile end-effector known, the human hand. Over the last years the HandCorpus community has grown and consists now (September 2015) of five European Committee (EC) projects and more than 20 research groups, located across Europe and United States of America. Finally, the HandCorpus website is cross-platform, cross-browser and fully accessible through all kind of mobile-devices.

Jeannette Bohg, Max Planck Institute
Title: Leveraging Big Data for Grasp Planning
Abstract:
We publicly released a new large-scale database containing grasps that are applied to a large set of objects from numerous categories. These grasps are generated in simulation and are annotated with the standard epsilon and a new physics-metric. We use a descriptive and efficient representation of the local object shape at which the grasp is applied. Each grasp is annotated with the proposed metrics and representation.
Given this data, we present a two-fold analysis:
(i) We use crowdsourcing to analyze the correlation of the two metrics with grasp success as predicted by humans. The results confirm that the proposed physics metric is a more consistent predictor for grasp success than the epsilon metric. Furthermore it supports the hypothesis that human labels are not required for good ground truth grasp data. Instead the physics metric can be used for simulation data.
(ii) We apply big-data learning techniques (Convolutional Neural Networks and Random Forests) to show how they can leverage the large-scale database for improved prediction of grasp success.

Yu Sun, University of South Florida
Title: Interactive motion and wrench in instrument manipulation
Abstract:


Tamim Asfour, Karlsruher Institut für Technologie
Title: The KIT Whole-Body Human Motion Database
Abstract:
We present a publically released large-scale whole-body human motion database consisting of motion data of the observed human subject as well as the objects with which the subject is interacting. We describe the procedures for a systematic recording of human motion data with associated complementary data like video recordings and additional sensor measurements (force, IMU, …), as well as environmental elements and objects. The availability of accurate object trajectories together with the associated object mesh models makes the data especially useful for the analysis of manipulation, locomotion and loco-manipulation tasks. We present procedures and techniques for motion capturing, annotating and organization in large-scale databases as well as for the normalization of human motion to a unified representation based on a reference model of the human body. In addition, we provide methods and software tools for efficient search in the database as well as the transfer of subject-specific motions to robots with different embodiments and discuss several current applications of the database in our current research on whole-body grasping and loco-manipulation

Marco Gabiccini/H. Marino, University of Pisa
Title: Datasets (and tools) from disconnected markers to organized behaviors: a path towards autonomous manipulation
Abstract:
In this talk, I will discuss the pros and cons in generating large datasets for grasping in experimental settings or in simulated environments, possibly including the use of optimal control methods to substitute the human in the loop.


Ken Goldberg, UC Berkeley
Title: Dexterity Network (Dex-Net): A Cloud-Based Network of 3D Objects for Robust Grasp Planning
Abstract:
Dexterity Network 1.0 (Dex-Net) is a data-driven approach to robust robot grasping and manipulation based on a new dataset of 3D object models that currently includes over 10,000 unique 3D object models and 2.5 million parallel-jaw grasps. Dex-Net includes a Multi-Armed Bandit algorithm with correlated rewards from prior grasps to estimate the probability of force closure under sampled uncertainty in object and gripper pose and friction. Dex-Net 1.0 uses Multi-View Convolutional Neural Networks (MV-CNNs), a new deep learning method for 3D object classification, as a similarity metric between objects. Dex-Net 1.0 runs on the Google Cloud Platform to simultaneously run up to 1,500 virtual cores, reducing runtime by three orders of magnitude. Experiments suggest that using prior data can significantly benefit the quality and complexity of robust grasp planning.

Accepted Posters


*Poster size is A0 in portrait or smaller due to the size of the poster boards.

Size	Width x Height (mm)	Width x Height (in)
A0	841 x 1189 mm		33.1 x 46.8 in


Poster # 1: Autonomous grasping data collection and tactile signal variability in real-world grasping
Qian Wan, Robert D. Howe

Poster # 2: Physically-Consistent Hand Manipulation Dataset
Vikash Kumar, Emo Todorov

Poster # 3: Performance Evaluation of 4DoF Gripper Pose Estimation Method by using APC items
Yukiyasu Domae, Ryosuke Kawanishi

Poster # 4: Using the YCB Object and Model Set to benchmark the iCub grasping capabilities
Lorenzo Jamone, Alexandre Bernardino, and Jose Santos-Victor

Poster # 5: More than a Million Ways to Be Pushed -- A HighFidelity Experimental Data Set of Planar Pushing
Peter K.T. Yu, Alberto Rodriguez, Maria Bauza Villalonga 

Poster # 6: Automotive General Assembly Part Datasets And their Environment
Jane Shi

Poster # 7: BiGS: BioTac Grasp Stability Dataset
Yevgen Chebotar, Karol Hausman, Zhe Su, Artem Molchanov, Oliver Kroemer, Gaurav Sukhatme, and Stefan Schaal

Poster # 8: Recording hand-surface usage in grasp demonstrations
Ravin de Souza, Jose Santos-Victor, and Aude Billard

Poster # 9: A dataset of thin-walled deformable objects for manipulation planning
Nicolas Alt, Jingyi Xu and Eckehard Steinbach

Poster # 10: CapriDB - Capture, Print, Innovate: A Low-Cost Pipeline and Database for Reproducible Manipulation
Research
Florian T. Pokorny*, Yasemin Bekiroglu*, Karl Pauwels, Judith Bütepage, Clara Scherer, and Danica Kragic

Poster # 11: Datasets for tactile perception and manipulation
Benjamin Ward-Cherrier, Luke Cramphorn, and Nathan F. Lepora

Poster # 12: G3DB: A Database of Successful and Failed Grasps with RGB-D Images, Point Clouds, Mesh Models and Gripper Parameters
Ashley Kleinhans, Benjamin Rosman, Michael Michalik, Bryan Tripp, and Renaud Detry