NAIST Robotics Laboratory

業績リスト

2017年


発表論文

学術論文誌

  • L. El Hafi , M. Ding, J. Takamatsu and T. Ogasawara: “STARE: Realtime, Wearable,Simultaneous Gaze Tracking and Object Recognition from Eye Images”, SMPTEMotion Imaging Journal, vol. 126, no. 6, pp. 37-46, 2017.

  • A. Asker, S. F. M. Assal, M. Ding, J. Takamatsu, T. Ogasawara and A. M.Mohamed: “Modeling of natural sit-to-stand movement based on minimumjerk criterion for natural-like assistance and rehabilitation”, Advanced Robotics, vol. 31, no. 17, pp. 901-917, 2017.

  • F. von Drigalski, L. El Hafi , Pedro Miguel Uriguen Eljuri, Gustavo Ricardez Alfonso Garcia, Jun Takamatsu and Tsukasa Ogasawara. Vibration-Reducing End Effector for Automation of Drilling Tasks in Aircraft Manufacturing. IEEE Robotics and Automation Letters. pp. 1-6, Vol. 2, Issue 4, 2017.

  • 野田哲男、長野陽、永谷達也、堂前幸康、長野鉄明、田中健一、小笠原司:”機械学習の枠組みに基づく能動型探索アルゴリズムのサーボパラメータ調整問題への適用性の検討“、計測自動制御学会論文集、Vol.53, No.3, pp.1-12, 2017.

国際会議

  • Y. Takahashi, N. Shirakura, K. Toyoshima, T. Amako, R. Isobe, J. Takamatsu and K. Yasumoto: “DeepRemote: A Smart Remote Controller for Intuitive Control through Home Appliances Recognition by Deep Learning”, Proceedings of the Tenth International Conference on Mobile Computing and Ubiquitous Networking (ICMU2017), Toyama, Japan, 2017.

  • G. A. G. Ricardez and Y. Osaki and M. Ding and J. Takamatsu and T. Ogasawara: “ Estimating the Operation of Unknown Appliances for Service Robots using CNN and Ontology”, Proceedings of the Second International Conference on Robotic Computing (IRC 2018), pp. 181-182, CA, USA, 2018.

  • W. Yamazaki, M. Ding, J. Takamatsu and T. Ogasawara: “Hand Pose Estimation and Motion Recognition Using Egocentric RGB-D Video”, Proceedings of the 12th International Workshop on Robust Computer Vision (IWRCV 2018), p. 7, Japan, 2018.

  • G. A. G. Ricardez, F. V. Drigalski, L. El Hafi , S. Okada, P. -C. Yang, W. Yamazaki, V. Hoerig, A. Delmotte, A. Yuguchi, M. Gall, C. Shiogama, K. Toyoshima, P. M. U.Eljuri, R. E. Zapata, M. Ding, J. Takamatsu, and T. Ogasawara: “Fusion of Learned and Manual Features for Robotic Picking in Warehouse Automation”, Proceedings of the 12th International Workshop on Robust Computer Vision (IWRCV 2018), p. 6, Ikoma, Japan, 2018.

  • Y. Wu, J. Qiu, J. Takamatsu and T. Ogasawara: “Temporal-Enhanced Convolutional Neural Networks for Person Re-identification”, Proceedings of the 12th International Workshop on Robust Computer Vision (IWRCV 2018), p. 2, Ikoma, Japan, 2018.

  • F. von Drigalski, M. Gall, S.-G.Cho, M. Ding, J. Takamatsu, T. Ogasawara and T. Asfour: “Textile Identification Using Fingertip Motion and 3D Force Sensors in an Open-Source Gripper”, Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO 2017), pp.424-429, Macau SAR, China, 2017.

  • W. Yamazaki, M. Ding, J. Takamatsu and T. Ogasawara: “Hand Pose Estimation and Motion Recognition Using Egocentric RGB-D Video”, Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO 2017), pp.147-152, Macau SAR, China, 2017.

  • A. Yuguchi, G. A. G. Ricardez, M. Ding, J. Takamatsu and T. Ogasawara: “Gaze Calibration for Human-Android Eye Contact Using a Single Camera”, Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO 2017), pp. 883-888, Macau SAR, China, 2017.

  • J. Takamatsu, Y. Osaki, G.  A.  G. Ricardez, M. Ding, and T. Ogasawara: “Conversion between Semantic and Metric Information for Operation of Home Appliances by Service Robots”, Proceedings of 2017 Japan-Korea Workshop on Robotics and Information Technology for Better Quality of Life, p. 24, 2017.

  • M. Ding, T. Suzuki and T. Ogasawara: “Estimation of Driver’s Posture using Pressure Distribution Sensors in Driving Simulator and On-Road Experiment”,Proceedings of 2017 IEEE International Conference on Cyborg and Bionic Systems, pp.215-220, Bejing, China, 2017.

  • S. -G. Cho, M. Yoshikawa, M. Ding, J. Takamatsu and T. Ogasawara: “Hand Motion Recognition Using a Distance Sensor Array”, Proceedings of IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 1459-1464, 2017.

  • Makoto Ikawa, Etsuko Ueda, Akishige Yuguchi, Gustavo Alfonso Garcia Ricardez, Ming Ding, Jun Takamatsu and Tsukasa Ogasawara: “Quantification of Elegant Motions for Receptionist Android Robot”, Proceedings of the 19th International Conference on Human-Computer Interaction (HCII 2017), vol. LNCS 10286, pp.435-446, 2017.

  • Ryota Matsura, Satoki Tsuichihara, Garcia Ricardez Gustavo Alfonso, Ming Ding, Jun Takamatsu, Tsukasa Ogasawara:“Reviewing Eating Habits Using Video Summarization of Eating Scenes”, Proceedings of 19th ARAHE Biennial International Congress, MPF01, 2017.

  • Gustavo Alfonso Garcia Ricardez, L. El Hafi , F. von Drigalski, Rodrigo Elizalde Zapata, Chika Shiogama, Kenta Toyoshima, Pedro Miguel Uriguen Eljuri, Marcus Gall, Akishige Yuguchi, Arnaud Delmotte, Viktor Gerhard Hoerig, Wataru Yamazaki, Seigo Okada, Yusuke Kato, Ryutaro Futakuchi, Kazuo Inoue, Katsuhiko Asai, Yasunao Okazaki, Masaki Yamamoto, Ming Ding, Jun Takamatsu, and Tsukasa Ogasawara: “Climbing on Giant’s Shoulders: Newcomer’s Road into the Amazon Robotics Challenge 2017,” Proceedings of 2017 IEEE/RAS Warehouse Picking Automation Workshop (WPAW 2017), Singapore, Singapore, May 2017.

  • F. von Drigalski, Daiki Yoshioka, Wataru Yamazaki, Sung-Gwi Cho, Marcus Gall, Pedro Miguel Uriguen Eljuri, Viktor Hoerig, Jessica Beltran, Ming Ding, Jun Takamatsu and Tsukasa Ogasawara. NAIST Openhand M2S: A Versatile Two-Finger Gripper Adapted for Pulling and Tucking Textile. Proceedings of IEEE International Conference on Robotic Computing (IRC), pp. 117-122, 2017

  • L. El Hafi , M. Ding, J. Takamatsu, and T. Ogasawara, “Gaze Tracking and Object Recognition from Eye Images,” Proceedings of 2017 First IEEE International Conference on Robotic Computing (IRC),pp. 310-315, Taichung, Taiwan, April 2017.

  • Shinichi Kosugi, Hiroshi Takemura, Teru Yonezawa, Kenta Nomura, Yasuhito Tanaka, Atsutoshi Ikeda, Tsukasa Ogasawara: “Three-dimensional Tarsal Bones Geometry in Four Foot Positions, Combined Ankle Dorsiflexion/plantarflexion and Subtalar Eversion/inversion, by The Motion Assistance Device with The Parallel Link Structure,” Proceedings of ORS 2017 Annual Meeting, Orthopedic Research Society, San Diego, March 19-22, 2017.

研究会/全国大会

  • T. Kiyokawa, K. Tomochika, M. Ding, J. Takamatsu and T. Ogasawara: “Automatic Annotation of Training Data using Visual Markers for Object Detection in Automated Factories”, 第23回ロボティクスシンポジア, 5D4, 2018.

  • G.  A. Garcia Ricardez, F. von Drigalski, L. El Hafi, M. Ding, J. Takamatsu and T. Ogasawara: “Lessons from the Airbus Shopfloor Challenge 2016 and the Amazon Robotics Challenge 2017”, Proceedings of the 18th SICE System Integration Division Annual Conference (SI2017), pp. 572-575, Sendai, Japan, 2017.

  • G.  A. Garcia Ricardez, F. von Drigalski, L. El Hafi, S. Okada, P.-C. Yang, W. Yamazaki, V. Hoerig, A. Delmotte, A. Yuguchi, M. Gall, C. Shiogama, K. Toyoshima, P.  M. Uriguen Eljuri, R. Elizalde Zapata, M. Ding, J. Takamatsu and T. Ogasawara: “Warehouse Picking Automation System with Learning- and Feature-based Object Recognition and Grasping Point Estimation”, Proceedings of the 18th SICE System Integration Division Annual Conference (SI2017), pp. 2249-2253, Sendai, Japan, 2017.

  • 高松淳:“生活支援ロボットを実現するための家政学的見地からみた技術的特異点”, 第8回横幹連合コンファレンス, pp. C-1-4, 2017.

  • 山崎 亘, 丁 明, 高松 淳, 小笠原 司:“一人称視点RGB-D映像を用いた手の形状および動作の推定”, 第35回日本ロボット学会学術講演会 (RSJ2017), 1J3-06, 2017.

  • 吉岡 大輝, 丁 明, 吉武 康栄, 高松 淳, 小笠原 司:“ロボットアームを用いた筋硬度計測システムの開発”, 第35回日本ロボット学会学術講演会 (RSJ2017), 1D3-03(1)-(4), 2017.

  • 湯口 彰重, 丁 明, 高松 淳, 小笠原 司:“アンドロイドと人間の視線に対する知覚の比較に基づく眼球運動のキャリブレーションの効果”, 第35回日本ロボット学会学術講演会 (RSJ2017), 2L2-04(1)-(4), 2017.

  • 趙崇貴, 清川拓哉, 友近圭汰, 吉川雅博, 小笠原司:“前腕形状計測に基づく手の動作認識によるロボットアームの操作”, ヒューマンインタフェース学会ヒューマンインタフェースシンポジウム2017, 7D2-2, 2017.

  • 松浦 亮太, 築地原 里樹, Garcia Ricardez Gustavo Alfonso, 丁 明, 高松 淳, 小笠原 司: “画像処理技術を使った食事作法の振り返り支援”, 日本家政学会第69回大会, P-068, 2017.

  • 趙崇貴, 吉川雅博, 丁明, 高松淳, 小笠原司: “距離センサアレイを用いた前腕形状計測に基づく手の関節角度の推定”, ロボティクス・メカトロニクス講演会2017 (ROBOMECH2017), 2P2-M07, 2017.

  • 松浦 亮太, 築地原 里樹, Garcia Ricardez Gustavo Alfonso, 丁 明, 高松 淳, 小笠原 司: “映像要約による食生活振り返り支援システム”, ロボティクス・メカトロニクス講演会2017 (ROBOMECH2017), 2P2-F10, 2017.

  • F. von Drigalski, 吉岡 大輝, Marcus Gall, Pedro Miguel Uriguen Eljuri, 山崎 亘, 趙 崇貴, Viktor Hoerig, Jessica G. Beltran Ullauri, 丁 明, 高松 淳, 小笠原 司: “A robotic system for automated bed-making using a gripper specialized for textile manipulation”, ロボティクス・メカトロニクス講演会2017 (ROBOMECH2017), 2P2-G10, 2017.

  • 築地原 里樹, 袴田 有哉, Garcia Ricardez Gustavo Alfonso, 高松 淳, 小笠原司: “胴体姿勢推定を用いた高速なヒューマノイドロボットの全身動作生成”, ロボティクス・メカトロニクス講演会2017 (ROBOMECH2017), 1P2-N07, 2017.

Technical Report

解説・総説

  • M. Ikawa: “HCII2017参加報告書”, ヒューマンインタフェース学会誌, 第19巻, p-248, 2017.

著書・編書・翻訳書

受賞

  • Y. Takahashi, N. Shirakura, K. Toyoshima, T. Amako, R. Isobe, J. Takamatsu and K. Yasumoto: “Best Student Paper Award” at the Tenth International Conference on Mobile Computing and Ubiquitous Networking (ICMU2017), Toyama, Japan, 2017.

  • Akishige Yuguchi, Gustavo Alfonso Garcia Ricardez, Ming Ding, Jun Takamatsu, and Tsukasa Ogasawara: “Finalist of Best Paper in Biomimetics Award” at the 2017 IEEE Int. Conf. on Robotics and Biomimetics (ROBIO 2017), Macau SAR, China, December 5-8, 2017.

学位論文

博士論文

  • L. El Hafi :  STARE: Real-Time, Wearable, Simultaneous Gaze Tracking and Object Recognition from Eye Images, PhD thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2017.
    Keywords: Eye model, Corneal image, Gaze tracking, Object recognition, Wearable device, Real time

  • F. von Drigalski:  Textile recognition and manipulation using tactile sensing based on active perception, PhD thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Textile Manipulation, Tactile Recognition, Tactile Sensing, Robot Gripper, Service Robots

修士論文

  • C. Shiogama: “Development of shoes selection system to reduce running injuries risk based on lower limb motion estimation technique using wearable sensors”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Running,Wearable Sensors,Running Shoes,Running Injuries,Digital Human Technology,Motion Feature Extraction

  • D. Yoshioka: “Scooping Motion Generation by a Robot Arm Considering Container Shape”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Scooping motion, Robot arm, Meal assist robot, Cooking robot, Motion generation

  • K. Tomochika: “Automatic Annotation of Training Data using Visual Markers for Object Detection in Automated Factories”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Automatic annotation, Object detection, Pose estimation, Deep learning, Visual marker

  • K. Toyoshima: “Design and Evaluation of End Effector for Touch Care Robots”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: touch-care robot, end effector, tactile, comfort, human-likeness

  • N. Shirakura: “Pushing Motion for an Unliftable Object Using a Multi-Copter”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Multi-copter, Pushing motion, Manipulation, Aerial manipulation, Physical intaraction

  • P. M. Uriguen Eljuri: “Rearranging Tasks in Daily-life Environments with a Humanoid Robot”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Humanoid robot, rearranging task, semantic information, task planning, stack items

  • R. Kawakami: “Refinement of the Disparity Map by Using Semantic Segmentation”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Stereo vision, Disparity map, Semantic map, Convolutional neural network, KITTI dataset

  • T. Amako: “Reconstruction of 3D Hand Shape by Stereo Vision of RGB Camera”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Hand Shape, 3D Reconstruction, RGB Camera, Stereo Vision, Nelder-Mead Simplex Method

  • T. Kiyokawa: “Tactile-based Pouring Motion inspired by Human Skill”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: Tactile-based manipulation, Pouring motion, Pouring skill, Mass estimation

  • T. Inoue: “Gazed Object Estimation for Obtaining Detailed Time-usage Data using a Mobile Robot”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    [ Keywords: Mobile robot,Lifelogging,Gazed target estimation,Changing viewpoint,Home economics

  • Y. Murase: “Human-Safe and Robot-Efficient Local Path Planner for Mobile Robot in Dynamic Environments”, Master’s thesis, Nara Institute of Science and Technology (NAIST), 8916-5 Takayama, Ikoma, Nara 630-0192, Japan, 2018.
    Keywords: collision avoidance, local path planning, dynamic navigation, cost function, automobile robots