박사

Designing user interface based on gaze estimation and gesture recognition

한상윤 2020년
논문상세정보
' Designing user interface based on gaze estimation and gesture recognition' 의 주제별 논문영향력
논문영향력 선정 방법
논문영향력 요약
주제
  • 응용 물리
  • Gaze Estimation
  • Hand Gesture Recognition
  • Pupil Segmentation
  • User Interface
  • neural networks
동일주제 총논문수 논문피인용 총횟수 주제별 논문영향력의 평균
4,838 0

0.0%

' Designing user interface based on gaze estimation and gesture recognition' 의 참고문헌

  • ¡°Random sampleConsensus : a paradigm for model fitting with applications to image analysis and automatedCartography
    vol . 24 , no . 6 , pp . 381–395 [1981]
  • [89] A. A. Argyros and M. I. Lourakis, “Vision-based interpretation of hand gestures for remote control of a computer mouse,” in European Conference on Computer Vision. Springer, 2006, pp. 40–51.
  • You only look once : Unified , real-time object detection
    pp . 779–788 . [2016]
  • Where is the driver looking : Analysis of head , eye and iris for robust gaze zone estimation
    pp . 988–994 [2014]
  • Vision-based hand gesture recognition system for a dynamic and complicated environment
    pp . 2891–2895 . [2015]
  • Unwrapping the eye for visible-spectrum gaze tracking on wearable devices
    pp . 369–376 . [2013]
  • Understanding the difficulty of training deep feedforward neural networks
    pp . 249–256 [2010]
  • U-Net : Convolutional Networks for Biomedical Image Segmentation
    pp . 234–241 [2015]
  • Towards detection of bus driver fatigue based on robust visual analysis of eye state
    vol . 18 , no . 3 , pp . 545–557 [2016]
  • The impact of advertising location and user task on the emergence of banner ad blindness : An eye-tracking study
    vol . 30 , no . 3 , pp . 206–219 [2014]
  • The form of the human pupil
    vol . 35 , no . 14 , pp . 2021–2036 [1995]
  • Study of polynomial mapping functions in video-oculography eye trackers ,
    vol . 19 , no . 2 , p. 10 [2012]
  • Starburst : A hybrid algorithm for videobased eye tracking combining feature-based and model-based approaches ,
    [2005]
  • Star shape prior in fully convolutional networks for skin lesion segmentation
    pp . 737–745 [2018]
  • Segnet : A deepConvolutional encoder-decoder architecture for image segmentation
    vol . 39 , no . 12 , pp . 2481–2495 [2017]
  • Robust real-time face detection
    vol . 57 , no . 2 , pp . 137–154 [2004]
  • Real-time eye gaze tracking for gaming design andConsumer electronics systems
    vol . 58 , no . 2 , pp . 347–355 [2012]
  • R-fcn : Object detection via region-based fullyConvolutional networks
    pp . 379–387 . [2016]
  • Pupilnet :Convolutional neural networks for robust pupil detection
    [2016]
  • PupilCenter detection for infrared irradiation eye image usingCnn
    pp . 100–105 [2017]
  • PupilCenter detection based on the unet for the user interaction in vr and ar environments
    pp . 958–959 . [2019]
  • Pupil detection in the wild : An evaluation of the state of the art in mobile head-mounted eye tracking
    [2016]
  • Pupil and glint detection using wearableCamera sensor and near-infrared led array
    vol . 15 , no . 12 , pp . 30 126–30 141 [2015]
  • Pupil : an open source platform for pervasive eye tracking and mobile gaze-based interaction
    pp . 1151–1160 . [2014]
  • Pedestrian tracking in surveillance video based on modified cnn
    vol . 77 , no . 18 , pp . 24 041–24 058 [2018]
  • Oscann : Technical characterization of a novel gaze tracking analyzer
    vol . 18 , no . 2 , p. 522 [2018]
  • On generalizing driver gaze zone estimation using convolutional neural networks
    pp . 849–854 [2017]
  • Normalized cut loss for weakly-supervised cnn segmentation
    pp . 1818–1827 [2018]
  • Non-intrusive gaze tracking using artificial neural networks
    pp . 753–760 . [1994]
  • MobileNets : Efficient Convolutional Neural Networks for Mobile Vision Applications
    [2017]
  • Learning deconvolution network for semantic segmentation
    pp . 1520–1528 [2015]
  • Iris center corneal reflection method for gaze tracking using visible light
    vol . 58 , no . 2 , pp . 411–419 [2010]
  • Introduction to pytorch . In Deep learning with python
    pp . 195–208 [2017]
  • Internet advertising : Is anybody watching ?
    vol . 17 , no . 4 , pp . 8–23 [2003]
  • Interactive graph cuts for optimal boundary & region segmentation of objects in nd images
    vol . 1 .pp . 105–112 . [2001]
  • Human-machine interaction ( hmi ) : A survey
    [2011]
  • Human-computer interaction using eye-gaze input
    vol . 19 , no . 6 , pp . 1527–1534 [1989]
  • Human skin detection using rgb , hsv and ycbcr color models
    [2017]
  • Hand gesture recognition with depth datain Proceedings of the 4th ACM/IEEE international workshop on Analysis and retrieval of tracked events and motion in imagery stream
    pp . 9–16 . [2013]
  • Hand gesture recognition using kinect
    pp . 196–199 [2012]
  • Gesture recognition using microsoft kinect®
    pp . 100–103 [2011]
  • Gesture recognition in ego-centric videos using dense trajectories and hand segmentation
    pp . 688–693 [2014]
  • Gaze-directed adaptive rendering for interacting with virtual space
    pp . 103–110 . [1996]
  • Gaze estimation using 3-d eyeball model under hmd circumstance
    pp . 1–4 . [2017]
  • Gaze estimation from low resolution imagesin Pacific-Rim Symposium on Image and Video Technology
    pp . 178–188 [2006]
  • Fully Convolutional Networks for Semantic Segmentation
    pp . 3431–3440 [2015]
  • Faster r-cnn : Towards real-time object detection with region proposal networks
    pp . 91–99 . [2015]
  • Fast R-CNN
    pp . 1440–1448 [2015]
  • Eyeglasses based electrooculography human-wheelchair interface
    pp . 4746–4751 . [2009]
  • Eyecontact : scleral coil eye tracking for virtual reality
    pp . 184–191 . [2016]
  • Eye tracking and eye-based human–computer interaction
    pp . 39–65 . [2014]
  • Eye movement analysis for activity recognition using electrooculography
    vol . 33 , no . 4 , pp . 741–753 [2010]
  • Excuse : Robust ¨ pupil detection in real-world scenarios
    pp . 39–51 . [2015]
  • Evaluation of the tobii eyex eye tracking controller and matlab toolkit for research
    vol . 49 , no . 3 , pp . 923–946 [2017]
  • Etracker : A mobile gaze-tracking system with near-eye display based on a combined gaze-tracking algorithm
    vol . 18 , no . 5 , p. 1626 [2018]
  • Eog-based eye movement detection and gaze estimation for an asynchronous virtual keyboard
    vol . 47 , pp . 159–167 [2019]
  • Emulation of physician tasks in eye-tracked virtual reality for remote diagnosis of neurodegenerative disease
    vol . 23 , no . 4 , pp . 1302–1311 [2017]
  • Else : Ellipse selection for ¨ robust pupil detection in real-world environments
    pp . 123–130 [2016]
  • Efficient piecewise training of deep structured models for semantic segmentation
    pp . 3194–3203 [2016]
  • Efficient inference in fully connected crfs with gaussian edge potentials
    pp . 109–117 [2011]
  • Efficient cnn implementation for eye-gaze estimation on low-power/low-quality consumer imaging systems
    [2018]
  • Driver gaze detection based on deep residual networks using the combined single image of dual nearinfrared cameras
    vol . 7 , pp . 93 448–93 461 [2019]
  • Driver fatigue detection ´ based on real-time eye gaze pattern analysis
    pp . 683–694 [2017]
  • Design and implementation of an augmented reality system using gaze interaction ,
    pp.1–8 [2011]
  • Deepvog : Open-source pupil segmentation and gaze estimation in neuroscience using deep learning
    [2019]
  • Deeplab : Semantic image segmentation with deep convolutional nets , atrous convolution , and fully connected crfs
    vol . 40 , no . 4 , pp . 834–848 [2017]
  • Deepeye : Deep convolutional network for pupil detection in real environments
    vol . 26 , no . 1 , pp . 85–95 [2019]
  • Deep neural networks segment neuronal membranes in electron microscopy images
    pp . 2843–2851 . [2012]
  • Deep learningbased gaze detection system for automobile drivers using a nir camera sensor
    vol . 18 , no . 2 , p. 456 [2018]
  • Deep convolutional neural networks for efficient pose estimation in gesture videos
    pp . 538–552 . [2014]
  • Conditional random fields as recurrent neural networks
    pp . 1529–1537 . [2015]
  • Compensation method of natural head movement for gaze tracking system using an ultrasonic sensor for distance measurement
    vol . 16 , no . 1 , p. 110 [2016]
  • Combining head pose and eye location information for gaze estimation ,
    vol . 21 , no . 2 , pp . 802–815 [2011]
  • Cnn-based pupil center detection for wearable gaze estimation system
    vol . [2017]
  • Children with autism observe social interactions in an idiosyncratic manner
    [2019]
  • Appearance-based gaze estimation under slight head motion
    vol . 76 , no . 2 , pp . 2203–2222 [2017]
  • Appearance-based gaze estimation in the wild
    pp . 4511–4520 . [2015]
  • Appearance-based eye gaze estimation
    pp . 191–195 .
  • Analysis of the accuracy and robustness of the leap motion controller
    vol . 13 , no . 5 , pp . 6380– 6393 [2013]
  • Adaptive linear regression for appearance-based gaze estimation
    vol . 36 , no . 10 , pp . 2033–2046 [2014]
  • Accurate pupil features extraction based on new projection function
    vol . 29 , no . 4 , pp . 663–680 [2012]
  • A wearable hand gesture recognition device based on acoustic measurements at wrist
    pp . 4443–4446 [2017]
  • A threshold selection method from gray-level histograms.
    vol . 9 , no . 1 , pp . 62–66 [1979]
  • A review and analysis of eye-gaze estimation systems , algorithms and performance evaluation methods in consumer platforms
    vol . 5 , pp . 16 495–16 519 [2017]
  • A real-time gaze position estimation method based on a 3-d eye model ,
    vol . 37 , no . 1 , pp . 199–212 [2007]
  • A novel approach to real-time nonintrusive gaze finding .
  • A novel approach to 3-d gaze tracking using stereo cameras ,
    vol . 34 , no . 1 , pp . 234–245 [2004]
  • A differential approach for gaze estimation with calibration.
    vol . 2 , no . 3 , [2018]
  • 2d gaze estimation based on pupil-glint vector using an artificial neural network
    vol . 6 , no . 6 , p. 174 [2016]