Mostrar el registro sencillo del documento

dc.rights.licenseAtribución-NoComercial-SinDerivadas 4.0 Internacional
dc.rights.licenseAtribución-NoComercial-SinDerivadas 4.0 Internacional
dc.contributor.advisorGómez-Mendoza, Juan Bernardo
dc.contributor.advisorRiaño-Rojas, Juan Carlos
dc.contributor.authorCortés-Osorio, Jimy Alexander
dc.date.accessioned2020-05-08T16:30:42Z
dc.date.available2020-05-08T16:30:42Z
dc.date.issued2020-04
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/77495
dc.description.abstractThis thesis introduces a new approach for estimating kinematic quantities, namely the angle the relative speed, and the acceleration from an actual single motion blur image using the Discrete Cosine Transform (DCT). Motion blur is a common phenomenon present in images. It is produced by the relative movement between the camera and the objects in the scene during camera sensor exposure to light. It usually happens to image recording systems mounted in vehicles, hand-held cameras, drones, satellites, and mobile robots. Our software-based technique focuses on cases where the camera moves at a constant linear velocity while the background remains unchanged. Syntactic and actual image were used to carry out the experiments. The Mean Absolute Error (MAE) of DCT Radon method for direction estimation was 4.66 degrees. Additionally, the Mean Relative Error for speed estimation of the DCT Peudo Cepstrum was 5.15% . Our alternative DCT frequency analysis proposals were more accurate than all competitors evaluated for velocity measurement. Also, we proposed an alternative approach to estimate relative acceleration from an actual uniformly accelerated motion blur image using homomorphic mapping to extract the characteristic Point Spread Function of a degraded image to train a machine learning regression model. Ensembles of Trees, Gaussian Processes (GPR), Linear, Support Vector Machine (SVM), Tree Regression and 19 variants were evaluated to predict the acceleration. The bests RMSE result was 0.2547m/s2 using GPR (Matern 5/2) with a prediction Speed of 530 observation per second. Finally, the proposed methods are valid alternatives for the estimation of the velocity and the acceleration from a single linear motion blur image. (Texto tomado de la fuente)
dc.description.abstractEsta tesis presenta un nuevo enfoque para estimar cantidades cinemáticas, a saber, el ángulo de la velocidad relativa y la aceleración de una imagen de desenfoque de movimiento único real usando la Transformación discreta de coseno (DCT). El desenfoque de movimiento es un fenómeno común presente en las imágenes. Se produce por el movimiento relativo entre la cámara y los objetos en la escena durante la exposición del sensor de la cámara a la luz. Suele ocurrir con los sistemas de grabación de imágenes montados en vehículos, cámaras de mano, drones, satélites y robots móviles. La presente técnica basada en software se enfoca en casos donde la cámara se mueve a una velocidad lineal constante mientras el fondo permanece sin cambios. Para los experimentos de estimación de velocidad se usaron imágenes sintéticas y reales. El error absoluto medio (MAE) del método DCT Radon para la estimación de dirección fue de 4.66 grados. Además, el error relativo medio para la estimación de la velocidad del DCT Pseudo Cepstrum fue del 5.15%. Las propuestas alternativas de análisis de frecuencia DCT fueron más precisas que todos los competidores evaluados para la medición de velocidad. Adicionalmente, se propuso un enfoque alternativo para estimar la aceleración relativa a partir de una imagen de desenfoque de movimiento acelerado uniformemente real usando mapeo homomórfico para extraer la función de dispersión de puntos característica de una imagen degradada para luego entrenar un modelo de regresión de aprendizaje automático. Se tomaron un total de 125 imágenes de desenfoque de movimiento uniformemente acelerado en un entorno controlado con luz y distancia a 5 aceleraciones diferentes en un rango entre0.64m/s2 y2.4m/s2. Se evaluaron Conjuntos de árboles, procesos gaussianos (GPR), Regresión Lineal, Máquinas de Vectores de Soporte (SVM) y 19 variantes de regresión para predecir la aceleración. El mejor resultado RMSE fue de 0.2553m/s2 usando regresión GPR con una velocidad de predicción de 530 observaciones por segundo. Finalmente, se concluye que los métodos propuestos son alternativas válidas para la estimación de la velocidad y la aceleración de una sola imagen con desenfoque de movimiento lineal invariante.
dc.description.sponsorshipUniversidad Tecnológica de Pereira
dc.format.extent133
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.rightsDerechos reservados - Universidad Nacional de Colombia
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject.ddc620 - Ingeniería y operaciones afines
dc.titleA contribution to the estimation of kinematic quantities from linear motion blurred images
dc.title.alternativeUna contribución a la estimación de cantidades cinemáticas a partir de imágenes desenfocadas por movimiento lineal
dc.typeTrabajo de grado - Doctorado
dc.rights.spaAcceso abierto
dc.description.projectContribución a la estimación de las cantidades cinemáticas a partir de imágenes desenfocadas por movimiento lineal uniforme
dc.type.driverinfo:eu-repo/semantics/other
dc.type.versioninfo:eu-repo/semantics/acceptedVersion
dc.publisher.programManizales - Ingeniería y Arquitectura - Doctorado en Ingeniería - Automática
dc.contributor.researchgroupComputación Aplicada Suave y Dura (SHAC)
dc.description.degreelevelDoctorado
dc.publisher.departmentDepartamento de Ingeniería Eléctrica y Electrónica
dc.publisher.branchUniversidad Nacional de Colombia - Sede Manizales
dc.relation.references[Gunturk(2012)] X. e. Gunturk, Bahadir Kursat; LI, Image restoration: Fundamentals and advances, Pattern Recognition, 2012.
dc.relation.references[Dobes et al.(2010)Dobes, Machala, and Fürst] M. Dobes, L. Machala, and T. Fürst, "Blurred image restoration: A fast method of nding the motion length and angle", Digital Signal Processing, vol. 20, no. 6, pp. 1677 1686, 2010.
dc.relation.references[Zhou and Zhang(2018)] L. Zhou and Z. Zhang, Moving objects segmentation and extraction based on motion blur features, Computers & Electrical Engineering, vol. 68, pp. 490 498, 2018. [Online]. Available: https://doi.org/10.1016/j.compeleceng.2018.05.003
dc.relation.references[Luh et al.(1980)Luh, Walker, and Paul] J. Luh, M. Walker, and R. Paul, Resolvedacceleration control of mechanical manipulators, IEEE Transactions on Automatic Control, vol. 25, no. 3, pp. 468 474, 1980.
dc.relation.references[Hoberock(1977)] L. L. Hoberock, A survey of longitudinal acceleration comfort studies in ground transportation vehicles, Journal of Dynamic Systems, Measurement, and Control, vol. 99, no. 2, pp. 76 84, 1977.
dc.relation.references[Lepeti£ et al.(2003)Lepeti£, Klan£ar, krjanc, Matko, and Poto£nik] M. Lepeti£, G. Klan£ar, I. krjanc, D. Matko, and B. Poto£nik, Time optimal path planning considering acceleration limits, Robotics and Autonomous Systems, vol. 45, no. 3-4, pp. 199 210, 2003. [Online]. Available: https://doi.org/10.1016/j.robot.2003.09.007
dc.relation.references[Lepeti£ et al.(2003)Lepeti£, Klan£ar, krjanc, Matko, and Poto£nik] M. Lepeti£, G. Klan£ar, I. krjanc, D. Matko, and B. Poto£nik, Time optimal path planning considering acceleration limits, Robotics and Autonomous Systems, vol. 45, no. 3-4, pp. 199 210, 2003. [Online]. Available: https://doi.org/10.1016/j.robot.2003.09.007
dc.relation.references[Sironi and Spitkovsky(2011)] L. Sironi and A. Spitkovsky, Acceleration of particles at the termination shock of a relativistic striped wind, The Astrophysical Journal, vol. 741, no. 1, p. 39, 2011.
dc.relation.references[Ohgi(2002)] Y. Ohgi, Microcomputer-based acceleration sensor device for sports biomechanics, Memory, vol. 32, p. 128Mbit, 2002.
dc.relation.references[Xu et al.(2013)Xu, Liu, and Li] J. Xu, F. Liu, and D. Li, Investigation of velocity and acceleration elds on limestone specimen surface under uniaxial compression loads using video images, Beijing, China, 2013, pp. 655 660.
dc.relation.references[Sawicki et al.(2003)Sawicki, Wu, Baaklini, and Gyekenyesi] J. T. Sawicki, X. Wu, G. Y. Baaklini, and A. L. Gyekenyesi, Vibration-based crack diagnosis in rotating shafts during acceleration through resonance, in Nondestructive Evaluation and Health Monitoring of Aerospace Materials and Composites II, vol. 5046. International Society for Optics and Photonics, 2003, pp. 1 11. [Online]. Available: https://doi.org/10.1117/12.484297
dc.relation.references[Hozumi et al.(2000)Hozumi, Yoshida, Akasaka, Asami, Kanzaki, Ueda, Yamamuro, Takagi, and Yoshikawa] T. Hozumi, K. Yoshida, T. Akasaka, Y. Asami, Y. Kanzaki, Y. Ueda, A. Yamamuro, T. Takagi, and J. Yoshikawa, Value of acceleration ow and the prestenotic to stenotic coronary ow velocity ratio by transthoracic color doppler echocardiography in noninvasive diagnosis of restenosis after percutaneous transluminal coronary angioplasty, Journal of the American College of Cardiology, vol. 35, no. 1, pp. 164 168, 2000.
dc.relation.references[Stanisavljevic et al.(2000)Stanisavljevic, Kalafatic, and Ribaric] V. Stanisavljevic, Z. Kalafatic, and S. Ribaric, Optical ow estimation over extended image sequence, in Electrotechnical Conference, 2000. MELECON 2000. 10th Mediterranean, vol. 2. IEEE, 2000, pp. 546 549.
dc.relation.references[Barron et al.(1994)Barron, Fleet, and Beauchemin] J. L. Barron, D. J. Fleet, and S. S. Beauchemin, Performance of optical ow techniques, International journal of computer vision, vol. 12, no. 1, pp. 43 77, 1994.
dc.relation.references[Bab-Hadiashar and Suter(1998)] A. Bab-Hadiashar and D. Suter, Robust optic ow computation, International Journal of Computer Vision, vol. 29, no. 1, pp. 59 77, 1998.
dc.relation.references[Ishiyama et al.(2004)Ishiyama, Okatani, and Deguchi] H. Ishiyama, T. Okatani, and K. Deguchi, High-speed and high-precision optical ow detection for real-time motion segmentation, in SICE 2004 Annual Conference, vol. 2. IEEE, 2004, pp. 1202 1205.
dc.relation.references[Pinto et al.(2014)Pinto, Moreira, Correia, and Costa] A. M. Pinto, A. P. Moreira, M. V. Correia, and P. G. Costa, A ow-based motion perception technique for an autonomous robot system, Journal of Intelligent & Robotic Systems, vol. 75, no. 3-4, pp. 475 492, 2014.
dc.relation.references[Su et al.(2011)Su, Lu, and Tan] B. Su, S. Lu, and C. L. Tan, Blurred image region detection and classi cation, in Proceedings of the 19th ACM international conference on Multimedia. ACM, 2011, pp. 1397 1400.
dc.relation.references[Wu et al.(2012)Wu, Guan, Su, and Zhang] J.Wu, Y. Guan, M. Su, and H. Zhang, A real-time method for detecting sharp images in visual navigation, in Robotics and Biomimetics (ROBIO), 2012 IEEE International Conference on. IEEE, 2012, pp. 884 889.
dc.relation.references[Wu et al.(2011)Wu, Ling, Yu, Li, Mei, and Cheng] Y. Wu, H. Ling, J. Yu, F. Li, X. Mei, and E. Cheng, Blurred target tracking by blur-driven tracker, in Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011, pp. 1100 1107.
dc.relation.references[Dai and Wu(2008)] S. Dai and Y. Wu, Motion from blur, in Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on. IEEE, 2008, pp. 1 8.
dc.relation.references[Sorel et al.(2009)Sorel, Sroubek, and Flusser] M. Sorel, F. Sroubek, and J. Flusser, Recent advances in space-variant deblurring and image stabilization, Springer Science + Business, pp. pp 259 272, 2009.
dc.relation.references[McCloskey et al.(2011)McCloskey, Muldoon, and Venkatesha]S. McCloskey, K. Muldoon, and S. Venkatesha, Motion invariance and custom blur from lens motion, in Proc. IEEE Int. Conf. Computational Photography (ICCP), Apr. 2011, pp. 1 8.
dc.relation.references[Yitzhaky and Stern(2003)]Y. Yitzhaky and A. Stern, Restoration of interlaced images degraded by variable velocity motion, Optical Engineering, vol. 42, no. 12, pp. 3557 3565, 2003.
dc.relation.references[Yitzhaky et al.(1998)Yitzhaky, Mor, Lantzman, and Kopeika]Y. Yitzhaky, I. Mor, A. Lantz- man, and N. Kopeika, Direct method for restoration of motion-blurred images, Journal of the Optical Society of America, vol. 15, no. 6, pp. 1512 1519, 1998.
dc.relation.references[Pérez Huerta and Rodriguez Zurita(2005)]J. Pérez Huerta and G. Rodriguez Zurita, Image restoration of blurring due to rectilinear motion: constant velocity and constant acceleration, Revista mexicana de física, vol. 51, no. 4, pp. 398 406, 2005.
dc.relation.references[Loce and Wolberg(1995)]R. P. Loce and G. Wolberg, Characterization of vibration-induced image defects in input scanners. in Document Recognition, 1995, pp. 350 357.
dc.relation.references[Fliegel(2004)]K. Fliegel, Modeling and measurement of image sensor characteristics, RADIOENGINEERING-PRAGUE-, vol. 13, no. 4, pp. 27 34, 2004.
dc.relation.references[Lin and Li(2004a)]H. Lin and K. Li, Motion blur removal and its application to vehicle speed detection, Electrical Engineering, no. 2, pp. 3407 3410, 2004.
dc.relation.references[Bovik(2009)]A. Bovik, The essential guide to image processing. Academic Press, 2009.
dc.relation.references[Rajagopalan and Chellappa(2014)]A. Rajagopalan and R. Chellappa, Motion Deblurring: Algorithms and Systems. Cambridge University Press, 2014.
dc.relation.references[Sorel and Flusser(2008)]M. Sorel and J. Flusser, Space-Variant Restoration of Images Degraded by Camera Motion Blur, IEEE Transactions on Image Processing, vol. 17, no. 2, pp. 105 116, 2008.
dc.relation.references[Chan and Shen(2005a)]T. F. Chan and J. Shen, Image processing and analysis: variational, PDE, wavelet, and stochastic methods. SIAM, 2005.
dc.relation.references[Bovik(2010)]A. C. Bovik, Handbook of image and video processing. Academic press, 2010.
dc.relation.references[Som(1971)]S. Som, Analysis of the e ect of linear smear on photographic images, JOSA, vol. 61, no. 7, pp. 859 864, 1971.
dc.relation.references[Lin(2005)]H.-Y. Lin, Vehicle speed detection and identifcation from a single motion blurred image, in Application of Computer Vision, 2005. WACV/MOTIONS'05 Volume 1. Seventh IEEE Workshops on, vol. 1. IEEE, 2005, pp. 461 467
dc.relation.references[Lin and Li(2005)]H.-Y. Lin and K.-J. Li, Vehicle speed estimation from single still images based on motion blur analysis. in MVA, 2005, pp. 128 131.
dc.relation.references[Schuon and Diepold(2009)]S. Schuon and K. Diepold, Comparison of motion de-blur algorithms and real world deployment, Acta Astronautica, vol. 64, no. 11, pp. 1050 1065, 2009.
dc.relation.references[Deb(2005)] Motion de-blurring - the nature of blur, 2005. [Online]. Available: http://ai.stanford.edu/~schuon/deblur.htm.
dc.relation.references[Celestino and Horikawa(2008)]M. Celestino and O. Horikawa, Velocity measurement based on image blur, Computer graphics and image processing, vol. 3, pp. 633 642, 2008.
dc.relation.references[Mohammadi et al.(2010)Mohammadi, Akbari, et al.]J. Mohammadi, R. Akbari et al., Vehicle speed estimation based on the image motion blur using radon transform, in Signal Processing Systems (ICSPS), 2010 2nd International Conference on, vol. 1. IEEE, 2010, pp. V1 243.
dc.relation.references[Mohammadi and Taherkhani(2013)]J. Mohammadi and A. Taherkhani, Object Speed Estima- tion in Frequency Domain of Single Taken Image, 2013, vol. 3.
dc.relation.references[Olivas et al.(2012)Olivas, ’orel, and Ford]S. J. Olivas, M. ’orel, and J. E. Ford, Platform motion blur image restoration system, Applied optics, vol. 51, no. 34, pp. 8246 8256, 2012.
dc.relation.references[Nemeth and Zarandy(2016)]M. Nemeth and A. Zarandy, Intraframe scene capturing and speed measurement based on superimposed image: New sensor concept for vehicle speed measurement, Journal of Sensors, vol. 2016, 2016.
dc.relation.references[Lee et al.(2016)Lee, Kim, and Kim]M. Lee, K.-S. Kim, and S. Kim, Measuring vehicle velocity in real time using modulated motion blur of camera image data, IEEE Transactions on Vehicular Technology, vol. 66, no. 5, pp. 3659 3673, 2016.
dc.relation.references[Lee(2017)]M. Lee, A study on measuring vehicle velocity in real time using modulated motion blur of camera image data, Ph.D. dissertation, Korea Advanced Institute of Science and Technology, 2017.
dc.relation.references[Lee et al.(2017)Lee, Kim, Cho, and Kim]M. Lee, K.-S. Kim, J. Cho, and S. Kim, Develop- ment of a vehicle body velocity sensor using modulated motion blur, in 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM). IEEE, 2017, pp. 406 411.
dc.relation.references[Jing et al.(2018)Jing, Xiao, Yang, Wang, and Yu]J. Jing, F. Xiao, L. Yang, S. Wang, and B. Yu, Measurements of velocity eld and diameter distribution of particles in multiphase ow based on trajectory imaging, Flow Measurement and Instrumentation, vol. 59, pp. 103 113, 2018.
dc.relation.references[Matsuo and Yakoh(2018)]K. Matsuo and T. Yakoh, Position and velocity measurement method from a single image using modulated illumination, in 2018 IEEE 15th International Workshop on Advanced Motion Control (AMC). IEEE, 2018, pp. 353 359.
dc.relation.references[Dwicahya et al.(2018a)Dwicahya, Ramadijanti, and Basuki]J. A. Dwicahya, N. Ramadijanti, and A. Basuki, Moving object velocity detection based on motion blur on photos using gray level, in 2018 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC). IEEE, 2018, pp. 192 198.
dc.relation.references[Zhou et al.(2019)Zhou, Chen, Zhang, Ye, and Tao]H. Zhou, M. Chen, L. Zhang, N. Ye, and C. Tao, Measuring shape and motion of a high-speed object with designed features from motion blurred images, Measurement, vol. 145, pp. 559 567, 2019.
dc.relation.references[Liu et al.(2008)Liu, Li, and Jia]R. Liu, Z. Li, and J. Jia, Image partial blur detection and classi cation, in Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on. IEEE, 2008, pp. 1 8.
dc.relation.references[Zhang and Hirakawa(2013a)]Y. Zhang and K. Hirakawa, Blur processing using double discrete wavelet transform, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2013, pp. 1091 1098.
dc.relation.references[Zhang and Hirakawa(2013b)] , Blur processing using double discrete wavelet transform, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2013, pp. 1091 1098.
dc.relation.references[Guo and Wang(2013)]Y. Guo and P. Wang, Identi cation of motion blur parameters based on 2d-dwt and cepstrum, Journal of Computational Information Systems, vol. 9, no. 16, pp. 6325 6332, 2013.
dc.relation.references[Phansalkar(2010a)]N. Phansalkar, Determination of linear motion point spread function using hough transform for image restoration, IEEE, pp. 1 4, 2010.
dc.relation.references[Grou-Szabo and Shibata(2009a)]R. Grou-Szabo and T. Shibata, Blind motion-blur parame- ter estimation using edge detectors, in 3rd. International Conference on Signal Processing and Communication Systems, 2009.
dc.relation.references[Aizenberg et al.(2000)Aizenberg, Aizenberg, Butakov, and Farberov]I. Aizenberg, N. Aizen- berg, C. Butakov, and E. Farberov, Image recognition on the neural network based on multi-valued neurons, in Proc. 15th Int. Conf. Pattern Recognition. ICPR-2000, vol. 2, 2000, pp. 989 992 vol.2.
dc.relation.references[Li et al.(2007)Li, Mersereau, and Simske]D. Li, R. M. Mersereau, and S. Simske, Blind image deconvolution through support vector regression, IEEE transactions on neural networks, vol. 18, no. 3, pp. 931 935, 2007.
dc.relation.references[Chen et al.(2010)Chen, Yang, Wu, and Zhao]X. Chen, J. Yang, Q. Wu, and J. Zhao, Motion blur detection based on lowest directional high-frequency energy, in Image Processing (ICIP), 2010 17th IEEE International Conference on. IEEE, 2010, pp. 2533 2536.
dc.relation.references[Tiwari et al.(2014)Tiwari, Singh, and Shukla]S. Tiwari, A. K. Singh, and V. Shukla, Certain investigations on motion blur detection and estimation, in Proceedings of international conference on signal, image and video processing, IIT Patna, 2014, pp. 108 114.
dc.relation.references[Lokhande et al.(2006)Lokhande, Arya, and Gupta]R. Lokhande, K. Arya, and P. Gupta, Identi cation of parameters and restoration of motion blurred images, in Proceedings of the 2006 ACM symposium on Applied computing. ACM, 2006, pp. 301 305.
dc.relation.references[Su et al.(2012)Su, Lu, and Lim]B. Su, S. Lu, and T. C. Lim, Restoration of motion blurred document images, in Proceedings of the 27th Annual ACM Symposium on Applied Computing. ACM, 2012, pp. 767 770.
dc.relation.references[Moghaddam and Jamzad(2007)]M. E. Moghaddam and M. Jamzad, Linear motion blur parameter estimation in noisy images using fuzzy sets and power spectrum, EURASIP Journal on Advances in Signal Processing, vol. 2007, no. 1, pp. 1 8, 2007.
dc.relation.references[Jia and Wen(2013)]S. Jia and J. Wen, Motion blurred image restoration, in Image and Signal Processing (CISP), 2013 6th International Congress on, vol. 1. IEEE, 2013, pp. 384 389.
dc.relation.references[Krahmer et al.(2006)Krahmer, Lin, McAdoo, Ott, Wang, Widemann, and Wohlberg] F. Krahmer, Y. Lin, B. McAdoo, K. Ott, J. Wang, D. Widemann, and B. Wohlberg, Blind image deconvolution: Motion blur estimation, Institute for Mathematics and its Applications, University of Minnesota, USA, Tech. Rep., 2006.
dc.relation.references[Pazhoumand-Dar et al.(2010)Pazhoumand-Dar, Abolhassani, and Saeedi]H. Pazhoumand- Dar, A. M. T. Abolhassani, and E. Saeedi, Object speed estimation by using fuzzy set, World Academy of Science, Engineering and Technology, International Journal of Computer, Electrical, Automation, Control and Information Engineering, vol. 4, no. 4, pp. 688 691, 2010.
dc.relation.references[Ji and Liu(2008)]H. Ji and C. Liu, Motion blur identi cation from image gradients, in Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on. IEEE, 2008, pp. 1 8.
dc.relation.references[Rekleitis(1996)]I. M. Rekleitis, Steerable lters and cepstral analysis for optical ow calculation from a single blurred image, in Vision Interface, vol. 1, 1996, pp. 159 166.
dc.relation.references[Tiwari and Shukla(2013)]S. Tiwari and Shukla, Review of motion blur estimation tech- niques, Journal of Image and Graphics, vol. 1, no. 4, pp. 176 184, 2013.
dc.relation.references[Cannon(1976)]M. Cannon, Blind deconvolution of spatially invariant image blurs with phase, Acoustics, Speech and Signal Processing, IEEE Transactions on, vol. 24, no. 1, pp. 58 63, 1976.
dc.relation.references[Kundur and Hatzinakos(1996)]D. Kundur and D. Hatzinakos, Blind image deconvolution, IEEE Signal Processing Magazine, vol. 13, no. 3, pp. 43 64, May 1996.
dc.relation.references[Dash(2012)]R. Dash, Parameters estimation for image restoration, Ph.D. dissertation, Department of Computer Science and Engineering, National Institute of Technology Rourkela, Rourkela 769008, India, 2012.
dc.relation.references[Yang et al.(2011)Yang, Liu, Liu, and Liao]S. Yang, H. Liu, B. Liu, and X. Liao, Blurring length estimation using ringing artifacts in a deblurred image, in Image Analysis and Signal Processing (IASP), 2011 International Conference on. IEEE, 2011, pp. 84 88.
dc.relation.references[Gunturk and Li(2012)]B. Gunturk and X. Li, Image Restoration: Fundamentals and Advances. CRC Press, 2012.
dc.relation.references[S orel et al.(2009)S orel, S roubek, and Flusser]M. S orel, F. S roubek, and J. Flusser, Unexploded Ordnance Detection and Mitigation, ser. NATO Sci. Peace Secur. Ser. B Phys. Biophys. The address of the publisher: Springer Netherlands, 2009, ch. Recent advances in space- variant deblurring and image stabilization, pp. 259 272.
dc.relation.references[Pretto et al.(2009)Pretto, Menegatti, Bennewitz, Burgard, and Pagello]A. Pretto, E. Menegatti, M. Bennewitz, W. Burgard, and E. Pagello, A visual odometry framework robust to motion blur, in Int. Conference on Robotics and Automation ICRA'09, May 2009, pp. 2250 2257.
dc.relation.references[Chan and Shen(2005b)]T. Chan and J. Shen, Image processing and analysis, ser. Other titles in applied mathematics. Society for Industrial and Applied Mathematics SIAM, 2005, vol. 94.
dc.relation.references[Potmesil and Chakravarty(1983)]M. Potmesil and I. Chakravarty, Modeling motion blur in computer-generated images, in ACM SIGGRAPH'83, vol. 17, no. 3, 1983, pp. 389 399.
dc.relation.references[Kawamura et al.(2002)Kawamura, Kondo, Konishi, and Ishlgakl]S. Kawamura, K. Kondo, Y. Konishi, and H. Ishlgakl, Estimation of motion using motion blur for tracking vision system, IEEE, pp. 371 376, 2002.
dc.relation.references[Xu and Zhao(2010)]T.-F. Xu and P. Zhao, Image motion-blur-based object's speed measure- ment using an interlaced scan image, Measurement Science and Technology, vol. 21, no. 7, p. 075502, 2010.
dc.relation.references[Rezvankhah et al.(2012)Rezvankhah, Bagherzadeh, Moradi, and Member]S. Rezvankhah, A. A. Bagherzadeh, H. Moradi, and B. N. A. Member, A Real-time Velocity Estimation using Motion Blur in Air Hockey, IEEE, 2012.
dc.relation.references[Zhang and Hirakawa(2015)]Y. Zhang and K. Hirakawa, Fast spatially varying object motion blur estimation, in Image Processing (ICIP), 2015 IEEE International Conference on. IEEE, 2015, pp. 646 650.
dc.relation.references[Brusius et al.(2011)Brusius, Schwanecke, and Barth]F. Brusius, U. Schwanecke, and P. Barth, Blind image deconvolution of linear motion blur, International Conference on Computer Vision, Imaging and Computer, vol. 274, 2011.
dc.relation.references[Lu(2006)]E. Lu, JuweiPoon, Restoration of motion blurred images, IEEE International, pp. 1193 1196, 2006.
dc.relation.references[GUO et al.(2013)GUO, WANG, and LIU]Y. GUO, P. WANG, and M. LIU, Identi cation of motion blur parameters based on 2d-dwt and cepstrum, Journal of Computational Information Systems 9, vol. 9, no. 16, pp. 6325 6332, 2013.
dc.relation.references[Yitzhaky and Kopeika(1997)]Y. Yitzhaky and N. S. Kopeika, Identi cation of blur parame- ters from motion blurred images, Graphical models and image processing, vol. 59, no. 5, pp. 310 320, 1997.
dc.relation.references[Grou-Szabo and Shibata(2009b)]R. Grou-Szabo and T. Shibata, Blind motion-blur parame- ter estimation using edge detectors, in Signal Processing and Communication Systems, 2009. ICSPCS 2009. 3rd International Conference on. IEEE, 2009, pp. 1 6.
dc.relation.references[Moghaddam and Jamzad(2004)]M. E. Moghaddam and M. Jamzad, Finding point spread function of motion blur using radon transform and modeling the motion length, in Signal Processing and Information Technology, 2004. Proceedings of the Fourth IEEE International Symposium on. IEEE, 2004, pp. 314 317.
dc.relation.references[Qi et al.(2005)Qi, Zhang, and Tan]X. Y. Qi, L. Zhang, and C. L. Tan, Motion deblurring for optical character recognition, in Document Analysis and Recognition, 2005. Proceedings. Eighth International Conference on. IEEE, 2005, pp. 389 393.
dc.relation.references[Phansalkar(2010b)]N. Phansalkar, Determination of linear motion Point Spread Function using Hough transform for image restoration, in In Proceedings of the IEEE International Coference on Computer Intelligence and Computer Research, 2010.
dc.relation.references[Yoshida et al.(1993)Yoshida, Horiike, and Fujita]Y. Yoshida, K. Horiike, and K. Fujita, Parameter estimation of uniform image blur using dct, IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences, vol. 76, no. 7, pp. 1154 1157, 1993.
dc.relation.references[Sakano et al.(2006)Sakano, Suetake, and Uchino]M. Sakano, N. Suetake, and E. Uchino, Robust identi cation of motion blur parameters by using angles of gradient vectors, in Intelligent Signal Processing and Communications, 2006. ISPACS'06. International Symposium on. IEEE, 2006, pp. 522 525.
dc.relation.references[Levin et al.(2011)Levin, Weiss, Durand, and Freeman]A. Levin, Y. Weiss, F. Durand, and W. T. Freeman, Understanding blind deconvolution algorithms, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 12, pp. 2354 2367, 2011.
dc.relation.references[Perrone and Favaro(2016)]D. Perrone and P. Favaro, A clearer picture of total variation blind deconvolution, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 38, no. 6, pp. 1041 1055, 2016.
dc.relation.references[Lam and Goodman(2000)]E. Y. Lam and J. W. Goodman, Iterative statistical approach to blind image deconvolution, J. Opt. Soc. Am. A, vol. 17, no. 7, pp. 1177 1184, Jul 2000. [Online]. Available:http://josaa.osa.org/abstract.cfm?URI=josaa-17-7-1177
dc.relation.references[Burger et al.(2009)Burger, Burge, Burge, and Burge]W. Burger, M. J. Burge, M. J. Burge, and M. J. Burge, Principles of digital image processing. Springer, 2009.
dc.relation.references[Benesty et al.(2007)Benesty, Sondhi, and Huang]J. Benesty, M. M. Sondhi, and Y. Huang, Springer handbook of speech processing. Salmon Tower Building New York City: Springer, 2007.
dc.relation.references[Whittaker and Shives(1983)]G. A. Whittaker and T. R. Shives, Technology Advances in Engineering and Their Impact on Detection, Diagnosis and Prognosis Methods, C. U. Press, Ed. Cambridge CB2, NY 10022, USA: CUP Archive, 1983, vol. 36.
dc.relation.references[Randall(2013)]R. B. Randall, A history of cepstrum analysis and its application to mechanical problems, in International Conference at Institute of Technology of Chartres, France, 2013, pp. 11 16.
dc.relation.references[Bogert(1963)]B. P. Bogert, The quefrency alanysis of time series for echoes; cepstrum, pseudo- autocovariance, cross-cepstrum and saphe cracking, Time series analysis, vol. 15, pp. 209 243, 1963.
dc.relation.references[Hassanein and Rudko(1984)]H. Hassanein and M. Rudko, On the use of discrete cosine transform in cepstral analysis, IEEE Transactions on acoustics, speech, and signal processing, vol. 32, no. 4, pp. 922 925, 1984.
dc.relation.references[Sung et al.(2002)Sung, Kim, Kim, Kwak, Yoo, and Yoo]M.-M. Sung, H.-J. Kim, E.-K. Kim, J.-Y. Kwak, J.-K. Yoo, and H.-S. Yoo, Clinical evaluation of jpeg2000 compression for digital mammography, IEEE Transactions on Nuclear Science, vol. 49, no. 3, pp. 827 832, 2002.
dc.relation.references[sip(2017)] Sipi image database, 2017. [Online]. Available:http://sipi.usc.edu/database/
dc.relation.references[bt6(2017)] Bt.601 : Studio encoding parameters of digital television for standard 4:3 and wide screen 16:9 aspect ratios, 2017. [Online]. Available:http://www.itu.int/rec/r-rec-bt.601
dc.relation.references[Shah et al.(2014)Shah, Dalal, Deshpande, and Patnaik]M. J. Shah, U. D. Dalal, A. M. Deshpande, and S. Patnaik, Hough transform and cepstrum based estimation of spatial- invariant and variant motion blur parameters, in Advances in Electronics, Computers and Communications (ICAECC), 2014 International Conference on. IEEE, 2014, pp. 1 6.
dc.relation.references[Deshpande and Patnaik(2012)]A. M. Deshpande and S. Patnaik, Radon transform based uniform and non-uniform motion blur parameter estimation, in Communication, Information & Computing Technology (ICCICT), 2012 International Conference on. IEEE, 2012, pp. 1 6.
dc.relation.references[Richardson(1972)]W. H. Richardson, Bayesian-based iterative method of image restoration, JoSA, vol. 62, no. 1, pp. 55 59, 1972.
dc.relation.references[Lucy(1974)]L. B. Lucy, An iterative technique for the recti cation of observed distributions, The astronomical journal, vol. 79, p. 745, 1974.
dc.relation.references[Cortés-Osorio et al.(2018)Cortés-Osorio, López-Robayo, and Hernández-Betancourt]J. A. Cortés-Osorio, C. D. López-Robayo, and N. Hernández-Betancourt, Evaluación y comparación de técnicas para la reconstrucción de la función de dispersión de punto de imágenes degradadas por difuminación lineal uniforme, TecnoLógicas, vol. 21, no. 42, pp. 211 229, 2018.
dc.relation.references[Pelegri et al.(2002)Pelegri, Alberola, and Llario]J. Pelegri, J. Alberola, and V. Llario, Vehicle detection and car speed monitoring system using gmr magnetic sensors, in IECON 02 [Industrial Electronics Society, IEEE 2002 28th Annual Conference of the], vol. 2. IEEE, 2002, pp. 1693 1695.
dc.relation.references[Li et al.(2011)Li, Dong, Jia, Xu, and Qin]H. Li, H. Dong, L. Jia, D. Xu, and Y. Qin, Some practical vehicle speed estimation methods by a single tra c magnetic sensor, in Intelligent Transportation Systems (ITSC), 2011 14th International IEEE Conference on. IEEE, 2011, pp. 1566 1573.
dc.relation.references[Odat et al.(2017)Odat, Shamma, and Claudel]E. Odat, J. S. Shamma, and C. Claudel, Vehicle classi cation and speed estimation using combined passive infrared/ultrasonic sensors, IEEE Transactions on Intelligent Transportation Systems, 2017.
dc.relation.references[Cheung et al.(2005)Cheung, Ergen, and Varaiya]S. Y. Cheung, S. C. Ergen, and P. Varaiya, Tra c surveillance with wireless magnetic sensors, in Proceedings of the 12th ITS world congress, 2005.
dc.relation.references[Luvizon et al.(2017)Luvizon, Nassu, and Minetto]D. C. Luvizon, B. T. Nassu, and R. Minetto, A video-based system for vehicle speed measurement in urban roadways, IEEE Transactions on Intelligent Transportation Systems, vol. 18, no. 6, pp. 1393 1404, 2017.
dc.relation.references[Wang(2016)]J.-x. Wang, Research of vehicle speed detection algorithm in video surveillance, in Audio, Language and Image Processing (ICALIP), 2016 International Conference on. IEEE, 2016, pp. 349 352.
dc.relation.references[Kruger et al.(1995)Kruger, Enkelmann, and Rossle]W. Kruger, W. Enkelmann, and S. Rossle, Real-time estimation and tracking of optical ow vectors for obstacle detection, in Intelligent Vehicles' 95 Symposium., Proceedings of the. IEEE, 1995, pp. 304 309.
dc.relation.references[Litzenberger et al.(2006)Litzenberger, Kohn, Belbachir, Donath, Gritsch, Garn, Posch, and Schraml] M. Litzenberger, B. Kohn, A. Belbachir, N. Donath, G. Gritsch, H. Garn, C. Posch, and S. Schraml, Estimation of vehicle speed based on asynchronous data from a silicon retina optical sensor, in Intelligent Transportation Systems Conference, 2006. ITSC'06. IEEE. IEEE, 2006, pp. 653 658.
dc.relation.references[Arashloo and Ahmadyfard(2007)]S. R. Arashloo and A. Ahmadyfard, Fine estimation of blur parmeters for image restoration, in Digital Signal Processing, 2007 15th International Conference on. IEEE, 2007, pp. 427 430.
dc.relation.references[Gal et al.(2014)Gal, Kiryati, and Sochen]R. Gal, N. Kiryati, and N. Sochen, Progress in the restoration of image sequences degraded by atmospheric turbulence, Pattern Recognition Letters, vol. 48, pp. 8 14, 2014.
dc.relation.references[Joshi et al.(2010)Joshi, Kang, Zitnick, and Szeliski]N. Joshi, S. B. Kang, C. L. Zitnick, and R. Szeliski, Image deblurring using inertial measurement sensors, in ACM Transactions on Graphics (TOG), vol. 29, no. 4. ACM, 2010, p. 30.
dc.relation.references[Li et al.(2012)Li, Zhang, Fu, and Meng]T. Li, D. W. Zhang, Y. Fu, and M. Q.-H. Meng, Motion blur removal for humanoid robots, in Automation and Logistics (ICAL), 2012 IEEE International Conference on. IEEE, 2012, pp. 378 381.
dc.relation.references[Rizo et al.(2003)Rizo, Coronado, Campo, Forero, Otalora, Devy, and Parra]J. Rizo, J. Coro- nado, C. Campo, A. Forero, C. Otalora, M. Devy, and C. Parra, Ursula: robotic demining system, in Proceedings of the 11th international conference on advanced robotics, 2003, pp. 538 43.
dc.relation.references[Rajasekharan and Kambhampati(2003)]S. Rajasekharan and C. Kambhampati, The current opinion on the use of robots for landmine detection, in Robotics and Automation, 2003. Proceedings. ICRA'03. IEEE International Conference on, vol. 3. IEEE, 2003, pp. 4252 4257.
dc.relation.references[Nagatani et al.(2013)Nagatani, Kiribayashi, Okada, Otake, Yoshida, Tadokoro, Nishimura, Yoshida, Koyana K. Nagatani, S. Kiribayashi, Y. Okada, K. Otake, K. Yoshida, S. Tadokoro, T. Nishimura, T. Yoshida, E. Koyanagi, M. Fukushima et al., Emergency response to the nuclear accident at the fukushima daiichi nuclear power plants using mobile rescue robots, Journal of Field Robotics, vol. 30, no. 1, pp. 44 63, 2013.
dc.relation.references[Yamamoto(1992)]S. Yamamoto, Development of inspection robot for nuclear power plant, in Robotics and Automation, 1992. Proceedings., 1992 IEEE International Conference on. IEEE, 1992, pp. 1559 1566.
dc.relation.references[Murphy et al.(2008)Murphy, Tadokoro, Nardi, Jaco , Fiorini, Choset, and Erkmen]R. R. Murphy, S. Tadokoro, D. Nardi, A. Jaco , P. Fiorini, H. Choset, and A. M. Erkmen, Search and rescue robotics, in Springer Handbook of Robotics. Berlin Heidelberg: Springer, 2008, pp. 1151 1173.
dc.relation.references[Casper and Murphy(2003)]J. Casper and R. R. Murphy, Human-robot interactions during the robot-assisted urban search and rescue response at the world trade center, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 33, no. 3, pp. 367 385, 2003.
dc.relation.references[Lin and Li(2004b)]H.-Y. Lin and K.-J. Li, Motion blur removal and its application to vehicle speed detection, in Image Processing, 2004. ICIP'04. 2004 International Conference on, vol. 5, no. 2. IEEE, 2004, pp. 3407 3410.
dc.relation.referencesSong et al.(2009)Song, Peng, Lu, Yang, and Yan]D. Song, L. Peng, G. Lu, S. Yang, and Y. Yan, Velocity measurement of pneumatically conveyed particles through digital imaging, Sensors and Actuators A: Physical, vol. 149, no. 2, pp. 180 188, 2009.
dc.relation.references[Zhang(2000)]Z. Zhang, A exible new technique for camera calibration, IEEE Transactions on pattern analysis and machine intelligence, vol. 22, no. 11, pp. 1330 1334, 2000.
dc.relation.references[Strobl et al.(2006)Strobl, Sepp, Fuchs, Paredes, and Arbter]K. Strobl, W. Sepp, S. Fuchs, C. Paredes, and K. Arbter, Camera calibration toolbox for matlab, 2006. [Online]. Available:http://www.vision.caltech.edu/bouguetj/calib_doc/index.html
dc.relation.references[Ric(2017)] Ricoh lens -cc0814a-2m, November 2017, (Accessed on 06/11/2017). [Online]. Available:https://www.baslerweb.com/en/products/vision-components/lenses/ ricoh-lens- -cc0814a-2m-f1-4-f8mm-2-3/
dc.relation.references[Basler(2017)]Basler, aca2000-165um - basler ace, https://www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2000- 165um/, November 2017, (Accessed on 06/11/2017). [Online]. Available:https: //www.baslerweb.com/en/products/cameras/area-scan-cameras/ace/aca2000-165um/
dc.relation.references[Zanobini et al.(2016)Zanobini, Sereni, Catelani, and Ciani]A. Zanobini, B. Sereni, M. Cate- lani, and L. Ciani, Repeatability and reproducibility techniques for the analysis of mea- surement systems, Measurement, vol. 86, pp. 125 132, 2016.
dc.relation.references[Cortes-Osorio et al.(2018)Cortes-Osorio, Gomez-Mendoza, and Riano-Rojas]J. A. Cortes- Osorio, J. B. Gomez-Mendoza, and J. C. Riano-Rojas, Velocity estimation from a single linear motion blurred image using discrete cosine transform, IEEE Transactions on Instrumentation and Measurement, pp. 1 13, 2018.
dc.relation.references[Shirmohammadi and Ferrero(2014)]S. Shirmohammadi and A. Ferrero, Camera as the instrument: the rising trend of vision based measurement, IEEE Instrumentation Measurement Magazine, vol. 17, no. 3, pp. 41 47, Jun. 2014.
dc.relation.references[Beauchemin et al.(2012)Beauchemin, Bauer, Kowsari, and Cho]S. S. Beauchemin, M. A. Bauer, T. Kowsari, and J. Cho, Portable and scalable vision-based vehicular instrumen- tation for the analysis of driver intentionality, IEEE Transactions on Instrumentation and Measurement, vol. 61, no. 2, pp. 391 401, Feb. 2012.
dc.relation.references[Motta et al.(2001)Motta, de Carvalho, and McMaster]J. M. S. Motta, G. C. de Carvalho, and R. McMaster, Robot calibration using a 3d vision-based measurement system with a single camera, Robotics and Computer-Integrated Manufacturing, vol. 17, no. 6, pp. 487 497, dec 2001. [Online]. Available:https://doi.org/10.1016/s0736-5845(01)00024-2
dc.relation.references[Wahbeh et al.(2003)Wahbeh, Ca rey, and Masri]A. M. Wahbeh, J. P. Ca rey, and S. F. Masri, A vision-based approach for the direct measurement of displacements in vibrating systems, Smart Materials and Structures, vol. 12, no. 5, p. 785, 2003. [Online]. Available: http://stacks.iop.org/0964-1726/12/i=5/a=016
dc.relation.references[Karimirad et al.(2014)Karimirad, Chauhan, and Shirinzadeh]F. Karimirad, S. Chauhan, and B. Shirinzadeh, Vision-based force measurement using neural networks for biological cell microinjection, Journal of Biomechanics, vol. 47, no. 5, pp. 1157 1163, mar 2014. [Online]. Available:https://doi.org/10.1016/j.jbiomech.2013.12.007
dc.relation.references[Park et al.(2010)Park, Lee, Jung, and Myung]J.-W. Park, J.-J. Lee, H.-J. Jung, and H. Myung, Vision-based displacement measurement method for high-rise building structures using partitioning approach, NDT & E International, vol. 43, no. 7, pp. 642 647, oct 2010. [Online]. Available:https://doi.org/10.1016/j.ndteint.2010.06.009
dc.relation.references[Vijayachitra and Krishnaswamy(2005)]S. Vijayachitra and K. Krishnaswamy, Industrial Instrumentation. New Age International, 2005.
dc.relation.references[Ovaska and Valiviita(1998)]S. J. Ovaska and S. Valiviita, Angular acceleration measurement: A review, in Instrumentation and Measurement Technology Conference, 1998. IMTC/98. Conference Proceedings. IEEE, vol. 2. IEEE, 1998, pp. 875 880.
dc.relation.references[Ovaska and Valiviita(1998)]S. J. Ovaska and S. Valiviita, Angular acceleration measurement: A review, in Instrumentation and Measurement Technology Conference, 1998. IMTC/98. Conference Proceedings. IEEE, vol. 2. IEEE, 1998, pp. 875 880.
dc.relation.references[Cannon(1974)]T. M. Cannon, Digital image deblurring by nonlinear homomorphic ltering, Utah University, Salt Lake City School of Computing, Tech. Rep., 1974.
dc.relation.references[Chen et al.(1996)Chen, Nandhakumar, and Martin]W.-G. Chen, N. Nandhakumar, and W. N. Martin, Image motion estimation from motion smear-a new computational model, IEEE transactions on pattern analysis and machine intelligence, vol. 18, no. 4, pp. 412 425, 1996.
dc.relation.references[Yitzhaky et al.(1999)Yitzhaky, Milberg, Yohaev, and Kopeika]Y. Yitzhaky, R. Milberg, S. Yohaev, and N. S. Kopeika, Comparison of direct blind deconvolution methods for motion-blurred images, Applied optics, vol. 38, no. 20, pp. 4325 4332, 1999. [Online]. Available:https://doi.org/10.1364/AO.38.004325
dc.relation.references[Benameur et al.(2012)Benameur, Mignotte, and Lavoie]S. Benameur, M. Mignotte, and F. Lavoie, An homomorphic ltering and expectation maximization approach for the point spread function estimation in ultrasound imaging, in Image Processing: Algorithms and Systems X; and Parallel Processing for Imaging Applications II, vol. 8295. International Society for Optics and Photonics, 2012, p. 82950T. [Online]. Available: https://doi.org/10.1117/12.903785
dc.relation.references[Mattausch and Goksel(2016)]O. Mattausch and O. Goksel, Image-based psf estimation for ultrasound training simulation, in International Workshop on Simulation and Synthesis in Medical Imaging. Springer, 2016, pp. 23 33.
dc.relation.references[Janwale and Lomte(2017)]A. P. Janwale and S. S. Lomte, Enhancement of cotton leaves images using various ltering techniques, in Data Management, Analytics and Innovation (ICDMAI), 2017 International Conference on. IEEE, 2017, pp. 303 305.
dc.relation.references[Raskar et al.(2006)Raskar, Agrawal, and Tumblin]R. Raskar, A. Agrawal, and J. Tumblin, Coded exposure photography: motion deblurring using uttered shutter, ACM Transactions on Graphics (TOG), vol. 25, no. 3, pp. 795 804, 2006.
dc.relation.references[Li et al.(2008)Li, Du, Zhang, and Wang]M. Li, H. Du, Q. Zhang, and J. Wang, Improved particle image velocimetry through cell segmentation and competitive survival, IEEE Transactions on Instrumentation and Measurement, vol. 57, no. 6, pp. 1221 1229, Jun. 2008
dc.relation.references[Sederman et al.(2004)Sederman, Mantle, Buckley, and Gladden]A. J. Sederman, M. D. Man- tle, C. Buckley, and L. F. Gladden, Mri technique for measurement of velocity vectors, acceleration, and autocorrelation functions in turbulent ow, Journal of Magnetic Reso- nance, vol. 166, no. 2, pp. 182 189, 2004.
dc.relation.references[McCloskey et al.(2012)McCloskey, Ding, and Yu]S. McCloskey, Y. Ding, and J. Yu, Design and estimation of coded exposure point spread functions, IEEE transactions on pattern analysis and machine intelligence, vol. 34, no. 10, p. 2071, 2012.
dc.relation.references[Agrawal et al.(2009)Agrawal, Xu, and Raskar]A. Agrawal, Y. Xu, and R. Raskar, Invertible motion blur in video, in ACM Transactions on Graphics (TOG), vol. 28, no. 3. ACM, 2009, p. 95.
dc.relation.references[Leifer et al.(2011)Leifer, Weems, Kienle, and Sims]J. Leifer, B. Weems, S. C. Kienle, and A. M. Sims, Three-dimensional acceleration measurement using videogrammetry tracking data, Experimental Mechanics, vol. 51, no. 2, pp. 199 217, 2011.
dc.relation.references[Liu and Katz(2006)]X. Liu and J. Katz, Instantaneous pressure and material acceleration measurements using a four-exposure piv system, Experiments in Fluids, vol. 41, no. 2, p. 227, 2006.
dc.relation.references[Chu et al.(2018)Chu, Wolfe, and Wang]P. Chu, B. T. Wolfe, and Z. Wang, Measurement of incandescent microparticle acceleration using stereoscopic imaging, Review of Scienti c Instruments, vol. 89, no. 10, 2018. [Online]. Available: http://dx.doi.org/10.1063/1.5034311
dc.relation.references[Chen et al.(2016)Chen, Li, Zhao, Huang, and Guo]G. Chen, L. Li, C. Zhao, R. Huang, and F. Guo, Acceleration characteristics of a rock slide using the particle image velocimetry technique, Journal of Sensors, vol. 2016, 2016. [Online]. Available: http://dx.doi.org/10.1155/2016/2650871
dc.relation.references[Dong et al.(2010)Dong, Song, Wang, Zeng, and Wu]J. Dong, Y. Song, H. Wang, J. Zeng, and Z. Wu, Predicting ow velocity a ected by seaweed resistance using svm regression, in Computer Application and System Modeling (ICCASM), 2010 International Conference on, vol. 2. IEEE, 2010, pp. V2 273.
dc.relation.references[Genç and Da§(2016)]O. Genç and A. Da§, A machine learning-based approach to predict the velocity pro les in small streams, Water resources management, vol. 30, no. 1, pp. 43 61, 2016.
dc.relation.references[Izquierdo-Verdiguier et al.(2014)Izquierdo-Verdiguier, Gomez-Chova, Bruzzone, and Camps-Valls] E. Izquierdo-Verdiguier, L. Gomez-Chova, L. Bruzzone, and G. Camps-Valls, Semisu- pervised kernel feature extraction for remote sensing image analysis, IEEE Transactions on Geoscience and Remote Sensing, vol. 52, no. 9, pp. 5567 5578, Sep. 2014.
dc.relation.references[Bouwmans et al.(2018)Bouwmans, Javed, Zhang, Lin, and Otazo]T. Bouwmans, S. Javed, H. Zhang, Z. Lin, and R. Otazo, On the applications of robust pca in image and video processing, Proceedings of the IEEE, vol. 106, no. 8, pp. 1427 1457, Aug 2018.
dc.relation.references[GmbH(2018)]P. S. GmbH, Cobra4 sensor unit 3d acceleration, 2018. [Online]. Available: https://repository.curriculab.net/ les/bedanl.pdf/12650.00/1265000e.pdf
dc.relation.references[Komiya et al.(2011)Komiya, Kurihara, and Ando]K. Komiya, T. Kurihara, and S. Ando, 3D particle image velocimetry using correlation image sensor, in Proc. SICE Annual Conf. 2011, Sep. 2011, pp. 2774 2778.
dc.relation.references[’orel and Flu(2008)]M. ’orel and J. Flu, Space-variant restoration of images degraded by camera motion blur, Image Processing, IEEE Transactions on, vol. 17, no. 2, pp. 105 116, 2008.
dc.relation.references[Sengar and Mukhopadhyay(2017)]S. S. Sengar and S. Mukhopadhyay, Detection of moving objects based on enhancement of optical ow, Optik-International Journal for Light and Electron Optics, vol. 145, pp. 130 141, 2017. [Online]. Available: https://doi.org/10.1016/j.ijleo.2017.07.040
dc.relation.references[Klyuvak et al.(2018)Klyuvak, Kliuva, and Skrynkovskyy]A. Klyuvak, O. Kliuva, and R. Skrynkovskyy, Partial motion blur removal, in 2018 IEEE Second International Conference on Data Stream Mining & Processing (DSMP). IEEE, 2018, pp. 483 487.
dc.relation.references[Zhang et al.(2018)Zhang, Zhu, Sun, Wang, and Zhang]A. Zhang, Y. Zhu, J. Sun, M. Wang, and Y. Zhang, Parametric model for image blur kernel estimation, in 2018 International Conference on Orange Technologies (ICOT). IEEE, 2018, pp. 1 5.
dc.relation.references[Moreno et al.(2013)Moreno, Valcárcel, and Hurtado]R. J. Moreno, F. A. E. Valcárcel, and D. A. Hurtado, Control de movimiento de un robot humanoide por medio de visión de máquina y réplica de movimientos humanos, INGE CUC, vol. 9, no. 2, pp. 44 51, 2013.
dc.relation.references[Contreras Parada et al.(2014)Contreras Parada, Peña Cortés, and Riaño Jaimes]P. A. Contr- eras Parada, C. A. Peña Cortés, and C. I. Riaño Jaimes, Módulo robótico para la clasi- cación de lulos (solanum quitoense) implementando visión arti cial, 2014.
dc.relation.references[Gallo Sánchez et al.(2016)Gallo Sánchez, Guerrero Ramírez, Vásquez Salcedo, and Alonso Castro] L. F. Gallo Sánchez, M. A. Guerrero Ramírez, J. D. Vásquez Salcedo, and M. Á. Alonso Castro, Diseño de un prototipo electromecánico para la emulación de los movimientos de un brazo humano, 2016.
dc.relation.references[Londoño et al.(2017)Londoño, Cortes, and Fernández]Y. Londoño, J. A. Cortes, and M. E. Fernández, Diseño, construcción e implementación de sistema de adquisición y análisis de datos para la enseñanza del movimiento rectilíneo en el laboratoriodiseño, construccion e implementación de sistema de adquisición y análisis de datos para la enseñanza del movimiento rectilíneo en el laboratorio, Momento, no. 55, pp. 57 73, 2017.
dc.relation.references[Slider(2017.)]R. M. Slider, 2017. [Online]. Available:https://www.revolvecamera.com/ products/ram-motorized-dolly-slider-bundle.
dc.relation.references[EVO.(2017)]R. R. S. EVO., Learn the system, rhino, 2017. [Online]. Available: https://rhinocameragear.com/pages/new-to-rhino-learn-the-system
dc.relation.references[Ding et al.(2010)Ding, McCloskey, and Yu]Y. Ding, S. McCloskey, and J. Yu, Analysis of motion blur with a utter shutter camera for non-linear motion, in European Conference on Computer Vision. Springer, 2010, pp. 15 30.
dc.relation.references[Dwicahya et al.(2018b)Dwicahya, Ramadijanti, and Basuki]J. A. Dwicahya, N. Ramadijanti, and A. Basuki, Moving object velocity detection based on motion blur on photos using gray level, in 2018 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC). IEEE, 2018, pp. 192 198.
dc.relation.references[Beggs(1983)]J. S. Beggs, Kinematics. CRC Press, 1983.
dc.relation.references[Schmid et al.(2000)Schmid, Lazos, et al.]W. A. Schmid, R. Lazos et al., Guía para estimar la incertidumbre de la medición, Centro nacional de Metrología (Abril 2004), 2000.
dc.relation.references[Cortes-Osorio et al.(2020)Cortes-Osorio, Muñoz Acosta, and López Robayo]J. A. Cortes- Osorio, D. A. Muñoz Acosta, and C. D. López Robayo, Diseño y construcción de un riel electromecánico para el estudio de la cinemática de imágenes con difuminación lineal uniforme, INGE CUC, vol. 16, no. 1, pp. 1 11, Jan. 2020.
dc.rights.accessrightsinfo:eu-repo/semantics/openAccess
dc.subject.proposalAceleración
dc.subject.proposalAcceleration
dc.subject.proposalKinematic quantities
dc.subject.proposalCantidades cinemáticas
dc.subject.proposalDCT
dc.subject.proposalDCT
dc.subject.proposalVelocity
dc.subject.proposalDesenfoque por movimiento
dc.subject.proposalVision-based measurement
dc.subject.proposalVelocidad
dc.subject.proposalMedida basada en visión
dc.subject.proposalMotion blur
dc.type.coarhttp://purl.org/coar/resource_type/c_1843
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aa
dc.type.contentText
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2


Archivos en el documento

Thumbnail
Thumbnail

Este documento aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del documento

Atribución-NoComercial-SinDerivadas 4.0 InternacionalEsta obra está bajo licencia internacional Creative Commons Reconocimiento-NoComercial 4.0.Este documento ha sido depositado por parte de el(los) autor(es) bajo la siguiente constancia de depósito