Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection

dc.contributor.advisorGonzález, Fabio A.
dc.contributor.authorGallego-Mejia, Joseph A.
dc.contributor.googlescholarhttps://scholar.google.cl/citations?user=DS0IfX4AAAAJ&hl=es&oi=aospa
dc.contributor.orcidGallego-Mejia, Joseph A. [0000-0001-8971-4998]spa
dc.contributor.researchgatehttps://www.researchgate.net/profile/Joseph-Gallego-Mejiaspa
dc.contributor.researchgroupMindlabspa
dc.date.accessioned2023-10-06T20:45:17Z
dc.date.available2023-10-06T20:45:17Z
dc.date.issued2023-10-05
dc.descriptionilustraciones, diagramasspa
dc.description.abstractThe main goal of this thesis is to propose efficient non-parametric density estimation methods that can be integrated with deep learning architectures, for instance, convolutional neural networks and transformers. A recent approach to non-parametric density estimation is neural density estimation. One advantage of these methods is that they can be integrated with deep learning architectures and trained using gradient descent. Most of these methods are based on neural network implementations of normalizing flows which transform an original simpler distribution to a more complex one. The approach of this thesis is based on a different idea that combines random Fourier features with density matrices to estimate the underlying distribution function. The method can be seen as an approximation of the popular kernel density estimation method but without the inherent computational cost. Density estimation methods can be applied to different problems in statistics and machine learning. They may be used to solve tasks such as anomaly detection, generative models, semi-supervised learning, compression, text-to-speech, among others. This thesis explores the application of the method in anomaly and outlier detection tasks such as medical anomaly detection, fraud detection, video surveillance, time series anomaly detection, industrial damage detection, among others. (Texto tomado de la fuente)eng
dc.description.abstractEl objetivo principal de esta tesis es proponer m´etodos eficientes de estimaci´on de densidad no param´etrica que puedan integrarse con arquitecturas de aprendizaje profundo, por ejemplo, redes neuronales convolucionales y transformadores. Una aproximaci´on reciente a la estimaci´on no param´etrica de la densidad es la estimaci´on de la densidad usando redes neuronales. Una de las ventajas de estos m´etodos es que pueden integrarse con arquitecturas de aprendizaje profundo y entrenarse mediante gradiente descendente. La mayor´ıa de estos m´etodos se basan en implementaciones de redes neuronales de flujos de normalizaci´on que transforman una distribuci´on original m´as simple en una m´as compleja. El enfoque de esta tesis se basa en una nueva idea diferente que combina caracter´ısticas aleatorias de Fourier con matrices de densidad para estimar la funci´on de distribuci´on subyacente. El m´etodo puede considerarse una aproximaci´on al popular m´etodo kernel density estimation, pero sin el coste computacional inherente. Los m´etodos de estimaci´on de la densidad pueden aplicarse a diferentes problemas en estad´ıstica y aprendizaje autom´atico. Pueden ser utilizados para resolver tareas como la detecci´on de anomal´ıas, modelos generativos, aprendizaje semi-supervisado, compresi´on, texto a habla, entre otros. El presente trabajo se centra principalmente en la aplicaci´on del m´etodo en tareas de detecci´on de anomal´ıas y valores at´ıpicos como la detecci´on de anomal´ıas m´edicas, la detecci´on de fraudes, la videovigilancia, la detecci´on de anomal´ıas en series temporales, la detecci´on de da˜nos industriales, entre otras.spa
dc.description.degreelevelDoctoradospa
dc.description.degreenameDoctor en Ingenieríaspa
dc.description.researchareaMachine Learningspa
dc.format.extentxv, 113 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/84781
dc.language.isoengspa
dc.publisherUniversidad Nacional de Colombiaspa
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotáspa
dc.publisher.facultyFacultad de Ingenieríaspa
dc.publisher.placeBogotá, Colombiaspa
dc.publisher.programBogotá - Ingeniería - Doctorado en Ingeniería - Sistemas y Computaciónspa
dc.relation.referencesKdd cup dataset, 1999. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html.spa
dc.relation.referencesAndrews Jerone T. A., Edward J Morton, and Lewis D Griffin. Detecting anomalous data using auto-encoders. International Journal of Machine Learning and Computing, 2016.spa
dc.relation.referencesCharu C Aggarwal. Outlier analysis second edition, 2016.spa
dc.relation.referencesMohiuddin Ahmed, Abdun Naser Mahmood, and Md Rafiqul Islam. A survey of anomaly detection techniques in financial domain. Future Generation Computer Systems, 55:278–288, 2016spa
dc.relation.referencesJinwon An and Sungzoon Cho. Variational autoencoder based anomaly detection using reconstruction probability. Special Lecture on IE, 2(1):1–18, 2015spa
dc.relation.referencesTessa K. Anderson. Kernel density estimation and K-means clustering to profile road accident hotspots. Accident Analysis and Prevention, 41(3):359–364, may 2009. ISSN 00014575. doi: 10.1016/j.aap.2008.12.014.spa
dc.relation.referencesJerone T A Andrews, Thomas Tanay, Edward J Morton, and Lewis D Griffin. Transfer representation-learning for anomaly detection, 2016. URL http://www.vlfeat.org/matconvnet.spa
dc.relation.referencesFabrizio Angiulli and Fabio Fassetti. Detecting distance-based outliers in streams of data. pages 811–820, 2007. ISBN 9781595938039. doi: 10.1145/1321440.1321552.spa
dc.relation.referencesHaim Avron, Vikas Sindhwani, Jiyan Yang, and Michael W. Mahoney. Quasi-monte carlo feature maps for shift-invariant kernels. Journal of Machine Learning Research, 17(120):1–38, 2016. URL http://jmlr.org/papers/v17/14-538.html.spa
dc.relation.referencesFrancis R Bach and Michael I Jordan. Predictive low-rank decomposition for kernel methods. In ICML 2005 - Proceedings of the 22nd International Conference on Machine Learning, pages 33–40, 2005. ISBN 1595931805. doi: 10.1145/1102351.1102356spa
dc.relation.referencesArturs Backurs, Piotr Indyk, and Tal Wagner. Space and time efficient kernel density estimation in high dimensions. volume 32, 2019spa
dc.relation.referencesSivaraman Balakrishnan, Srivatsan Narayanan, Alessandro Rinaldo, Aarti Singh, and Larry Wasserman. Cluster trees on manifolds. arXiv preprint arXiv:1307.6515, 2013.spa
dc.relation.referencesDavid M Bashtannyk and Rob J Hyndman. Bandwidth selection for kernel conditional density estimation, 2001. URL www.elsevier.com/locate/csda.spa
dc.relation.referencesYoshua Bengio and Samy Bengio. Modeling high-dimensional discrete data with multilayer neural networks. Advances in Neural Information Processing Systems, 12, 1999.spa
dc.relation.referencesSiddharth Bhatia, Arjit Jain, Pan Li, Ritesh Kumar, and Bryan Hooi. Mstream: Fast anomaly detection in multi-aspect streams. pages 3371–3382. Association for Computing Machinery, Inc, 4 2021. ISBN 9781450383127. doi: 10.1145/3442381. 3450023spa
dc.relation.referencesSiddharth Bhatia, Arjit Jain, Shivin Srivastava, Kenji Kawaguchi, and Bryan Hooi. Memstream: Memory-based streaming anomaly detection. WWW 2022 - Proceedings of the ACM Web Conference 2022, pages 610–621, 2022. doi: 10.1145/3485447. 3512221.spa
dc.relation.referencesPeter J Bickel and Kjell A Doksum. Mathematical statistics: basic ideas and selected topics, volumes I-II package. CRC Press, 2015spa
dc.relation.referencesChristopher M Bishop and Nasser M Nasrabadi. Pattern recognition and machine learning, volume 4. Springer, 2006spa
dc.relation.referencesGiuseppe Borruso. Network Density Estimation: A GIS Approach for Analysing Point Patterns in a Network Space. Transactions in GIS, 12(3):377–402, 2008. ISSN 1361- 1682spa
dc.relation.referencesMarkus M Breunig, Hans-Peter Kriegel, Raymond T Ng, and J¨org Sander. Lof: identifying density-based local outliers. In Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pages 93–104, 2000.spa
dc.relation.referencesBrian Bullins, Cyril Zhang, and Yi Zhang. Not-so-random features. 10 2017. URL http://arxiv.org/abs/1710.10230spa
dc.relation.referencesOscar Bustos-Brinez, Joseph Gallego-Mejia, and Fabio A Gonz´alez. Ad-dmkde: Anomaly detection through density matrices and fourier features. arXiv preprint arXiv:2210.14796, 2022.spa
dc.relation.referencesRaghavendra Chalapathy and Sanjay Chawla. Deep learning for anomaly detection: A survey. 1 2019. URL http://arxiv.org/abs/1901.03407.spa
dc.relation.referencesRaghavendra Chalapathy, Aditya Krishna Menon, and Sanjay Chawla. Anomaly detection using one-class neural networks. arXiv preprint arXiv:1802.06360, 2018.spa
dc.relation.referencesVarun Chandola. Anomaly detection : A survey, 2009spa
dc.relation.referencesMoses Charikar and Paris Siminelakis. Hashing-based-estimators for kernel density in high dimensions. In 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS), pages 1032–1043. IEEE, 2017.spa
dc.relation.referencesSotirios P. Chatzis, Dimitrios Korkinof, and Yiannis Demiris. A quantum-statistical approach toward robot learning by demonstration. IEEE Transactions on Robotics, 28:1371–1381, 2012. ISSN 15523098. doi: 10.1109/TRO.2012.2203055.spa
dc.relation.referencesFr´ed´eric Chazal, Brittany Fasy, Fabrizio Lecci, Bertrand Michel, Alessandro Rinaldo, Alessandro Rinaldo, and Larry Wasserman. Robust topological inference: Distance to a measure and kernel distance. The Journal of Machine Learning Research, 18(1): 5845–5884, 2017spa
dc.relation.referencesGal Chechik, Varun Sharma, Uri Shalit, Samy Bengio, and Samy Bengio CHECHIK. Large Scale Online Learning of Image Similarity Through Ranking. Technical report, 2010.spa
dc.relation.referencesYen Chi Chen. A tutorial on kernel density estimation and recent advances. Biostatistics and Epidemiology, 1:161–187, 1 2017. ISSN 24709379. doi: 10.1080/24709360. 2017.1396742spa
dc.relation.referencesYen-Chi Chen, Christopher R Genovese, Larry Wasserman, et al. A comprehensive approach to mode clustering. Electronic Journal of Statistics, 10(1):210–241, 2016spa
dc.relation.referencesPeitao Cheng, Yuanying Qiu, Xiumei Wang, and Ke Zhao. A New Single Image Super-Resolution Method Based on the Infinite Mixture Model. IEEE Access, 5:2228–2240, 2017. ISSN 21693536. doi: 10.1109/ACCESS.2017.2664103. URL http://arxiv.org/abs/2006.05218.spa
dc.relation.referencesVictor Chernozhukov, Denis Chetverikov, Kengo Kato, et al. Gaussian approximation of suprema of empirical processes. Annals of Statistics, 42(4):1564–1597, 2014spa
dc.relation.referencesKrzysztof Choromanski and Vikas Sindhwani. Recycling randomness with structure for sublinear time kernel expansions. 5 2016. URL http://arxiv.org/abs/1605.09049spa
dc.relation.referencesN Cristianini. Kernel Methods for Pattern Analysis, volume 1. 2004. doi: 10.1198/ tech.2005.s264.spa
dc.relation.referencesTri Dao, Christopher De Sa, and Christopher R´e. Gaussian quadrature for kernel features. Advances in neural information processing systems, 30:6109, 2017.spa
dc.relation.referencesPhilip I Davies and Nicholas J Higham. Numerically stable generation of correlation matrices and their factors. BIT Numerical Mathematics, 40(4):640–651, 2000.spa
dc.relation.referencesArthur P Dempster, Nan M Laird, and Donald B Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal of the royal statistical society: series B (methodological), 39(1):1–22, 1977.spa
dc.relation.referencesB Denkena, M-A Dittrich, H Noske, and M Witt. Statistical approaches for semisupervised anomaly detection in machining. Production Engineering, 14(3):385–393, 2020spa
dc.relation.referencesLuc Devroye. Nonparametric density estimation. The L 1 View, 1985.spa
dc.relation.referencesAsimenia Dimokranitou. Adversarial autoencoders for anomalous event detection in images. PhD thesis, Purdue University, 2017spa
dc.relation.referencesZhiguo Ding and Minrui Fei. An anomaly detection approach based on isolation forest algorithm for streaming data using sliding window. volume 3, pages 12–17. IFAC Secretariat, 2013. ISBN 9783902823458. doi: 10.3182/20130902-3-CN-3020.00044.spa
dc.relation.referencesLaurent Dinh, David Krueger, and Yoshua Bengio. Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516, 2014.spa
dc.relation.referencesLaurent Dinh, Jascha Sohl-Dickstein, and Samy Bengio. Density estimation using real nvp. 5 2016. URL http://arxiv.org/abs/1605.08803.spa
dc.relation.referencesJoni A Downs. Time-geographic density estimation for moving point objects, 2010.spa
dc.relation.referencesConor Durkan, Artur Bekasov, Iain Murray, and George Papamakarios. Neural spline flows. In Advances in Neural Information Processing Systems, volume 32, 2019spa
dc.relation.referencesVincent Dutordoir, Hugh Salimbeni, Marc Peter Deisenroth, and James Hensman. Gaussian process conditional density estimation. volume 2018-Decem, pages 2385– 2395, 2018.spa
dc.relation.referencesBradley Efron. Bootstrap methods: another look at the jackknife. In Breakthroughs in statistics, pages 569–593. Springer, 1992.spa
dc.relation.referencesGilberto Fernandes, Joel JPC Rodrigues, Luiz Fernando Carvalho, Jalal F Al-Muhtadi, and Mario Lemes Proen¸ca. A comprehensive survey on network anomaly detection. Telecommunication Systems, 70(3):447–489, 2019.spa
dc.relation.referencesChris Fraley and Adrian E. Raftery. Model-based clustering, discriminant analysis, and density estimation. Journal of the American Statistical Association, 97:611–631, 2002. ISSN 01621459. doi: 10.1198/016214502760047131.spa
dc.relation.referencesBrendan J Frey, J Frey Brendan, and Brendan J Frey. Graphical models for machine learning and digital communication. MIT press, 1998spa
dc.relation.referencesJoseph A. Gallego and Fabio A. Gonz´alez. Quantum adaptive fourier features for neural density estimation, 2022. URL https://arxiv.org/abs/2208.00564.spa
dc.relation.referencesJoseph A Gallego, Fabio A Gonz´alez, and Olfa Nasraoui. Robust kernels for robust location estimation. Neurocomputing, 429:174–186, 2021spa
dc.relation.referencesJoseph A Gallego, Juan F Osorio, and Fabio A Gonzalez. Fast kernel density estimation with density matrices and random fourier features. In Advances in Artificial Intelligence–IBERAMIA 2022: 17th Ibero-American Conference on AI, Cartagena de Indias, Colombia, November 23–25, 2022, Proceedings, pages 160–172. Springer, 2023spa
dc.relation.referencesJoseph Gallego-Mejia, Oscar Bustos-Brinez, and Fabio A Gonz´alez. Leandmkde: Quantum latent density estimation for anomaly detection. arXiv preprint arXiv:2211.08525, 2022spa
dc.relation.referencesJoseph A. Gallego-Mejia and Fabio A Gonzalez. Demande dataset, April 2023. URL https://doi.org/10.5281/zenodo.7822851spa
dc.relation.referencesJoseph A Gallego-Mejia, Oscar A Bustos-Brinez, and Fabio A Gonz´alez. Inqmad: Incremental quantum measurement anomaly detection. In 2022 IEEE International Conference on Data Mining Workshops (ICDMW), pages 787–796. IEEE, 2022spa
dc.relation.referencesWeihao Gao, Sreeram Kannan, Sewoong Oh, and Pramod Viswanath. Estimating mutual information for discrete-continuous mixtures. Advances in neural information processing systems, 30, 2017spa
dc.relation.referencesIoannis Gatopoulos, Maarten Stol, and Jakub M. Tomczak. Super-resolution Variational Auto-Encoders. jun 2020. URL http://arxiv.org/abs/2006.05218.spa
dc.relation.referencesChristopher R Genovese, Marco Perone-Pacifico, Isabella Verdinelli, Larry Wasserman, et al. Nonparametric ridge estimation. Annals of Statistics, 42(4):1511–1545, 2014spa
dc.relation.referencesMathieu Germain, Karol Gregor, Iain Murray, and Hugo Larochelle. Made: Masked autoencoder for distribution estimation, 2015.spa
dc.relation.referencesKaan Gokcesu and Suleyman S. Kozat. Online density estimation of nonstationary sources using exponential family of distributions. IEEE Transactions on Neural Networks and Learning Systems, 29:4473–4478, 9 2018. ISSN 21622388. doi: 10.1109/TNNLS.2017.2740003.spa
dc.relation.referencesFabio A. Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Supervised Learning with Quantum Measurements. 2020. URL http://arxiv.org/abs/2004.01227spa
dc.relation.referencesFabio A Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Classification with quantum measurements. Journal of the Physical Society of Japan, 90(4): 044002, 2021.spa
dc.relation.referencesFabio A. Gonz´alez, Alejandro Gallego, Santiago Toledo-Cort´es, and Vladimir VargasCalder´on. Learning with density matrices and random features, 2021.spa
dc.relation.referencesArtur Gramacki. Nonparametric kernel density estimation and its computational aspects, 2018.spa
dc.relation.referencesArtur Gramacki. Nonparametric kernel density estimation and its computational aspects. Springer, 2018.spa
dc.relation.referencesAlexander G Gray and Andrew W Moore. Nonparametric density estimation: Toward computational tractability. In Proceedings of the 2003 SIAM International Conference on Data Mining, pages 203–211. SIAM, 2003spa
dc.relation.referencesClaudio Guardnaccia, Michele Grimaldi, Gabriella Graziuso, and Simona Mancini. CROWDSOURCING NOISE MAPS ANALYSIS BY MEANS OF KERNEL DENSITY ESTIMATION. pages 1691–1697, 2021. doi: 10.48465/fa.2020.0505. URL https://hal.archives-ouvertes.fr/hal-03233732.spa
dc.relation.referencesSudipto Guha, Nina Mishra, Gourav Roy, and Okke Schrijvers. Robust random cut forest based anomaly detection on streams, 2016.spa
dc.relation.referencesSahand Hariri, Matias Carrasco Kind, and Robert J. Brunner. Extended isolation forest. 11 2018. doi: 10.1109/TKDE.2019.2947676. URL http://arxiv.org/abs/1811.02141http://dx.doi.org/10.1109/TKDE.2019.2947676.spa
dc.relation.referencesTrevor Hastie, Robert Tibshirani, Jerome H Friedman, and Jerome H Friedman. The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer, 2009spa
dc.relation.referencesMarti A. Hearst, Susan T Dumais, Edgar Osuna, John Platt, and Bernhard Scholkopf. Support vector machines. IEEE Intelligent Systems and their applications, 13(4):18– 28, 1998.spa
dc.relation.referencesGeoffrey E Hinton, Simon Osindero, and Yee-Whye Teh. A fast learning algorithm for deep belief nets. Neural computation, 18(7):1527–1554, 2006.spa
dc.relation.referencesA. H. Lashkari I. Sharafaldin and A. A Ghorbani. Toward generating a new intrusion detection dataset and intrusion traffic characterization, 2018.spa
dc.relation.referencesF´elix Iglesias and Tanja Zseby. Analysis of network traffic features for anomaly detection. Machine Learning, 101:59–84, 10 2015. ISSN 15730565. doi: 10.1007/ s10994-014-5473-9.spa
dc.relation.referencesPiotr Indyk and Rajeev Motwd. Approximate nearest neighbors: Towards removing the curse of dimensionality, 1998.spa
dc.relation.referencesMarko V Jankovic. Probabilistic Approach to Neural Networks Computation Based on Quantum Probability Model Probabilistic Principal Subspace Analysis Example. 2010. URL http://arxiv.org/abs/1001.4301.spa
dc.relation.referencesMarko V. Jankovic. Quantum Low Entropy based Associative Reasoning – QLEAR Learning, 2017. ISSN 23318422spa
dc.relation.referencesMarko V. Jankovic and Masashi Sugiyama. Probabilistic principal component analysis based on joystick probability selector. In Proceedings of the International Joint Conference on Neural Networks, pages 1414–1421. IEEE, 2009. ISBN 9781424435531. doi: 10.1109/IJCNN.2009.5178696.spa
dc.relation.referencesMarko V. Jankovi´c, Tomislav Gaji´c, and Branimir D. Reljin. Applications of probabilistic model based on main quantum mechanics concepts. In 12th Symposium on Neural Network Applications in Electrical Engineering, NEUREL 2014 - Proceedings, pages 33–36, 2014. ISBN 9781479958887. doi: 10.1109/NEUREL.2014.7011453.spa
dc.relation.referencesJ.H.M. Janssens, F. Huszar, E.O. Postma, and H.J. van den Herik. Stochastic outlier selection, 2012.spa
dc.relation.referencesPing Ji, Na Zhao, Shijie Hao, and Jianguo Jiang. Automatic image annotation by semisupervised manifold kernel density estimation. Information Sciences, 281:648–660, 10 2014. ISSN 00200255. doi: 10.1016/j.ins.2013.09.016spa
dc.relation.referencesJoagg. Joaggi/demande: v1.0, March 2023. URL https://doi.org/10.5281/zenodo.7709634.spa
dc.relation.referencesMichael I Jordan and Tom M Mitchell. Machine learning: Trends, perspectives, and prospects. Science, 349(6245):255–260, 2015.spa
dc.relation.referencesVilen Jumutc and Johan A.K. Suykens. Multi-class supervised novelty detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36:2510–2523, 12 2014. ISSN 01628828. doi: 10.1109/TPAMI.2014.2327984.spa
dc.relation.referencesFiruz Kamalov. Kernel density estimation based sampling for imbalanced class distribution. Information Sciences, 512:1192–1201, feb 2020. ISSN 00200255. doi: 10.1016/j.ins.2019.10.017.spa
dc.relation.referencesSangwook Kim, Yonghwa Choi, and Minho Lee. Deep learning with support vector data description. Neurocomputing, 165:111–117, 10 2015. ISSN 18728286. doi: 10. 1016/j.neucom.2014.09.086spa
dc.relation.referencesDiederik P Kingma and Max Welling. Auto-encoding variational bayes, 2014.spa
dc.relation.referencesDiederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. Improved variational inference with inverse autoregressive flow. In Advances in Neural Information Processing Systems, pages 4743–4751, 2016.spa
dc.relation.referencesB. Ravi Kiran, Dilip Mathew Thomas, and Ranjith Parakkal. An overview of deep learning based methods for unsupervised and semi-supervised anomaly detection in videos, 2018. ISSN 2313433X.spa
dc.relation.referencesMatej Kristan, Aleˇs Leonardis, and Danijel Skoˇcaj. Multivariate online kernel density estimation with gaussian kernels. Pattern Recognition, 44:2630– 2642, 10 2011. ISSN 00313203. doi: 10.1016/j.patcog.2011.03.019. URL https://linkinghub.elsevier.com/retrieve/pii/S0031320311001233.spa
dc.relation.referencesDonghwoon Kwon, Hyunjoo Kim, Jinoh Kim, Sang C Suh, Ikkyun Kim, and Kuinam J Kim. A survey of deep learning-based network anomaly detection. Cluster Computing, 22(1):949–961, 2019.spa
dc.relation.referencesHugo Larochelle and Iain Murray. The neural autoregressive distribution estimator. In Proceedings of the fourteenth international conference on artificial intelligence and statistics, pages 29–37. JMLR Workshop and Conference Proceedings, 2011.spa
dc.relation.referencesLongin Jan Latecki, Aleksandar Lazarevic, and Dragoljub Pokrajac. Outlier detection with kernel density functions. In International Workshop on Machine Learning and Data Mining in Pattern Recognition, pages 61–75. Springer, 2007.spa
dc.relation.referencesQuoc Le, Tamas Sarlos, and Alex Smola. Fastfood - approximating kernel expansions in loglinear time. In 30th International Conference on Machine Learning (ICML), 2013. URL http://jmlr.org/proceedings/papers/v28/le13.html.spa
dc.relation.referencesYann LeCun, Bernhard Boser, John S Denker, Donnie Henderson, Richard E Howard, Wayne Hubbard, and Lawrence D Jackel. Backpropagation applied to handwritten zip code recognition. Neural computation, 1(4):541–551, 1989spa
dc.relation.referencesYann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436–444, 2015.spa
dc.relation.referencesJeisung Lee and Mignon Park. An adaptive background subtraction method based on kernel density estimation. Sensors (Switzerland), 12:12279–12300, 9 2012. ISSN 14248220. doi: 10.3390/s120912279spa
dc.relation.referencesJonathan Li and Andrew Barron. Mixture density estimation. Advances in neural information processing systems, 12, 1999spa
dc.relation.referencesKun-Lun Li, Hou-Kuan Huang, Sheng-Feng Tian, and Wei Xu. Improving one-class svm for anomaly detection. In Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 03EX693), volume 5, pages 3077– 3081. IEEE, 2003.spa
dc.relation.referencesYanjun Li, Kai Zhang, Jun Wang, and Sanjiv Kumar. Learning adaptive random features. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 4229–4236, 2019spa
dc.relation.referencesZheng Li, Yue Zhao, Nicola Botta, Cezar Ionescu, and Xiyang Hu. Copod: Copulabased outlier detection. pages 1118–1123, 11 2020.spa
dc.relation.referencesZhu Li, Jean Fran¸cois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In 36th International Conference on Machine Learning, ICML 2019, volume 2019-June, pages 6916–6936, 2019. ISBN 9781510886988spa
dc.relation.referencesZhu Li, Jean-Francois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In International Conference on Machine Learning, pages 3905–3914. PMLR, 2019.spa
dc.relation.referencesZilong Lin, Yong Shi, and Zhi Xue. Idsgan: Generative adversarial networks for attack generation against intrusion detection. 9 2018. URL http://arxiv.org/abs/1809.02077spa
dc.relation.referencesFang Liu, Yanwei Yu, Peng Song, Yangyang Fan, and Xiangrong Tong. Scalable KDEbased top-n local outlier detection over large-scale data streams. Knowledge-Based Systems, 204:106186, 2020. ISSN 0950-7051. doi: https://doi.org/10.1016/j.knosys.2020. 106186. URL https://www.sciencedirect.com/science/article/pii/S0950705120304159spa
dc.relation.referencesFanghui Liu, Xiaolin Huang, Yudong Chen, and Johan A. K. Suykens. Random Features for Kernel Approximation: A Survey on Algorithms, Theory, and Beyond. apr 2020.spa
dc.relation.referencesFanghui Liu, Xiaolin Huang, Yudong Chen, Jie Yang, and Johan Suykens. Random fourier features via fast surrogate leverage weighted sampling. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 4844–4851, 2020.spa
dc.relation.referencesFei Tony Liu, Kai Ming Ting, and Zhi-Hua Zhou. Isolation forest. In 2008 eighth ieee international conference on data mining, pages 413–422. IEEE, 2008spa
dc.relation.referencesQiao Liu, Jiaze Xu, Rui Jiang, and Wing Hung Wong. Density estimation using deep generative neural networks. Proceedings of the National Academy of Sciences, 118(15), 2021.spa
dc.relation.referencesPeng Lv, Yanwei Yu, Yangyang Fan, Xianfeng Tang, and Xiangrong Tong. Layerconstrained variational autoencoding kernel density estimation model for anomaly detection. Knowledge-Based Systems, 196, 5 2020. ISSN 09507051. doi: 10.1016/j.knosys. 2020.105753.spa
dc.relation.referencesYueming Lyu. Spherical structured feature maps for kernel approximation, 2017spa
dc.relation.referencesLarry M Manevitz and Malik Yousef. One-class svms for document classification. Journal of machine Learning research, 2(Dec):139–154, 2001.spa
dc.relation.referencesEmaad Manzoor, Hemank Lamba, and Leman Akoglu. Xstream: Outlier dete’x’ion in feature-evolving data streams. pages 1963–1972. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220107.spa
dc.relation.referencesWilliam B. March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. 10 2014. URL http://arxiv.org/abs/1410.0260.spa
dc.relation.referencesWilliam B March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. SIAM Journal on Scientific Computing, 37(2):A1089–A1110, 2015.spa
dc.relation.referencesLuis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.spa
dc.relation.referencesLuis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.spa
dc.relation.referencesBarbara J McNeil and James A Hanley. Statistical approaches to the analysis of receiver operating characteristic (roc) curves. Medical decision making, 4(2):137–150, 1984.spa
dc.relation.referencesSeonwoo Min, Byunghan Lee, and Sungroh Yoon. Deep learning in bioinformatics. 18 (5):851–869, 2017. ISSN 14774054. doi: 10.1093/bib/bbw068spa
dc.relation.referencesTom Minka et al. Divergence measures and message passing. Technical report, Citeseer, 2005spa
dc.relation.referencesYisroel Mirsky, Tomer Doitshman, Yuval Elovici, and Asaf Shabtai. Kitsune: An ensemble of autoencoders for online network intrusion detection, 2 2018. ISSN 23318422.spa
dc.relation.referencesNour Moustafa and Jill Slay. Unsw-nb15: a comprehensive data set for network intrusion detection systems (unsw-nb15 network data set), 2015.spa
dc.relation.referencesMarina Munkhoeva, Yermek Kapushev, Evgeny Burnaev, and Ivan Oseledets. Quadrature-based features for kernel approximation. 2 2018. URL http://arxiv.org/abs/1802.03832.spa
dc.relation.referencesKevin P Murphy. Machine learning: a probabilistic perspective. MIT press, 2012.spa
dc.relation.referencesGyoung S. Na, Donghyun Kim, and Hwanjo Yu. Dilof: Effective and memory efficient local outlier detection in data streams. pages 1993–2002. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220022spa
dc.relation.referencesBenjamin Nachman and David Shih. Anomaly detection with density estimation. Physical Review D, 101(7):075042, 2020.spa
dc.relation.referencesElizbar A Nadaraya. Some new estimates for distribution functions. Theory of Probability & Its Applications, 9(3):497–500, 1964.spa
dc.relation.referencesTomoki Nakaya and Keiji Yano. Visualising crime clusters in a space-time cube: An exploratory data-analysis approach using space-time kernel density estimation and scan statistics. Transactions in GIS, 14:223–239, 6 2010. ISSN 13611682. doi: 10. 1111/j.1467-9671.2010.01194.xspa
dc.relation.referencesHD Nguyen, Kim Phuc Tran, S´ebastien Thomassey, and Moez Hamad. Forecasting and anomaly detection approaches using lstm and lstm autoencoder techniques with the applications in supply chain management. International Journal of Information Management, 57:102282, 2021spa
dc.relation.referencesHien D. Nguyen, Dianhui Wang, and Geoffrey J. McLachlan. Randomized mixture models for probability density approximation and estimation. Information Sciences, 467:135–148, 10 2018. ISSN 00200255. doi: 10.1016/j.ins.2018.07.056.spa
dc.relation.referencesGuansong Pang, Charu Aggarwal, Chunhua Shen, and Nicu Sebe. Editorial deep learning for anomaly detection. IEEE Transactions on Neural Networks and Learning Systems, 33:2282–2286, 6 2022. ISSN 2162-237X. doi: 10.1109/TNNLS.2022.3162123. URL https://ieeexplore.ieee.org/document/9786561/spa
dc.relation.referencesGeorge Papamakarios, Theo Pavlakou, and Iain Murray. Masked autoregressive flow for density estimation. 5 2017. URL http://arxiv.org/abs/1705.07057.spa
dc.relation.referencesEmanuel Parzen. On estimation of a probability density function and mode. The annals of mathematical statistics, 33(3):1065–1076, 1962.spa
dc.relation.referencesF. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.spa
dc.relation.referencesKainan Peng, Wei Ping, Zhao Song, and Kexin Zhao. Non-autoregressive neural textto-speech. 5 2019. URL http://arxiv.org/abs/1905.08459spa
dc.relation.referencesTom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2016. ISSN 1573-0565spa
dc.relation.referencesTom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2 2016. ISSN 15730565. doi: 10.1007/s10994-015-5521-0spa
dc.relation.referencesMarco A.F. Pimentel, David A. Clifton, Lei Clifton, and Lionel Tarassenko. A review of novelty detection. Signal Processing, 99:215–249, 6 2014. ISSN 01651684. doi: 10.1016/j.sigpro.2013.12.026spa
dc.relation.referencesAdrian Alan Pol, Victor Berger, Cecile Germain, Gianluca Cerminara, and Maurizio Pierini. Anomaly detection with conditional variational autoencoders. In 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA), pages 1651–1657, 2019. doi: 10.1109/ICMLA.2019.00270.spa
dc.relation.referencesRimpal Popat and Jayesh Chaudhary. A survey on credit card fraud detection using machine learning. 2018spa
dc.relation.referencesAli Rahimi and Benjamin Recht. Random features for large-scale kernel machines. In Proceedings of the 20th International Conference on Neural Information Processing Systems, NIPS’07, page 1177–1184, Red Hook, NY, USA, 2007. Curran Associates Inc. ISBN 9781605603520.spa
dc.relation.referencesAli Rahimi, Benjamin Recht, et al. Random features for large-scale kernel machines. In NIPS, volume 3, page 5. Citeseer, 2007spa
dc.relation.referencesSridhar Ramaswamy, Rajeev Rastogi, and Kyuseok Shim. Efficient algorithms for mining outliers from large data sets. page 427438. Association for Computing Machinery, 2000. ISBN 1581132174.spa
dc.relation.referencesDaniel Ramotsoela, Adnan Abu-Mahfouz, and Gerhard Hancke. A survey of anomaly detection in industrial wireless sensor networks with critical water system infrastructure as a case study. Sensors (Switzerland), 18, 8 2018. ISSN 14248220. doi: 10.3390/s18082491spa
dc.relation.referencesCarl Edward Rasmussen. Gaussian processes in machine learning. In Summer school on machine learning, pages 63–71. Springer, 2003spa
dc.relation.referencesCarl Edward. Rasmussen. Gaussian Processes in Machine Learning. Springer-Verlag Berlin Heidelberg 2004, 19(1):63–71, 2004. ISSN 00219509.spa
dc.relation.referencesShebuti Rayana. ODDS library, 2016. URL http://odds.cs.stonybrook.eduspa
dc.relation.referencesS. Reed, Y. Chen, T. Paine, A. van den Oord, S. M.A. Eslami, D. Rezende, O. Vinyals, and N. de Freitas. Few-shot autoregressive density estimation: Towards learning to learn distributions, 2017.spa
dc.relation.referencesDanilo Jimenez Rezende and Shakir Mohamed. Variational inference with normalizing flows. volume 2, pages 1530–1538, 2015. ISBN 9781510810587.spa
dc.relation.referencesBaqar Rizvi, Ammar Belatreche, Ahmed Bouridane, and Ian Watson. Detection of stock price manipulation using kernel based principal component analysis and multivariate density estimation. IEEE Access, 8:135989–136003, 2020.spa
dc.relation.referencesVijay K Rohatgi and AK Md Ehsanes Saleh. An introduction to probability and statistics. John Wiley & Sons, 2015.spa
dc.relation.referencesMurray Rosenblatt. Remarks on some nonparametric estimates of a density function. Ann. Math. Statist., 27(3):832–837, 09 1956. doi: 10.1214/aoms/1177728190. URL https://doi.org/10.1214/aoms/1177728190spa
dc.relation.referencesPeter J Rousseeuw and Katrien Van Driessen. A fast algorithm for the minimum covariance determinant estimator. Technometrics, 41(3):212–223, 1999.spa
dc.relation.referencesLukas Ruff, Robert Vandermeulen, Nico Goernitz, Lucas Deecke, Shoaib Ahmed Siddiqui, Alexander Binder, Emmanuel M¨uller, and Marius Kloft. Deep one-class classification. volume 80. PMLR, 2018spa
dc.relation.referencesLukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Gregoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, and Klaus Robert Muller. A unifying review of deep and shallow anomaly detection. Proceedings of the IEEE, 109: 756–795, 5 2021. ISSN 15582256. doi: 10.1109/JPROC.2021.3052449.spa
dc.relation.referencesG Rupert Jr et al. Simultaneous statistical inference. Springer Series in Statistics, 2012.spa
dc.relation.referencesSaket Sathe and Charu C Aggarwal. Subspace outlier detection in linear time with randomized hashing. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 459–468. IEEE, 2016.spa
dc.relation.referencesIssei Sato, Kenichi Kurihara, Shu Tanaka, Hiroshi Nakagawa, and Seiji Miyashita. Quantum annealing for variational Bayes inference. In Proceedings of the 25th Conference on Uncertainty in Artificial Intelligence, UAI 2009, pages 479–486, 2009spa
dc.relation.referencesB. Sch¨olkopf. Learning with kernels. Journal of the Electrochemical Society, 129 (November):2865, 2002. ISSN 0162-1459. doi: 10.1198/jasa.2003.s269. URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.167.5140&rep=rep1&type=pdfspa
dc.relation.referencesBernhard Sch¨olkopf, Alexander Smola, and Klaus-Robert M¨uller. Kernel principal component analysis. In International conference on artificial neural networks, pages 583–588. Springer, 1997.spa
dc.relation.referencesBernhard Scholkopf, Robert C Williamson, and Peter L Bartlett. New Support Vector Algorithms ¤. Neural Computation, 1245:1207–1245, 2000.spa
dc.relation.referencesBernhard Sch¨olkopf, John C Platt, John Shawe-Taylor, Alex J Smola, and Robert C Williamson. Estimating the support of a high-dimensional distribution. Neural computation, 2001spa
dc.relation.referencesDavid W Scott. Multivariate density estimation and visualization. In Handbook of computational statistics, pages 549–569. Springer, 2012spa
dc.relation.referencesYounghyun Jo Sejong, Yang Seon, and Joo Kim. SRFlow-DA: Super-Resolution Using Normalizing Flow with Deep Convolutional Block. Technical report, 2021. URL https://github.com/yhjo09/SRFlow-DA.spa
dc.relation.referencesRazieh Sheikhpour, Mehdi Agha Sarram, and Robab Sheikhpour. Particle swarm optimization for bandwidth determination and feature selection of kernel density estimation based classifiers in diagnosis of breast cancer. Applied Soft Computing Journal, 40:113–131, 3 2016. ISSN 15684946. doi: 10.1016/j.asoc.2015.10.005spa
dc.relation.referencesBoxi Shen, Xiang Xu, Jun Li, Antonio Plaza, and Qunying Huang. Unfolding spatialtemporal patterns of taxi trip based on an improved network kernel density estimation. ISPRS International Journal of Geo-Information, 9:683, 11 2020. ISSN 2220-9964. doi: 10.3390/ijgi9110683.spa
dc.relation.referencesWeiwei Shen, Zhihui Yang, and Jun Wang. Random features for shift-invariant kernels with moment matching. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 31, 2017.spa
dc.relation.referencesXun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.Xun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.spa
dc.relation.referencesAlistair Shilton, Sutharshan Rajasegarar, and Marimuthu Palaniswami. Combined multiclass classification and anomaly detection for large-scale wireless sensor networks. In 2013 IEEE eighth international conference on intelligent sensors, sensor networks and information processing, pages 491–496. IEEE, 2013spa
dc.relation.referencesParis Siminelakis, Kexin Rong, Peter Bailis, Moses Charikar, and Philip Levis. Rehashing kernel evaluation in high dimensions. volume 2019-June, pages 10153–10173, 2019. ISBN 9781510886988spa
dc.relation.referencesAman Sinha and John Duchi. Learning kernels with random features. In Advances in Neural Information Processing Systems, pages 1306–1314, 2016.spa
dc.relation.referencesDaniel B. Smith. Kde-for-scipy (python script). https://github.com/Daniel-B-Smith/KDE-for-SciPy/blob/master/kde.py, 2021. Accessed on March 6, 2023.spa
dc.relation.referencesPaul Smolensky. Information processing in dynamical systems: Foundations of harmony theory. Technical report, Colorado Univ at Boulder Dept of Computer Science, 1986.spa
dc.relation.referencesAngela A Sodemann, Matthew P Ross, and Brett J Borghetti. A review of anomaly detection in automated surveillance. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(6):1257–1272, 2012spa
dc.relation.referencesR. F. Streater. Classical and quantum probability. Journal of Mathematical Physics, 41:3556–3603, 2000. ISSN 00222488. doi: 10.1063/1.533322.spa
dc.relation.referencesSwee Chuan Tan, Kai Ming Ting, and Tony Fei Liu. Fast anomaly detection for streaming data. In Twenty-second international joint conference on artificial intelligence, 2011.spa
dc.relation.referencesXiaofeng Tang and Aiqiang Xu. Multi-class classification using kernel density estimation on K-nearest neighbours. Electronics Letters, 52(8):600–602, apr 2016. ISSN 00135194. doi: 10.1049/el.2015.4437.spa
dc.relation.referencesMahbod Tavallaee, Ebrahim Bagheri, Wei Lu, and Ali A Ghorbani. A detailed analysis of the kdd cup 99 data set, 2009. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.htmlspa
dc.relation.referencesP. Tiwari and M. Melucci. Towards a quantum-inspired binary classifier. IEEE Access, 7:42354–42372, 2019. doi: 10.1109/ACCESS.2019.2904624.spa
dc.relation.referencesMaximilian E. Tschuchnig and Michael Gadermayr. Anomaly detection in medical imaging - a mini review. In Peter Haber, Thomas J. Lampoltshammer, Helmut Leopold, and Manfred Mayr, editors, Data Science – Analytics and Applications, pages 33–38, Wiesbaden, 2022. Springer Fachmedien Wiesbaden. ISBN 978-3-658-36295-9.spa
dc.relation.referencesBerwin A Turlach. Bandwidth selection in kernel density estimation: A review. In CORE and Institut de Statistique. Citeseer, 1993spa
dc.relation.referencesAndrea Vedaldi and Andrew Zisserman. Efficient additive kernels via explicit feature maps. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(3):480– 492, 2012. ISSN 01628828. doi: 10.1109/TPAMI.2011.153.spa
dc.relation.referencesPascal Vincent, Hugo Larochelle, Yoshua Bengio, and Pierre-Antoine Manzagol. Extracting and composing robust features with denoising autoencoders, 2009.spa
dc.relation.referencesJohn Von Neumann. Wahrscheinlichkeitstheoretischer aufbau der quantenmechanik. Nachrichten von der Gesellschaft der Wissenschaften zu G¨ottingen, MathematischPhysikalische Klasse, 1927:245–272, 1927.spa
dc.relation.referencesLin Wang, Fuqiang Zhou, Zuoxin Li, Wangxia Zuo, and Haishu Tan. Abnormal Event Detection in Videos Using Hybrid Spatio-Temporal Autoencoder. In Proceedings - International Conference on Image Processing, ICIP, pages 2276–2280, 2018. ISBN 9781479970612. doi: 10.1109/ICIP.2018.8451070.spa
dc.relation.referencesXuanzhao Wang, Zhengping Che, Bo Jiang, Ning Xiao, Ke Yang, Jian Tang, Jieping Ye, Jingyu Wang, and Qi Qi. Robust unsupervised video anomaly detection by multipath frame prediction. IEEE Transactions on Neural Networks and Learning Systems, 33(6):2301–2312, 2022. doi: 10.1109/TNNLS.2021.3083152.spa
dc.relation.referencesYong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.spa
dc.relation.referencesYong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.spa
dc.relation.referencesChristopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001spa
dc.relation.referencesChristopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001spa
dc.relation.referencesMiao Xie, Song Han, Biming Tian, and Sazia Parvin. Anomaly detection in wireless sensor networks: A survey. Journal of Network and computer Applications, 34(4): 1302–1325, 2011spa
dc.relation.referencesTianbao Yang, Yu Feng Li, Mehrdad Mahdavi, Rong Jin, and Zhi Hua Zhou. Nystr¨om method vs random Fourier features: A theoretical and empirical comparison. In Advances in Neural Information Processing Systems, volume 1, pages 476–484, 2012. ISBN 9781627480031spa
dc.relation.referencesFelix Xinnan Yu, Ananda Theertha Suresh, Krzysztof Choromanski, Daniel HoltmannRice, and Sanjiv Kumar. Orthogonal random features. pages 1983–1991, 10 2016. URL http://arxiv.org/abs/1610.09072.spa
dc.relation.referencesFelix Xinnan X Yu, Ananda Theertha Suresh, Krzysztof M Choromanski, Daniel N Holtmann-Rice, and Sanjiv Kumar. Orthogonal random features. In D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 29, pages 1975–1983. Curran Associates, Inc., 2016. URL https://proceedings.neurips.cc/paper/2016/file/53adaf494dc89ef7196d73636eb2451b-Paper.pdf.spa
dc.relation.referencesHoussam Zenati, Manon Romain, Chuan-Sheng Foo, Bruno Lecouat, and Vijay Chandrasekhar. Adversarially learned anomaly detection. In 2018 IEEE International conference on data mining (ICDM), pages 727–736. IEEE, 2018.spa
dc.relation.referencesLiangwei Zhang, Jing Lin, and Ramin Karim. Adaptive kernel density-based anomaly detection for nonlinear systems. Knowledge-Based Systems, 139:50–63, 2018. ISSN 09507051. doi: 10.1016/j.knosys.2017.10.009.spa
dc.relation.referencesYue Zhao, Zain Nasrullah, and Zheng Li. Pyod: A python toolbox for scalable outlier detection. Journal of Machine Learning Research, 20, 2019. ISSN 15337928.spa
dc.relation.referencesXun Zhou, Sicong Cheng, Meng Zhu, Chengkun Guo, Sida Zhou, Peng Xu, Zhenghua Xue, and Weishi Zhang. A state of the art survey of data mining-based fraud detection and credit scoring. volume 189. EDP Sciences, 8 2018. doi: 10.1051/matecconf/ 201818903002.spa
dc.relation.referencesGallego-Mejia, J. A. (2023, June). Efficient non-parametric neural density estimation and its application to outlier and anomaly detection. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 37, No. 13, pp. 16117-16118).spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseAtribución-NoComercial-SinDerivadas 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/spa
dc.subject.ddc000 - Ciencias de la computación, información y obras generalesspa
dc.subject.decsAprendizaje profundospa
dc.subject.decsDeep Learningeng
dc.subject.decsRedes Neurales de la Computaciónspa
dc.subject.decsNeural Networks, Computereng
dc.subject.proposalKernel density estimationeng
dc.subject.proposalKernel methodseng
dc.subject.proposalDeep Learningeng
dc.subject.proposalRandom Fourier Featureseng
dc.subject.proposalMachine Learningeng
dc.subject.proposalDeep Kerneleng
dc.subject.proposalLarge-scale learningeng
dc.subject.proposalKernel Density Esitmation Approximationeng
dc.subject.proposalDensity Matrix
dc.subject.proposalNeural Density Estimationeng
dc.subject.proposalEstimación de la densidad del núcleospa
dc.subject.proposalMétodos del núcleospa
dc.subject.proposalAprendizaje profundospa
dc.subject.proposalAprendizaje a gran escalaspa
dc.subject.proposalAproximaciones de la estimación de la densidad del nucleospa
dc.subject.proposalMatriz de densidadspa
dc.subject.proposalEstimación de la densidad neuronalspa
dc.titleEfficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detectioneng
dc.title.translatedEstimación neuronal no paramétrica eficiente de la densidad y su aplicación a la detección de valores atípicos y anomalíasspa
dc.typeTrabajo de grado - Doctoradospa
dc.type.coarhttp://purl.org/coar/resource_type/c_db06spa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/doctoralThesisspa
dc.type.redcolhttp://purl.org/redcol/resource_type/TDspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dcterms.audience.professionaldevelopmentInvestigadoresspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
1022369610-2023.pdf
Tamaño:
10.74 MB
Formato:
Adobe Portable Document Format
Descripción:
Tesis de Doctorado en Ingeniería - Sistemas y Computación

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: