Siga este enlace para ver otros tipos de publicaciones sobre el tema: Dimensionality reduction.

Artículos de revistas sobre el tema "Dimensionality reduction"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Dimensionality reduction".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Cheng, Long, Chenyu You, and Yani Guan. "Random Projections for Non-linear Dimensionality Reduction." International Journal of Machine Learning and Computing 6, no. 4 (2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Marchette, David J., and Wendy L. Poston. "Local dimensionality reduction." Computational Statistics 14, no. 4 (1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sun, Yu-Yin, Michael Ng, and Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Texto completo
Resumen
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are kno
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Koren, Y., and L. Carmel. "Robust linear dimensionality reduction." IEEE Transactions on Visualization and Computer Graphics 10, no. 4 (2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Lotlikar, R., and R. Kothari. "Fractional-step dimensionality reduction." IEEE Transactions on Pattern Analysis and Machine Intelligence 22, no. 6 (2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Gottlieb, Lee-Ad, Aryeh Kontorovich, and Robert Krauthgamer. "Adaptive metric dimensionality reduction." Theoretical Computer Science 620 (March 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Pang, Rich, Benjamin J. Lansdell, and Adrienne L. Fairhall. "Dimensionality reduction in neuroscience." Current Biology 26, no. 14 (2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lovaglio, Pietro Giorgio, and Giorgio Vittadini. "Multilevel dimensionality-reduction methods." Statistical Methods & Applications 22, no. 2 (2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Carter, Kevin, Raviv Raich, William Finn, and Alfred Hero,III. "Information-Geometric Dimensionality Reduction." IEEE Signal Processing Magazine 28, no. 2 (2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction." IEEE Transactions on Cybernetics 43, no. 6 (2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Zhang, Zhao, Tommy W. S. Chow, and Ning Ye. "SEMISUPERVISED MULTIMODAL DIMENSIONALITY REDUCTION." Computational Intelligence 29, no. 1 (2012): 70–110. http://dx.doi.org/10.1111/j.1467-8640.2012.00429.x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Liu, Huan, and Rudy Setiono. "Dimensionality reduction via discretization." Knowledge-Based Systems 9, no. 1 (1996): 67–72. http://dx.doi.org/10.1016/0950-7051(95)01030-0.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Yang, Li. "Distance-preserving dimensionality reduction." Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 1, no. 5 (2011): 369–80. http://dx.doi.org/10.1002/widm.39.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Li, Hongda, Jian Cui, Xinle Zhang, Yongqi Han, and Liying Cao. "Dimensionality Reduction and Classification of Hyperspectral Remote Sensing Image Feature Extraction." Remote Sensing 14, no. 18 (2022): 4579. http://dx.doi.org/10.3390/rs14184579.

Texto completo
Resumen
Terrain classification is an important research direction in the field of remote sensing. Hyperspectral remote sensing image data contain a large amount of rich ground object information. However, such data have the characteristics of high spatial dimensions of features, strong data correlation, high data redundancy, and long operation time, which lead to difficulty in image data classification. A data dimensionality reduction algorithm can transform the data into low-dimensional data with strong features and then classify the dimensionally reduced data. However, most classification methods ca
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Ahmad, Noor, and Ali Bou Nassif. "Dimensionality Reduction: Challenges and Solutions." ITM Web of Conferences 43 (2022): 01017. http://dx.doi.org/10.1051/itmconf/20224301017.

Texto completo
Resumen
The use of dimensionality reduction techniques is a keystone for analyzing and interpreting high dimensional data. These techniques gather several data features of interest, such as dynamical structure, input-output relationships, the correlation between data sets, covariance, etc. Dimensionality reduction entails mapping a set of high dimensional data features onto low dimensional data. Motivated by the lack of learning models’ performance due to the high dimensionality data, this study encounters five distinct dimensionality reduction methods. Besides, a comparison between reduced dimensiona
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Shen, Zilin. "Comparison and Evaluation of Classical Dimensionality Reduction Methods." Highlights in Science, Engineering and Technology 70 (November 15, 2023): 411–18. http://dx.doi.org/10.54097/hset.v70i.13890.

Texto completo
Resumen
As one of the tasks of unsupervised learning, data dimensionality reduction is faced with the problem of a lack of evaluation methods. Based on this, three classical dimensionality reduction methods such as PCA, t-SNE and UMAP were selected as the research object in this paper. This article selected 5 three-classification datasets and used the three methods mentioned above to perform dimensionality reduction. This paper plotted 3D scatter graphs after dimensionality reduction to analyze the differentiation effect of the data on different categories of the target variable. Then the data after d
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Tiwari, Anamika. "Classification of Emotion from Facial Image Using Dimensionality Reduction Technique." International Journal of Scientific Engineering and Research 4, no. 3 (2016): 11–13. https://doi.org/10.70729/29031601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

K, Bhargavi. "Data Dimensionality Reduction Techniques : Review." International Journal of Engineering Technology and Management Sciences 4, no. 4 (2020): 62–65. http://dx.doi.org/10.46647/ijetms.2020.v04i04.010.

Texto completo
Resumen
Data science is the study of data. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. The goal of data science is to gain insights and knowledge from any type of data — both structured and unstructured. Data science is related to computer science, but is a separate field. Computer science involves creating programs and algorithms to record and process data, while data science covers any type of data analysis, which may or may not use computers. Data science is more closely related to the mathematics field of Statistics, which inc
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Kay, Steven. "Dimensionality Reduction for Signal Detection." IEEE Signal Processing Letters 29 (2022): 145–48. http://dx.doi.org/10.1109/lsp.2021.3129453.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Nelson, Jelani. "Dimensionality Reduction in Euclidean Space." Notices of the American Mathematical Society 67, no. 10 (2020): 1. http://dx.doi.org/10.1090/noti2166.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Tianhao Zhang, Dacheng Tao, Xuelong Li, and Jie Yang. "Patch Alignment for Dimensionality Reduction." IEEE Transactions on Knowledge and Data Engineering 21, no. 9 (2009): 1299–313. http://dx.doi.org/10.1109/tkde.2008.212.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Wang, Shujian, Deyan Xie, Fang Chen, and Quanxue Gao. "Dimensionality reduction by LPP‐L21." IET Computer Vision 12, no. 5 (2018): 659–65. http://dx.doi.org/10.1049/iet-cvi.2017.0302.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Raymer, M. L., W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain. "Dimensionality reduction using genetic algorithms." IEEE Transactions on Evolutionary Computation 4, no. 2 (2000): 164–71. http://dx.doi.org/10.1109/4235.850656.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Saund, E. "Dimensionality-reduction using connectionist networks." IEEE Transactions on Pattern Analysis and Machine Intelligence 11, no. 3 (1989): 304–14. http://dx.doi.org/10.1109/34.21799.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Vats, Deepak, and Avinash Sharma. "Dimensionality Reduction Techniques: Comparative Analysis." Journal of Computational and Theoretical Nanoscience 17, no. 6 (2020): 2684–88. http://dx.doi.org/10.1166/jctn.2020.8967.

Texto completo
Resumen
It has been spotted an exponential growth in terms of dimension in real world data. Some example of higher dimensional data may includes speech signal, sensor data, medical data, criminal data and data related to recommendation process for different field like news, movies (Netflix) and e-commerce. To empowering learning accuracy in the area of machine learning and enhancing mining performance one need to remove redundant feature and feature not relevant for mining and learning task from this high dimension dataset. There exist many supervised and unsupervised methodologies in literature to pe
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Harrow, Aram W., Ashley Montanaro, and Anthony J. Short. "Limitations on quantum dimensionality reduction." International Journal of Quantum Information 13, no. 04 (2015): 1440001. http://dx.doi.org/10.1142/s0219749914400012.

Texto completo
Resumen
The Johnson–Lindenstrauss Lemma is a classic result which implies that any set of n real vectors can be compressed to O( log n) dimensions while only distorting pairwise Euclidean distances by a constant factor. Here we consider potential extensions of this result to the compression of quantum states. We show that, by contrast with the classical case, there does not exist any distribution over quantum channels that significantly reduces the dimension of quantum states while preserving the 2-norm distance with high probability. We discuss two tasks for which the 2-norm distance is indeed the co
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Tasoulis, Sotiris, Nicos G. Pavlidis, and Teemu Roos. "Nonlinear dimensionality reduction for clustering." Pattern Recognition 107 (November 2020): 107508. http://dx.doi.org/10.1016/j.patcog.2020.107508.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Gao, Junbin, Qinfeng Shi, and Tibério S. Caetano. "Dimensionality reduction via compressive sensing." Pattern Recognition Letters 33, no. 9 (2012): 1163–70. http://dx.doi.org/10.1016/j.patrec.2012.02.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Wang, Yasi, Hongxun Yao, and Sicheng Zhao. "Auto-encoder based dimensionality reduction." Neurocomputing 184 (April 2016): 232–42. http://dx.doi.org/10.1016/j.neucom.2015.08.104.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Hou, Chenping, Changshui Zhang, Yi Wu, and Yuanyuan Jiao. "Stable local dimensionality reduction approaches." Pattern Recognition 42, no. 9 (2009): 2054–66. http://dx.doi.org/10.1016/j.patcog.2008.12.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Qiao, Lishan, Limei Zhang, and Songcan Chen. "Dimensionality reduction with adaptive graph." Frontiers of Computer Science 7, no. 5 (2013): 745–53. http://dx.doi.org/10.1007/s11704-013-2234-z.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Lai, Zhihui, Yong Xu, Jian Yang, Linlin Shen, and David Zhang. "Rotational Invariant Dimensionality Reduction Algorithms." IEEE Transactions on Cybernetics 47, no. 11 (2017): 3733–46. http://dx.doi.org/10.1109/tcyb.2016.2578642.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Nagabhushan, P., K. Chidananda Gowda, and Edwin Diday. "Dimensionality reduction of symbolic data." Pattern Recognition Letters 16, no. 2 (1995): 219–23. http://dx.doi.org/10.1016/0167-8655(94)00085-h.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Havelka, Jan, Anna Kučerová, and Jan Sýkora. "Dimensionality reduction in thermal tomography." Computers & Mathematics with Applications 78, no. 9 (2019): 3077–89. http://dx.doi.org/10.1016/j.camwa.2019.04.019.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Santos, João Filipe, Maria Manuela Portela, and Inmaculada Pulido-Calvo. "Dimensionality reduction in drought modelling." Hydrological Processes 27, no. 10 (2012): 1399–410. http://dx.doi.org/10.1002/hyp.9300.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Xie, Fuding, Yutao Fan, and Ming Zhou. "Dimensionality Reduction by Weighted Connections between Neighborhoods." Abstract and Applied Analysis 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/928136.

Texto completo
Resumen
Dimensionality reduction is the transformation of high-dimensional data into a meaningful representation of reduced dimensionality. This paper introduces a dimensionality reduction technique by weighted connections between neighborhoods to improveK-Isomap method, attempting to preserve perfectly the relationships between neighborhoods in the process of dimensionality reduction. The validity of the proposal is tested by three typical examples which are widely employed in the algorithms based on manifold. The experimental results show that the local topology nature of dataset is preserved well w
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Md., Abu Marjan, Rashedul Islam Md., Palash Uddin Md., Ibn Afjal Masud, and Al Mamun Md. "PCA-based dimensionality reduction for face recognition." TELKOMNIKA (Telecommunication, Computing, Electronics and Control) 19, no. 5 (2021): 1622–29. https://doi.org/10.12928/telkomnika.v19i5.19566.

Texto completo
Resumen
In this paper, we conduct a comprehensive study on dimensionality reduction (DR) techniques and discuss the mostly used statistical DR technique called principal component analysis (PCA) in detail with a view to addressing the classical face recognition problem. Therefore, we, more devotedly, propose a solution to either a typical face or individual face recognition based on the principal components, which are constructed using PCA on the face images. We simulate the proposed solution with several training and test sets of manually captured face images and also with the popular Olivetti Resear
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Zhao, Xiaowei, Feiping Nie, Sen Wang, Jun Guo, Pengfei Xu, and Xiaojiang Chen. "Unsupervised 2D Dimensionality Reduction with Adaptive Structure Learning." Neural Computation 29, no. 5 (2017): 1352–74. http://dx.doi.org/10.1162/neco_a_00950.

Texto completo
Resumen
In recent years, unsupervised two-dimensional (2D) dimensionality reduction methods for unlabeled large-scale data have made progress. However, performance of these degrades when the learning of similarity matrix is at the beginning of the dimensionality reduction process. A similarity matrix is used to reveal the underlying geometry structure of data in unsupervised dimensionality reduction methods. Because of noise data, it is difficult to learn the optimal similarity matrix. In this letter, we propose a new dimensionality reduction model for 2D image matrices: unsupervised 2D dimensionality
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Chun-Man Yan, Chun-Man Yan, and Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction." 電腦學刊 33, no. 2 (2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Texto completo
Resumen
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing m
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Chun-Man Yan, Chun-Man Yan, and Yu-Yao Zhang Chun-Man Yan. "Face Recognition Based on SRC Combined with Sparse Embedding Dimensionality Reduction." 電腦學刊 33, no. 2 (2022): 083–93. http://dx.doi.org/10.53106/199115992022043302007.

Texto completo
Resumen
<p>Sparse representation-based classification (SRC) method has achieved good recognition results and shown strong robustness for face recognition, especially when the face image is affected by illumination variations, expression changes and occlusion. SRC method simply uses the training set as a dictionary to encode test samples. However, the high-dimensional training face data usually contain a large amount of redundant information, which will increase the complexity of this method. Therefore, the image dimensionality reduction procedure is separately performed by most of the existing m
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Vijayarani, Dr S., Dr C. Sivamathi, and Mrs S. Maria Sylviaa. "Bio Inspired Algorithms for Dimensionality Reduction and Outlier Detection in Medical Datasets." International Journal of Advanced Networking and Applications 14, no. 01 (2022): 5277–86. http://dx.doi.org/10.35444/ijana.2022.14107.

Texto completo
Resumen
Dimensionality Reduction is one of the useful techniques used in number of applications in order to reduce the number of features to improve the productivity and efficiency of the task. Clustering is one of the influential tasks in data mining. Dimensionality reductions are used in data mining, Image processing, Networking, Mobile computing, etc. The elementary intention of this work is to apply dimensionality reduction algorithms and then cluster the datasets to detect outliers. A bio-inspired ACO (Ant Colony optimization) algorithm has been proposed to reduce dimensionality. Also another bio
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Remesh, Reshma, and Pattabiraman V. "A SURVEY ON THE CURES FOR THE CURSE OF DIMENSIONALITY IN BIG DATA." Asian Journal of Pharmaceutical and Clinical Research 10, no. 13 (2017): 355. http://dx.doi.org/10.22159/ajpcr.2017.v10s1.19755.

Texto completo
Resumen
Dimensionality reduction techniques are used to reduce the complexity for analysis of high dimensional data sets. The raw input data set may have large dimensions and it might consume time and lead to wrong predictions if unnecessary data attributes are been considered for analysis. So using dimensionality reduction techniques one can reduce the dimensions of input data towards accurate prediction with less cost. In this paper the different machine learning approaches used for dimensionality reductions such as PCA, SVD, LDA, Kernel Principal Component Analysis and Artificial Neural Network hav
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Gao, Yunlong, Sizhe Luo, Jinyan Pan, Zhihao Wang, and Peng Gao. "Kernel alignment unsupervised discriminative dimensionality reduction." Neurocomputing 453 (September 2021): 181–94. http://dx.doi.org/10.1016/j.neucom.2021.03.127.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Crespo, Luis G., Brendon K. Colbert, Sean P. Kenny, and Daniel P. Giesy. "Dimensionality Reduction of Sliced-Normal Distributions." IFAC-PapersOnLine 53, no. 2 (2020): 7412–17. http://dx.doi.org/10.1016/j.ifacol.2020.12.1275.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms." Atmospheric Measurement Techniques 6, no. 9 (2013): 2267–76. http://dx.doi.org/10.5194/amt-6-2267-2013.

Texto completo
Resumen
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may util
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Petty, G. W. "Dimensionality reduction in Bayesian estimation algorithms." Atmospheric Measurement Techniques Discussions 6, no. 2 (2013): 2327–52. http://dx.doi.org/10.5194/amtd-6-2327-2013.

Texto completo
Resumen
Abstract. An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may util
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Mahadev, Preeti, and P. Nagabhushan. "Incremental Dimensionality Reduction in Hyperspectral Data." International Journal of Computer Applications 163, no. 7 (2017): 21–34. http://dx.doi.org/10.5120/ijca2017913575.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Jindal, Priyanka, and Dharmender Kumar. "A Review on Dimensionality Reduction Techniques." International Journal of Computer Applications 173, no. 2 (2017): 42–46. http://dx.doi.org/10.5120/ijca2017915260.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Merola, Giovanni M., and Bovas Abraham. "Dimensionality reduction approach to multivariate prediction." Canadian Journal of Statistics 29, no. 2 (2001): 191–200. http://dx.doi.org/10.2307/3316072.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Zhikai Zhao, Jiansheng Qian, and Jian Cheng. "Marginal Discriminant Projection for Dimensionality Reduction." International Journal of Digital Content Technology and its Applications 6, no. 15 (2012): 1–11. http://dx.doi.org/10.4156/jdcta.vol6.issue15.1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!