Academic literature on the topic 'Elastic-net regularization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Elastic-net regularization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Elastic-net regularization"

1

Faiyaz, Chowdhury Abrar, Pabel Shahrear, Rakibul Alam Shamim, Thilo Strauss, and Taufiquar Khan. "Comparison of Different Radial Basis Function Networks for the Electrical Impedance Tomography (EIT) Inverse Problem." Algorithms 16, no. 10 (2023): 461. http://dx.doi.org/10.3390/a16100461.

Full text
Abstract:
This paper aims to determine whether regularization improves image reconstruction in electrical impedance tomography (EIT) using a radial basis network. The primary purpose is to investigate the effect of regularization to estimate the network parameters of the radial basis function network to solve the inverse problem in EIT. Our approach to studying the efficacy of the radial basis network with regularization is to compare the performance among several different regularizations, mainly Tikhonov, Lasso, and Elastic Net regularization. We vary the network parameters, including the fixed and variable widths for the Gaussian used for the network. We also perform a robustness study for comparison of the different regularizations used. Our results include (1) determining the optimal number of radial basis functions in the network to avoid overfitting; (2) comparison of fixed versus variable Gaussian width with or without regularization; (3) comparison of image reconstruction with or without regularization, in particular, no regularization, Tikhonov, Lasso, and Elastic Net; (4) comparison of both mean square and mean absolute error and the corresponding variance; and (5) comparison of robustness, in particular, the performance of the different methods concerning noise level. We conclude that by looking at the R2 score, one can determine the optimal number of radial basis functions. The fixed-width radial basis function network with regularization results in improved performance. The fixed-width Gaussian with Tikhonov regularization performs very well. The regularization helps reconstruct the images outside of the training data set. The regularization may cause the quality of the reconstruction to deteriorate; however, the stability is much improved. In terms of robustness, the RBF with Lasso and Elastic Net seem very robust compared to Tikhonov.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, Yu-long, and Yun-long Feng. "Learning performance of elastic-net regularization." Mathematical and Computer Modelling 57, no. 5-6 (2013): 1395–407. http://dx.doi.org/10.1016/j.mcm.2012.11.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

De Mol, Christine, Ernesto De Vito, and Lorenzo Rosasco. "Elastic-net regularization in learning theory." Journal of Complexity 25, no. 2 (2009): 201–30. http://dx.doi.org/10.1016/j.jco.2009.01.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Delei, Cheng Wang, Xiongming Lai, Huizhen Zhang, Haibo Li, and Jianwei Chen. "Elastic-net regularization based multi-point vibration response prediction in situation of unknown uncorrelated multiple sources load." International Journal of Applied Electromagnetics and Mechanics 64, no. 1-4 (2020): 649–57. http://dx.doi.org/10.3233/jae-209375.

Full text
Abstract:
In order to reduce the influence of ill-posed inverse on response prediction in the situation of unknown uncorrelated multiple sources load, a response prediction method based on elastic-net regularization in the frequency domain was proposed. This method utilized the linear relationship between known responses and the unknown responses instead of the transfer function to predict the response. Moreover, the elastic-net regularization model has two regularization parameters combining l1, l2 regularization to reduce the influence of ill-posed inverse. The experiment results on the data of acoustic and vibration sources on cylindrical shells showed that the elastic-net regularization in predicting response could obtain higher accurate results compared with the method of transfer function and the method of ordinary least squares, and predict vibration response effectively and satisfy industrial requirements.
APA, Harvard, Vancouver, ISO, and other styles
5

Guo, Lihua. "Extreme Learning Machine with Elastic Net Regularization." Intelligent Automation & Soft Computing 26, no. 3 (2020): 421–27. http://dx.doi.org/10.32604/iasc.2020.013918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dölker, E. M., R. Schmidt, S. Gorges, et al. "Elastic Net Regularization in Lorentz force evaluation." NDT & E International 99 (October 2018): 141–54. http://dx.doi.org/10.1016/j.ndteint.2018.07.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kvyetnyy, R. N., and S. I. Borodkin. "Improved model of ELASTIC NET regularization for financial time series." Optoelectronic Information-Power Technologies 49, no. 1 (2025): 29–35. https://doi.org/10.31649/1681-7893-2025-49-1-29-35.

Full text
Abstract:
This paper proposes a modification of Elastic Net regression for short-term forecasting of financial time series by introducing Gaussian weight decay. The new approach is designed to smooth the abrupt “jumps” between the last historical observation and the first forecast—an issue typical of standard regularization. To assess its effectiveness, we formally derive the Elastic Net model with four weighting schemes (no decay, linear, exponential, and Gaussian) and conduct empirical experiments on the S&P 500, Dow Jones Industrial Average, and Nasdaq Composite indices over the period 2020–2025. The results demonstrate that Gaussian decay minimizes the transition gap and achieves the lowest RMSE and Deviation for the S&P 500 and Nasdaq Composite, whereas exponential decay proves optimal for the Dow Jones Industrial Average.
APA, Harvard, Vancouver, ISO, and other styles
8

LI, HONG, NA CHEN, and LUOQING LI. "ELASTIC-NET REGULARIZATION FOR LOW-RANK MATRIX RECOVERY." International Journal of Wavelets, Multiresolution and Information Processing 10, no. 05 (2012): 1250050. http://dx.doi.org/10.1142/s0219691312500506.

Full text
Abstract:
This paper considers the problem of recovering a low-rank matrix from a small number of measurements consisting of linear combinations of the matrix entries. We extend the elastic-net regularization in compressive sensing to a more general setting, the matrix recovery setting, and consider the elastic-net regularization scheme for matrix recovery. To investigate on the statistical properties of this scheme and in particular on its convergence properties, we set up a suitable mathematic framework. We characterize some properties of the estimator and construct a natural iterative procedure to compute it. The convergence analysis shows that the sequence of iterates converges, which then underlies successful applications of the matrix elastic-net regularization algorithm. In addition, the error bounds of the proposed algorithm for low-rank matrix and even for full-rank matrix are presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
9

Kereta, Zeljko, and Valeriya Naumova. "On an unsupervised method for parameter selection for the elastic net." Mathematics in Engineering 4, no. 6 (2022): 1–36. http://dx.doi.org/10.3934/mine.2022053.

Full text
Abstract:
<abstract><p>Despite recent advances in regularization theory, the issue of parameter selection still remains a challenge for most applications. In a recent work the framework of statistical learning was used to approximate the optimal Tikhonov regularization parameter from noisy data. In this work, we improve their results and extend the analysis to the elastic net regularization. Furthermore, we design a data-driven, automated algorithm for the computation of an approximate regularization parameter. Our analysis combines statistical learning theory with insights from regularization theory. We compare our approach with state-of-the-art parameter selection criteria and show that it has superior accuracy.</p></abstract>
APA, Harvard, Vancouver, ISO, and other styles
10

Nakkiran, Arunadevi, and Vidyaa Thulasiraman. "Elastic net feature selected multivariate discriminant mapreduce classification." Indonesian Journal of Electrical Engineering and Computer Science 26, no. 1 (2022): 587. http://dx.doi.org/10.11591/ijeecs.v26.i1.pp587-596.

Full text
Abstract:
Analyzing the <span>big stream data and other valuable information is a significant task. Several conventional methods are designed to analyze the big stream data. But the scheduling accuracy and time complexity is a significant issue. To resolve, an elastic-net kernelized multivariate discriminant map reduce classification (EKMDMC) is introduced with the novelty of elastic-net regularization-based feature selection and kernelized multivariate fisher Discriminant MapReduce classifier. Initially, the EKMDMC technique executes the feature selection to improve the prediction accuracy using the Elastic-Net regularization method. Elastic-Net regularization method selects relevant features such as central processing unit (CPU) time, memory and bandwidth, energy based on regression function. After selecting relevant features, kernelized multivariate fisher discriminant mapr classifier is used to schedule the tasks to optimize the processing unit. Kernel function is used to find higher similarity of stream data tasks and mean of available classes. Experimental evaluation of proposed EKMDMC technique provides better performance in terms of resource aware predictive scheduling efficiency, false positive rate, scheduling time and memory consumption.</span>
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Elastic-net regularization"

1

Johnsen, Sofia, and Sarah Felldin. "Improving Knowledge of Truck Fuel Consumption Using Data Analysis." Thesis, Linköpings universitet, Reglerteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-130047.

Full text
Abstract:
The large potential of big data and how it has brought value into various industries have been established in research. Since big data has such large potential if handled and analyzed in the right way, revealing information to support decision making in an organization, this thesis is conducted as a case study at an automotive manufacturer with access to large amounts of customer usage data of their vehicles. The reason for performing an analysis of this kind of data is based on the cornerstones of Total Quality Management with the end objective of increasing customer satisfaction of the concerned products or services. The case study includes a data analysis exploring how and if patterns about what affects fuel consumption can be revealed from aggregated customer usage data of trucks linked to truck applications. Based on the case study, conclusions are drawn about how a company can use this type of analysis as well as how to handle the data in order to turn it into business value. The data analysis reveals properties describing truck usage using Factor Analysis and Principal Component Analysis. Especially one property is concluded to be important as it appears in the result of both techniques. Based on these properties the trucks are clustered using k-means and Hierarchical Clustering which shows groups of trucks where the importance of the properties varies. Due to the homogeneity and complexity of the chosen data, the clusters of trucks cannot be linked to truck applications. This would require data that is more easily interpretable. Finally, the importance for fuel consumption in the clusters is explored using model estimation. A comparison of Principal Component Regression (PCR) and the two regularization techniques Lasso and Elastic Net is made. PCR results in poor models difficult to evaluate. The two regularization techniques however outperform PCR, both giving a higher and very similar explained variance. The three techniques do not show obvious similarities in the models and no conclusions can therefore be drawn concerning what is important for fuel consumption. During the data analysis many problems with the data are discovered, which are linked to managerial and technical issues of big data. This leads to for example that some of the parameters interesting for the analysis cannot be used and this is likely to have an impact on the inability to get unanimous results in the model estimations. It is also concluded that the data was not originally intended for this type of analysis of large populations, but rather for testing and engineering purposes. Nevertheless, this type of data still contains valuable information and can be used if managed in the right way. From the case study it can be concluded that in order to use the data for more advanced analysis a big-data plan is needed at a strategic level in the organization. The plan summarizes the suggested solution for the managerial issues of the big data for the organization. This plan describes how to handle the data, how the analytic models revealing the information should be designed and the tools and organizational capabilities needed to support the people using the information.
APA, Harvard, Vancouver, ISO, and other styles
2

Gigli, Pierfrancesco. "Tactical asset allocation and machine learning: empirical findings on weights portfolio optimization with elastic net regularization." Master's thesis, 2020. http://hdl.handle.net/10362/108603.

Full text
Abstract:
This paper studies how a machine learning algorithm can generate tactical allocation which out performs returns fora pre-defined benchmark. We use three distinct and diverse data sets to implement the model which tries to forecast the next month’ sa selected equity index price. The algorithm used to accomplish this task is Elastic Net.Once the predictions are generated from an out-of-sample subset, we elaborate a tactical portfolio allocation aiming to maximize the return of a different combination of classical allocation between bonds and equity,and a risk parity strategy. Finally, we evaluate those returns by comparing them to the benchmark.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Elastic-net regularization"

1

Xin-feng, Zhu, Wang Jian-dong, and Li Bin. "Multivariate Curve Resolution with Elastic Net Regularization." In Lecture Notes in Electrical Engineering. Springer Netherlands, 2011. http://dx.doi.org/10.1007/978-94-007-1839-5_160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lange, Carsten. "Ridge, Lasso, and Elastic-Net — Regularization Explained." In Practical Machine Learning with R. Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003367147-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Jingwei, and Farzan Farnia. "Sparse Domain Transfer via Elastic Net Regularization." In Lecture Notes in Computer Science. Springer Nature Singapore, 2024. https://doi.org/10.1007/978-981-96-0917-8_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gadwal, Madhuri, and Atul Negi. "DLMLP with Elastic-Net Regularization for Hyperspectral Image Classification." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-81821-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cleophas, Ton J., and Aeilko H. Zwinderman. "Optimal Scaling: Regularization Including Ridge, Lasso, and Elastic Net Regression." In Machine Learning in Medicine. Springer Netherlands, 2012. http://dx.doi.org/10.1007/978-94-007-5824-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Tao, and Jinwen Ma. "Bayesian Probit Model with $$ \varvec{L}^{\varvec{\alpha}} $$Lα and Elastic Net Regularization." In Intelligent Computing Theories and Application. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-95930-6_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Fundamentals of Artificial Neural Networks and Deep Learning." In Multivariate Statistical Machine Learning Methods for Genomic Prediction. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_10.

Full text
Abstract:
AbstractIn this chapter, we go through the fundamentals of artificial neural networks and deep learning methods. We describe the inspiration for artificial neural networks and how the methods of deep learning are built. We define the activation function and its role in capturing nonlinear patterns in the input data. We explain the universal approximation theorem for understanding the power and limitation of these methods and describe the main topologies of artificial neural networks that play an important role in the successful implementation of these methods. We also describe loss functions (and their penalized versions) and give details about in which circumstances each of them should be used or preferred. In addition to the Ridge, Lasso, and Elastic Net regularization methods, we provide details of the dropout and the early stopping methods. Finally, we provide the backpropagation method and illustrate it with two simple artificial neural networks.
APA, Harvard, Vancouver, ISO, and other styles
8

Raghava, M., Arun Agarwal, and C. Raghavendra Rao. "A Scalable Spatial Anisotropic Interpolation Approach for Object Removal from Images Using Elastic Net Regularization." In Lecture Notes in Computer Science. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-49397-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hara, Kenji, and Kohei Inoue. "Multi-Metric Near-Optimal Image Denoising." In Denoising - New Insights [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.106710.

Full text
Abstract:
It is necessary to optimize the parameters for each image input to achieve the maximum denoising performance because the performance of denoising algorithms depends largely on the selection of the associated parameters. The commonly used objective image quality measures in quantitatively evaluating a denoised image are PSNR, SSIM, and MS-SSIM, which assume that the original image exists and is fully available as a reference. However, we do not have access to such reference images in many practical applications. Most existing methods for no-reference denoising parameter optimization either use the estimated noise distribution or a unique no-reference image quality evaluation measure. In the chapter, for BM3D, which is a state-of-the-art denoising algorithm, we introduce a natural image statistics (NIS) based on the generalized Gaussian distribution (GGD) and the elastic net regularization (EN) regression method and propose its use to perform the BM3D parameter optimization for PSNR, SSIM, and MS-SSIM, respectively, which are the popular image quality evaluation measures, without reference image and knowledge of the noise distribution. Experimental results with several images demonstrate the effectiveness of the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Elastic-net regularization"

1

Chandra Sekar, P., Ammar H. Shnain, M. Jamuna Rani, G. Durgadevi, and R. Chethana. "A Support Vector Machine with Elastic Net Regularization and Radial Basis Function based Spectrum Sensing for Cognitive Radio Networks." In 2024 First International Conference on Software, Systems and Information Technology (SSITCON). IEEE, 2024. https://doi.org/10.1109/ssitcon62437.2024.10796356.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Causin, P., G. Naldi, and R. M. Weishaeupl. "Elastic Net Regularization in Diffuse Optical Tomography Applications." In 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI). IEEE, 2019. http://dx.doi.org/10.1109/isbi.2019.8759476.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Heredia-Juesas, Juan, Luis Tirado, and Jose A. Martinez-Lorenzo. "Fast Source Reconstruction via ADMM with Elastic Net Regularization." In 2018 IEEE International Symposium on Antennas and Propagation & USNC/URSI National Radio Science Meeting. IEEE, 2018. http://dx.doi.org/10.1109/apusncursinrsm.2018.8608521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Eunwoo, Minsik Lee, and Songhwai Oh. "Elastic-net regularization of singular values for robust subspace learning." In 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2015. http://dx.doi.org/10.1109/cvpr.2015.7298693.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Duo, and Toshihisa Tanaka. "Sparse kernel principal component analysis based on elastic net regularization." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727676.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Qiu, Bing-Jiang, Wen-Sheng Chen, Bo Chen, and Bin-Bin Pan. "A new parallel MRI image reconstruction model with elastic net regularization." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727638.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Angel Gutierrez-Estevez, Miguel, Renato L. G. Cavalcante, and Slawomir Stanczak. "Nonparametric Radio Maps Reconstruction Via Elastic Net Regularization with Multi-Kernels." In 2018 IEEE 19th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC). IEEE, 2018. http://dx.doi.org/10.1109/spawc.2018.8445843.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kelly, J. W., A. D. Degenhart, D. P. Siewiorek, A. Smailagic, and Wei Wang. "Sparse linear regression with elastic net regularization for brain-computer interfaces." In 2012 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2012. http://dx.doi.org/10.1109/embc.2012.6346911.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Wei, Lan Chang, and Qian Du. "Representation-based classification for hyperspectral imagery: An elastic net regularization approach." In 2015 7th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS). IEEE, 2015. http://dx.doi.org/10.1109/whispers.2015.8075436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Srivatsaan, S., Adhith Sankar, and M. Karthikeyan. "Impact of elastic net and LASSO regularization techniques on the NHANES dataset." In 4TH INTERNATIONAL CONFERENCE ON INTERNET OF THINGS 2023: ICIoT2023. AIP Publishing, 2024. http://dx.doi.org/10.1063/5.0217034.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Elastic-net regularization"

1

Cerulli, Giovanni. Machine Learning and AI for Research in Python. Instats Inc., 2023. http://dx.doi.org/10.61700/b7qz5fpva9dar469.

Full text
Abstract:
This seminar is an introduction to Machine Learning and Artificial Intelligence methods for the social, economic, and health sciences using Python. After introducing the subject, the seminar will cover the following methods: (i) model selection and regularization (Lasso, Ridge, Elastic-net); (ii) discriminant analysis and nearest-neighbor classification; and (iii) artificial neural networks. The course will offer various instructional examples using real datasets in Python. An Instats certificate of completion is provided at the end of the seminar, and 2 ECTS equivalent points are offered.
APA, Harvard, Vancouver, ISO, and other styles
2

Cerulli, Giovanni. Machine Learning and AI for Research in Python. Instats Inc., 2023. http://dx.doi.org/10.61700/x92cmhdrxsvu7469.

Full text
Abstract:
This seminar is an introduction to Machine Learning and Artificial Intelligence methods for the social, economic, and health sciences using Python. After introducing the subject, the seminar will cover the following methods: (i) model selection and regularization (Lasso, Ridge, Elastic-net); (ii) discriminant analysis and nearest-neighbor classification; and (iii) artificial neural networks. The course will offer various instructional examples using real datasets in Python. An Instats certificate of completion is provided at the end of the seminar, and 2 ECTS equivalent points are offered.
APA, Harvard, Vancouver, ISO, and other styles
3

Cerulli, Giovanni. Machine Learning and AI for Researchers in R. Instats Inc., 2023. http://dx.doi.org/10.61700/n8rzaz6kghskt469.

Full text
Abstract:
This seminar is an introduction to Machine Learning and Artificial Intelligence methods for the social, economic, and health sciences using R. After introducing the subject, the seminar will cover the following methods: (i) model selection and regularization (Lasso, Ridge, Elastic-net, and subset-selection models); (ii) discriminant analysis and nearest-neighbor classification; and (iii) artificial neural networks. The course will offer various instructional examples using real datasets in R. An Instats certificate of completion is provided at the end of the seminar, and 2 ECTS equivalent points are offered.
APA, Harvard, Vancouver, ISO, and other styles
4

Cerulli, Giovanni. Machine Learning and AI for Researchers in R. Instats Inc., 2023. http://dx.doi.org/10.61700/atz7nxsz9afbm469.

Full text
Abstract:
This seminar is an introduction to Machine Learning and Artificial Intelligence methods for the social, economic, and health sciences using R. After introducing the subject, the seminar will cover the following methods: (i) model selection and regularization (Lasso, Ridge, Elastic-net, and subset-selection models); (ii) discriminant analysis and nearest-neighbor classification; and (iii) artificial neural networks. The course will offer various instructional examples using real datasets in R. An Instats certificate of completion is provided at the end of the seminar, and 2 ECTS equivalent points are offered.
APA, Harvard, Vancouver, ISO, and other styles
5

Cerulli, Giovanni. Machine Learning and AI for Researchers in R. Instats Inc., 2023. http://dx.doi.org/10.61700/w5nn12uvjosgd469.

Full text
Abstract:
This seminar is an introduction to Machine Learning and Artificial Intelligence methods for the social, economic, and health sciences using R. After introducing the subject, the seminar will cover the following methods: (i) model selection and regularization (Lasso, Ridge, Elastic-net, and subset-selection models); (ii) discriminant analysis and nearest-neighbor classification; and (iii) artificial neural networks. The course will offer various instructional examples using real datasets in R. An Instats certificate of completion is provided at the end of the seminar, and 2 ECTS equivalent points are offered.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography