To see the other types of publications on this topic, follow the link: Constructive Cost Model and Software Development Phase.

Journal articles on the topic 'Constructive Cost Model and Software Development Phase'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Constructive Cost Model and Software Development Phase.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Attarzadeh, Iman, and Siew Hock Ow. "Proposing an Effective Artificial Neural Network Architecture to Improve the Precision of Software Cost Estimation Model." International Journal of Software Engineering and Knowledge Engineering 24, no. 06 (2014): 935–53. http://dx.doi.org/10.1142/s0218194014500338.

Full text
Abstract:
Software companies have to manage different software projects based on different time, cost, and manpower requirement, which is a very complex task in software project management. Accurate software estimates at the early phase of software development is one of the crucial objectives and a great challenge in software project management, in the last decades. Since software development attributes are vague and uncertain at the early phase of development, software estimates tend to a certain degree of estimation error. A software development cost estimation model incorporates soft computing techniques provides a solution to fit the vagueness and uncertainty of software attributes. In this paper, an adaptive artificial neural network (ANN) architecture for Constructive Cost Model (COCOMO) is proposed in order to produce accurate software estimates. The ANN is utilized to determine the importance of calibration of the software attributes using past project data in order to produce accurate software estimates. Software project data from the COCOMO I and NASA'93 data sets were used in the evaluation of the proposed model. The result shows an improvement in estimation accuracy of 8.36% of the ANN-COCOMO II when compared with the original COCOMO II.
APA, Harvard, Vancouver, ISO, and other styles
2

Ba’abbad, Ibrahim Mohammad, and M. Rizwan Jameel Qureshi. "Quality Extended Use Case Point (QUCP): An Improved Cost Estimation Method." International Journal of Computer Science and Mobile Computing 10, no. 6 (2021): 1–9. http://dx.doi.org/10.47760/ijcsmc.2021.v10i06.001.

Full text
Abstract:
The quality of a product is one of the major interests of the manufacturing process in all industries. The software industry imposes to construct a project with several phases to ensure producing high-quality software. A software development company estimates time, effort and cost of the project during planning phase. It is important to have accurate estimations to reduce the risks of project failure. Several cost estimation methods are practiced in the software development companies such as Function Point (FP), Use Case Points (UCP), Constructive Cost Model I and II and Story Points (SP). UCP cost estimation method is taken in this research to improve the accuracy of its estimation. UCP estimation depends on the use case diagram of the proposed system. A use case diagram describes the main functional requirements of the proposed system. UCP partially considers non-functional requirements through the technical and environmental factors. There is a lacking in the UCP method to consider the importance of quality attributes in the estimating process. This paper proposes an extended version of the existing UCP method named Quality Extended Use Case Point (QUCP) method in which quality attributes are included to improve the accuracy of cost estimation. A questionnaire is used to validate the proposed QUCP method. It is found after data analysis that seventy five percentages of the participants are agreed that the proposed method will not only help to improve the accuracy of cost estimation but it will also enable a software development company to deliver high-quality products.
APA, Harvard, Vancouver, ISO, and other styles
3

Hasoon, Safwan, and Fatima Younis. "Constructing Expert System to Automatic Translation for Software development." Al-Kitab Journal for Pure Sciences 2, no. 2 (2018): 231–47. http://dx.doi.org/10.32441/kjps.02.02.p16.

Full text
Abstract:
the development in computer fields, especially in the software engineering, emerged the need to construct intelligence tool for automatic translation from design phase to coding phase, for producing the source code from the algorithm model represented in pseudo code, and execute it depending on the constructing expert system which reduces the cost, time and errors that may occur during the translation process, which has been built the knowledge base, inference engine, and the user interface. The knowledge bases consist of the facts and the rules for the automatic transition. The results are compared with a set of neural networks, which are Back propagation neural network, Cascade-Forward network, and Radial Basis Function network. The results showed the superiority of the expert system in automatic transition process speed, as well as easy to add, delete or modify process for rules or data of the pseudo code compared with previously mentioned neural networks.
APA, Harvard, Vancouver, ISO, and other styles
4

Khosakitchalert, Chavanont, Nobuyoshi Yabuki, and Tomohiro Fukuda. "Development of BIM-based quantity takeoff for light-gauge steel wall framing systems." Journal of Information Technology in Construction 25 (December 18, 2020): 522–44. http://dx.doi.org/10.36680/j.itcon.2020.030.

Full text
Abstract:
Quantity takeoff based on building information modeling (BIM) is more reliable, accurate, and rapid than the traditional quantity takeoff approach. However, the quality of BIM models affects the quality of BIM-based quantity takeoff. Our research focuses on drywalls, which consist of wall framings and wall panels. If BIM models from the design phases do not contain wall framing models, contractors or sub-contractors cannot perform quantity takeoff for purchasing materials. Developing wall framing models under a tight schedule in the construction phase is time-consuming, cost-intensive, and error-prone. The increased geometries in a BIM model also slow down the software performance. Therefore, in this research, an automatic method is proposed for calculating quantities of wall framings from drywalls in a BIM model. Building elements that overlap with the drywalls are subtracted from the drywall surfaces before calculation. The quantities of wall framings are then embedded into the properties of drywall in the BIM model and hence they can be extracted directly from the BIM model. A prototype system is developed and the proposed method is validated in an actual construction project. The results of the case study showed that the prototype system took 282 s to deliver accurate quantities of wall framings with deviations of 0.11 to 0.30% when compared to a baseline, and the file size of the BIM model after applying the proposed method was increased very slightly from 47.0 MB to 47.1 MB. This research contributes to developing an approach for quantity takeoff of wall framings that are not present in a BIM model. Accurate quantities of wall framings can be obtained while the time and cost of developing wall framings for quantity takeoff can be saved. The proposed method does not increase the geometries in the BIM model; therefore, the file size of the model does not increase greatly, which stabilizes the software performance.
APA, Harvard, Vancouver, ISO, and other styles
5

Challa, Ratna Kumari, and Kanusu Srinivasa Rao. "An Effective Optimization of Time and Cost Estimation for Prefabrication Construction Management Using Artificial Neural Networks." Revue d'Intelligence Artificielle 36, no. 1 (2022): 115–23. http://dx.doi.org/10.18280/ria.360113.

Full text
Abstract:
The success of every construction business relies on the projects performed in a given period and at the negotiated rate. The construction business includes prefabrication firms, logistics companies, design industries on site and so on. The manufacturing method includes the assembly of structural parts at a development plant and the transport of them onto the building site as finished or semi-assembled components. For optimization, artificial neural networks (ANNs) are used because of their capacity to overcome qualitative and quantitative difficulties in the building industry. An ANN is used to execute the input, hidden, and output layers depending on the weight of the hidden layer. Different modeling strategies maximize the layers. ANN covers a wide variety of issues in construction management, for instance cost analysis, decision making, prediction of the mark-up percentage and the production rate in the construction industry. The main advantage of prefabricated methodology is that the procedure is easily done. The other real benefit of the prefabrication process is its integrated versatility. The present study underlines that the total project period and cost are the key considerations in the current job procurement phase in constructing prefabrication. The results of the proposed model are based on ANN algorithms which mainly achieves the perfect weight values in time and cost estimates.
APA, Harvard, Vancouver, ISO, and other styles
6

Huseynov, Mehdi, Natig Hamidov, and Jabrayil Eyvazov. "Construction of Numerical PVT-Models for the Bulla-Daniz Gas-Condensate Field Based on Laboratory Experiments on Reservoir Fluid Samples." European Journal of Applied Science, Engineering and Technology 2, no. 1 (2024): 26–33. https://doi.org/10.59324/ejaset.2024.2(1).04.

Full text
Abstract:
PVT analysis is important for field-wide optimization and development. This is because we must understand the fluid's overall behavior from the reservoir to the production and processing facilities, and ultimately to the refinery. Modern computer software that uses equation of state (EOS) models to simulate experiments and illustrate fluid phase characteristics has contributed to its growth as a distinct field of study. To find the operating parameters that will maximize the surface liquid content and prolong the production plateau duration at the lowest feasible cost, PVT simulations are run. These simulations employ laboratory-derived data to fine-tune the EOS models, with the outcomes being integrated into reservoir simulation and research. The quality of the data is crucial to getting a good match between EOS and laboratory data, and for retrograde gas condensates, this can be particularly difficult because of their complex phase behavior. When utilized in reservoir simulations, an inadequate match leads to computational mistakes and unrepresentative findings, endangering the reservoir management decisions that depend on it.
APA, Harvard, Vancouver, ISO, and other styles
7

Rianat Abbas, Sunday Jacob Nwanyim, Joy Awoleye Adesina, Augustine Udoka Obu, Adetomiwa Adesokan, and Jeremiah Folorunso. "Secure by design - enhancing software products with AI-Driven security measures." Computer Science & IT Research Journal 6, no. 3 (2025): 184–200. https://doi.org/10.51594/csitrj.v6i3.1880.

Full text
Abstract:
As cyber threats continue to evolve in scale and complexity, traditional reactive security measures no longer suffice. This study explores the integration of AI-driven security within the Secure by Design framework as a forward-looking approach to building inherently secure digital products across industries. Rather than treating security as an afterthought, Secure by Design embeds protective mechanisms—such as encryption, predictive analytics, and real-time threat detection—throughout the product development lifecycle. This research employs quantitative design, surveying 203 professionals from sectors including finance, software development, agriculture, and construction. It investigates the adoption, effectiveness, and challenges of AI-powered security measures, using machine learning algorithms to analyze key security features. The findings reveal that encryption, predictive security, and automated response systems are the most impactful components in strengthening product security. The model achieved a strong performance with an accuracy of 79%, though challenges such as false positives and integration complexity persist. Despite growing awareness, many organizations still address security reactively, with only 14.8% incorporating it during the design phase. Barriers such as limited awareness, cost, and complexity continue to slow adoption. However, 74.9% of respondents express openness to deeper AI integration in future product developments, highlighting optimism about its potential. This study reinforces the need for a proactive shift in security practices, where AI not only supports real-time threat detection but also future-proofs products in an increasingly hostile cyber landscape. By embedding AI into the design phase, organizations can reduce attack surfaces, comply with regulatory demands, and build stakeholder trust. Future research should explore industry-specific implementations, autonomous AI systems in low-tech environments, and the scalability of cross-sector security frameworks. Keywords: Secure by Design, AI-Driven Security, Encryption, Predictive Threat Detection, Machine Learning, Product Development.
APA, Harvard, Vancouver, ISO, and other styles
8

Nevendra, Meetesh, and Pradeep Singh. "Cross-Project Defect Prediction with Metrics Selection and Balancing Approach." Applied Computer Systems 27, no. 2 (2022): 137–48. http://dx.doi.org/10.2478/acss-2022-0015.

Full text
Abstract:
Abstract In software development, defects influence the quality and cost in an undesirable way. Software defect prediction (SDP) is one of the techniques which improves the software quality and testing efficiency by early identification of defects(bug/fault/error). Thus, several experiments have been suggested for defect prediction (DP) techniques. Mainly DP method utilises historical project data for constructing prediction models. SDP performs well within projects until there is an adequate amount of data accessible to train the models. However, if the data are inadequate or limited for the same project, the researchers mainly use Cross-Project Defect Prediction (CPDP). CPDP is a possible alternative option that refers to anticipating defects using prediction models built on historical data from other projects. CPDP is challenging due to its data distribution and domain difference problem. The proposed framework is an effective two-stage approach for CPDP, i.e., model generation and prediction process. In model generation phase, the conglomeration of different pre-processing, including feature selection and class reweights technique, is used to improve the initial data quality. Finally, a fine-tuned efficient bagging and boosting based hybrid ensemble model is developed, which avoids model over -fitting/under-fitting and helps enhance the prediction performance. In the prediction process phase, the generated model predicts the historical data from other projects, which has defects or clean. The framework is evaluated using25 software projects obtained from public repositories. The result analysis shows that the proposed model has achieved a 0.71±0.03 f1-score, which significantly improves the state-of-the-art approaches by 23 % to 60 %.
APA, Harvard, Vancouver, ISO, and other styles
9

Jausovec, Marko, and Metka Sitar. "Comparative Evaluation Model Framework for Cost-Optimal Evaluation of Prefabricated Lightweight System Envelopes in the Early Design Phase." Sustainability 11, no. 18 (2019): 5106. http://dx.doi.org/10.3390/su11185106.

Full text
Abstract:
This paper proposes an extended comparative evaluation model framework (ECEMF) that highlights two objectives: (1) a specific economic evaluation method for the cost-optimisation of prefabricated lightweight system envelopes to achieve a greater value of the building, and (2) a comparative evaluation model framework usable by different profiles of stakeholders, when adopting the decision on the most optimal envelope type in the early design phase. Based on the proposed framework, the analysis was conducted for the case study building representing a small single-family house located in Slovenia. The methodology applied is based on the life cycle cost (LCC) including construction, operation, maintenance, and refurbishment costs, but excluding dismantling, disposal, and reuse, for the period of 50 years’ lifetime of the building which combines the Building Information Modelling (BIM) with Value for Money (VfM) assessment. To exploit the automated evaluation process in the computing environment, several tools were used, including Archicad for BIM in combination with Legep software for LCC. On one hand, the model confirms the assumption that the optimal value parameters of a building do not only depend on the typical costs related to high-performance buildings. On the other hand, from the stakeholders’ view, the model enables the choice of the optimal solution regarding the envelope type to be made in the early design phase. In this view, the model could function as an important decision tool, with a direct economic impact on the value.
APA, Harvard, Vancouver, ISO, and other styles
10

Kanwal, Iqra, and Ali Afzal Malik. "Regression-based predictive modelling of software size of fintech projects using technical specifications." Mehran University Research Journal of Engineering and Technology 44, no. 2 (2025): 164–73. https://doi.org/10.22581/muet1982.3289.

Full text
Abstract:
This research aims to develop a predictive model to estimate the lines of code (LOC) of software projects using technical requirements specifications. It addresses the recurring issue of inaccurate effort and cost estimation in software development that often results in budget overruns and delays. This study includes a detailed analysis of a dataset comprising past real-life software projects. It focuses on extracting relevant predictors from projects' requirements written in technical and easily comprehensible natural language. To assess feasibility, a pilot study is conducted at the beginning. Then, Simple Linear Regression (SLR) is employed to determine the relative predictive strength of eight potential predictors identified earlier. The number of API calls is found to be the strongest independent predictor (R2 = 0.670) of LOC. The subsequent phase entails constructing a software size prediction model using Forward Stepwise Multiple Linear Regression (FSMLR). The adjusted R2 value of the final model indicates that two factors – the number of API calls and the number of GUI fields – account for more than 80% of the variation in code size (measured using LOC). Model validation is performed using k-fold cross-validation. Validation results are also promising. The average MMRE of all folds is 0.203 indicating that, on average, the model's predictions are off by approximately 20% relative to the actual values. The average PRED (25) is 0.708 implying that nearly 71% of predicted size values are within 25% of the actual size values. This model can help project managers in making better decisions regarding project management, budgeting, and scheduling.
APA, Harvard, Vancouver, ISO, and other styles
11

Malviya, Ashwani Kumar, Mehdi Zarehparast Malekzadeh, Francisco Enrique Santarremigia, Gemma Dolores Molero, Ignacio Villalba-Sanchis, and Victor Yepes. "A Formulation Model for Computations to Estimate the Lifecycle Cost of NiZn Batteries." Sustainability 16, no. 5 (2024): 1965. http://dx.doi.org/10.3390/su16051965.

Full text
Abstract:
The increasing demand for electricity and the electrification of various sectors require more efficient and sustainable energy storage solutions. This paper focuses on the novel rechargeable nickel–zinc battery (RNZB) technology, which has the potential to replace the conventional nickel–cadmium battery (NiCd), in terms of safety, performance, environmental impact, and cost. The paper aims to provide a comprehensive and systematic analysis of RNZBs by modeling their lifecycle cost (LCC) from cradle to grave. This paper also applies this LCC model to estimate costs along the RNZB’s lifecycle in both cases: per kilogram of battery mass and per kilowatt hour of energy released. This model is shown to be reliable by comparing its results with costs provided by recognized software used for LCC analysis. A comparison of LCCs for three widely used battery technologies: lead–acid, Li-ion LFP, and NMC batteries, which can be market competitors of NiZn, is also provided. The study concludes that the NiZn battery was found to be the cheapest throughout its entire lifecycle, with NiZn Formulation 1 being the cheapest option. The cost per unit of energy released was also found to be the lowest for NiZn batteries. The current research pain points are the availability of data for nickel–zinc batteries, which are in the research and development phase, while other battery types are already widely used in energy storage. This paper recommends taking into account the location factor of infrastructures, cost of machinery, storage, number of suppliers of raw materials, amount of materials transported in each shipment, and the value of materials recovered after the battery recycling process to further reduce costs throughout the battery’s lifecycle. This LCC model can be also used for other energy storage technologies and serve as objective functions for optimization in further developments.
APA, Harvard, Vancouver, ISO, and other styles
12

Balasubramanyam, N. "Design and Development of a Numerical Model and Study on Kinematic Analysis of a Circular Diamond Saw Blade for Ceramics." International Journal for Research in Applied Science and Engineering Technology 9, no. 12 (2021): 686–91. http://dx.doi.org/10.22214/ijraset.2021.39353.

Full text
Abstract:
Abstract: Diamond tools are currently being used by an increasing number of architects, miners and construction engineers because they are faster and easier to use than older, more traditional instruments like sledge hammers and pneumatic and hydraulic jacks. Bridge and highway surfaces are cut with diamond asphalt and concrete cutting machines to provide for quick, clean, and easy section removal and replacement. The entire cost is reduced since diamond tools take less time and manpower The experiment is carried out to validate the performance of diamond saw blades by taking into consideration characteristics such as normal force, tangential force, cutting speed, cut depth, and peripheral velocity. In present exploration work we are introductory phase of plan conclusion of a jewel device cutting edge with various segmental like 8,12,16,20 corn meal by utilizing Solid works programming we are planning the apparatus cutting edge after that we are imported in Ansys Software for Analysis reason. Computing the necessary qualities for examination and estimations of earthenware tiles likewise are some other stone molecule. Another power model of cutting is presented and inferred numerical demonstrating for chip thickness. Identical chip thickness to coarseness space proportion is gotten from the new power model another outspread opening like profile is presented. Fragmented sort jewel saw sharp edge with the measurement of 400 mm and different portion, for example, 8, 12, 16 and 20 are planned in Solid works effectively. An examination study between existing roundabout outspread space and cone like opening is done to decide deformity, stress dispersion, vibration and temperature conveyance.
APA, Harvard, Vancouver, ISO, and other styles
13

Chen, Denghong, Tianwei Cao, Ke Yang, Ran Chen, Chao Li, and Ruxiang Qin. "Study on the Optimization of Proportion of Fly Ash-Based Solid Waste Filling Material with Low Cost and High Reliability." Sustainability 14, no. 14 (2022): 8530. http://dx.doi.org/10.3390/su14148530.

Full text
Abstract:
In order to solve the problem of the high cost of coal-based solid waste bulk stacking and paste filling in the large-scale coal electrification base in East NingXia, in this study, fly ash is skillfully used to replace the broken coal gangue as the mixed filling material. As using a jaw crusher for crushing large coal gangue is expensive, and its energy consumption is relatively high, paste filler using fly ash as aggregate is studied through micro and macro test analyses. Using response surface methodology design software, 29 groups of mix proportion schemes are designed to obtain the best mix proportion. In addition, the radar results of slump, slump flow, and comprehensive strength are obtained by the normalization method. According to the radar chart results of the three normalized indexes, the optimal ratio parameters are as follows: the fly ash in solid phase is 79%, the mass of fly ash to the mass of cement (FA/C) is 6:1, the solid mass concentration is 78%, the fly ash to gasification slag is 1:1, and the results show that σ3d = 2.20 MPa, slump = 205 mm, and flow = 199 mm. Taking the solid mass concentration, FA/C, the fly ash content in solid phase, and the coal gangue-to-gasification slag ratio as independent variables, the influence of single-factor and multi-factor interactions of the independent variables are analyzed based on the response surface model. It is found that the solid mass concentration and FA/C have a very significant effect on the early strength. Replacing coal gangue base with fly ash base can effectively reduce the crushing cost and energy consumption and provide low-cost and highly reliable technical reserves for large-scale filling.
APA, Harvard, Vancouver, ISO, and other styles
14

Ghanbari, Afshin, Hossein Shirazi, and Mohammad Amin Adibi. "Designing a Model for the Impact of Viral Marketing in Social Networks Using Adaptive Neuro-Fuzzy Inference Systems." Management Strategies and Engineering Sciences 6, no. 5 (2024): 1–8. https://doi.org/10.61838/msesj.6.5.1.

Full text
Abstract:
The aim of this study is to design a model for the impact of viral marketing in social networks using the Adaptive Neuro-Fuzzy Inference System (ANFIS). This research utilizes the Adaptive Neuro-Fuzzy Inference System due to its ability to implement human knowledge through concepts like timestamps and fuzzy rules, its nonlinear nature, adaptability, and superior accuracy compared to other methods in conditions with limited data. These features are among the most significant advantages of ANFIS systems. The MATLAB software was employed in the ANFIS framework to modify inputs and outputs. This research followed the steps of input fuzzification, fuzzy rule base development, fuzzy inference engine construction, aggregation phase, and defuzzification when employing the Adaptive Neuro-Fuzzy Inference System. Value-based marketing relies on the principle that individuals who have used a product or service and had a positive experience share this experience with others, encouraging them to use the product or service as well. Viral marketing is, in a way, a form of partnership where an individual shares their experience with another person who needs the product or service. Due to its broad reach, low cost, high speed, and simplicity, companies can implement controlled viral marketing campaigns through principled and regulatory-compliant marketing efforts, thereby contributing to the growth of the company. This is because, within a short period, many people become familiar with the company and its brand name.
APA, Harvard, Vancouver, ISO, and other styles
15

Jia, Jing, Jieya Gao, Weixin Wang, Ling Ma, Junda Li, and Zijing Zhang. "An Automatic Generation Method of Finite Element Model Based on BIM and Ontology." Buildings 12, no. 11 (2022): 1949. http://dx.doi.org/10.3390/buildings12111949.

Full text
Abstract:
For the mechanical analysis work in the structural design phase, data conversion and information transfer between BIM model and finite element model have become the main factors limiting its efficiency and quality, with the development of BIM (building information modeling) technology application in the whole life cycle. The combined application of BIM and ontology technology has promoted the automation of compliance checking, cost management, green building evaluation, and many other fields. Based on OpenBIM, this study combines IFC (Industry Foundation Classes) and the ontology system and proposes an automatic generation method for converting BIM to the finite element model. Firstly, the elements contained in the finite element model are generalized and the information set requirement, to be extracted or inferred from BIM for the generation of the finite element model, is obtained accordingly. Secondly, the information extraction technical route is constructed to satisfy the acquisition of the information set, including three main aspects, i.e., IFC-based material information, spatial information, and other basic information; ontology-based finite element cell selection method; and APDL statement generation methods based on JAVA, C#, etc. Finally, a complete technical route and a software architecture, designed for converting BIM to the finite element model, are derived. To assess the feasibility of the method, a simple structure is tested in this paper, and the result indicates that the automatic decision-making reasoning mechanism of constructing element type and meshing method can be explored by ontology and IFC. This study contributes to the body of knowledge by providing an efficient method for automatic generation of the BIM structure model and a reference for future applications using BIM in structural analysis.
APA, Harvard, Vancouver, ISO, and other styles
16

Zou, Yiquan, Zhaocheng Sun, Han Pan, Wenlei Tu, and Daode Dong. "Parametric Automated Design and Virtual Simulation of Building Machine Using BIM." Buildings 13, no. 12 (2023): 3011. http://dx.doi.org/10.3390/buildings13123011.

Full text
Abstract:
With the continuous development of the construction technology of the main structure of high-rise buildings, traditional construction techniques have been widely used in high-rise building projects. However, these technologies have problems such as low safety, low intelligence, and poor integrity. In order to solve these challenges, the Third Engineering Bureau of China Construction independently developed a new type of construction technology—a high-rise building integrated work construction platform (referred to as a building machine). Building machines has been gradually applied to high-rise building projects due to its intelligence, integration, safety, and other advantages. However, with the increase in the number of high-rise building projects, the traditional way of designing building machine layout programs is inefficient and the design program is complicated to change. To solve these problems, this paper proposes a building machine parametric design and layout and virtual simulation method. This method uses Blender open-source software to model the building machine parametrically and quickly using its visual programming tool Geometry Nodes (GN) and simulates the building machine in the virtual scene through the Unity3D platform. The results show that, compared with the traditional design mode, the method proposed in this paper can quickly complete the scheme design in the design phase of the building machine and display it through the 3D model, which has a better visualization effect, improves the design efficiency, and reduces the design cost, and the Unity simulation platform can also provide the construction personnel with pre-shift education and simulation of the operation of the building machine. This method provides a theoretical basis and guidance for the digital construction of the building machine.
APA, Harvard, Vancouver, ISO, and other styles
17

Kadirvel, Kanchana, Raju Kannadasan, Mohammed H. Alsharif, and Zong Woo Geem. "Design and Modeling of Modified Interleaved Phase-Shifted Semi-Bridgeless Boost Converter for EV Battery Charging Applications." Sustainability 15, no. 3 (2023): 2712. http://dx.doi.org/10.3390/su15032712.

Full text
Abstract:
Electric vehicles (EVs) are set to become one of the domestic transportation systems that are highly preferred over conventional vehicles. Due to the huge demand for and cost of fuel, many people are switching over to EVs. Companies such as Tesla, BMW, Audi, and Mercedes have started marketing EVs. These EVs need charging stations to charge the batteries. The challenges for EV batteries require the implementation of features such as fast charging, long-run utilization, reduced heat emission, a light weight, and a small size. However, fast charging using conventional converters generates an imbalance in current injection due to the passive component selection. In this study, a converter is proposed that uses an interleaved network that provides a balanced current injection; i.e., an improved interleaved phase-shifted semi-bridgeless boost converter (IIPSSBBC) is designed for EV battery charging applications. The suggested approach is mathematically designed using MATLAB/Simulink (2021) software. The result shows that the battery charging current achieves about 16.5 A, which is relatively more than conventional systems. Moreover, the charging time of the proposed converter is about 6 hrs for a 50 Ah battery with a discharge load capacity of 5000 W, which is relatively less than the conventional method. In a nutshell, compared with conventional converters, the IIPSSBBC performs better, and, notably, the charging speed and current injection are increased by two times the amount. Further, a prototype hardware model is developed to assess the performance of the proposed converter.
APA, Harvard, Vancouver, ISO, and other styles
18

Gorkovchuk, J., and D. Gorkovchuk. "FEATURES OF HERITAGE BIM MODELING BASED ON LASER SCANNING DATA." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVI-5/W1-2022 (February 3, 2022): 123–28. http://dx.doi.org/10.5194/isprs-archives-xlvi-5-w1-2022-123-2022.

Full text
Abstract:
Abstract. BIM modeling technologies are gradually becoming mandatory and necessary in a life cycle of a building or structure. The main difference between BIM and other types of design is the collection and comprehensive processing of all architectural, technological, economic, operational and other information about the building in a single informational environment. Moreover, all elements of the model are interconnected and interdependent, which ensures the maximum proximity of the model to the real situation. The advantages of BIM technology for cultural heritage sites are operational guidance and quality control of restoration and construction works, minimizing the probability of errors in projects, cost reduction and optimization at operation phase. Technological scheme of high-precision information modeling of cultural heritage objects includes data acquisition, modelling of structural elements and accumulation of attributive information in a specialized software environment, quality control, visualization. The data acquisition stage is based on analysis of existing data and documentation, executive surveys. For this purpose, terrestrial laser scanning is a perfect surveying tool due to its speed, accuracy, completeness of data and level of detail. The difficulty of determining the technical characteristics of the elements and their physical properties, can be partially solved with excessive information provided from laser scanning, and the complexity of modeling deformations and deviations, restricts the level of development and detail (LOD) of the BIM model but define use of mixed LOD (LOD200-500) for a cultural heritage object. The factors that determine the LOD of BIM modeling are: the current state of the object, surveying accessibility, availability of documentation etc. Considering such modeling features provides the creation of a informational model that determines the functionality of the object, in respect with its cultural, historical and architectural value.
APA, Harvard, Vancouver, ISO, and other styles
19

M., Sangeetha* Dr. P. Sengottuvelan. "SOFTWARE REFACTORING COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION." INTERNATIONAL JOURNAL OF RESEARCH SCIENCE & MANAGEMENT 4, no. 6 (2017): 43–49. https://doi.org/10.5281/zenodo.583651.

Full text
Abstract:
“Now a day’s software industry major development cost devoted software maintenance. The major challenge for this industry is to produce quality software which is timely designed and build with proper cost estimates. Refactoring is used increasing the ability of software to adopt the new requirements and maintenance In this paper, we have proposed a cost estimation model based on Multi-objective Particle Swarm Optimization (MPSO) to tune the parameters of the famous Constructive Cost Model (COCOMO). This cost estimation model is integrated with Quality Function Deployment (QFD) methodology to support decision making in software designing and development processes for improving the quality. This approach helps to the developers to efficiently plan the overall software development life cycle.
APA, Harvard, Vancouver, ISO, and other styles
20

Manas Prasad Rout. "Advanced Machine Learning Software Cost Prediction Model using AdaBoost and COCOMO Cost Parameters." Journal of Information Systems Engineering and Management 10, no. 49s (2025): 1266–72. https://doi.org/10.52783/jisem.v10i49s.10128.

Full text
Abstract:
Playing a pivotal role in software development, the Constructive Cost Model (COCOMO) offers a systematic and structured approach to cost estimation. It stands as a widely utilized model, aiding project managers in estimating the required effort, time, and cost for software development projects. COCOMO takes into consideration diverse factors, including the project's size, complexity, and the experience of the development team. The utilization of COCOMO empowers software development teams to make informed decisions related to resource allocation, project scheduling, and budgeting. Its application extends to managing expectations, enhancing project planning, and mitigating the risk of cost overruns. The integration of machine learning assumes a critical role in advancing cost estimation within the realm of software development, specifically through COCOMO. Through the utilization of machine learning algorithms, COCOMO gains the capability to analyze and interpret extensive datasets, taking into account numerous complex factors that influence project costs. This work aims to propose a model for software cost estimation using an advanced machine learning technique, i.e. Adaptive boosting, which has improved accuracy, reduced overfitting, effectiveness with imbalanced data, and good generalization capabilities. The proposed work may contribute to the success of software projects, providing a reliable and comprehensive framework for cost estimation.
APA, Harvard, Vancouver, ISO, and other styles
21

Khin, Htay, Mie Aung Mie, Yin Cho Yin, and Moe Thein Moe. "Software Engineering Cost Estimation using COCOMO II Model." International Journal of Trend in Scientific Research and Development 3, no. 5 (2019): 2326–29. https://doi.org/10.5281/zenodo.3591420.

Full text
Abstract:
Estimating the software cost and price to customer is vital role for software engineering. Accurate Software development cost estimation is important for effective project management such as budgeting project planning and control. Before software development processes begin, software cost and duration for any project should be agreement between developers and customers.In this paper, the effort required to develop the system, schedule needed to complete and the required average staff are estimated by using COCOMO constructive cost model model that is one of the algorithmic models. A mathematical formula is use to predict effort based on estimate of project size, thousands of sources of code KSLOC . First, this KSLOC must be calculated for the exiting project or other projects. Then ,the require effort, duration and averaging staffing are continued to calculate for each of three types of projects which are organic, semi detached and embedded All these estimates are implemented with Java Programming Language and Microsoft Access. Khin Htay | Mie Mie Aung | Yin Yin Cho | Moe Moe Thein "Software Engineering Cost Estimation using COCOMO II Model" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28019.pdf
APA, Harvard, Vancouver, ISO, and other styles
22

Mishra, Debasisha, and Biswajit Mahanty. "A study of software development project cost, schedule and quality by outsourcing to low cost destination." Journal of Enterprise Information Management 29, no. 3 (2016): 454–78. http://dx.doi.org/10.1108/jeim-08-2014-0080.

Full text
Abstract:
Purpose – The purpose of this paper is to find good values of onsite-offshore team strength; number of hours of communication between business users and onsite team and between onsite and offshore team so as to reduce project cost and improve schedule in a global software development (GSD) environment for software development project. Design/methodology/approach – This study employs system dynamics simulation approach to study software project characteristics in both co-located and distributed development environments. The authors consulted 14 experts from Indian software outsourcing industry during our model construction and validation. Findings – The study results show that there is a drop in overall team productivity in outsourcing environment by considering the offshore options. But the project cost can be reduced by employing the offshore team for coding and testing work only with minimal training for imparting business knowledge. The research results show that there is a potential to save project cost by being flexible in project schedule. Research limitations/implications – The implication of the study is that the project management team should be careful not to keep high percentage of manpower at offshore location in distributed software environment. A large offshore team can increase project cost and schedule due to higher training overhead, lower productivity and higher error proneness. In GSD, the management effort should be to keep requirement analysis and design work at onsite location and involves the offshore team in coding and testing work. Practical implications – The software project manager can use the model results to divide the software team between onsite and offshore location during various phases of software development in distributed environment. Originality/value – The study is novel as there is little attempt at finding the team distribution between onsite and offshore location in GSD environment.
APA, Harvard, Vancouver, ISO, and other styles
23

Turhan, Cihan, Ali Serdar Atalay, and Gulden Gokcen Akkurt. "An Integrated Decision-Making Framework for Mitigating the Impact of Urban Heat Islands on Energy Consumption and Thermal Comfort of Residential Buildings." Sustainability 15, no. 12 (2023): 9674. http://dx.doi.org/10.3390/su15129674.

Full text
Abstract:
Urban heat island (UHI) is a zone that is significantly warmer than its surrounding rural zones as a result of human activities and rapid and dense urbanization. Excessive air temperature due to the UHI phenomenon affects the energy performance of buildings and human health and contributes to global warming. Knowing that most of the building energy is consumed by residential buildings, therefore, developing a framework to mitigate the impact of the UHI on residential building energy performance is vital. This study develops an integrated framework that combines hybrid micro-climate and building energy performance simulations and multi-criteria decision-making techniques. As a case study, an urban area is analyzed under the Urban GreenUP project funded by the European Union’s Horizon 2020 Programme. Four different strategies to mitigate the UHI effect, including the current situation, changing the low-albedo materials with high-albedo ones, nature-based solutions, and changing building façade materials, are investigated with a micro-climatic simulation tool. Then, the output of the strategies, which is potential air temperature, is used in a dynamic building energy simulation software to obtain energy consumption and thermal comfort data of the residential buildings in the case area. Finally, a multi-criteria decision-making model, using real-life criteria, such as total energy consumption, thermal comfort, capital cost, lifetime and installation flexibility, is used to make a decision for decreasing the UHI effect on residential energy performance of buildings. The results showed that applying NBSs, such as green roofs and changing existing trees with high leaf area density ones, have the highest ranking among all mitigation strategies. The output of this study may help urban planners, architects, and engineers in the decision-making processes during the design phase of urban planning.
APA, Harvard, Vancouver, ISO, and other styles
24

SAMYA, SUMITHRA ALAGAR, VIJAYALAKSHMI NAGARAJAN, AHILAN APPATHURAI, and SALINDA EVELINE SUNIRAM. "SOFTWARE COST EFFORT AND TIME ESTIMATION USING DRAGONFLY WHALE LION OPTIMIZED DEEP NEURAL NETWORK." REVUE ROUMAINE DES SCIENCES TECHNIQUES — SÉRIE ÉLECTROTECHNIQUE ET ÉNERGÉTIQUE 69, no. 4 (2024): 431–36. http://dx.doi.org/10.59277/rrst-ee.2024.69.4.11.

Full text
Abstract:
Effective software development depends on the exact estimation of effort, time, cost, and customer satisfaction. Software project management requires an accurate evaluation of software development's effort, time, and cost, often underestimated or overestimated. So far, methodology has yet to accurately and reliably estimate the cost of software development. To overcome this issue, this paper proposed a constructive rapid application development model based on software cost effort and time estimation approach (CORADMO-based CETA) for accurate software cost estimation. The data requirements, cost drivers, constraints, and priorities are given as input to the fuzzy inference system (FIS). The processed output, such as effort, time, and cost for the nominal plan, shortest schedule plan, and least cost plan, is computed in the FIS. To reduce the effort, time, and cost, the output is optimized by dragonfly whale lion optimization (DWLO), which provides the best-estimated effort, time, and cost as an output for software development. The proposed CORADMO-based CETA model is tested in the NASA 93 dataset using MATLAB. The performance of the CORADMO-based CETA method is measured in terms of Pred (25 %), magnitude of relative error, and mean magnitude of relative error, attaining the values of 80.72 %, 87.94 %, and 98.13 %, respectively. Finally, the CORADMO-based CETA model justifies the suitability of dragonfly whale lion optimization with the proposed fuzzy logic.
APA, Harvard, Vancouver, ISO, and other styles
25

Yadav, Chandra Shekhar, and Raghuraj Singh. "Prediction Model for Object Oriented Software Development Effort Estimation Using One Hidden Layer Feed Forward Neural Network with Genetic Algorithm." Advances in Software Engineering 2014 (June 3, 2014): 1–6. http://dx.doi.org/10.1155/2014/284531.

Full text
Abstract:
The budget computation for software development is affected by the prediction of software development effort and schedule. Software development effort and schedule can be predicted precisely on the basis of past software project data sets. In this paper, a model for object-oriented software development effort estimation using one hidden layer feed forward neural network (OHFNN) has been developed. The model has been further optimized with the help of genetic algorithm by taking weight vector obtained from OHFNN as initial population for the genetic algorithm. Convergence has been obtained by minimizing the sum of squared errors of each input vector and optimal weight vector has been determined to predict the software development effort. The model has been empirically validated on the PROMISE software engineering repository dataset. Performance of the model is more accurate than the well-established constructive cost model (COCOMO).
APA, Harvard, Vancouver, ISO, and other styles
26

Najm, Assia, Abdelali Zakrani, and Abdelaziz Marzak. "Cluster-based fuzzy regression trees for software cost prediction." Indonesian Journal of Electrical Engineering and Computer Science 27, no. 2 (2022): 1138–50. https://doi.org/10.11591/ijeecs.v27.i2.pp1138-1150.

Full text
Abstract:
The current paper proposes a novel type of decision tree, which is never used for software development cost prediction (SDCP) purposes, the clusterbased fuzzy regression tree (CFRT). This model uses the fuzzy k-means (FKM), which deals with data uncertainty and imprecision. The tree expansion is based on the variability measure by choosing the node with the highest value of granulation diversity. This paper outlined an experimental study comparing CFRT with four SDCP methods, notably linear regression, multi-layer perceptron, K-nearest-neighbors, and classification and regression trees (CART), employing eight datasets and the leave-one-out cross-validation (LOOCV). The results show that CFRT is among the best, ranked first in 3 datasets according to four accuracy measures. Also, according to the Pred(25%) values, the proposed CFRT model outperformed all the twelve compared techniques in four datasets: Albrecht, constructive cost model (COCOMO), Desharnais, and The International Software Benchmarking Standards Group (ISBSG) using LOOCV and 30-fold crossvalidation technique.
APA, Harvard, Vancouver, ISO, and other styles
27

Denard, Samuel, Atila Ertas, Susan Mengel, and Stephen Ekwaro-Osire. "Development Cycle Modeling: Resource Estimation." Applied Sciences 10, no. 14 (2020): 5013. http://dx.doi.org/10.3390/app10145013.

Full text
Abstract:
This paper presents results produced by a domain-independent system development model that enables objective and quantitative calculation of certain development cycle characteristics. The presentation recounts the model’s motivation and includes an outline of the model’s structure. The outline shows that the model is constructive. As such, it provides an explanatory mechanism for the results that it produces, not just a representation of qualitative observations or measured data. The model is a Statistical Agent-based Model of Development and Evaluation (SAbMDE); and it appears to be novel with respect to previous design theory and methodology work. This paper focuses on one development cycle characteristic: resource utilization. The model’s resource estimation capability is compared to Boehm’s long-used software development estimation techniques. His Cone of Uncertainty (COU) captures project estimation accuracy empirically at project start but intuitively over a project’s duration. SAbMDE calculates estimation accuracy at start up and over project duration; and SAbMDE duplicates the COU’s empirical values. Additionally, SAbMDE produces results very similar to the Constructive Cost Model (COCOMO) effort estimation for a wide range of input values.
APA, Harvard, Vancouver, ISO, and other styles
28

Chhabra, Sonia, and Harvir Singh. "Optimizing Design of Fuzzy Model for Software Cost Estimation Using Particle Swarm Optimization Algorithm." International Journal of Computational Intelligence and Applications 19, no. 01 (2020): 2050005. http://dx.doi.org/10.1142/s1469026820500054.

Full text
Abstract:
Estimation of software cost and effort is of prime importance in software development process. Accurate and reliable estimation plays a vital role in successful completion of the project. To estimate software cost, various techniques have been used. Constructive Cost Model (COCOMO) is amongst most prominent algorithmic model used for cost estimation. Different versions of COCOMO consider different types of parameters affecting overall cost. Parameters involved in estimation using COCOMO possess vagueness which introduces some degree of uncertainty in algorithmic modelling. The concept of fuzzy logic can deal with uncertainty involved in Intermediate COCOMO cost driver measurements via Fuzzy Inference System (FIS). In the proposed research, an effort has been made wherein, for each cost driver, an FIS is designed to calculate the corresponding effort multiplier. Proposed research provides an insight through evolutionary-based optimization techniques to optimize fuzzy logic-based COCOMO using Particle Swarm Optimization Algorithm. The magnitude of relative error and its mean, calculated using COCOMO NASA2 and COCOMONASA datasets are used as evaluation metrics to validate the proposed model. The model outperforms when compared to other optimization techniques like Genetic Algorithm.
APA, Harvard, Vancouver, ISO, and other styles
29

Zakaria, Noor Azura, Amelia Ritahani Ismail, Nadzurah Zainal Abidin, Nur Hidayah Mohd Khalid, and Afrujaan Yakath Ali. "Optimized COCOMO parameters using hybrid particle swarm optimization." International Journal of Advances in Intelligent Informatics 7, no. 2 (2021): 177. http://dx.doi.org/10.26555/ijain.v7i2.583.

Full text
Abstract:
Software effort and cost estimation are crucial parts of software project development. It determines the budget, time, and resources needed to develop a software project. The success of a software project development depends mainly on the accuracy of software effort and cost estimation. A poor estimation will impact the result, which worsens the project management. Various software effort estimation model has been introduced to resolve this problem. COnstructive COst MOdel (COCOMO) is a well-established software project estimation model; however, it lacks accuracy in effort and cost estimation, especially for current projects. Inaccuracy and complexity in the estimated effort have made it difficult to efficiently and effectively develop software, affecting the schedule, cost, and uncertain estimation directly. In this paper, Particle Swarm Optimization (PSO) is proposed as a metaheuristics optimization method to hybrid with three traditional state-of-art techniques such as Support Vector Machine (SVM), Linear Regression (LR), and Random Forest (RF) for optimizing the parameters of COCOMO models. The proposed approach is applied to the NASA software project dataset downloaded from the promise repository. Comparing the proposed approach has been made with the three traditional algorithms; however, the obtained results confirm low accuracy before hybrid with PSO. Overall, the results showed that PSOSVM on the NASA software project dataset could improve effort estimation accuracy and outperform other models.
APA, Harvard, Vancouver, ISO, and other styles
30

Mishra, Gaurav, P. K. Kapur, and A. K. Shrivastava. "Multi Release Cost Model — A New Perspective." International Journal of Reliability, Quality and Safety Engineering 24, no. 06 (2017): 1740007. http://dx.doi.org/10.1142/s0218539317400071.

Full text
Abstract:
Software testing plays a crucial role in software development process to achieve high quality software. Increased demand for new features, market competition, time and resource limitation has shortened the software lifecycle. To be in the market, software companies have to release new versions of software with new functionalities. Many ideologies and cost models have been formulated in the past decades with the idea of when to release the software for single and multi-versions. In the existing literature, the total debugging cost for the first version and next versions includes the cost of debugging in the operational phase along with the testing cost. It is assumed that the software is supported until the operational phase is over. Also, it is considered that some of the remaining faults of the previous release are reported and removed during the testing period of the new versions, which shows the repetition of cost for debugging the remaining number of faults of the previous release and results as increased cost of the software for the next release. In the present paper, we provide a new perspective on the release cost. To overcome the above problem in this paper, we describe a generalized cost model in which the remaining faults of the current version are removed during the operational phase of the current version and partly in the testing period of the next version. Application of the proposed cost model is shown on a real fault counting data for multi-versions of a software.
APA, Harvard, Vancouver, ISO, and other styles
31

Najm, Assia, Abdelali Zakrani, and Abdelaziz Marzak. "Cluster-based fuzzy regression trees for software cost prediction." Indonesian Journal of Electrical Engineering and Computer Science 27, no. 2 (2022): 1138. http://dx.doi.org/10.11591/ijeecs.v27.i2.pp1138-1150.

Full text
Abstract:
The current paper <span lang="EN-US">proposes a novel type of decision tree, which is never used for software development cost prediction (SDCP) purposes, the cluster-based fuzzy regression tree (CFRT). This model uses the fuzzy k-means (FKM), which deals with data uncertainty and imprecision. The tree expansion is based on the variability measure by choosing the node with the highest value of granulation diversity. This paper outlined an experimental study comparing CFRT with four SDCP methods, notably linear regression, multi-layer perceptron, K-nearest-neighbors, and classification and regression trees (CART), employing eight datasets and the leave-one-out cross-validation (LOOCV). The results show that CFRT is among the best, ranked first in 3 datasets according to four accuracy measures. Also, according to the Pred(25%) values, the proposed CFRT model outperformed all the twelve compared techniques in four datasets: Albrecht, constructive cost model (COCOMO), Desharnais, and The International Software Benchmarking Standards Group (ISBSG) using LOOCV and 30-fold cross-validation technique.</span>
APA, Harvard, Vancouver, ISO, and other styles
32

Mishra, Debasisha, and Biswajit Mahanty. "The effect of onsite-offshore work division on project cost, schedule, and quality for re-engineering projects in Indian outsourcing software industry." Strategic Outsourcing: An International Journal 7, no. 3 (2014): 198–225. http://dx.doi.org/10.1108/so-06-2014-0010.

Full text
Abstract:
Purpose – The aim of this paper is to make an attempt to find good values of onsite–offshore team strength; number of hours of communication between business users and onsite team and between onsite and offshore team to reduce cost and improve schedule for re-engineering projects in global software development environment. Design/methodology/approach – The system dynamics technique is used for simulation model construction and policy run experimentation. The experts from Indian software outsourcing industry were consulted for model construction, validation and analysis of policy run results in both co-located and distributed software development environment. Findings – The study results show that there is a drop in the overall team productivity in outsourcing environment by considering the offshore options. But the project cost can be reduced by employing the offshore team for coding and testing work only with minimal training for imparting business knowledge. The research results show that there is a potential to save project cost by being flexible in project schedule. Research limitations/implications – The study found that there could be substantial cost saving for re-engineering projects with a loss of project schedule when an appropriate onsite–offshore combination is used. The quality and productivity drop, however, were rather small for such combinations. The cost savings are high when re-engineering work is sent to offshore location entirely after completion of requirement analysis work at onsite location and providing training to offshore team in business knowledge The research findings show that there is potential to make large cost savings by being flexible in project schedule for re-engineering projects. Practical implications – The software project manager can use the model results to divide the software team between onsite and offshore location during various phases of software development in distributed environment. Originality/value – The study is novel as there is little attempt at finding the team distribution between onsite and offshore location in global software development environment.
APA, Harvard, Vancouver, ISO, and other styles
33

Sonal, M. Derle, and M. Gawande Ranjit. "SOFTWARE EFFORT ESTIMATION USING MACHINE LEARNING." Journal of the Maharaja Sayajirao University of Baroda 59, no. 1 (I) (2025): 415–27. https://doi.org/10.5281/zenodo.15250495.

Full text
Abstract:
AbstractSoftware cost and effort estimation is a critical task in software engineering. Research in this fieldcontinually evolves with new techniques, requiring periodic comparative analyses. Accurate softwarecost estimation is vital for project success, as it helps identify the challenges and risks involved indevelopment. The wide variety of machine learning (ML) and non-ML techniques has led tocomparisons and the integration of these methods, making it essential to determine the most effectiveestimation techniques to enhance the project development process.The review follows a three-stage approach: planning (Tollgate approach), conducting (Likert-typescale), and reporting results from five well-known digital libraries. Among the 52 selected articles, theartificial neural network (ANN) model and the Constructive Cost Model (COCOMO) have beenidentified as the most favored techniques. The mean magnitude of relative error (MMRE) is thepreferred accuracy metric, with software engineering and project management being the most relevantfields. The PROMISE repository has emerged as the most widely accessed database. Keywords: Software cost estimation, systematic literature review, tollgate approach, Likert scale,quality assessment, software dependability.
APA, Harvard, Vancouver, ISO, and other styles
34

Chatzipetrou, Panagiota. "Software Cost Estimation." International Journal of Service Science, Management, Engineering, and Technology 10, no. 3 (2019): 14–31. http://dx.doi.org/10.4018/ijssmet.2019070102.

Full text
Abstract:
Software cost estimation (SCE) is a critical phase in software development projects. A common problem in building software cost models is that the available datasets contain projects with lots of missing categorical data. There are several techniques for handling missing data in the context of SCE. The purpose of this article is to show a state-of-art statistical and visualization approach of evaluating and comparing the effect of missing data on the accuracy of cost estimation models. Five missing data techniques were used: multinomial logistic regression, listwise deletion, mean imputation, expectation maximization and regression imputation; and compared with respect to their effect on the prediction accuracy of a least squares regression cost model. The evaluation is based on various expressions of the prediction error. The comparisons are conducted using statistical tests, resampling techniques and visualization tools like the regression error characteristic curves.
APA, Harvard, Vancouver, ISO, and other styles
35

Siau, Keng, and Yuhong Tian. "Open Source Software Development Process Model." Journal of Global Information Management 21, no. 4 (2013): 103–20. http://dx.doi.org/10.4018/jgim.2013100106.

Full text
Abstract:
The global open source movement has provided software users with more choices, lower software acquisition cost, more flexible software customization, and possibly higher quality software product. Although the development of open source software is dynamic and it encourages innovations, the process can be chaotic and involve members around the globe. An Open Source Software Development (OSSD) process model to enhance the survivability of OSSD projects is needed. This research uses the grounded theory approach to derive a Phase-Role-Skill-Responsibility (PRSR) OSSD process model. The three OSSD process phases -- Launch Stage, Before the First Release, and Between Releases -- address the characteristics of the OSSD process as well as factors that influence the OSSD process. In the PRSR model, different roles/actors are required to have different skills and responsibilities corresponding to each of the three OSSD process phases. This qualitative research contributes to the software development literature as well as open source practice.
APA, Harvard, Vancouver, ISO, and other styles
36

Kumar, Dr Prakash. "Comparative Study of Different Project Size Estimation Technique for the Development of Software." International Journal of Computer Science and Mobile Computing 6, no. 12 (2017): 157–63. http://dx.doi.org/10.47760/ijcsmc.2021.v06i12.001.

Full text
Abstract:
In SDLC (Software development life cycle model) there are various phase we use to develop the software in that the one is planning phase in this phase we use some estimation technique for estimate the Size, Cost, Effort etc for the software. The main objective of software engineering discipline is to develop the software in systematic and discipline manner as per user requirement. And also the software should deliver in time and in budget. To acquiring this feature is called planning of the software i.e. how much it take time and cost to complete and effort required form development is depend on nature of the software. The objective of this paper is to find out advantages and shortcoming of different Size estimation technique. In this paper we compared all traditional approach for size estimation technique.
APA, Harvard, Vancouver, ISO, and other styles
37

Strelko, Oleksandr, Manish Raj Aryal, Abigail Zack, et al. "Early Challenges in the Implementation of Automated CranialRebuild Freeware for Generation of Patient-Specific Cranial Implant Using Additive Manufacturing: A Pilot Project in Review." Biomimetics 9, no. 7 (2024): 430. http://dx.doi.org/10.3390/biomimetics9070430.

Full text
Abstract:
Traumatic Brain Injury (TBI) is a significant global health concern, particularly in low- and middle-income countries (LMICs) where access to medical resources is limited. Decompressive craniectomy (DHC) is a common procedure to alleviate elevated intracranial pressure (ICP) following TBI, but the cost of subsequent cranioplasty can be prohibitive, especially in resource-constrained settings. We describe challenges encountered during the beta-testing phase of CranialRebuild 1.0, an automated software program tasked with creating patient-specific cranial implants (PSCIs) from CT images. Two pilot clinical teams in the Philippines and Ukraine tested the software, providing feedback on its functionality and challenges encountered. The constructive feedback from the Philippine and Ukrainian teams highlighted challenges related to CT scan parameters, DICOM file arrays, software limitations, and the need for further software improvements. CranialRebuild 1.0 shows promise in addressing the need for affordable PSCIs in LMICs. Challenges and improvement suggestions identified throughout the beta-testing phase will shape the development of CranialRebuild 2.0, with the aim of enhancing its functionality and usability. Further research is needed to validate the software’s efficacy in a clinical setting and assess its cost-effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
38

Tae, Jin Yang. "Comparative Study on the Attributes Analysis of Software Development Cost Model Based on Exponential-type Lifetime Distribution." International Journal of Emerging Technology and Advanced Engineering 11, no. 10 (2021): 166–76. http://dx.doi.org/10.46338/ijetae1021_20.

Full text
Abstract:
In this paper, the development cost attributes were newly analyzed by applying the Exponential–type lifetime distributions (Burr-Hatke-exponential, ExponentialBasic, Exponential-exponential, Inverse-exponential) widely utilized in the field of software lifetime testing and quality evaluation to the software development cost model. Also, to verify the attributes of the analyzed development cost, after analyzing the future reliability, the optimal cost model was presented. For this study, an analysis algorithm using software failure time data was proposed to solve the research solution, the maximum likelihood estimation (MLE) was applied to solve the parameter values, and the nonlinear equation was calculated using the binary method. Simulations show that if the cost of removing one flaw found during the test phase increases, the development cost increases, but the release time does not change. However, if the cost of flaws correction discovered by operators increased, both development costs and release times increased. Therefore, we must remove all possible flaws at the testing stage to eliminate failures. In conclusion, First, the Exponential-exponential distribution model showed the best performance among the proposed models because it had the lowest software development cost and the highest future reliability. Second, the software development cost attributes of the Exponential-type lifetime distributions were newly analyzed. Third, through this data, it was able to help software developers to analyze the most economical development cost
APA, Harvard, Vancouver, ISO, and other styles
39

Singh, Ompal, P. K. Kapur, A. K. Shrivastava, and Gaurav Mishra. "A Multi Release Cost Model in Distributed Environment." International Journal of Reliability, Quality and Safety Engineering 24, no. 01 (2017): 1750001. http://dx.doi.org/10.1142/s0218539317500012.

Full text
Abstract:
Software industry has reached far beyond since its origin. Technical advancements are taking place at a speed faster than ever. This has further enhanced the pressure on software developers. They are trying hard to keep a pace with these rapid developments by coming out with strategies to increase the pace of their work without significantly affecting the software quality and reliability. One such strategy is distributed development of software in which the software is composed of two different types of components viz. newly developed and re-used component. Software developed in Distributed Development Environment (DDE) is characterized by enhanced availability and reliability. At the same time, the ever growing contention in the market and increasing needs of customers obligate the developers to come out with enhanced functionalities in the software from time to time leading to multiple releases of a software. The added functionalities further enhance the existing complexity of the software and testing team may not be able to perfectly remove the faults leading to imperfect debugging or add more bugs due to lack of knowledge about the software in the initial phase known as error generation. In this paper, we incorporated this real-life phenomenon to come out with a multi up-gradations modeling for removal of software fault in a distributed environment. In the present framework we described a cost model to obtain optimal release time in multi up-gradation of software under distributed environment. To validate the analytical results of the proposed framework, numerical illustration is provided.
APA, Harvard, Vancouver, ISO, and other styles
40

Okamura, Hiroyuki, and Tadashi Dohi. "Optimizing Testing-Resource Allocation Using Architecture-Based Software Reliability Model." Journal of Optimization 2018 (September 27, 2018): 1–7. http://dx.doi.org/10.1155/2018/6948656.

Full text
Abstract:
In the management of software testing, testing-recourse allocation is one of the most important problems due to the tradeoff between development cost and reliability of released software. This paper presents the model-based approach to design the testing-resource allocation. In particular, we employ the architecture-based software reliability model with operational profile to estimate the quantitative software reliability in operation phase and formulate the multiobjective optimization problems with respect to cost, testing effort, and software reliability. In numerical experiment, we investigate the difference of the presented optimization problem from the existing testing-resource allocation model.
APA, Harvard, Vancouver, ISO, and other styles
41

Kholed, Langsari, Sarno Riyanarto, and Sholiq. "Optimizing Time and Effort Parameters of COCOMO II Using Fuzzy Multi-objective Particle Swarm Optimization." TELKOMNIKA Telecommunication, Computing, Electronics and Control 16, no. 5 (2018): 2199–207. https://doi.org/10.12928/TELKOMNIKA.v16i5.9698.

Full text
Abstract:
Estimating the efforts, costs, and schedules of software projects is a frequent challenge to software development projects. A bad estimation will result in bad management of a project. Various models of estimation have been defined to complete this estimate. The Constructive Cost Model II (COCOMO II) is one of the most famous models as a model for estimating efforts, costs, and schedules. To estimate the effort, cost, and schedule in project of software, the COCOMO II uses inputs: Effort Multiplier (EM), Scale Factor (SF), and Source Line of Code (SLOC). Evidently, this model is still lack in terms of accuracy rates in both efforts estimated and time of development. In this paper, we introduced to use Gaussian Membership Function (GMF) of Fuzzy Logic and Multi-Objective Particle Swarm Optimization (MOPSO) method to calibrate and optimize the parameters of COCOMO II. It is to achieve a new level of accuracy better on COCOMO II. The Nasa93 dataset is used to implement the method proposed. The experimental results of the method proposed have reduced the error downto 11.89% and 8.08% compared to the original COCOMO II. This method proposed has achieved better results than previous studies.
APA, Harvard, Vancouver, ISO, and other styles
42

Jatinderkumar, R. Saini, and S. Chomal Vikas. "On Effort Distribution in Software Project Development for Academic Domain." International Journal of Engineering and Advanced Technology (IJEAT) 9, no. 3 (2020): 1755–61. https://doi.org/10.35940/ijeat.C4706.029320.

Full text
Abstract:
Effort distribution in software engineering is a well-known term used to measure cost and effort estimation for each and every phase or activity in software development. Effort distribution is taken in consideration in almost all IT companies while developing software. But it is mostly not considered or overlooked in developing academic software projects by students of computer science courses. The paper presents with results of an experimentation on phase effort distribution data of 84 software academic projects of post graduate final year students of computer science. The phase effort distribution provided by students were collected, analyzed and compared with COCOMO II model which provides effort distribution required in software development. Finally, this paper also discusses and provides recommendation about the use and importance of effort distribution in academic software projects development.
APA, Harvard, Vancouver, ISO, and other styles
43

Naveen, Malik, Kumar Goyal Sandip, and Malik Vinisha. "Software Quality Assesment using COCOMO-II Metrics with ABC and NN." International Journal of Engineering and Advanced Technology (IJEAT) 9, no. 2 (2020): 2982–88. https://doi.org/10.35940/ijeat.C5633.029320.

Full text
Abstract:
Time, cost and quality predictions are the key aspects of any software development system. Loses that result due to wrong estimations may lead to irresistible damage. It is observed that a badly estimated project always results into a bad quality output as the efforts are put in the wrong direction. In the present study, author proposed ABC-COCOMO-II as a new model and tried to enhance the extent of accuracy in effort quality assessment through effort estimation. In the proposed model author combined the strengths of COCOMO-II (Constructive Cost Model) with the Artificial Bee Colony (ABC) and Neural Network (NN). In the present work, ABC algorithm is used to select the best solution, NN is used for the classification purpose to improve the quality estimation using COCOMO-II. The results are compared and evaluated with the pre-existing effort estimation models. The simulation results had shown that the proposed combination outperformed in terms of quality estimation with small variation of 5-10% in comparison to the actual effort, which further leads to betterment of the quality. More than 90% projects results into high quality output for the proposed algorithmic architecture.
APA, Harvard, Vancouver, ISO, and other styles
44

ARUMUGAM, CHAMUNDESWARI, and CHITRA BABU. "DEVELOPMENTAL SIZE ESTIMATION FOR OBJECT-ORIENTED SOFTWARE BASED ON ANALYSIS MODEL." International Journal of Software Engineering and Knowledge Engineering 23, no. 03 (2013): 289–308. http://dx.doi.org/10.1142/s0218194013500083.

Full text
Abstract:
Software size estimation at the early analysis phase of software development lifecycle is crucial for predicting the associated effort and cost. Analysis phase captures the functionality addressed in the software to be developed in object-oriented software development life-cycle. Unified modeling language captures the functionality of the software at the analysis phase based on use case model. This paper proposes a new method named as use case model function point to estimate the size of the object-oriented software at the analysis phase itself. While this approach is based on use case model, it also adapts the function point analysis technique to use case model. The various features such as actors, use cases, relationship, external reference, flows, and messages are extracted from use case model. Eleven rules have been derived as guidelines to identify the use case model components. The function point analysis components are appropriately mapped to use case model components and the complexity based on the weightage is specified to calculate use case model function point. This proposed size estimation approach has been evaluated with the object-oriented software developed in our software engineering laboratory to assess its ability to predict the developmental size. The results are empirically analysed based on statistical correlation for substantiating the proposed estimation method.
APA, Harvard, Vancouver, ISO, and other styles
45

Verma, Vibha, Sameer Anand, and Anu Gupta Aggarwal. "Software warranty cost optimization under imperfect debugging." International Journal of Quality & Reliability Management 37, no. 9/10 (2019): 1233–57. http://dx.doi.org/10.1108/ijqrm-03-2019-0088.

Full text
Abstract:
Purpose The purpose of this paper is to identify and quantify the key components of the overall cost of software development when warranty coverage is given by a developer. Also, the authors have studied the impact of imperfect debugging on the optimal release time, warranty policy and development cost which signifies that it is important for the developers to control the parameters that cause a sharp increase in cost. Design/methodology/approach An optimization problem is formulated to minimize software development cost by considering imperfect fault removal process, faults generation at a constant rate and an environmental factor to differentiate the operational phase from the testing phase. Another optimization problem under perfect debugging conditions, i.e. without error generation is constructed for comparison. These optimization models are solved in MATLAB, and their solutions provide insights to the degree of impact of imperfect debugging on the optimal policies with respect to software release time and warranty time. Findings A real-life fault data set of Radar System is used to study the impact of various cost factors via sensitivity analysis on release and warranty policy. If firms tend to provide warranty for a longer period of time, then they may have to bear losses due to increased debugging cost with more number of failures occurring during the warrantied time but if the warranty is not provided for sufficient time it may not act as sufficient hedge during field failures. Originality/value Every firm is fighting to remain in the competition and expand market share by offering the latest technology-based products, using innovative marketing strategies. Warranty is one such strategic tool to promote the product among masses and develop a sense of quality in the user’s mind. In this paper, the failures encountered during development and after software release are considered to model the failure process.
APA, Harvard, Vancouver, ISO, and other styles
46

Akumba, Beatrice O., Samera U. Otor, Iorshase Agaji, and Barnabas T. Akumba. "A Predictive Risk Model for Software Projects’ Requirement Gathering Phase." International Journal of Innovative Science and Research Technology 5, no. 6 (2020): 231–36. http://dx.doi.org/10.38124/ijisrt20jun066.

Full text
Abstract:
The initial stage of the software development lifecycle is the requirement gathering and analysis phase. Predicting risk at this phase is very crucial because cost and efforts can be saved while improving the quality and efficiency of the software to be developed. The datasets for software requirements risk prediction have been adopted in this paper to predict the risk levels across the software projects and to ascertain the attributes that contribute to the recognized risk in the software projects. A supervised machine learning technique was used to predict the risk across the projects using Naïve Bayes Classifier technique. The model was able to predict the risks across the projects and the performance metrics of the risk attributes were evaluated. The model predicted four (4) as Catastrophic, eleven (11) as High, eighteen (18) as Moderate, thirty-three (33) as Low and seven (7) as insignificant. The overall confusion matrix statistics on the risk levels prediction by the model had accuracy to be 98% with confidence interval (CI) of 95% and Kappa 97%.
APA, Harvard, Vancouver, ISO, and other styles
47

Ivanov, Ivan, and Mario Angelov. "DESIGN AND IMPLEMENTATION OF SOFTWARE-DEFINED PI/4-DQPSK MODEM WITH RECEIVE ANTENNA DIVERSITY." ENVIRONMENT. TECHNOLOGIES. RESOURCES. Proceedings of the International Scientific and Practical Conference 3 (June 22, 2024): 101–4. http://dx.doi.org/10.17770/etr2024vol3.8108.

Full text
Abstract:
Software-defined radio (SDR) is leading concept nowadays, for development of multifunctional radio systems. Article addresses design and implementation on SDR platform of digital modulator/demodulator (modem) with pi/4 differential quadrature phase shift keying modulation (pi/4-DQPSK) and antenna diversity in the receiver side. A model of the system was created in GNU Radio Framework. Experimental results of bit error rate (BER) in presence of additive white Gaussian noise (AWGN) is obtained through simulation and compared with no diversity system. Superiority of diversity scheme, based on criteria BER, is confirmed by numerical results. An experimental, model-based RF DQPSK modem was implemented with Universal Software Radio Peripheral (USRP) frontend. This step from development process confirms advantages of SDR concept, verifies model implementation trough ability to exchange digital information from the transceiver to the receiver in indoor environment and capability of constructive elements to support cocherense. For future implementation of full functional radio communication system, the need of additional blocks for synchronization is identified.
APA, Harvard, Vancouver, ISO, and other styles
48

WU, ZHIQIAO, JIAFU TANG, C. K. KWONG, and C. Y. CHAN. "AN OPTIMIZATION MODEL FOR REUSE SCENARIO SELECTION CONSIDERING RELIABILITY AND COST IN SOFTWARE PRODUCT LINE DEVELOPMENT." International Journal of Information Technology & Decision Making 10, no. 05 (2011): 811–41. http://dx.doi.org/10.1142/s0219622011004580.

Full text
Abstract:
In this paper, a model that assists developers to evaluate and compare alternative reuse scenarios in software product line (SPL) development systematically in proposed. The model can identify basic activities (abstracted as operations) and precisely relate cost and reliability with each basic operation. A typical reuse mode is described from the perspectives of application and domain engineering. According to this scheme, six reuse modes are identified, and alternative industry reuse scenarios can be derived from these modes. A bi-objective 0-1 integer programming model is developed to help decision makers select reuse scenarios when they develop a SPL to minimize cost and maximize reliability while satisfying system requirements to a certain degree. This model is called the cost and reliability optimization under constraint satisfaction (CROS). To design the model efficiently, a three-phase algorithm for finding all efficient solutions is developed, where the first two phases can obtain an efficient solution, and the last phase can generate a nonsupported efficient solution. Two practical methods are presented to facilitate decision making on selecting from the entire range of efficient solutions in light of the decision-maker's preference for man–computer interaction. An application of the CROS model in a mail server system development is presented as a case study.
APA, Harvard, Vancouver, ISO, and other styles
49

Khan, Mohammad, M. A. Khanam, and M. H. Khan. "Requirement Based Testability Estimation Model of Object Oriented Software." Oriental journal of computer science and technology 10, no. 04 (2017): 793–801. http://dx.doi.org/10.13005/ojcst/10.04.14.

Full text
Abstract:
To measure testability before the actual development starts will play a crucial role to the developer, designers and end users as well. Early measurement of testability, especially in early requirement stage to assist the developer for the further development process, and will also assures us to produce and deliver the high quality requirement that can surely reduce the overall cost and improves the quality of development process. Taking view of this fact, this paper identifies testability estimation factors namely understandability and modifiability and establishes the correlation among testability, understandability and modifiability. Further, a model is developed to quantify software testability in requirement phase and named as Requirement Testability Model of Object Oriented Software-RTMOOS. Furthermore, the correlation of Testability with these factors has been tested and justified with the help of statistical measures.
APA, Harvard, Vancouver, ISO, and other styles
50

Sahoo, D. K., B. S. Mohanty, and G. B. Bhaskar. "Multi phase-field simulation study on microstructural grain development during inductive-based friction surfacing of aluminium." Digest Journal of Nanomaterials and Biostructures 17, no. 1 (2022): 263–84. http://dx.doi.org/10.15251/djnb.2022.171.263.

Full text
Abstract:
The computational solution is considered an effective tool for the analysis and a good understanding of the complex microstructural development that occurs during friction surfacing. In this article, ABACUS software was used to create a 3D-FE model of friction surfaced layering of AA 6063 aluminium on EN8 carbon steel, and the heat and strain rate collected throughout the operation were utilised for a computational investigation of microstructural recrystallization and grain development. The Multi-phase Field model (MFM) and constructive material model (CMM) were used for the prediction of the grain development during the process. A decrease in the incubation period from 0.839 sec to 0.578 sec was seen before recrystallization, after a temperature rises from 100°C to 300°C for substrate preheating. Validation of the reliability model obtained from the computational study was done using the image received from electron backscattered diffraction (EBSD) for grain size development and distribution. An appropriate assessment has been made between computational and experimental images which shows the maximum error of less than 10%. The development of grain structure during recrystallization was impacted extensively for increasing the coating strength, which was seen as inversely proportional to the average coating's grain diameter.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!