Academic literature on the topic 'Algorithmic service encounters'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Algorithmic service encounters.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Algorithmic service encounters"

1

Hoang, Khang TRAN. "Cognitive-Affective Responses to Algorithmic Service Encounters: A Multi-Theoretical Analysis of AI-Mediated Customer Experience, Brand Relationship Quality, and Satisfaction Dynamics in Vietnam's Digital Marketplace." Journal of Economics, Finance And Management Studies 08, no. 05 (2025): 2274–84. https://doi.org/10.5281/zenodo.15501654.

Full text
Abstract:
This research investigates the complex interplay between cognitive-affective responses and algorithmic service encounters in Vietnam's rapidly evolving digital marketplace. Through a multi-theoretical lens integrating service-dominant logic, cognitive appraisal theory, technology acceptance model, and relationship marketing frameworks, this study examines how artificial intelligence (AI) mediated customer experiences influence brand relationship quality and satisfaction. Employing a robust mixed-method approach combining structural equation modeling with partial least squares (PLS-SEM) and fuzzy-set Qualitative Comparative Analysis (fsQCA), data from 387 Vietnamese digital consumers were analyzed. Results reveal that algorithmic service personalization significantly enhances cognitive-affective customer experiences, which subsequently strengthen brand relationship quality and satisfaction. Technology readiness moderates these relationships, with higher levels amplifying positive effects of AI-mediated experiences. The fsQCA findings identify multiple configurations of conditions leading to high satisfaction, demonstrating equifinality in satisfaction formation. This research contributes to the literature by developing an integrated theoretical framework explicating the psychological mechanisms through which algorithmic service encounters shape customer outcomes, offering nuanced insights into the digital transformation of service experiences in emerging markets, and identifying optimal configurations for enhancing customer satisfaction in technology-mediated environments.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Zhiping, and Changda Wang. "Service Function Chain Migration: A Survey." Computers 14, no. 6 (2025): 203. https://doi.org/10.3390/computers14060203.

Full text
Abstract:
As a core technology emerging from the convergence of Network Function Virtualization (NFV) and Software-Defined Networking (SDN), Service Function Chaining (SFC) enables the dynamic orchestration of Virtual Network Functions (VNFs) to support diverse service requirements. However, in dynamic network environments, SFC faces significant challenges, such as resource fluctuations, user mobility, and fault recovery. To ensure service continuity and optimize resource utilization, an efficient migration mechanism is essential. This paper presents a comprehensive review of SFC migration research, analyzing it across key dimensions including migration motivations, strategy design, optimization goals, and core challenges. Existing approaches have demonstrated promising results in both passive and active migration strategies, leveraging techniques such as reinforcement learning for dynamic scheduling and digital twins for resource prediction. Nonetheless, critical issues remain—particularly regarding service interruption control, state consistency, algorithmic complexity, and security and privacy concerns. Traditional optimization algorithms often fall short in large-scale, heterogeneous networks due to limited computational efficiency and scalability. While machine learning enhances adaptability, it encounters limitations in data dependency and real-time performance. Future research should focus on deeply integrating intelligent algorithms with cross-domain collaboration technologies, developing lightweight security mechanisms, and advancing energy-efficient solutions. Moreover, coordinated innovation in both theory and practice is crucial to addressing emerging scenarios like 6G and edge computing, ultimately paving the way for a highly reliable and intelligent network service ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
3

Tikaningsih, Ades. "Optimizing Waste Collection Routes in Purwokerto using the Dijkstra Algorithm." Publication of the International Journal and Academic Research 1, no. 2 (2025): 78–85. https://doi.org/10.63222/pijar.v1i2.21.

Full text
Abstract:
Waste is a complex problem that has the potential to cause environmental degradation if not handled properly. Waste management in Banyumas Regency, particularly in the Purwokerto area, encounters several fundamental limitations, including the limited capacity of the transportation fleet and the lack of an integrated route distribution system. Based on the documentation of the Banyumas Regency Environmental Service, of the total daily waste generation amounting to 600 tons, only 45% is transported to the Final Disposal Site (TPA), highlighting the urgent need for optimization in the waste management system. This study applies the Dijkstra Algorithm using the greedy principle, which weights the distances between points and calculates the minimum value to develop a simulation for determining the shortest route for transporting waste from the Temporary Shelter (TPS) to the TPA in Purwokerto. The results of the computational analysis indicate that the optimal route from the Environmental Service office to the TPA/PDU Tanjung is 10.553 kilometers long, involving eight stages of algorithmic iteration. This finding confirms the efficiency of the route compared to other alternatives, supporting the acceleration of waste reduction and handling targets in alignment with the Banyumas Regency's strategic policy directives
APA, Harvard, Vancouver, ISO, and other styles
4

Ku, Chu-Chang, Chien-Chou Chen, Simon Dixon, Hsien Ho Lin, and Peter J. Dodd. "Patient pathways of tuberculosis care-seeking and treatment: an individual-level analysis of National Health Insurance data in Taiwan." BMJ Global Health 5, no. 6 (2020): e002187. http://dx.doi.org/10.1136/bmjgh-2019-002187.

Full text
Abstract:
IntroductionPatients with tuberculosis (TB) often experience difficulties in accessing diagnosis and treatment. Patient pathway analysis identifies mismatches between TB patient care-seeking patterns and service coverage, but to date, studies have only employed cross-sectional aggregate data.MethodsWe developed an algorithmic approach to analyse and interpret patient-level routine data on healthcare use and to construct patients’ pathways from initial care-seeking to treatment outcome. We applied this to patients with TB in a simple random sample of one million patients’ records in the Taiwan National Health Insurance database. We analysed heterogeneity in pathway patterns, delays, service coverage and patient flows between different health system levels.ResultsWe constructed 7255 pathways for 6258 patients. Patients most commonly initially sought care at the primary clinic level, where the capacity for diagnosing TB patients was 12%, before eventually initiating treatment at higher levels. Patient pathways are extremely heterogeneous prior to diagnosis, with the 10% most complex pathways accounting for 48% of all clinical encounters, and 55% of those pathways yet to initiate treatment after a year. Extended consideration of alternative diagnoses was more common for patients aged 65 years or older and for patients with chronic lung disease.ConclusionOur study demonstrates that longitudinal analysis of routine individual-level healthcare data can be used to generate a detailed picture of TB care-seeking pathways. This allows an understanding of several temporal aspects of care pathways, including lead times to care and the variability in patient pathways.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Yuhong, and Yiqin Sheng. "Two-stage optimization of instant distribution of fresh products based on improved NSGA-III algorithm." International Journal of Industrial Engineering Computations 16, no. 3 (2025): 535–56. https://doi.org/10.5267/j.ijiec.2025.5.002.

Full text
Abstract:
As an important part of the fresh produce business format, fresh food instant delivery encounters numerous challenges. Issues like high losses, complex cold chains and time sensitivity lead to increased costs. Additionally, the living space of end-delivery personnel is under pressure and the talent market is saturated. The platform algorithms focus on the interests of themselves and customers while relatively overlooking those of delivery personnel, which affects the overall operation quality, resulting in a significant reduction in delivery efficiency and a remarkable decline in service quality, and further leading to the loss of user stickiness. Therefore, optimizing the fresh food delivery route and considering the interests of multiple parties to improve efficiency and service quality is a crucial research issue in the field of fresh food instant delivery. This paper designs a three-objective static model for fresh food instant delivery aiming at minimizing the total cost, maximizing customer satisfaction and maximizing riders satisfaction. Considering the dynamic changes of orders during the actual operation process and in combination with the dynamics of newly added orders, a multi-objective dynamic model with the goals of minimizing the total cost, minimizing the average customer dissatisfaction and maximizing the income fairness of riders is further established. Based on the constructed models and by incorporating the SPBO strategy, the NSGA-III algorithm is improved and designed to make it more adaptable to the multi-objective optimization requirements in the fresh food instant delivery scenario. This study selects five operational points within a specific region of a fresh food self-operated platform and the order data from a particular day as research cases to obtain the relevant parameters required for the model and conduct case analysis. Based on the platform's business priorities and development needs, appropriate Pareto solutions are selected. Additionally, the feasibility and effectiveness of the improved algorithm are verified through algorithmic comparison. The research aims to provide valuable references and insightful implications for the management decisions of relevant fresh food self-operated platforms, as well as to continuously optimize the management and service of the instant delivery process.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Huiyi. "Risks and Countermeasures of Personal Information Processing in Minors’ Network Socialization." Law and Economy 4, no. 2 (2025): 58–67. https://doi.org/10.56397/le.2025.02.08.

Full text
Abstract:
Minors experience network cognition, network imitation and network interaction in the algorithmic society. The invisible hand of algorithm is inseparable from every network action. When minors encounter algorithmic recommendation services on social media, their personal information is abused by the algorithm. The existing regulations for the protection of personal information based on algorithms have some problems, such as rigid age division of notification and consent mechanism and one-time notification and consent. In this regard, it is necessary to re-establish the risk assessment system for minors’ personal information, improve the operability of consent notification and emphasize dynamic notification, balance the rights and interests of minors’ personal information and the freedom of expression on the Internet, and build an algorithmic society for minors to communicate and trust.
APA, Harvard, Vancouver, ISO, and other styles
7

Gruber, Jonathan, and Eszter Hargittai. "The importance of algorithm skills for informed Internet use." Big Data & Society 10, no. 1 (2023): 205395172311681. http://dx.doi.org/10.1177/20539517231168100.

Full text
Abstract:
Using the Internet means encountering algorithmic processes that influence what information a user sees or hears. Existing research has shown that people's algorithm skills vary considerably, that they develop individual theories to explain these processes, and that their online behavior can reflect these understandings. Yet, there is little research on how algorithm skills enable people to use algorithms to their own benefit and to avoid harms they may elicit. To fill this gap in the literature, we explore the extent to which people understand how the online systems and services they use may be influenced by personal data that algorithms know about them, and whether users change their behavior based on this understanding. Analyzing 83 in-depth interviews from five countries about people's experiences with researching and searching for products and services online, we show how being aware of personal data collection helps people understand algorithmic processes. However, this does not necessarily enable users to influence algorithmic output, because currently, options that help users control the level of customization they encounter online are limited. Besides the empirical contributions, we discuss research design implications based on the diversity of the sample and our findings for studying algorithm skills.
APA, Harvard, Vancouver, ISO, and other styles
8

G. Raja Ramesh. "Utilizing Hybrid Cloud Computing with Machine Learning and Deep Learning to Enhance Privacy, Security, and Empower Patients." Journal of Information Systems Engineering and Management 10, no. 3 (2025): 1226–36. https://doi.org/10.52783/jisem.v10i3.7194.

Full text
Abstract:
This study investigates the effectiveness of Hybrid Cloud solutions in meeting the challenges encountered within the healthcare sector. Hybrid Cloud technology provides adaptable, on-demand services that empower hospitals and clinics to sidestep costly infrastructure upgrades and streamline maintenance costs. The scalability of cloud platforms addresses the fluctuating demands of the health and wellness industry, supported by fail-safes like disaster recovery and redundancy to ensure continuous service availability. At the heart of this infrastructure lies the Hybrid Health Cloud (HHC), serving as a central data repository for efficient information access and sharing. Nevertheless, obstacles emerge due to the time-consuming decryption and memory-intensive re-encryption processes inherent in HHC schemes. To counter these challenges, a novel approach integrates machine learning, deep learning, and Hybrid Cloud technologies, aiming to enhance system efficiency. Leveraging SHA-based algorithmic perspectives such as categorization, grouping, deep semantic networks, and quantum semantic networks, this study strives to improve both prediction accuracy and data protection.
APA, Harvard, Vancouver, ISO, and other styles
9

Meer, Elana A., Maguire Herriman, Doreen Lam, et al. "Design, Implementation, and Validation of an Automated, Algorithmic COVID-19 Triage Tool." Applied Clinical Informatics 12, no. 05 (2021): 1021–28. http://dx.doi.org/10.1055/s-0041-1736627.

Full text
Abstract:
Abstract Objective We describe the design, implementation, and validation of an online, publicly available tool to algorithmically triage patients experiencing severe acute respiratory syndrome coronavirus (SARS-CoV-2)-like symptoms. Methods We conducted a chart review of patients who completed the triage tool and subsequently contacted our institution's phone triage hotline to assess tool- and clinician-assigned triage codes, patient demographics, SARS-CoV-2 (COVID-19) test data, and health care utilization in the 30 days post-encounter. We calculated the percentage of concordance between tool- and clinician-assigned triage categories, down-triage (clinician assigning a less severe category than the triage tool), and up-triage (clinician assigning a more severe category than the triage tool) instances. Results From May 4, 2020 through January 31, 2021, the triage tool was completed 30,321 times by 20,930 unique patients. Of those 30,321 triage tool completions, 51.7% were assessed by the triage tool to be asymptomatic, 15.6% low severity, 21.7% moderate severity, and 11.0% high severity. The concordance rate, where the triage tool and clinician assigned the same clinical severity, was 29.2%. The down-triage rate was 70.1%. Only six patients were up-triaged by the clinician. 72.1% received a COVID-19 test administered by our health care system within 14 days of their encounter, with a positivity rate of 14.7%. Conclusion The design, pilot, and validation analysis in this study show that this COVID-19 triage tool can safely triage patients when compared with clinician triage personnel. This work may signal opportunities for automated triage of patients for conditions beyond COVID-19 to improve patient experience by enabling self-service, on-demand, 24/7 triage access.
APA, Harvard, Vancouver, ISO, and other styles
10

Kharitonova, Yu, N. S. Malik, and T. Yang. "The Legal Issue of Deterrence of Algorithmic Control of Digital Platforms: The Experience of China, the European Union, Russia and India." BRICS Law Journal 10, no. 1 (2023): 147–70. http://dx.doi.org/10.21684/2412-2343-2023-10-1-147-170.

Full text
Abstract:
The authorities in a number of states are concerned about the need for public disclosure of the recommendation algorithms that are used in online services. The introduction of regulations aimed at software developers is frequently proposed as a potential solution to this problem of algorithm transparency. These requirements, which must be fulfilled by the developers of software products, can be administrative regulations or standards regulations. However, despite these efforts, in the absence of direct legislative regulation, users continue to encounter the possibility that a social network feed or a search service result may present content that is unequal or unclear. This is due to the fact that the logic behind these recommendations is not clear and is concealed by IT giants. The following are among the main provisions of legislative initiatives: the liability of digital platforms to publish the mechanisms of recommendation services, the responsibility to inform the user about the processing of personal data and the possibility for the user to refuse such processing. States have recognized the problem and are approaching it from different positions. Each region chooses what to prioritize in terms of the law. We see that for China and Europe, all areas of platforms are important, whereas for Russia, news platforms and video hosting are of interest and for India, social media is the most important platform category. However, in all of the countries, the requirements for the disclosure of the recommendation engine to a certain extent are expanding. The amount of information that is publicly available as well as the order in which it is disclosed are both variable. This study demonstrates the commonalities and differences in the approaches taken by various countries.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Algorithmic service encounters"

1

Mehta, Vaishali, Dolly Sharma, Monika Mangla, Anita Gehlot, Rajesh Singh, and Sergio Márquez Sánchez, eds. Challenges and Opportunities for Deep Learning Applications in Industry 4.0. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/97898150360601220101.

Full text
Abstract:
The competence of deep learning for the automation and manufacturing sector has received astonishing attention in recent times. The manufacturing industry has recently experienced a revolutionary advancement despite several issues. One of the limitations for technical progress is the bottleneck encountered due to the enormous increase in data volume for processing, comprising various formats, semantics, qualities and features. Deep learning enables detection of meaningful features that are difficult to perform using traditional methods. The book takes the reader on a technological voyage of the industry 4.0 space. Chapters highlight recent applications of deep learning and the associated challenges and opportunities it presents for automating industrial processes and smart applications. Chapters introduce the reader to a broad range of topics in deep learning and machine learning. Several deep learning techniques used by industrial professionals are covered, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical project methodology. Readers will find information on the value of deep learning in applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. The book also discusses prospective research directions that focus on the theory and practical applications of deep learning in industrial automation. Therefore, the book aims to serve as a comprehensive reference guide for industrial consultants interested in industry 4.0, and as a handbook for beginners in data science and advanced computer science courses.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Algorithmic service encounters"

1

Högberg, Karin. "Technostress Among Hotel Employees - a Longitudinal Study of Social Media as Digital Service Encounters." In Information and Communication Technologies in Tourism 2021. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-65785-7_6.

Full text
Abstract:
AbstractThe increasing implementation of digital technologies in organizations such as social media platforms is fundamentally transforming the nature of services encounters [1, 2], not least in the hospitality industry. This causes new ways of working for hotel employees, causing disruption in service routines and work tasks. There are few qualitative studies that are focusing on the hospitality industry and technostress. The present study focus on technostress among employees in an international hotel chain. Data have been collected in eight European countries over a period of seven years. The Person-Technology fit model is used in order to identify and analyze stressors and strains deriving from social media use. The results indicate that techno stressors such as work overload, work-life conflict, and changing algorithms creates negative stressors. The study makes a theoretical contribution to technostress research in the Information Systems research as well as the hospitality research field by uncovering negative stressors and strains created over time.
APA, Harvard, Vancouver, ISO, and other styles
2

Liao, Xinhai, Jiaqi Wu, Zilin Jia, and Jiaxing Xian. "Precise Services in Smart Cultural Tourism for Tourists." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2024. https://doi.org/10.3233/faia241443.

Full text
Abstract:
In response to the specific problems encountered in addressing the needs of tourists in the cultural and tourism industry, innovations have been introduced how Big Data enhances precision services in smart tourism through a tourist-centric framework that improves personalization and accuracy using advanced algorithms. A service model centered on tourists and utilizing information technology to achieve precise smart cultural and tourism services has been proposed. Based on collaborative filtering recommendation systems, combined with the transformer model, predictions are made on tourist needs and behavior in order to provide personalized and precise service recommendations to tourists. Recommendations have also been made to promote the transformation of the cultural and tourism industry towards smart cultural and tourism based on this model.
APA, Harvard, Vancouver, ISO, and other styles
3

Shahra, Essa Qasem, Tarek Rahil Sheltami, and Elhadi M. Shakshuki. "A Comparative Study of Range-Free and Range-Based Localization Protocols for Wireless Sensor Network." In Sensor Technology. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2454-1.ch071.

Full text
Abstract:
Wireless Sensor Network is deployed in many fields including military operations, mechanical applications, human services, smart homes, etc. However, deploying WSN encounters many challenges. One of the challenges is localizing the node position, especially mobile targets in critical situations. In this paper, the authors compare two types from range-free localization algorithms and one type from range-based algorithms, namely: Received Signal Strength (RSS), Centroid, and Distance Vector Hop (DV-Hops) protocols, using Cooja simulator. RSS localization algorithms require determining values of the RSS from the anchor nodes around the mobile node, to calculate the distance between the unknown mobile and the first three anchor nodes in the mobile range. The centroid localization requires only three anchors to compute the location of the mobile sensor without the need for distance measuring. Lastly, the DV-Hop algorithm uses routing tables of each anchor in the network topology to compute the Average Distance of Hops. The results show that rang-based algorithms are more accurate than range-free.
APA, Harvard, Vancouver, ISO, and other styles
4

Liang, Wenkai, Yuanlong Yu, and Hangjiang Guo. "A Location Privacy Protection Scheme Combining Locality-Sensitive Hashing and Landmark Positioning." In Advances in Transdisciplinary Engineering. IOS Press, 2025. https://doi.org/10.3233/atde250297.

Full text
Abstract:
Location data plays a crucial role in location-based services (LBS) as it enables service providers to analyze users’ daily activities and infer their behavioral patterns. Effectively protecting user privacy during the use of these services is a significant research topic in the field of cybersecurity. Currently, researchers have made substantial progress by generating cloaking points based on users’ actual locations to safeguard their location data. However, this method still encounters challenges related to inaccurate positioning and the potential to deduce actual locations. To address these issues, this paper integrates locality-sensitive hashing with landmark positions, which protects users’ real locations while utilizing landmarks as a basis for positioning. This approach ensures the protection of location privacy while maintaining the algorithm’s practicality. Extensive experiments have demonstrated that this method achieves superior location positioning accuracy across various application scenarios, outperforming existing techniques.
APA, Harvard, Vancouver, ISO, and other styles
5

Kitanov, Stojan, Borislav Popovski, and Toni Janevski. "Quality Evaluation of Cloud and Fog Computing Services in 5G Networks." In Research Anthology on Architectures, Frameworks, and Integration Strategies for Distributed and Cloud Computing. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-5339-8.ch086.

Full text
Abstract:
Because of the increased computing and intelligent networking demands in 5G network, cloud computing alone encounters too many limitations, such as requirements for reduced latency, high mobility, high scalability, and real-time execution. A new paradigm called fog computing has emerged to resolve these issues. Fog computing distributes computing, data processing, and networking services to the edge of the network, closer to end users. Fog applied in 5G significantly improves network performance in terms of spectral and energy efficiency, enable direct device-to-device wireless communications, and support the growing trend of network function virtualization and separation of network control intelligence from radio network hardware. This chapter evaluates the quality of cloud and fog computing services in 5G network, and proposes five algorithms for an optimal selection of 5G RAN according to the service requirements. The results demonstrate that fog computing is a suitable technology solution for 5G networks.
APA, Harvard, Vancouver, ISO, and other styles
6

Kitanov, Stojan, Borislav Popovski, and Toni Janevski. "Quality Evaluation of Cloud and Fog Computing Services in 5G Networks." In Research Anthology on Developing and Optimizing 5G Networks and the Impact on Society. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-7708-0.ch012.

Full text
Abstract:
Because of the increased computing and intelligent networking demands in 5G network, cloud computing alone encounters too many limitations, such as requirements for reduced latency, high mobility, high scalability, and real-time execution. A new paradigm called fog computing has emerged to resolve these issues. Fog computing distributes computing, data processing, and networking services to the edge of the network, closer to end users. Fog applied in 5G significantly improves network performance in terms of spectral and energy efficiency, enable direct device-to-device wireless communications, and support the growing trend of network function virtualization and separation of network control intelligence from radio network hardware. This chapter evaluates the quality of cloud and fog computing services in 5G network, and proposes five algorithms for an optimal selection of 5G RAN according to the service requirements. The results demonstrate that fog computing is a suitable technology solution for 5G networks.
APA, Harvard, Vancouver, ISO, and other styles
7

Kitanov, Stojan, Borislav Popovski, and Toni Janevski. "Quality Evaluation of Cloud and Fog Computing Services in 5G Networks." In Enabling Technologies and Architectures for Next-Generation Networking Capabilities. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-6023-4.ch001.

Full text
Abstract:
Because of the increased computing and intelligent networking demands in 5G network, cloud computing alone encounters too many limitations, such as requirements for reduced latency, high mobility, high scalability, and real-time execution. A new paradigm called fog computing has emerged to resolve these issues. Fog computing distributes computing, data processing, and networking services to the edge of the network, closer to end users. Fog applied in 5G significantly improves network performance in terms of spectral and energy efficiency, enable direct device-to-device wireless communications, and support the growing trend of network function virtualization and separation of network control intelligence from radio network hardware. This chapter evaluates the quality of cloud and fog computing services in 5G network, and proposes five algorithms for an optimal selection of 5G RAN according to the service requirements. The results demonstrate that fog computing is a suitable technology solution for 5G networks.
APA, Harvard, Vancouver, ISO, and other styles
8

Thandayuthapani, S., P. Thirumoorthi, P. Elantheraiyan, Leena Jenefa, and M. Selvakumar. "An Exploration of Consumer Engagement Strategies Through the Lens of Artificial Intelligence in Marketing Personalization." In Advances in Marketing, Customer Relationship Management, and E-Services. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-7122-0.ch008.

Full text
Abstract:
The study uses consumer involvement theory to examine AI's potential to change marketing through personalization. Using client data and past encounters, AI may adapt messaging and boost engagement. Gamification motivates consumers—AI provides individualized gamified marketing for greater brand interactions. Predictive algorithms employ user data to customize adverts, product recommendations, and demographic content. NLP systems assess sentiment from social media/reviews. Companies can make messaging more engaging. Conversational AI improves consumer connections with real-time product recommendations and support. Personalization increases satisfaction and loyalty, but data privacy requires transparency and user control. AI-driven tailored marketing must be diverse to avoid bias.
APA, Harvard, Vancouver, ISO, and other styles
9

Jindal, Priyanshi, and Harshit Gouri. "AI-Personalization Paradox." In Advances in Marketing, Customer Relationship Management, and E-Services. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-1918-5.ch004.

Full text
Abstract:
The integration of AI into various aspects of our lives has significantly reshaped how we access information, products, and services. AI-driven personalization, a key feature of many platforms, aims to enhance user experiences by tailoring content to individual preferences. Advancement of AI personalization has given rise to the filter bubble and echo-chamber phenomena. Filter bubbles expose consumers to content that reinforces their existing beliefs and preferences, creating a paradox. This chapter explores the multifaceted implications of AI-personalization paradox on digital consumer behavior in the context of the filter bubble era investigating how AI algorithms shape the content users encounter, impact of algorithms on information diversity, and consequences for consumer decision-making. The chapter concludes speculating on future of personalization, emphasizing the need to balance customization with information diversity, encouraging critical thinking about AI ethics among consumers.
APA, Harvard, Vancouver, ISO, and other styles
10

Singha, Surjit. "Social Media's Influence on Consumer Decision-Making." In Advances in Marketing, Customer Relationship Management, and E-Services. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-3811-7.ch004.

Full text
Abstract:
Within the ever-evolving realm of social media marketing, enterprises encounter various obstacles and prospects. Effective strategies prioritize using narratives, fostering community, demonstrating transparency, and embracing authenticity. A combination of quantitative metrics and qualitative insights is necessary for calculating ROI. In anticipation of the future, social media platforms will feature personalized AI-driven experiences, immersive experiences, and an emphasis on audio-centric content. Amidst obstacles such as algorithm modifications and privacy apprehensions, prospects emerge through video hegemony and the incorporation of social commerce. Business enterprises must possess agility, strategic foresight, and a steadfast dedication to authenticity and ethics to navigate the complex interplay between innovation and fundamental values.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Algorithmic service encounters"

1

Shaffer, Joshua, Joseph B. Kopena, and William C. Regli. "Web Service Interfaces for Design Repositories." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-85386.

Full text
Abstract:
Reuse of design knowledge is an important goal in engineering design, and has received much attention. A substantial set of algorithms, methodology, and developed systems exist which support various aspects of this goal. However, the majority of these systems are built around a particular user interface, often some form of Web-based repository portal. The work described here presents search and other core functionality as web services rather than a monolithic repository system. These services may then be employed by a variety of applications, integrating them into interfaces familiar to the designer, extending functionality, streamlining their use, and enabling them to be employed throughout the design process. This paper demonstrates this approach by wrapping previously developed repository search algorithms as web services, and then using these within a plug-in for an existing commercial CAD environment. Based on issues encountered in developing this demonstration, this paper also discusses the challenges and potential approaches toward a more general, widespread application of web services in engineering design.
APA, Harvard, Vancouver, ISO, and other styles
2

Hussein, Dina, Taha Belkhouja, Ganapati Bhat, and Janardhan Rao Doppa. "Energy-Efficient Missing Data Imputation in Wearable Health Applications: A Classifier-aware Statistical Approach." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/807.

Full text
Abstract:
Wearable devices are being increasingly used in high-impact health applications including vital sign monitoring, rehabilitation, and movement disorders. Wearable health monitoring can aid in the United Nations social development goal of healthy lives by enabling early warning, risk reduction, and management of health risks. Health tasks on wearable devices employ multiple sensors to collect relevant parameters of user’s health and make decisions using machine learning (ML) algorithms. The ML algorithms assume that data from all sensors are available for the health monitoring tasks. However, the applications may encounter missing or incomplete data due to user error, energy limitations, or sensor malfunction. Missing data results in significant loss of accuracy and quality of service. This paper presents a novel Classifier-Aware iMputation (CAM) approach to impute missing data such that classifier accuracy for health tasks is not affected. Specifically, CAM employs unsupervised clustering followed by a principled search algorithm to uncover imputation patterns that maintain high accuracy. Evaluations on seven diverse health tasks show that CAM achieves accuracy within 5% of the baseline with no missing data when one sensor is missing. CAM also achieves significantly higher accuracy compared to generative approaches with negligible energy overhead, making it suitable for wide range of wearable applications.
APA, Harvard, Vancouver, ISO, and other styles
3

Rocher, Antonin, Jean-Charles Maré, and Roland Becquet. "Model-based Failure Anticipation and Predictive Maintenance: a Landing Gear Application." In Vertical Flight Society 77th Annual Forum & Technology Display. The Vertical Flight Society, 2021. http://dx.doi.org/10.4050/f-0077-2021-16800.

Full text
Abstract:
In order to reduce maintenance for the operators and improve the availability of their aircrafts, new failure anticipation methods are being implemented at Airbus Helicopters. This paper focuses on the rapid development and entry into service of a monitoring and alerting algorithm able to provide a diagnosis for the most common faults encountered in-service on a landing gear system. This development is based on a hybrid approach combining analysis on historical data and multiphysics modelling and simulation. Reliable health indicators are built to greatly reduce the amount of data to be continuously analyzed from in-service measurements. The monitoring of their values assesses the system health and diagnoses a potential on-going degradation mode. The implementation of such methods aims at paving the way for a condition-based maintenance on aircraft on-board systems, given the promising results and operational gains already exhibited by the in-service implementation.
APA, Harvard, Vancouver, ISO, and other styles
4

Drouet, Céline, Nicolas Cellier, Jérémie Raymond, and Denis Martigny. "Sea State Estimation Based on Ship Motions Measurements and Data Fusion." In ASME 2013 32nd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/omae2013-10657.

Full text
Abstract:
In-service monitoring can help to increase safety of ships especially regarding the fatigue assessment. For this purpose, it is compulsory to know the environmental conditions encountered: wind, but also the full directional wave spectrum. During the EU TULCS project, a full scale measurements campaign has been conducted onboard the CMA-CGM 13200 TEU container ship Rigoletto. She has been instrumented to measure deformation of the ship as well as the sea state encountered during its trip. This paper will focus on the sea state estimation. Three systems have been installed to estimate the sea state encountered by the Rigoletto: An X-band radar from Ocean Waves with WAMOS® system and two altimetric wave radars from RADAC®. Nevertheless, the measured significant wave height can be disturbed by several external elements like bow waves, sprays, sea surface ripples, etc… Furthermore, ship motions are also measured and can provide another estimation of the significant wave height using a specific algorithm developed by DCNS Research for the TULCS project. As all those estimations are inherently different, it is necessary to make a fusion of those data to provide a single estimation (“best estimate”) of the significant wave height. This paper will present the data fusion process developed for TULCS and show some first validation results.
APA, Harvard, Vancouver, ISO, and other styles
5

Cuanang, Jonas, Constantine Tarawneh, Martin Amaro, Jennifer Lima, and Heinrich Foltz. "Optimization of Railroad Bearing Health Monitoring System for Wireless Utilization." In 2020 Joint Rail Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/jrc2020-8060.

Full text
Abstract:
Abstract In the railroad industry, systematic health inspections of freight railcar bearings are required. Bearings are subjected to high loads and run at high speeds, so over time the bearings may develop a defect that can potentially cause a derailment if left in service operation. Current bearing condition monitoring systems include Hot-Box Detectors (HBDs) and Trackside Acoustic Detection Systems (TADS™). The commonly used HBDs use non-contact infrared sensors to detect abnormal temperatures of bearings as they pass over the detector. Bearing temperatures that are about 94°C above ambient conditions will trigger an alarm indicating that the bearing must be removed from field service and inspected for defects. However, HBDs can be inconsistent, where 138 severely defective bearings from 2010 to 2019 were not detected. And from 2001 to 2007, Amsted Rail concluded that about 40% of presumably defective bearings detected by HBDs did not have any significant defects upon teardown and inspection. TADS™ use microphones to detect high-risk bearings by listening to their acoustic sound vibrations. Still, TADS™ are not very reliable since there are less than 30 active systems in the U.S. and Canada, and derailments may occur before bearings encounter any of these systems. Researchers from the University Transportation Center for Railway Safety (UTCRS) have developed an advanced algorithm that can accurately and reliably monitor the condition of the bearings via temperature and vibration measurements. This algorithm uses the vibration measurements collected from accelerometers on the bearing adapters to determine if there is a defect, where the defect is within the bearing, and the approximate size of the defect. Laboratory testing is performed on the single bearing and four bearing test rigs housed at the University of Texas Rio Grande Valley (UTRGV). The algorithm uses a four second sample window of the recorded vibration data and can reliably identify the defective component inside the bearing with up to a 100% confidence level. However, about 20,000 data points are used for this analysis, which requires substantial computational power. This can limit the battery life of the wireless onboard condition monitoring system. So, reducing the vibration sample window to conduct an accurate analysis should result in a more optimal power-efficient algorithm. A wireless onboard condition monitoring module that collects one second of vibration data (5,200 samples) was manufactured and tested to compare its efficacy against a wired setup that uses a four second sample window. This study investigates the root-mean-square values of the bearing vibration and its power spectral density plots to create an optimized and accurate algorithm for wireless utilization.
APA, Harvard, Vancouver, ISO, and other styles
6

Ding, Wei, and Zhaoyi Li. "Analysis of Intelligent Design of Service Robot Based on Intelligent Transformation." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002317.

Full text
Abstract:
Over the past 30 years from the end of the 20th century to the present, service robotics technology has made great progress, and many important results have been achieved in the broad interdisciplinary fields of robot mechanical structure, information transmission and interaction, material science, automation control, sensor technology, etc. . Every breakthrough in key technologies has enabled service robots to develop rapidly in the direction that people expect. With the in-depth development of Internet technology, the comprehensive popularization and promotion of the Internet of Things technology led by 5G. Coupled with the continuous breakthrough of new developments in the field of artificial intelligence, the development of service robots has encountered an unprecedented technological dividend period, and will surely usher in considerable development, becoming an important driving force for the promotion of human civilization and economic development.The rapid development of many advanced technologies such as artificial intelligence, big data, and 5G communications has directly promoted the rapid development of the robotics industry. Under the impetus of new technologies, service robots closely related to humans are developing rapidly. Service robots are becoming more and more intelligent. In this context, the design principles of service robots, interaction design, service mode design and other related fields are in urgent need of intelligent transformation, and the concept of intelligent design with artificial intelligence as the core has begun to be paid attention to by the academic community.The research on intelligent design in academia is currently mainly in the review stage, and the field of discussion is mainly focused on graphic design. This article hopes to broaden the research field of intelligent design by studying the intelligent design of service robots. At the same time, it provides new ideas and new paradigms for the innovative design of service robots to improve the user experience and service quality of service robots. This research mainly studies the design principles, design goals, interaction design, service mode and design process of existing service robots based on the perspective of intelligent design. This article mainly uses the literature analysis method and the desktop survey method to sort out related theories and design methods and combines specific practical case analysis to make a bold outlook on the intelligent design of service robots to help the intelligent transformation of service robots.In the era of intelligent design, the design principles of service robots are also changing and iterating. First of all, service robots must adapt to their service scenarios. Different service scenarios have different requirements for the function and existence of the product; secondly, Secondly, the interaction design of service robots should be carried out based on user experience, and technology serves as a tool to enhance users’ experience ; the last is the discussion of appearance design principles of service robots. This article takes the LeoBots Scrub Singapore sweeping robot as an example to propose that the appearance design of service robots needs to be developed around safety, emotion and bionics. Intelligent design is guided by traditional design thinking and methods, and conducts big data analysis and intelligent research on the essence, process, thinking and other aspects of industrial design through related design methodology, as the basis for intelligent design to simulate artificial design. The core technical means of intelligent design is artificial intelligence, which is based on big data analysis, combined with artificial intelligence technologies such as machine learning, artificial neural networks, genetic algorithms, and deep learning to achieve the intelligent development of the entire process design.The change of service robot design principles and the addition of intelligent design have changed the design process of service robots. Based on the practical cases of Haier U-BOT robots, this article actively explores the service robot design process under the development trend of intelligent transformation based on intelligent design in order to provide new ideas for the intelligent design of service robots. The intelligent transformation of service robots promotes the development of intelligent design, and intelligent design drives the intelligent transformation of service robots.
APA, Harvard, Vancouver, ISO, and other styles
7

Bo, Cao, Song Yu, Xu Feng, and Peng Fukang. "Development of Time Series Drilling Datasets for Stuck Pipe Prediction Using Volve Field Data." In ASME 2024 43rd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2024. http://dx.doi.org/10.1115/omae2024-128386.

Full text
Abstract:
Abstract Numerous machine learning algorithms are applied in the oil and gas industry. However, the data used in these studies are difficult to obtain due to various limitations. Due to the lack of benchmark datasets, it is challenging to make performance comparisons across different algorithms. Volve field dataset was made public by Equinor, which provides raw data availability for the development of drilling and completion datasets. In this paper, we utilize the time-based drilling data from the Volve drilling platform and transform it into time series datasets for stuck pipe prediction. Specifically, we introduce our concepts and principles for data development, the rules for selecting time intervals and attributes, the challenges encountered during the data development process and the methods for overcoming them. We discuss the applicability of these methods, the issues they bring, and their impact on data quality. Furthermore, we provide a well development case that includes complex data. The results indicated that our research shows promise in providing time series reference datasets for the application of machine learning algorithms in stuck pipe prediction. We aim to provide a reference methodology for the development of raw data, reducing barriers to data utilization. We hope to provide data availability, possibly even serving as a reference benchmark dataset for the further applications of machine learning algorithms in the oil and gas sector. Our datasets are made publicly available on GitHub : https://github.com/promiseeee/Time-series-stuck-pipe-prediction.
APA, Harvard, Vancouver, ISO, and other styles
8

Hanif, Amer, Elton Frost, Fei Le, Marina Nikitenko, Mikhail Blinov, and Nikolay Velker. "A Fast ANN Trained Solver Enables Real-Time Radial Inversion of Dielectric Dispersion Data & Accurate Estimate of Reserves in Challenging Environments." In SPE Middle East Oil & Gas Show and Conference. SPE, 2021. http://dx.doi.org/10.2118/204904-ms.

Full text
Abstract:
Abstract Dielectric dispersion measurements are increasingly used by petrophysicists to reduce uncertainty in their hydrocarbon saturation analysis, and subsequent reserves estimation, especially when encountered with challenging environments. Some of these challenges are related to variable or unknown formation water salinity and/or a changing rock texture which is a common attribute of carbonate reservoirs found in the Middle East. A new multi-frequency, multi-spacing dielectric logging service, utilizes a sensor array scheme which provides wave attenuation and phase difference measurements at multiple depths of investigation up to 8 inches inside the formation. The improvement in depth of investigation provides a better measurement of true formation properties, however, also provides a higher likelihood of measuring radial heterogeneity due to spatially variable shallow mud-filtrate invasion. Meaningful petrophysical interpretation requires an accurate electromagnetic (EM) inversion, which accommodates this heterogeneity, while converting raw tool measurements to true formation dielectric properties. Forward modeling solvers are typically beset with a slow processing speed precluding use of complex, albeit representative, formation petrophysical models. An artificial neural network (ANN) has been trained to significantly speed up the forward solver, thus leading to implementation and real-time execution of a complex multi-layer radial inversion algorithm. The paper describes, in detail, the development, training and validation of both the ANN network and the inversion algorithm. The presented algorithm and ANN inversion has shown ability to accurately resolve mud filtrate invasion profile as well as the true formation properties of individual layers. Examples are presented which demonstrate that comprehensive, multi-frequency, multi-array, EM data sets are inverted efficiently for dis-similar dielectric properties of both invaded and non-invaded formation layers around the wellbore. The results are further utilized for accurate hydrocarbon quantification otherwise not achieved by conventional resistivity based saturation techniques. This paper presents the development of a new EM inversion algorithm and an artificial neural network (ANN) trained to significantly speed up the solution of this algorithm. This approach leads to a fast turnaround for an accurate petrophysical analysis, reserves estimate and completion decisions.
APA, Harvard, Vancouver, ISO, and other styles
9

"Comprehending Complex Clastics: Consistent Real-Time Geological Borehole Imaging Independent of Drilling Fluid and Telemetry Limitations." In 2022 SPWLA 63rd Annual Symposium. Society of Petrophysicists and Well Log Analysts, 2022. http://dx.doi.org/10.30632/spwla-2022-0096.

Full text
Abstract:
Real-time geological interpretation while drilling can be achieved with high-resolution borehole images; however, the use of different drilling fluids, telemetryrelated limitations, and non-optimal depth control on rigs often leave geoscientists with limited and poorquality data, leading to inconsistencies over a field’s life cycle. This study from offshore Norway presents applications of new measurements and algorithms to address such challenges providing consistent borehole imaging for geological interpretations while drilling complex subsurface. New multi-physics high-resolution LWD (logging while drilling) technology was deployed for real-time imaging in boreholes drilled with nonconductive fluids, addressing technology gaps that earlier allowed such services only in conductive aqueous fluids and providing much-needed independence to drill various well trajectories in any mud configuration without limiting high-resolution imaging for geological, petrophysical, and geomechanical interpretation. Correspondingly, real-time data transmission challenges were addressed with improved mud-pulse telemetry and wired drill-pipe. Furthermore, new application algorithms were developed to compensate for inadequate depth control impacting the integrity of high-resolution data. We present results from field development operations in the Utsira High region of the Norwegian North Sea, including examples of pilot and lateral sections drilled with conductive and nonconductive fluids. Conventional evaluation of the encountered heterogeneous mix of alluvial fans, plains, and aeolian dune facies is difficult, even more in horizontal drains where standard logs are often featureless across problematic conglomerates. Real-time dips picked on high-definition images helped with geosteering as well. Examples of geological features from different wells are presented with unique resistivity images from new LWD borehole imager for nonconductive fluid, comparing with image data acquired in conductive mud for consistent interpretation. Structural elements of subseismic faults and fractures were interpreted with consistency to provide geologists with confident featurepicks for updating their reservoir models.
APA, Harvard, Vancouver, ISO, and other styles
10

Vishnumolakala, Narendra, Dean Michael Murphy, Thu Nguyen, Enrique Zarate Losoya, Vivekvardhan Reddy Kesireddy, and Eduardo Gildin. "Predicting Dysfunction Vibration Events while Drilling Using LSTM Recurrent Neural Networks." In SPE/IATMI Asia Pacific Oil & Gas Conference and Exhibition. SPE, 2021. http://dx.doi.org/10.2118/205571-ms.

Full text
Abstract:
Abstract The objective of the study is to build a robust Recurrent Neural Network system using Long-Short-Term-Memory (LSTM) to predict future vibrations during drilling operations. This provides a reliable solution to the complex problem of modeling several forms of vibrations encountered downhole. This accurate prediction system can be readily integrated into advisory/warning systems giving drillers the potential to save time, improve safety, and increase efficiency in drilling operations. High-frequency downhole drilling data onshore fields, obtained from a major O&G service provider, was used to train and validate the models. First, multiple classification algorithms such as Logistic Regression, KNN, Decision Trees, Random Forest were utilized to identify the presence and severity of Stickslip, Whirl, and other drill-string vibrations. LSTM-RNN was then used instead of traditional RNN intended for sequential data, to resolve the vanishing gradient problem. LSTM-RNN architecture was built to predict vibrations a)10 seconds and b) 30 seconds into the future. Results of the traditional classification models confirmed the hypothesis that dysfunctions can be successfully identified based on real-time downhole drilling data. 98% accuracy was obtained in successfully identifying torsional vibrations during drilling. A total of 101 parameters including measured and derived variables are available in the dataset. Modeling was performed with 14 features and vibrations were predicted. The RNN model was trained on data from multiple wells that encountered vibrations during drilling. The models were able to predict vibrations 10 seconds into the future with an MSE of 0.02 and 30 seconds into the future with reasonable accuracy and MSE of 0.10. Avoiding excessive vibrations will result in fewer trips by increasing longevity and reducing malfunctions of downhole electronics, the drill-string, and the BHA. Reduced NPT means drilling complex wells efficiently in less time which in turn directly translates to lower costs for the company. In addition to significant cost benefits, automated technology predicting anomalies and reacting in real-time translates to improved safety because it would now require fewer operators at risk on the rig floor. The work opens up avenues for a sophisticated advisory/warning system and effective ‘look-ahead’ drilling processes in the future.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography