To see the other types of publications on this topic, follow the link: Google cloud projects.

Journal articles on the topic 'Google cloud projects'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Google cloud projects.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zulpukarova, D. "Google Applications in Student’s Self-learning." Bulletin of Science and Practice 5, no. 12 (2019): 420–30. http://dx.doi.org/10.33619/2414-2948/49/52.

Full text
Abstract:
Problem and rationale. Despite the spread of information technology in various areas of the modern society, the possibilities of cloud technology are not used much in the practice of university education. The article substantiates the effectiveness of the use of Google applications in the process of organizing student’s self-learning. Methodology. While researching the problem, we have used the analysis of theoretical and methodological literature, study of best practices in teaching mathematics and computer science using information technology, analysis of the advantages and disadvantages of using cloud technology in training, conduct of computer science courses using Google applications, as well as the management of independent learning of first-year students of Osh State University using cloud technology. Results. The capabilities of the specialized services of Google.docs, Google.tables, Google.presentation and Google.forms for creating a document, presentation, spreadsheet, online questionnaires and online tests have been defined. The use of Google services in the practice of training leads to the formation of student’s training in the skills of working with cloud technologies, meanwhile students demonstrate three levels of knowledge of cloud technologies: high, medium and low. Conclusions. Working with cloud technologies, it is possible for students to build ICT skills, skills of self-organization. Using Google applications, it is potential to monitor the implementation of self-learning, student projects, both in the process of completing assignments and achieving final results, the outcome of which is the improvement of teacher’s work process and saving time.
APA, Harvard, Vancouver, ISO, and other styles
2

Barreiro Megino, Fernando, Mikhail Borodin, Kaushik De, et al. "Accelerating science: The usage of commercial clouds in ATLAS Distributed Computing." EPJ Web of Conferences 295 (2024): 07002. http://dx.doi.org/10.1051/epjconf/202429507002.

Full text
Abstract:
The ATLAS experiment at CERN is one of the largest scientific machines built to date and will have ever growing computing needs as the Large Hadron Collider collects an increasingly larger volume of data over the next 20 years. ATLAS is conducting R&D projects on Amazon Web Services and Google Cloud as complementary resources for distributed computing, focusing on some of the key features of commercial clouds: lightweight operation, elasticity and availability of multiple chip architectures. The proof of concept phases have concluded with the cloud-native, vendoragnostic integration with the experiment’s data and workload management frameworks. Google Cloud has been used to evaluate elastic batch computing, ramping up ephemeral clusters of up to O(100k) cores to process tasks requiring quick turnaround. Amazon Web Services has been exploited for the successful physics validation of the Athena simulation software on ARM processors. We have also set up an interactive facility for physics analysis allowing endusers to spin up private, on-demand clusters for parallel computing with up to 4 000 cores, or run GPU enabled notebooks and jobs for machine learning applications. The success of the proof of concept phases has led to the extension of the Google Cloud project, where ATLAS will study the total cost of ownership of a production cloud site during 15 months with 10k cores on average, fully integrated with distributed grid computing resources and continue the R&D projects.
APA, Harvard, Vancouver, ISO, and other styles
3

Wilson, Deirdre. "A Collaborative Story Writing Project Using Google Docs and Face-to-Face Collaboration." Canadian Journal of Learning and Technology 49, no. 3 (2025): 1–21. https://doi.org/10.21432/cjlt28174.

Full text
Abstract:
The Google Docs application is part of Google Workspace for Education, a suite of cloud-based productivity and collaboration tools that are now ubiquitous in middle and high school classrooms. While there is an expanding body of research documenting the benefits of using Google Docs to support collaborative writing projects, there exists few qualitative studies detailing how cloud-based tools are integrated into courses that meet face-to-face on an ongoing basis. This case study explores how an experienced high school English teacher facilitated a collaborative writing project, in which students used Google Docs to co-write a story. The students were instructed to work on their stories asynchronously from home and synchronously during face-to-face classes. Data sources included field notes from class observations, reflections written by the teacher, semi-structured interviews with the teacher, focus group interviews with the students, and the shared Google Docs. This article describes affordances and constraints associated with the pedagogical supports provided during the collaborative writing process and offers recommendations for teachers who intend to use Google Docs to facilitate collaborative writing projects.
APA, Harvard, Vancouver, ISO, and other styles
4

Atri, Preyaa. "Enabling AI Workflows: A Python Library for Seamless Data Transfer between Elasticsearch and Google Cloud Storage." Journal of Artificial Intelligence, Machine Learning and Data Science 1, no. 1 (2022): 489–91. https://doi.org/10.51219/JAIMLD/preyaa-atri/132.

Full text
Abstract:
This paper introduces a Python library designed to accelerate AI workflows by facilitating seamless data transfer between Elasticsearch, a powerful search engine for unstructured data, and Google Cloud Storage (GCS), a scalable cloud storage platform. By automating the migration of large datasets from Elasticsearch to GCS, the library empowers AI researchers and practitioners to efficiently leverage cloud-based resources for model training, preprocessing, and analysis. This research delves into the library's features, dependencies, usage patterns, and its potential to enhance data management efficiency in AI-driven projects. Additionally, the paper discusses the library's limitations and proposes future enhancements to further streamline AI development pipelines.
APA, Harvard, Vancouver, ISO, and other styles
5

Khan, Nawsher, A. Noraziah, Elrasheed I. Ismail, Mustafa Mat Deris, and Tutut Herawan. "Cloud Computing." International Journal of E-Entrepreneurship and Innovation 3, no. 2 (2012): 51–59. http://dx.doi.org/10.4018/jeei.2012040104.

Full text
Abstract:
Cloud computing is fundamentally altering the expectations for how and when computing, storage, and networking resources should be allocated, managed, consumed, and allow users to utilize services globally. Due to the powerful computing and storage, high availability and security, easy accessibility and adaptability, reliable scalability and interoperability, cost and time effective cloud computing is the top, needed for current fast growing business world. A client, organization or a trade that adopting emerging cloud environment can choose a well suitable infrastructure, platform, software, and a network resource, for any business, where each one has some exclusive features and advantages. The authors first develop a comprehensive classification for describing cloud computing architecture. This classification help in survey of several existing cloud computing services developed by various projects globally such as Amazon, Google, Microsoft, Sun and Force.com and by using this survey’s results the authors identified similarities and differences of the architecture approaches of cloud computing.
APA, Harvard, Vancouver, ISO, and other styles
6

Osadcha, Kateryna P., and Viacheslav V. Osadchyi. "The use of cloud computing technology in professional training of future programmers." CTE Workshop Proceedings 8 (March 19, 2021): 155–64. http://dx.doi.org/10.55056/cte.229.

Full text
Abstract:
The article provides a brief analysis of the current state of the study of cloud technologies by future software engineers at foreign and Ukrainian universities. The author experience in the application of cloud technologies in the training of future software engineers in Ukraine is presented. The application of cloud business automation systems, online services to monitor the implementation of the software projects, Google services for collaboration, planning and productivity while studying professional disciplines and carrying out diploma projects is described. Based on the survey conducted at Stackoverflow, the state of application of cloud technologies by software engineers around the world has been analyzed. The cloud technologies that are not studied at the analyzed universities of Ukraine and those that are not popular with software developers in the world, but studied at Ukrainian universities by future software engineers are outlined. Conclusions are made on the modernization of training programs for future software engineers. Topics for the study of cloud technologies by future software engineers in the content of professional disciplines are proposed.
APA, Harvard, Vancouver, ISO, and other styles
7

Bhargav, Bachina. "Deploying Python APIs on GCP GKE with HELM: A Comprehensive Guide." European Journal of Advances in Engineering and Technology 8, no. 11 (2021): 28–38. https://doi.org/10.5281/zenodo.10901249.

Full text
Abstract:
<strong>ABSTRACT</strong> Kubernetes has rapidly become a cornerstone technology in modern IT infrastructure, with widespread adoption across various industries. However, deploying applications on Kubernetes entails managing numerous objects such as deployments, configmaps, and secrets, each defined in manifest files. While this approach works for initial deployments, it becomes cumbersome for repeated deployments. Enter Helm, the Kubernetes package manager designed to simplify application deployment, enhance security, and provide configurability. Helm streamlines the deployment process by enabling users to package and manage Kubernetes applications efficiently. In this paper, we explore deploying Python APIs on Google Cloud Platform's Google Kubernetes Engine (GKE) using Helm, covering essential prerequisites, Dockerization, container image pushing, Helm chart creation, GKE cluster configuration, Helm chart installation, accessing the deployed API, cluster cleanup, and concluding insights. Through this exploration, we aim to provide a comprehensive guide for deploying applications on Kubernetes with Helm, empowering users to leverage the full potential of Kubernetes in their projects.
APA, Harvard, Vancouver, ISO, and other styles
8

Venica, Liptia, Muhammad Ayyas, Muchamad Amin Maezun Ta'sin Billah, Rista Dewi Opsantini, and Syaifi Al-Mahfudzi. "Introduction to Cloud Computing Fundamental for Teachers and Students of SMK Negeri 1 Karangdadap Kabupaten Pekalongan." REKA ELKOMIKA: Jurnal Pengabdian kepada Masyarakat 4, no. 3 (2023): 232–42. http://dx.doi.org/10.26760/rekaelkomika.v4i3.232-242.

Full text
Abstract:
The use of cloud computing technology is currently increasingly widespread. The purpose of this community service is to introduce vocational high school students and teachers to cloud computing and implement it to solve computational problems. Participants are introduced to cloud architecture, the advantages of using cloud, various cloud service providers, and different categories of products that the cloud service providers deliver. During the training, participants are provided with the case studies on the use of three products available on Google Cloud Platform (GCP), i.e., Compute Engine, Cloud Storage, and Cloud Vision API. As a result, participants are able to create virtual machines using Compute Engine, publish static web pages using Cloud Storage, and create an application to detect the name and location of a landmark from an image using Cloud Vision API. After this activity, participants are expected to be able to explore other cloud products available and use them to create their own projects.
APA, Harvard, Vancouver, ISO, and other styles
9

Kratzke, Nane. "Volunteer Down: How COVID-19 Created the Largest Idling Supercomputer on Earth." Future Internet 12, no. 6 (2020): 98. http://dx.doi.org/10.3390/fi12060098.

Full text
Abstract:
From close to scratch, the COVID-19 pandemic created the largest volunteer supercomputer on earth. Sadly, processing resources assigned to the corresponding Folding@home project cannot be shared with other volunteer computing projects efficiently. Consequently, the largest supercomputer had significant idle times. This perspective paper investigates how the resource sharing of future volunteer computing projects could be improved. Notably, efficient resource sharing has been optimized throughout the last ten years in cloud computing. Therefore, this perspective paper reviews the current state of volunteer and cloud computing to analyze what both domains could learn from each other. It turns out that the disclosed resource sharing shortcomings of volunteer computing could be addressed by technologies that have been invented, optimized, and adapted for entirely different purposes by cloud-native companies like Uber, Airbnb, Google, or Facebook. Promising technologies might be containers, serverless architectures, image registries, distributed service registries, and all have one thing in common: They already exist and are all tried and tested in large web-scale deployments.
APA, Harvard, Vancouver, ISO, and other styles
10

Karmana, Putu Dian, I. Putu Bagus Ambara Yasa, Gusti Komang Tri Wismana, and I Putu Buda Suyasa. "Rancangan Pengarsipan Laporan Digital pada Instansi Dinas Kebudayaan Provinsi Bali Berbasis Google Site." JIS SIWIRABUDA 2, no. 1 (2024): 11–16. https://doi.org/10.58878/jissiwirabuda.v2i1.282.

Full text
Abstract:
The Design of Digital Report Archiving at the Cultural Office of Bali Province based on Google Site aims to enhance efficiency and accessibility in document management. Leveraging the Google Site platform, this design integrates cloud-based data storage with collaborative features, facilitating the archiving and retrieval processes. Digital report documents, encompassing cultural activities and related projects, can be swiftly uploaded, organized, and accessed through a user-friendly interface. Data security is ensured through well-managed access rights. With the implementation of this design, the Cultural Office of Bali Province anticipates optimizing operational effectiveness, reducing paper usage, and fostering collaboration among team members in the management of digital reports.
APA, Harvard, Vancouver, ISO, and other styles
11

DUMBIRI, Rita, Sandra Ihuoma BABATOPE, and Princes Clementina OWAIRU. "Awareness and Utilisation of Cloud Computing Technologies for Library Services: A Survey of Librarians in Colleges of Education Libraries in Nigeria." Niger Delta Journal of Library and Information Science 5, no. 2 (2024): 108–24. https://doi.org/10.5281/zenodo.14787308.

Full text
Abstract:
<strong>Abstract</strong><strong>Purpose of the Study</strong>: The study's purpose is to investigate the cloud computing technologies used by librarians and report on the concerns related to implementing cloud-completion technologies in selected colleges of education libraries in Nigeria.<strong>Methodology:</strong> Data was collected from 116 para-professional and professional librarians working in 11 selected Federal Colleges of education in Nigeria.<strong>Findings:</strong> The results of the study show that YouTube, Google Drive, OneDrive, Gmail, and Google Scholar are the commonly used cloud computing technologies by librarians in colleges of education libraries in Nigeria. The study also found that librarians' purpose in using cloud computing technologies is to store and share files, share videos related to library orientations/other video content, and collaborate with other librarians for research projects. The study also found that adopting cloud computing technologies has advantages such as storage capacity, access to files from anywhere, better performance, data security, software updates automatically, and lower maintenance costs. The findings will inform librarians or information professionals about the role played by cloud computing in developing information services and how to use it.<strong>Recommendations/Conclusion</strong>: The study concludes by providing a better understanding andapplication of cloud computing to advance the provision of library services to users of college libraries and other libraries worldwide.<strong>Study Type</strong>: Research paper
APA, Harvard, Vancouver, ISO, and other styles
12

Kramarenko, T. G. "Training of teacher to school students collaborate with educational resources of cloud based projects." CTE Workshop Proceedings 4 (March 21, 2017): 206–10. http://dx.doi.org/10.55056/cte.351.

Full text
Abstract:
The aim of research is to find contemporary approaches to teamwork future teachers with educational resources. Objectives of the study is to analyze the possibilities of establishing joint work in networks of students and improvement course “Information and communication mathematics learning tools” for blended learning. The object of the research is the process methodical training of future teachers of mathematics and computer science. The subject of research is the use of cloud based projects in methodical training of future teachers. The paper analyzes the problems of networks cooperation in various cloud based services, including GeoGebra, LearningApps, Prezi, Google. Results of the study is planned to generalize to form recommendations on the organization of cooperation in the network of the future teachers.
APA, Harvard, Vancouver, ISO, and other styles
13

Huang, Po-Jung, Jui-Huan Chang, Hou-Hsien Lin, et al. "DeepVariant-on-Spark: Small-Scale Genome Analysis Using a Cloud-Based Computing Framework." Computational and Mathematical Methods in Medicine 2020 (September 1, 2020): 1–7. http://dx.doi.org/10.1155/2020/7231205.

Full text
Abstract:
Although sequencing a human genome has become affordable, identifying genetic variants from whole-genome sequence data is still a hurdle for researchers without adequate computing equipment or bioinformatics support. GATK is a gold standard method for the identification of genetic variants and has been widely used in genome projects and population genetic studies for many years. This was until the Google Brain team developed a new method, DeepVariant, which utilizes deep neural networks to construct an image classification model to identify genetic variants. However, the superior accuracy of DeepVariant comes at the cost of computational intensity, largely constraining its applications. Accordingly, we present DeepVariant-on-Spark to optimize resource allocation, enable multi-GPU support, and accelerate the processing of the DeepVariant pipeline. To make DeepVariant-on-Spark more accessible to everyone, we have deployed the DeepVariant-on-Spark to the Google Cloud Platform (GCP). Users can deploy DeepVariant-on-Spark on the GCP following our instruction within 20 minutes and start to analyze at least ten whole-genome sequencing datasets using free credits provided by the GCP. DeepVaraint-on-Spark is freely available for small-scale genome analysis using a cloud-based computing framework, which is suitable for pilot testing or preliminary study, while reserving the flexibility and scalability for large-scale sequencing projects.
APA, Harvard, Vancouver, ISO, and other styles
14

Rodrigues Ramos, Juliano, and Tamiris Fernanda Malacrida. "PROCESSOS, MÉTODOS E PRÁTICAS DE ENGENHARIA DE SOFTWARE EM PROJETOS SOFTWARE LIVRE: UM ESTUDO DE CASO OWNCLOUDE NEXTCLOUD." COLLOQUIUM EXACTARUM 10, no. 2 (2018): 53–59. http://dx.doi.org/10.5747/ce.2018.v10.n2.e238.

Full text
Abstract:
Software Engineering practices aim to improve the softwaresquality, which includes Open Source Software (OSS). This work aims to identify software engineering practices in the OSS projects OwnCloude Nextcloud. We perform a scientific literature review, using digital databases such as the Google Scholar and CAPESPeriodic Portal, and also search using the gray literature. We conclude that good practices of software engineering are applied into both cloud storage projects OwnCloudand Nextcloud. However, we found that the major focus is still on coding, and therefore, other development processes, such as documentation, effective control of change management and quality management processes, should receive more emphasis
APA, Harvard, Vancouver, ISO, and other styles
15

Pečeliūnaitė, Angelė. "Debesų kompiuterija: darbas, bendradarbiavimas ir komunikacija. Ar debesis tenkina studentų ir mokslininkų poreikius?" Informacijos mokslai 55 (January 1, 2011): 117–30. http://dx.doi.org/10.15388/im.2011.0.3165.

Full text
Abstract:
Informacinės komunikacinės technologijos sparčiai vystosi ir tobulėja. Debesų kompiuterija (Cloud Computing) yra dar viena naujovė, sparčiai pradėjusi plisti XXI a. pradžioje. Debesų kompiuterija – tai galimybė naudotis kompiuterine ir programine įranga internete ir mokėti tik už tai, kiek teikiama paslauga buvo pasinaudota. Debesų kompiuterijos paslaugos pirmiausia buvo orientuotos į verslą. Šio straipsnio tikslas – apžvelgti debesų technologijas ir jų teikiamas galimybes akademinei bendruomenei – studijoms, mokslui, bendradarbiavimui ir mokslinei komunikacijai. Analizuojamos trijų korporacijų siūlomos paslaugos, kreipiant dėmesį į paslaugų įvairovę, galimybes rengti projektus, konferencijas etc., technologijos saugumą, mobilumą ir paslaugų kainą. Tyrimo išvadose pabrėžiama, kad debesų kompiuterija yra patraukli mokslui ir studijoms. Debesį galima panaudoti studijoms, moksliniam darbui, bendradarbiavimui ir mokslinėje komunikacijoje, tik svarbu pasirinkti poreikius tenkinantį atitinkamą debesį.Pagrindiniai žodžiai: debesų technologijos, debesų kompiuterija, viešas ir privatus debesis, virtualus privatus tinklas, SaaS, PaaS, IaaS, virtualizacijos procesai, duomenų centrai, Microsoft Office 365, Microsoft Web Apps, Zoho debesis, Google Apps studijoms, mokslo komunikacija.Cloud Computing: The job, co-operation and communication Does Cloud Meet the Needs of Students and Scholars?Angelė Pečeliūnaitė SummaryThe information and communication technologies are rapidly evolving and progressing. Cloud computing is one of the innovations that began rapidly spreading from the beginning of the 21st century. The article summarizes the cloud computing paradigm, an introduction to cloud computing platforms, the cloud service offered by the cloud providers, and highlights the attractive features of this technology. The collaboration and communication methods in the cloud are discussed. The cloud computing services were primarily focused on business. The aim of the article is an overview of the cloud technologies and opportunities for the academic community – in studies, research, col-laboration and scientific communication.Delic divides research into three significant periods: empirical, theoretical, and experimental/simulation (Delic et al., 2010, p. 3). This article falls within the third study phase: an experimen-tal study comparing the services of three selected cloud providers. We analyze the cloud services of Zoho Web Apps, Microsoft Web Apps and Google Apps for Education according to the selected five categories (information gathered from web sites and company experts speaking): education in a variety of services offered working with documents and projects, scientific communication and col-laboration, data protection, mobility and price.The results have shown that cloud services are attractive to the educational community. The largest variety of service diversity and performance improvements are offered by the Microsoft and Zoho clouds on the SaaS and PaaS platforms. Zoho is a leader of these cloud services. The most attractive environment for scientific communication and collaboration (including mobility) is the Microsoft cloud. Data protection takes care of all service providers: SLA of 99.9% guarantee. Zoho uses web encryption for 256-bit SSL, and Microsoft 128-bit SSL / TSL. Google Apps for Education is in many ways behind the above-mentioned corporations, while Google's cloud services are offered to meet students' needs; services are provided free of charge.The investigation was conducted in December 1–15, 2010.
APA, Harvard, Vancouver, ISO, and other styles
16

Omojola, Sesan, and Kenechi Okeke. "Cloud-Based Solutions for Scalable Non-profit Project Management Systems." Advances in Research 26, no. 2 (2025): 418–27. https://doi.org/10.9734/air/2025/v26i21309.

Full text
Abstract:
Objective: This study explores the feasibility of applying cloud computing to develop scalable, low-cost project management systems for non-profit organizations. It explores cloud-based technology advantages, limitations, and probable uses in optimizing project coordination, resource allocation, and impact assessment. Study Design: A comprehensive review of literature on cloud-based non-profit project management solutions, with specific focus on research work published between 2019 and 2024. Methodology: The research is based on Google Scholar systematic literature review, as well as Scopus, IEEE Xplore, and ScienceDirect databases. The articles selected in the databases cover cloud computing within non-profit management, cost-effectiveness, security concerns, and digital transformation in enhancing operational effectiveness. Results: The review identifies 14 studies that illustrate the application of cloud computing for managing non-profit projects. Results highlight that cloud systems improve collaboration, data security, cost-effectiveness, and accessibility. Conversely, risks to data privacy, dependency on internet access, and the lack of technical understanding are concerns to most non-profit organizations. Conclusions: Cloud computing has emerged as a transformative solution for non-profits to accomplish efficient and scalable project management. Although the advantages of cloud systems are well recognized, future research needs to focus on creating measures to mitigate security concerns and enhance digital competence among non-profit staff to unlock the complete potential of these technologies.
APA, Harvard, Vancouver, ISO, and other styles
17

Pflanzner, Tamas, Hamza Baniata, and Attila Kertesz. "Latency Analysis of Blockchain-Based SSI Applications." Future Internet 14, no. 10 (2022): 282. http://dx.doi.org/10.3390/fi14100282.

Full text
Abstract:
Several revolutionary applications have been built on the distributed ledgers of blockchain (BC) technology. Besides cryptocurrencies, we can find many other application fields in smart systems exploiting smart contracts and Self Sovereign Identity (SSI) management. The Hyperledger Indy platform is a suitable open-source solution for realizing permissioned BC systems for SSI projects. SSI applications usually require short response times from the underlying BC network, which may vary highly depending on the application type, the used BC software, and the actual BC deployment parameters. To support the developers and users of SSI applications, we present a detailed latency analysis of a private permissioned BC system built with Indy and Aries. To streamline our experiments, we developed a Python application using containerized Indy and Aries components from official Hyperledger repositories. We deployed our experimental application on multiple virtual machines in the public Google Cloud Platform and on our local, private cloud using a Docker platform with Kubernetes. We evaluated and compared their performance with the metrics of reading and writing response latency. We found that the local Indy ledger reads 30–50% faster, and writes 65–85% faster than the Indy ledger running on the Google Cloud Platform.
APA, Harvard, Vancouver, ISO, and other styles
18

Elidrisy, Ahmed. "Leveraging Cloud Services & Digital Transformation for Sustainability: Insights from Cases of Qatar." Journal of Innovative Research 2, no. 1 (2024): 20–28. http://dx.doi.org/10.54536/jir.v2i1.2398.

Full text
Abstract:
The current research aimed to present an in-depth analysis of how Qatar Free Zone (QFZ), Qatar Smart Program (TASMU), Qatar National Broadband Network (Qnbn), and Msheireb Downtown Doha are Leveraging Cloud Services and digital transformation for sustainability under Qatar National Vision 2030 (QNV2030). The researcher used a qualitative case study research approach to analyse the aim gathering data using semi-structured interviews from 21 sustainability experts, government officials, and IT professionals. The findings highlighted that QFZ and Qnbn projects of Qatar meet the principles of SDG 8: Decent Work and Economic Growth. Results showed that Qnbn offers multiple benefits for the economic development of multiple sectors. All four cases align with two SDGs, including SDG 9: Industry, Innovation, and Infrastructure and SDG 11: Sustainable Cities and Communities, to meet the requirements of the QNV2030 and sustainability for building Smart Qatar. Also, these projects like Msheireb Downtown Doha and QFZ in Qatar are founded on four key pillars, including social, human, economic and environmental developments meeting QNV2030 objectives. Cloud services also offer climate-related innovations facilitating environmental monitoring, resource optimisation, etc., in QFZ and Google Cloud partnership aligned with SDG 13: Climate Action.
APA, Harvard, Vancouver, ISO, and other styles
19

Dilip Prakash Valanarasu and Er. Shubham Jain. "Cloud Migration of ATG E-commerce to GCP: Process, challenges, and outcomes of migrating an ATG-based e-commerce application to Google Cloud." International Journal for Research Publication and Seminar 16, no. 2 (2025): 63–69. https://doi.org/10.36676/jrps.v16.i1.50.

Full text
Abstract:
Cloud migration projects are increasingly vital for modernizing legacy applications and enhancing business agility. This paper examines the comprehensive process of migrating an ATG-based e-commerce application to Google Cloud Platform (GCP). The migration strategy encompassed thorough planning, analysis, and implementation stages that addressed key aspects of scalability, performance optimization, and security enhancement. Initially, a detailed assessment of the existing ATG framework identified potential bottlenecks and compatibility issues with cloud environments. Subsequently, a robust migration roadmap was developed to mitigate risks and ensure a seamless transition, emphasizing data integrity and minimal service disruption. The process involved refactoring monolithic components into microservices, integrating containerization technologies, and leveraging GCP’s advanced orchestration, monitoring, and automation tools. Throughout the migration journey, several challenges emerged, including legacy code dependencies, integration complexities, and the necessity to comply with stringent industry standards. Addressing these obstacles required iterative testing, close stakeholder collaboration, and agile project management practices. Ultimately, the outcomes demonstrated improved operational efficiency, enhanced system resilience, and a significant reduction in infrastructure costs. Furthermore, the transition enabled the e-commerce platform to capitalize on cloud-native features such as dynamic scaling
APA, Harvard, Vancouver, ISO, and other styles
20

Barisits, Martin, Robert Barnsley, Fernando Harald Barreiro Megino, et al. "Extending Rucio with modern cloud storage support." EPJ Web of Conferences 295 (2024): 01030. http://dx.doi.org/10.1051/epjconf/202429501030.

Full text
Abstract:
Rucio is a software framework designed to facilitate scientific collaborations in efficiently organising, managing, and accessing extensive volumes of data through customizable policies. The framework enables data distribution across globally distributed locations and heterogeneous data centres, integrating various storage and network technologies into a unified federated entity. Rucio offers advanced features like distributed data recovery and adaptive replication, and it exhibits high scalability, modularity, and extensibility. Originally developed to meet the requirements of the high-energy physics experiment ATLAS, Rucio has been continuously expanded to support LHC experiments and diverse scientific communities. Recent R&amp;D projects within these communities have evaluated the integration of both private and commercially-provided cloud storage systems, leading to the development of additional functionalities for seamless integration within Rucio. Furthermore, the underlying systems, FTS and GFAL/Davix, have been extended to cater to specific use cases. This contribution focuses on the technical aspects of this work, particularly the challenges encountered in building a generic interface for self-hosted cloud storage, such as MinIO or CEPH S3 Gateway, and established providers like Google Cloud Storage and Amazon Simple Storage Service. Additionally, the integration of decentralised clouds like SEAL is explored. Key aspects, including authentication and authorisation, direct and remote access, throughput and cost estimation, are highlighted, along with shared experiences in daily operations.
APA, Harvard, Vancouver, ISO, and other styles
21

Carthen, Chase D., Araam Zaremehrjardi, Vinh Dac Le, et al. "A Novel Spatial Data Pipeline for Streaming Smart City Data." International Journal of Software Innovation 12, no. 1 (2024): 1–15. http://dx.doi.org/10.4018/ijsi.359180.

Full text
Abstract:
Point cloud data from light detection and ranging (LiDAR) is often used for its spatial qualities, particularly in smart city projects involving vehicles and pedestrians. In this paper, the authors introduce a streaming and an on-demand pipeline for capturing LiDAR data from Velodyne Ultra Pucks placed along nine northern Nevada intersections known as the Living Lab within smart city project for the City of Reno. This pipeline is an iteration of a previously proposed pipeline with several feature enhancements. A streaming point cloud service was implemented to stream Point Cloud Data (PCD), LASzip (LAZ), and Google Draco. Also, two web services were built for the packet capture (PCAP) and ROS 2 bag files that enables acquisition of these formats for LiDAR data. A metadata service tracks edge device states and a GraphQL service interfaces with multiple services across the Living Lab. Draco provided the best processing time and had more options that affected the quality of the point cloud. To evaluate this pipeline, a discussion is provided, with an analysis of the point cloud formats.
APA, Harvard, Vancouver, ISO, and other styles
22

Sonata, Fifin, Hendra Jaya, Muhammad Syahril, Muhammad Dahria, and Rudi Gunawan. "IMPLEMENTASI PROGRAM MSIB BATCH 7 PADA KEGIATAN STUDI INDEPENDEN DI BANGKIT ACADEMY 2024 BY GOOGLE, GOTO, TRAVELOKA YAYASAN DICODING INDONESIA." JCES | FKIP UMMat 8, no. 1 (2025): 98. https://doi.org/10.31764/jces.v8i1.28457.

Full text
Abstract:
Abstrak: Yayasan Dicoding Indonesia (YDI) adalah sebuah perusahaan startup yang bertujuan mengembangkan ekosistem developer di Indonesia. YDI mempunyai platform pembelajaran elektronik pada halaman Dicoding.com. Salah satu program pembelajaran dan pelatihan YDI adalah Bangkit Academy 2024 yang diselenggarakan oleh Google, GoTo, dan Traveloka. Tujuan dari program ini adalah untuk memberikan kesempatan kepada mahasiswa Indonesia untuk mengembangkan keterampilan di bidang teknologi. Program MSIB menawarkan 2 jalur karir yaitu Cloud Computing dan Mobile Development (Android) yang keduanya memiliki topik Android Learning Path, Cloud Computing Learning Path dan Machine Learning Learning Path. Peserta program akan mendapat kesempatan untuk belajar dari para ahli di bidangnya dan mengerjakan proyek dunia nyata di bawah bimbingan mentor dari perusahaan teknologi ternama. Selain itu, program ini juga menawarkan peserta kesempatan untuk menerima sertifikasi dari Google. Program MSIB memerlukan kurikulum komprehensif yang mencakup pembelajaran teoretis, proyek dunia nyata, dan pendampingan. Sebagai hasil dari program ini, peserta memperoleh pemahaman mendalam tentang bidang pilihan mereka, keterampilan praktis yang kuat, dan sertifikasi yang mendukung keahlian mahasiswa. Kesimpulannya adalah program MSIB Bangkit Academy 2024 memberikan manfaat besar bagi peserta dalam mengembangkan keterampilan dan pengetahuan di bidang teknologi. Kerja sama dengan perusahaan teknologi dan mentor yang berpengalaman, peserta mampu dan siap berkarir pada industri teknologi menggunakan peluang kerja yang lebih baik.Abstract: Yayasan Dicoding Indonesia (YDI) is a startup company that aims to develop the developer ecosystem in Indonesia. YDI has an e-learning platform on the Dicoding.com page. One of YDI's learning and training programs is Bangkit Academy 2024 which is organized by Google, GoTo, and Traveloka. The purpose of this program is to provide opportunities for Indonesian students to develop skills in the technology field. The MSIB program offers 2 career paths, namely Cloud Computing and Mobile Development (Android), both of which have the topics of Android Learning Path, Cloud Computing Learning Path and Machine Learning Learning Path. Program participants will have the opportunity to learn from experts in their fields and work on real-world projects under the guidance of mentors from well-known technology companies. In addition, this program also offers participants the opportunity to receive certification from Google. The MSIB program requires a comprehensive curriculum that includes theoretical learning, real-world projects, and mentoring. As a result of this program, participants gain an in-depth understanding of their chosen field, strong practical skills, and certification that supports student expertise. The conclusion is that the MSIB Bangkit Academy 2024 program provides great benefits for participants in developing skills and knowledge in the technology field. Collaborating with technology companies and experienced mentors, participants are able and ready to pursue a career in the technology industry using better job opportunities.
APA, Harvard, Vancouver, ISO, and other styles
23

Melnikov, Oleksii. "Organization and storage of large datasets for ai training." Smart technologies: Industrial and Civil Engineering 3, no. 16 (2025): 29–39. https://doi.org/10.32347/st.2025.3.1203.

Full text
Abstract:
The article examines current approaches to organizing and storing large datasets used for training artificial intelligence (AI) models, particularly for radio signal recognition tasks. It emphasizes the need for reliable and efficient data storage solutions due to the increasing scale and complexity of data involved in AI applications. The study analyzes various data storage formats (HDF5, WAV, IQ raw data), provides an overview of cloud-based solutions (AWS, Google Cloud, Azure), local storage systems (SAN, NAS, JBOD), and distributed file systems. It discusses best practices in data versioning and cataloging for enhanced accessibility, performance, and reproducibility of AI training processes. Recommendations are provided for selecting optimal storage methods based on the specific requirements of AI projects and the characteristics of the processed data.
APA, Harvard, Vancouver, ISO, and other styles
24

Seidl, Fabian, Lauren Hagen, Jacob Wilson, et al. "Abstract 1067: The ISB Cancer Gateway in the Cloud (ISB-CGC): access, explore and analyze large-scale cancer data through the Google Cloud." Cancer Research 85, no. 8_Supplement_1 (2025): 1067. https://doi.org/10.1158/1538-7445.am2025-1067.

Full text
Abstract:
Abstract Rapid growth of cancer data in recent decades has made data discovery and wrangling difficult for the average cancer research lab. Our mission at the ISB Cancer Gateway in the Cloud (ISB-CGC), part of the NCI’s Cancer Research Data Commons ecosystem, is to democratize access to large cancer datasets. Funded by the NCI, we have performed ETL processes on data from GDC and PDC projects such as TCGA, TARGET, and CPTAC. We generated hundreds of publicly available BigQuery tables containing data such as mutations, gene expression, and protein abundance, which enable data analysis in the cloud via SQL. BigQuery analyses are inexpensive and rapid even when scaled to petabyte sized inputs, for example in one analysis 6.6 billion correlations were computed in 2.5 hours with a total cost of about one dollar. These data can also be accessed affordably from Google Cloud VMs where researchers can develop analysis pipelines in Python, R, and workflow languages. We will also highlight recently improved accessibility to the Mitelman Database, added and expanded HTAN single cell data and notebook selections, and updates to our BigQuery Table Search tool. Citation Format: Fabian Seidl, Lauren Hagen, Jacob Wilson, Boris Aguilar, Deena Bleich, Lauren Wolfe, Poojitha Gundluru, Suzanne Paquette, Elaine Lee, Danna Huffman, William Longabaugh, David Pot. The ISB Cancer Gateway in the Cloud (ISB-CGC): access, explore and analyze large-scale cancer data through the Google Cloud [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2025; Part 1 (Regular Abstracts); 2025 Apr 25-30; Chicago, IL. Philadelphia (PA): AACR; Cancer Res 2025;85(8_Suppl_1):Abstract nr 1067.
APA, Harvard, Vancouver, ISO, and other styles
25

Trad, Antoine. "Enterprise Transformation Projects-Cloud Transformation Concept Holistic Security Integration (CTC-HSI)." WSEAS TRANSACTIONS ON COMPUTERS 21 (December 30, 2022): 343–64. http://dx.doi.org/10.37394/23205.2022.21.41.

Full text
Abstract:
This chapter presents the fundaments of the Cloud Transformation Concept (CTC) and this concept is a basic component of the author’s transformation framework and in complex transformation projects, where a holistic security concept is a top priority. The implementation of CTC’s Holistic Security Integration (CTCHSI) is supported by the author’s Applied Holistic Mathematical Model (AHMM) for CTC (AHMM4CTC) and his various research works on Holistic Security Integration (HSI), Business Process Management (BPM), Artificial Intelligence (AI), AI Services (AIS), Compute Services (CS), Mathematical Models, cross-functional transformations projects. The AHMM4CTC is based on cross-functional research on an authentic and proprietary mixed research method supported by his own version of an AI learning model, a search tree, combined with an internal heuristic algorithm. In this chapter, the focus is on CTC-HSI’s integration concepts, requirements, services, data management, and corresponding transformation security strategies. The proposed AHMM4CTC-based CTC-HSI is a concept for secured environments, which use real-life cases of a transformation project, which needs scalable and secured Cloud Platform’s (CP) infrastructure and services layer, that are supported by the alignment of CP services, standards, enterprise architecture paradigm, and security development strategies. In a CP-based transformation project, the author recommends integrating a Private CP (PCP); which can use commercial CPs like the Google CP (GCP). The GCP was chosen as a sample CP, but there is a need to define a standardized security architecture and concept/procedures so that the organization builds its own CTC-HSI-based PCP and has to avoid locked-in commercial products.
APA, Harvard, Vancouver, ISO, and other styles
26

Seidl, Fabian, Lauren Hagen, Jacob Wilson, et al. "Abstract 3547: The ISB Cancer Gateway in the Cloud (ISB-CGC): Access, explore and analyze large-scale cancer data through the Google Cloud." Cancer Research 84, no. 6_Supplement (2024): 3547. http://dx.doi.org/10.1158/1538-7445.am2024-3547.

Full text
Abstract:
Abstract Rapid growth of cancer data in recent decades has made data discovery and wrangling difficult for the average cancer research lab. Our mission at the ISB Cancer Gateway in the Cloud (ISB-CGC), part of the NCI’s Cancer Research Data Commons ecosystem, is to democratize access to large cancer datasets. Funded by the NCI, we have performed ETL processes on data from GDC and PDC projects such as TCGA, TARGET, and CPTAC. We generated hundreds of BigQuery tables containing data such as mutations, gene expression, and protein abundance, which enable data analysis in the cloud via SQL. BigQuery analyses are inexpensive and rapid even when scaled to petabyte sized inputs, for example we ran 6.6 billion correlations in 2.5 hours with a total cost of about one dollar. These data can also be accessed affordably from Google Cloud VMs where researchers can develop analysis pipelines in Python, R, and workflow languages such as CWL. We present two recent collaborations: In one BigQuery was used to develop machine learning algorithms that calculated genetic risk scores from TCGA glioblastoma and ovarian cancer copy number variation. In another example researchers combined SQL queries of our BQ tables with data from the ISPY2 Trial initiative and generated an R shiny app that can dynamically create data visualizations for genes of interest in different TCGA cohorts. Citation Format: Fabian Seidl, Lauren Hagen, Jacob Wilson, Boris Aguilar, Deena Bleich, Lauren Wolfe, Poojitha Gundluru, Prema Venkatesan, Mi Tian, Suzanne Paquette, Elaine Lee, Danna Huffman, David Pot, William Longabaugh. The ISB Cancer Gateway in the Cloud (ISB-CGC): Access, explore and analyze large-scale cancer data through the Google Cloud [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2024; Part 1 (Regular Abstracts); 2024 Apr 5-10; San Diego, CA. Philadelphia (PA): AACR; Cancer Res 2024;84(6_Suppl):Abstract nr 3547.
APA, Harvard, Vancouver, ISO, and other styles
27

Hsu, Jim, Paul Christensen, and Scott Long. "RecutClub 2.0: Deployment of a Trainee-Led, Didactic-Centered Whole Slide Imaging Platform." American Journal of Clinical Pathology 158, Supplement_1 (2022): S14—S15. http://dx.doi.org/10.1093/ajcp/aqac126.025.

Full text
Abstract:
Abstract Objectives The Houston Methodist Pathology training program encourages trainee participation in quality improvement (QI) projects for improving the didactics experience. One of these initiatives is RecutClub, a whole slide imaging (WSI) web-based platform for presenting surgical pathology unknown conference cases. With the evolving nature of pathology didactics in an ever-changing world where remote work plays an increasingly important role, several refinements were needed to ensure that the platform could continue to provide an unparalleled WSI experience for incoming trainees. Additionally, the burden of updating site content needed to be distributed from the site administrators to the end-users. Methods RecutClub is operated through Google Cloud via a Google App Engine instance. It connects to Google Cloud Databases to store persistent tabular data and Google Cloud Storage for pyramidal JPEG files used to display whole slide images in a web-browser. In the next generation platform (RecutClub 2.0), the largest change was to migrate case uploading from a developer-controlled solution to a decentralized, trainee-driven web portal. To ensure a superior user experience, the entire site as well as the upload form was migrated to Bootstrap 4. An AJAX-native, asynchronous uploader was built to interface the existing Google Cloud platform to a new upload portal that allows authorized trainees to build cases from scratch. Additionally, a tagging solution was implemented to allow for server-side persistent storage of trainee-selected cases, such as “interesting” cases or cases for further review. This tagging complements the existing region of interest (ROI) and case search functionalities. Results The deployment of RecutClub 2.0 involved multiple rounds of resident-led testing and feedback. Several regular testers offered ongoing user feedback and suggestions for visual layout and usability. The response from the general trainee population after deployment was overwhelming positive, with many trainees commenting on the “improved” visual aesthetics as well as the convenience of the upload functionality. Administrator involvement in case-upload time decreased from 2 hours/month to 0 hours/month due to enabling trainee-driven uploading. Conclusions The deployment of RecutClub 2.0 fulfilled the dual primary objectives of enabling straightforward trainee-led content updates, as well as decreasing administrator involvement in case-upload time. These and other changes ensure that RecutClub can continue to provide an unparalleled WSI experience for pathology trainees at Houston Methodist, while also providing fresh content in a landscape with several competing WSI platforms.
APA, Harvard, Vancouver, ISO, and other styles
28

Cascalheira, João, Nuno Bicho, and Célia Gonçalves. "A Google-Based Freeware Solution for Archaeological Field Survey and Onsite Artifact Analysis." Advances in Archaeological Practice 5, no. 4 (2017): 328–39. http://dx.doi.org/10.1017/aap.2017.21.

Full text
Abstract:
ABSTRACTThis paper introduces a new freeware digital system, based on Google/Android platforms, designed to be a fully integrated and customizable solution to record, manage, and share archaeological survey data. The core of the system is two custom smartphone/tablet applications, through which surveyors are able to retrieve geographical coordinates and relevant attribute data from archaeological locations, but also to perform onsite analysis of artifacts, including taking accurate measurements with digital calipers directly connected to the mobile devices. The system saves all data recovered in the devices' internal memory, as well as in a cloud-based spatial database (Google Fusion Tables), where data can be automatically shared and examined using a rather intuitive set of visualization tools to instantly make maps or produce exploratory charts. Using the example of a recent field survey project for Stone Age sites in Mozambique, we provide a detailed discussion of the creation and use of all hardware and software components of our solution that will allow other researchers to reproduce the methodology and customize the system to meet the needs of their own projects.
APA, Harvard, Vancouver, ISO, and other styles
29

Gopi, K. Naga, K. Thirupathi Rao, K. Chaitanya, and R. Venkatesh. "Weighted Rank Query Search over Cloud Data Storage." International Journal of Advances in Applied Sciences 5, no. 2 (2016): 85. http://dx.doi.org/10.11591/ijaas.v5.i2.pp85-93.

Full text
Abstract:
This paper describes and fixes the problem of multi-keyword rated look for over secured reasoning information (MRSE) while preserving strict system sensible comfort in the reasoning processing model. Information proprietors are roused to delegate their convoluted data administration frameworks from neighborhood destinations to the business open thinking for extraordinary flexibility and monetary advantages. Anyway, for safeguarding data solace, delicate data must be secured before outsourcing, which obsoletes traditional data use in light of plaintext essential word and key expression search for. To better bolster clients in their long haul data missions on the Web, Google stay informed concerning their worries and mouse snaps while seeking on the web. In this paper, we think about the issue of arranging a client's conventional concerns into gatherings in an intense and electronic style. In a flash deciding inquiry gatherings is useful for various distinctive online search for motor parts and projects, for example, question recommendations, result position, question alteration, sessionization, and community explore for. In our method, we depart past strategies to depend on literary resemblance or moment points of confinement, and we suggest a more compelling technique that controls search for question logs.
APA, Harvard, Vancouver, ISO, and other styles
30

Царев, Ю. В., and Е. И. Рыжкова. "DETERMINATION OF FORMALDEHYDE CONTENT IN THE AIR OF YAROSLAVL ACCORDING TO SENTINEL-5P SATELLITE DATA." Южно-Сибирский научный вестник, no. 6(52) (December 31, 2023): 31–35. http://dx.doi.org/10.25699/sssb.2023.52.6.050.

Full text
Abstract:
Оценка концентрации формальдегида в атмосферном воздухе населенных мест является важной частью исследований воздействия предприятий на селитебную территорию. Данные по содержанию формальдегида для инструмента Tropomi спутника Sentinel-5P, предоставляемые Европейским космическим агентством в рамках программы Copernicus сгруппированы в датасетах, которые требуют обработки. Для реализации процесса извлечения данных содержания формальдегида из датасета с целью последующего анализа и принятия решений фирмой Google предоставляется облачный сервис Google Earth Engine (GEE), позволяющий осуществлять их обработку на языке программирования java. В статье представлено облачное приложение, которое позволяет осуществить оценку содержания формальдегида в атмосферном воздухе. На примере города Ярославль представлены результаты оценки. Территория исследования может определяться изменением параметров работы программы. Особенностью облачного решения является возможность совместной работы над проектами, связанными с экологическим мониторингом. Результатом разработки является облачное приложение на языке java, которое позволяет гибко извлекать из датасета сведения о содержании формальдегида в атмосферном воздухе для конкретной даты, усредненные значения за неделю, месяц и т.д., а также определять размеры территории для которой будет производиться оценка. В облачном приложении есть возможность загрузки набора необходимых данных, возможность извлечь данные по содержанию формальдегида с 2018 года по настоящее время. Использование облачного сервиса и соответствующего приложения позволяет получать необходимые данные о состоянии загрязнения воздуха формальдегидом. Полученные данные могут быть использованы для территориального планирования, эколого-географических обследований, в том числе по оценке экологической обстановки. Assessing the concentration of formaldehyde in the atmospheric air of populated areas is an important part of research into the impact of enterprises on residential areas. Data on formaldehyde content for the Tropomi instrument of the Sentinel-5P satellite, provided by the European Space Agency as part of the Copernicus program, are grouped in datasets that require processing. To implement the process of extracting formaldehyde content data from a dataset for the purpose of subsequent analysis and decision-making, Google provides the Google Earth Engine (GEE) cloud service, which allows for their processing in the java programming language. The article presents a cloud application that allows you to assess the formaldehyde content in atmospheric air. Using the example of the city of Yaroslavl, the assessment results are presented. The study area can be determined by changing the program operating parameters. A special feature of the cloud solution is the ability to collaborate on projects related to environmental monitoring. The result of the development is a cloud application in the Java language, which allows you to flexibly extract from the dataset information about the formaldehyde content in the atmospheric air for a specific date, average values for a week, month, etc., as well as determine the size of the territory for which the assessment will be made. The cloud application has the ability to download a set of necessary data and the ability to extract data on formaldehyde content from 2018 to the present. Using a cloud service and the corresponding application allows you to obtain the necessary data on the state of air pollution with formaldehyde. The data obtained can be used for territorial planning, environmental and geographical surveys, including assessment of the environmental situation..
APA, Harvard, Vancouver, ISO, and other styles
31

Sobchenko, Tetiana, Natalia Smolianiuk, Violetta Panchenko, Tetiana Tverdokhlib, and Svitlana Dotsenko. "Teaching «Fundamentals of Health» with the application of cloud technologies." Revista Amazonia Investiga 11, no. 49 (2022): 36–46. http://dx.doi.org/10.34069/ai/2022.49.01.4.

Full text
Abstract:
The aim of the article is to analyze the results of teaching the subject «Fundamentals of Health» in grades 5-9 in the conditions of distance learning with the application of cloud technologies. The following methods were used for the study: videoconferences, explanations, interviews, surveys, tests for formative and summative assessment, performing practical tasks, works, projects, compiling instructions to tasks, creating educational presentations and videos. The article substantiates the possibility of teaching the subject «Fundamentals of Health» in grades 5-9 using cloud technology and educational platforms in the conditions of distance learning, such as: Google Classroom, Zoom, Meet, Edpuzzle. The results of 5-9-grade students’ academic performance were analyzed and it was found that grades 5, 6, and 9 showed a fairly high level of academic achievement, while for the students of 7th and 8th grades this indicator was quite low. The obtained results are explained by the difference in motivation, experience of blended learning and students’ age psychological peculiarities, parents’ inclusion in the educational process, various types of practical tasks. The prospects for further use of distance education are highlighted in the article. They include application of tried and tested cloud technologies, systematic and comprehensive professional development of teachers; teacher-parent cooperation, preparing guidelines for students’ parents; adapting types of tasks and their content to the conditions of distance and blended learning; launching teachers’ own YouTube channels.
APA, Harvard, Vancouver, ISO, and other styles
32

Panforova, Iryna, and Viktor Shutko. "Development of a parametric estimation method of the operations duration of the IT project of migration of the information system to the cloud platform." Management Information System and Devises, no. 180 (May 22, 2024): 27–36. http://dx.doi.org/10.30837/0135-1710.2024.180.027.

Full text
Abstract:
The process of migration of information systems (IS) to the cloud, which is an urgent problem for many modern enterprises, is considered. The subject of the research is methods for estimating the duration of operations in the implementation of an IT project of migration of information systems to the cloud. Several main methods for estimating the duration of operations are considered: the method of Expert Judgment, the method of Analogous Estimation and the method of analysis and revision of scenarios (Program Evaluation and Review Technique (PERT). Among the methods considered, the PERT method was chosen as the one that best adapts to temporal uncertainties. The advantages and disadvantages of the original PERT method are considered. It is found that focusing on the most probable scenario with minimal consideration of the influence of other scenarios may not reflect all possible cases when estimating the duration of operations of IT projects. Attention is paid to the existing modifications, the purpose of their development and adaptability to solving the problems of estimating the duration of operations for IT projects of migration of information systems to the cloud. An analysis of the existing modification of the method (Modified PERT) on the possibility of solving the problem of focusing on the most likely scenario. Based on the results of the preliminary study, a method of parametric estimation based on the PERT method has been proposed to solve the identified problem. A comparison of the accuracy of estimating the duration of operations of the IT project of migration of IS to the Google Cloud for the original PERT method and the developed method of parametric estimation is carried out on the example of three tasks. It has been experimentally shown that for situations where the context of the operation is not based on the most probable scenario, the developed method provides a more accurate estimate than the original one. The change in the graph of the probability curve of the distribution of the time of the operation is clearly shown using the original PERT method and the parametric estimation method based on the PERT method.
APA, Harvard, Vancouver, ISO, and other styles
33

Budaev, Evgeny S., Dmitry A. Sobolev, and Boris S. Budaev. "DEVELOPMENT OF A MOBILE APPLICATION TO ENSURE THE MOVEMENT OF THE LOW-MOBILITY POPULATION." SOFT MEASUREMENTS AND COMPUTING 11/2, no. 72 (2023): 59–70. http://dx.doi.org/10.36871/2618-9976.2023.11-2.006.

Full text
Abstract:
Currently, the society is launching many projects aimed at improving the lives of people with disabilities, and the tendency to transform attitudes towards citizens with disabilities towards loyalty is one of the main sociocultural determinants affecting the relationship of a disabled person with the surrounding social environment. The IT sector, not wanting to lag behind the noble movement of assistance and support for the disabled, generates a huge number of both software and technical meansinnovations aimed at improving the quality of life of people with disabilities. The article describes an IT solution in the form of developing a mobile application that allows you to set a route from the point of departure to the destination, bypassing difficult places. The development was carried out using the Java programming language, the Android Studio development environment, the platform for integrating maps into the Google Maps Platform project, and the Firebase cloud database.
APA, Harvard, Vancouver, ISO, and other styles
34

Mention, Anne-Laure, João José Pinto Ferreira, and Marko Torkkeli. "Moonshot innovations: Wishful Thinking or Business-As-Usual?" Journal of Innovation Management 7, no. 1 (2019): 1–6. http://dx.doi.org/10.24840/2183-0606_007.001_0001.

Full text
Abstract:
‘Our mind-set will be to avoid the moonshot’ said Boeing CEO James McNerney at a Wall Street analysts meeting in Seattle nearly 5 years ago (see Gates, 2014). The ambitious, exploratory and risky endeavour dubbed as moonshot project of the Boeing 787 Dreamliner had sunk billions of dollars in an industry where end-users demanded more comfort and convenience for less cost. According to McNerney, moonshots do not work in a price-sensitive environment. It is argued that they also tend to take the focus away from more immediate value capture opportunities as seen through Google’s loss on its core Cloud Platform to Amazon Web Services (AWS). Google’s parent company Alphabet which oversees Google X (a semi-secret moonshot project lab) more recently reported that it had incurred a US$1.3billion in operating loss on moonshot projects with a sizeable increase in compensation of employees and executives working on these projects (Alphabet, 2018). Notably, none of the Google X lab spin-outs (e.g. Loon – a balloon-based internet project, Waymo – self-driving car project, Wing – drone delivery project) have been identified as commercially viable. Despite the uncertainties and failures, the focus on moonshot innovations continues to proliferate in academia (Kaur, Kaur and Singh, 2016; Strong and Lynch, 2018) and practice (Martinez, 2018). Yourden (1997) even wrote an interesting book on perseverance and tenacity to keep going even after failed projects. Proponents of moonshot thinking have claimed that it can help solve society’s biggest challenges (e.g. cure cancer, see Kovarik, 2018) with some suggesting to encourage such thinking by paying failure bonuses (Figueroa, 2018). Yet others remain sceptical, positing that moonshot is ‘awesome and pointless’ (Haigh, 2019, p.4). A proverbial question, thus, emerges: are moonshot innovations simply wishful thinking or can they be part of business-as-usual? In part, the answer may be two-fold – 1) understanding the value of moonshot thinking, and 2) understanding moonshot challenges. (...)
APA, Harvard, Vancouver, ISO, and other styles
35

Nastiti, Kartika Rizqi, Ahmad Fathan Hidayatullah, and Ahmad Rafie Pratama. "Discovering Computer Science Research Topic Trends using Latent Dirichlet Allocation." Jurnal Online Informatika 6, no. 1 (2021): 17. http://dx.doi.org/10.15575/join.v6i1.636.

Full text
Abstract:
Before conducting a research project, researchers must find the trends and state of the art in their research field. However, that is not necessarily an easy job for researchers, partly due to the lack of specific tools to filter the required information by time range. This study aims to provide a solution to that problem by performing a topic modeling approach to the scraped data from Google Scholar between 2010 and 2019. We utilized Latent Dirichlet Allocation (LDA) combined with Term Frequency-Indexed Document Frequency (TF-IDF) to build topic models and employed the coherence score method to determine how many different topics there are for each year’s data. We also provided a visualization of the topic interpretation and word distribution for each topic as well as its relevance using word cloud and PyLDAvis. In the future, we expect to add more features to show the relevance and interconnections between each topic to make it even easier for researchers to use this tool in their research projects.
APA, Harvard, Vancouver, ISO, and other styles
36

R Niakan Kalhori, Sharareh, Kambiz Bahaadinbeigy, Kolsoum Deldar, Marsa Gholamzadeh, Sadrieh Hajesmaeel-Gohari, and Seyed Mohammad Ayyoubzadeh. "Digital Health Solutions to Control the COVID-19 Pandemic in Countries With High Disease Prevalence: Literature Review." Journal of Medical Internet Research 23, no. 3 (2021): e19473. http://dx.doi.org/10.2196/19473.

Full text
Abstract:
Background COVID-19, the disease caused by the novel coronavirus SARS-CoV-2, has become a global pandemic, affecting most countries worldwide. Digital health information technologies can be applied in three aspects, namely digital patients, digital devices, and digital clinics, and could be useful in fighting the COVID-19 pandemic. Objective Recent reviews have examined the role of digital health in controlling COVID-19 to identify the potential of digital health interventions to fight the disease. However, this study aims to review and analyze the digital technology that is being applied to control the COVID-19 pandemic in the 10 countries with the highest prevalence of the disease. Methods For this review, the Google Scholar, PubMed, Web of Science, and Scopus databases were searched in August 2020 to retrieve publications from December 2019 to March 15, 2020. Furthermore, the Google search engine was used to identify additional applications of digital health for COVID-19 pandemic control. Results We included 32 papers in this review that reported 37 digital health applications for COVID-19 control. The most common digital health projects to address COVID-19 were telemedicine visits (11/37, 30%). Digital learning packages for informing people about the disease, geographic information systems and quick response code applications for real-time case tracking, and cloud- or mobile-based systems for self-care and patient tracking were in the second rank of digital tool applications (all 7/37, 19%). The projects were deployed in various European countries and in the United States, Australia, and China. Conclusions Considering the potential of available information technologies worldwide in the 21st century, particularly in developed countries, it appears that more digital health products with a higher level of intelligence capability remain to be applied for the management of pandemics and health-related crises.
APA, Harvard, Vancouver, ISO, and other styles
37

Solovei, Olha, Tetiana Honcharenko, and Anatolii Fesan. "Technologies to manager big data of urban building projects." Management of Development of Complex Systems, no. 60 (November 29, 2024): 121–28. https://doi.org/10.32347/2412-9933.2024.60.121-128.

Full text
Abstract:
The transformation of the construction industry according to the Construction 4.0 concept is possible with the availability of technology for managing big data of construction projects, where the task of managing big data includes tasks: collection; processing; renewal; backup and save data. Today, the information technologies of urban construction projects are a complex of integrated software complexes, and the data of construction projects remain stored in various data repositories, which makes it difficult, and sometimes impossible, to use them for project implementation. The choice of big data management technologies, including, depends on the types of big data that are characteristic of the project. The purpose of this work is to define a list of technologies for managing various types of data of urban construction projects to enable their use for the automation of urban construction projects. To achieve the goal, the work analyzed the types and formats of data information systems of urban construction projects, namely: business process management systems; systems of interaction with interested parties; labor protection and risk management systems in construction; operation management systems; systems for designing and creating models of spatial objects; systems of augmented reality (VR/AR); systems for engineering analysis. Based on the analysis of data types and formats, it is determined that the data belongs to the following categories: structured, semi-structured and unstructured data types. Studying the developments of scientists regarding big data management technologies of certain types made it possible to compile a list of technologies for data management of urban construction projects, namely: for data collection – Apache Kafka, Apache Hbase, Apache Spark, Apache Hadoop, Stream Analytics, Scrapy, Twitter API, Facebook Graph API; for data processing – technologies "Intelligent analysis of texts", "Computer vision", machine and deep learning; 3) for data storage – AWS S3, AWS RDS SQL, Azure Data Lake, HDFS, Redi, CosmosDB, MongoDB, Azure Blob Storage; 4) for backup – AWS Backup, Google Cloud Storage, Microsoft Azure Backup, MongoDB Backup, Cassandra Backup; to update – Apache Kafka/Flink/Spark Streaming, SQL. Further research will consist in conducting an analysis of the effectiveness of the methods of the specified technologies for solving the tasks of data management of construction projects, depending on the nature of the data input.
APA, Harvard, Vancouver, ISO, and other styles
38

Dr., Arun Thomas, Shaji Sheethal, and Shaju |. Silji Simon C. |. Vishnu Narayan V. Shery. "Question Bank Maker." International Journal of Trend in Scientific Research and Development 3, no. 3 (2019): 1592–95. https://doi.org/10.31142/ijtsrd23208.

Full text
Abstract:
The project is a question bank maker .It is helpful for the students to prepare themselves for learning important topics for the exam . Referring the previous question papers is a part of exam preparation. Learning the same after covering all the topics may be late in most of the cases. Thus if we could sort out and arrange all the questions in module wise it will be easier to go through previous questions. the question bank maker , sorts the Questions module wise, Year wise , Exam wise and mark wise. Question bank maker will produce the questions sort wise easily for the welfare of the students . The students will be able to go through previous questions easily as a part of exam preparations. In future the question bank maker can be fully automated to get question papers after exams identify chapters or module of each questions by itself with machine learning and generate files automatically after exams. Question bank maker is more than a simple question maker or question creator. Here you can upload the question papers as a pdf format and we can sort the question papers in module wise, mark wise and exam wise. Dr. Arun Thomas | Sheethal Shaji | Shery Shaju | Silji Simon C | Vishnu Narayan V &quot;Question Bank Maker&quot; Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-3 , April 2019, URL: https://www.ijtsrd.com/papers/ijtsrd23208.pdf
APA, Harvard, Vancouver, ISO, and other styles
39

Karishma, Hakim, Jangewad Supriya, Maindarkar Priyanka, and T.G Thite. "Review of Software and Hardware IoT Platforms." Journal of Information Technology and Sciences 5, no. 2 (2019): 17–23. https://doi.org/10.5281/zenodo.3192914.

Full text
Abstract:
<strong><em>Abstract</em></strong> <em>The Internet of things (IoT) is a system of interconnected computing devices, mechanical and digital machines, objects, animals or people that are provided with singular identifiers and the ability to transfer data over a network without need humantohuman or human to computer interaction. There are various consumer, commercial, industrial applications of IoT. In this study we have presented short survey of the most popular IoT software and hardware platforms available in market. This platformsis used to create modern IoT projects in startups, scientific projects and other useful devices for smart home, smart city, agriculture etc.</em>
APA, Harvard, Vancouver, ISO, and other styles
40

Vaishali, Nagpure. "Capacity Planning and Resource Utilization in Large-Scale IT Projects - Data-Driven Approach: A Survey." INTERNATIONAL JOURNAL OF INNOVATIVE RESEARCH AND CREATIVE TECHNOLOGY 9, no. 5 (2023): 1–11. https://doi.org/10.5281/zenodo.14281940.

Full text
Abstract:
Capacity planning and resource utilization are essential aspects of managing large-scale IT systems, particularly in cloud environments, e-commerce platforms, ride-sharing services, and industrial manufacturing systems. These processes are crucial for ensuring that IT resources are used efficiently, costs are minimized, and system performance is maintained under varying demand conditions. Traditional methods of resource allocation often fall short in addressing the dynamic and complex nature of modern IT environments, necessitating more adaptive, data-driven approaches.This survey explores the application of machine learning (ML), optimization techniques, and orchestration tools in the context of large-scale IT projects. It provides a comprehensive analysis of how data-driven methods, including predictive analytics and real-time monitoring, are used to forecast demand, optimize resource allocation, and enhance system efficiency. Use cases are examined from diverse domains, such as predicting server load in e-commerce platforms, optimizing driver allocation in ride-sharing services, minimizing energy consumption in manufacturing, and scaling resources in cloud environments.Key technologies such as TensorFlow for predictive modeling, Google OR-Tools for optimization, and Kubernetes for container orchestration are discussed. The survey includes real-world examples and detailed workflows, illustrating how machine learning models can be deployed for demand forecasting, resource allocation, and autoscaling in production environments. Furthermore, it presents advanced visualizations to demonstrate the insights gained from data, such as heatmaps for resource allocation mismatches and time-series plots for server load predictions.In addition to the theoretical underpinnings, this survey provides practical guidance for deploying these techniques using platforms like AWS, Kubernetes, and Prometheus. It also covers optimization techniques such as linear programming and dynamic programming, showcasing how these methods are applied to solve real-world resource management problems.The paper concludes by emphasizing the importance of continuous monitoring, evaluation, and feedback loops to refine capacity planning strategies over time. Finally, future directions are explored, focusing on emerging trends like edge computing, federated learning, and sustainability in IT resource management. This survey serves as a comprehensive guide for researchers and practitioners looking to enhance the scalability, efficiency, and sustainability of large-scale IT projects.
APA, Harvard, Vancouver, ISO, and other styles
41

Agapiou, Athos. "Multi-Temporal Change Detection Analysis of Vertical Sprawl over Limassol City Centre and Amathus Archaeological Site in Cyprus during 2015–2020 Using the Sentinel-1 Sensor and the Google Earth Engine Platform." Sensors 21, no. 5 (2021): 1884. http://dx.doi.org/10.3390/s21051884.

Full text
Abstract:
Urban sprawl can negatively impact the archaeological record of an area. In order to study the urbanisation process and its patterns, satellite images were used in the past to identify land-use changes and detect individual buildings and constructions. However, this approach involves the acquisition of high-resolution satellite images, the cost of which is increases according to the size of the area under study, as well as the time interval of the analysis. In this paper, we implemented a quick, automatic and low-cost exploration of large areas, for addressing this purpose, aiming to provide at a medium resolution of an overview of the landscape changes. This study focuses on using radar Sentinel-1 images to monitor and detect multi-temporal changes during the period 2015–2020 in Limassol, Cyprus. In addition, the big data cloud platform, Google Earth Engine, was used to process the data. Three different change detection methods were implemented in this platform as follow: (a) vertical transmit, vertical receive (VV) and vertical transmit, horizontal receive (VH) polarisations pseudo-colour composites; (b) the Rapid and Easy Change Detection in Radar Time-Series by Variation Coefficient (REACTIV) Google Earth Engine algorithm; and (c) a multi-temporal Wishart-based change detection algorithm. The overall findings are presented for the wider area of the Limassol city, with special focus on the archaeological site of “Amathus” and the city centre of Limassol. For validation purposes, satellite images from the multi-temporal archive from the Google Earth platform were used. The methods mentioned above were able to capture the urbanization process of the city that has been initiated during this period due to recent large construction projects.
APA, Harvard, Vancouver, ISO, and other styles
42

Anaya, Jesús A., Víctor H. Gutiérrez-Vélez, Ana M. Pacheco-Pascagaza, Sebastián Palomino-Ángel, Natasha Han, and Heiko Balzter. "Drivers of Forest Loss in a Megadiverse Hotspot on the Pacific Coast of Colombia." Remote Sensing 12, no. 8 (2020): 1235. http://dx.doi.org/10.3390/rs12081235.

Full text
Abstract:
Tropical forests are disappearing at unprecedented rates, but the drivers behind this transformation are not always clear. This limits the decision-making processes and the effectiveness of forest management policies. In this paper, we address the extent and drivers of deforestation of the Choco biodiversity hotspot, which has not received much scientific attention despite its high levels of plant diversity and endemism. The climate is characterized by persistent cloud cover which is a challenge for land cover mapping from optical satellite imagery. By using Google Earth Engine to select pixels with minimal cloud content and applying a random forest classifier to Landsat and Sentinel data, we produced a wall-to-wall land cover map, enabling a diagnosis of the status and drivers of forest loss in the region. Analyses of these new maps together with information from illicit crops and alluvial mining uncovered the pressure over intact forests. According to Global Forest Change (GFC) data, 2324 km2 were deforested in this area from 2001 to 2018, reaching a maximum in 2016 and 2017. We found that 68% of the area is covered by broadleaf forests (67,473 km2) and 15% by shrublands (14,483 km2), the latter with enormous potential to promote restoration projects. This paper provides a new insight into the conservation of this exceptional forest with a discussion of the drivers of forest loss, where illicit crops and alluvial mining were found to be responsible for 60% of forest loss.
APA, Harvard, Vancouver, ISO, and other styles
43

Onyshchenko, Nataliіa. "Training of Future Teachers by Means of Innovative Technologies in the Process of Studying Pedagogical Disciplines." Education and Pedagogical Sciences, no. 1 (176) (2021): 72–80. http://dx.doi.org/10.12958/2227-2747-2021-1(176)-72-80.

Full text
Abstract:
The article reveals the features of training future teachers by means of innovative technologies in the process of studying pedagogical disciplines. The article describes innovative pedagogical technologies that are being intensively introduced in modern higher education institutions (distance learning technologies (most relevant in the context of the COVID-19 pandemic), a credit-modular system for organizing training, game, information and computer, interactive, multimedia, dialogue and communication, project technologies, technology of problem learning, etc.). It has been established that among the pedagogical technologies used in the study of pedagogical disciplines, multimedia technologies are very popular. It is emphasized that the training methodology, case method and portfolio method help the formation of communicative skills, the creation of a benevolent atmosphere of communication in practical and seminar lessons. It has been determined that the design technology provides for a systematic and consistent modelling of solutions to problem situations that require research efforts from the participants in the educational process aimed at research and design of the best ways to create projects, their defence and analysis of the results. It has been noted that Internet projects and cloud technologies (Google Apps, Office 365 services, file storage, electronic literature, web applications and distance learning support systems Moodle and Blackboard) are relevant in the study of pedagogical disciplines, which are aimed at increasing the level of students’ speech skills and help to learn to receive a significant amount of information.
APA, Harvard, Vancouver, ISO, and other styles
44

Morgunov, G. I., and ,. A. V. Orlovsky. "QGIS WEB CLIENT 2 – NEW TECHNOLOGY FOR WEB GEODATA PUBLISHING." ECOLOGY ECONOMY INFORMATICS. GEOINFORMATION TECHNOLOGIES AND SPACE MONITORING 2, no. 6 (2021): 28–31. http://dx.doi.org/10.23885/2500-123x-2021-2-6-28-31.

Full text
Abstract:
This paper focuses on Quantum GIS (QGIS), an open-source cross-platform application that allows to visualize geospatial data in various formats. The paper indicated the main advantages of QGIS web application (second version) – QGIS Web Client 2: free distribution; free access to open-source code, which allows to write or modify a script or program module; permission to modify the source code; the ability to install the program on various operating systems (Windows, Mac OS, Ubuntu, etc.); a large library of free modules for working with geodata; the ability to publish maps on the web using Mapserver and other analogs; the ability to download space images from different sources (Yandex, Google, Bing Aerial, etc.); the ability to post data and publish projects on the Internet using the QGIS Cloud plugin. The technical requirements for the design of the QWC2 web application are presented; instructions for installing and QWC2 configuration on the available infrastructure are also given. The testing and the functionality of QWC2 application were presented on practical examples (area / length / coordinate measurements; thumbnail / redline functionality; generation of permalinks; PDF printing; ability to export maps in different images; import WMS/WFS; map comparison).
APA, Harvard, Vancouver, ISO, and other styles
45

Царев, Ю. В., and И. Р. Сылкин. "USING A CLOUD APPLICATION TO ASSESS THE POWER OF INDUSTRIAL EMISSIONS INTO THE ATMOSPHERE ACCORDING TO SENTINEL-5P SATELLITE DATA." Южно-Сибирский научный вестник, no. 3(55) (June 30, 2024): 135–39. http://dx.doi.org/10.25699/sssb.2024.55.3.017.

Full text
Abstract:
Оценка мощности выбросов является неотъемлемой частью оценки воздействия предприятия на окружающую среду. В предлагаемом исследовании используются данные концентрации диоксида азота, определенные прибором Tropomi спутника Sentinel-5P и предоставленные Европейским космическим агентством в рамках программы Copernicus. На основе облачного сервиса Google Earth Engine (GEE) с использованием языка программирования Java было разработано приложение для реализации процесса извлечения концентраций диоксида азота из геопространственного набора данных для последующего анализа и определения интенсивности выбросов предприятия. В публикации для оценки мощности выбросов использована методика, разработанная Главной геофизической обсерваторией имени А.И. Воейкова в редакции до 2018 года (ОНД-86) и после 2018 года (Приказ № 273). Облачное приложение использует данные о концентрации оксида азота в атмосфере для оценки мощности выбросов от промышленных предприятий. Результаты расчетов продемонстрированы на примере района исследований Костромской ГРЭС – городаВолгореченска. Район исследования определялся местоположением Костромской ГРЭС в Волгореченске и метеорологическими параметрами. Особенностью облачного решения является возможность сотрудничества в проектах, связанных с экологическим мониторингом. Приложение может гибко извлекать из массива данных усредненные данные по содержанию оксида азота в атмосфере за определенные периоды времени, определять размер оцениваемой территории и рассчитывать мощность источников выбросов. С помощью облачного сервиса и соответствующего приложения можно получить необходимые данные о состоянии загрязнения воздуха оксидами азота и использовать их для регионального планирования, экологических и географических исследований, в том числе для оценки мощности выбросов промышленных предприятий. Assessing the power of emissions into the atmosphere is an integral part of assessing the impact of an enterprise on the environment. The proposed study uses information on the concentration of nitrogen oxides determined by the Tropomi instrument of the Sentinel-5P satellite and provided by the European Space Agency as part of the Copernicus program. To implement the process of extracting the concentration of nitrogen dioxide from a set of geospatial data for the purpose of subsequent analysis and determination of the emission power, an application based on the Google Earth Engine (GEE) cloud service was developed, which allows data processing in the java programming language. In the publication under review, to estimate the power of the ejection, the methodology of the Main Geophysical Observatory named after. A.I. Voeykova. The cloud application allows you to estimate the emission power from an industrial facility based on the concentrations of nitrogen oxide in the atmospheric air. Using the example of the research object of the Kostroma State District Power Plant in Volgorechensk, the calculation results are presented. The study area is determined by the location of the Kostroma State District Power Plant in Volgorechensk and meteorological parameters. A special feature of the cloud solution is the ability to collaborate on projects related to environmental monitoring. The result of the development is a cloud application in the Java language, which allows you to flexibly extract from the dataset average information about the content of nitrogen oxide in the atmospheric air for a specific date interval, determine the size of the territory for which the assessment will be made, and calculate the power of the emission source. The use of a cloud service and the corresponding application allows you to obtain the necessary data on the state of air pollution with nitrogen oxide, which can be used for territorial planning, environmental and geographical surveys, including assessing the emission power of industrial facilities..
APA, Harvard, Vancouver, ISO, and other styles
46

Edwards, Thomas, Christopher B. Jones, Sarah E. Perkins, and Padraig Corcoran. "Passive citizen science: The role of social media in wildlife observations." PLOS ONE 16, no. 8 (2021): e0255416. http://dx.doi.org/10.1371/journal.pone.0255416.

Full text
Abstract:
Citizen science plays an important role in observing the natural environment. While conventional citizen science consists of organized campaigns to observe a particular phenomenon or species there are also many ad hoc observations of the environment in social media. These data constitute a valuable resource for ‘passive citizen science’—the use of social media that are unconnected to any particular citizen science program, but represent an untapped dataset of ecological value. We explore the value of passive citizen science, by evaluating species distributions using the photo sharing site Flickr. The data are evaluated relative to those submitted to the National Biodiversity Network (NBN) Atlas, the largest collection of species distribution data in the UK. Our study focuses on the 1500 best represented species on NBN, and common invasive species within UK, and compares the spatial and temporal distribution with NBN data. We also introduce an innovative image verification technique that uses the Google Cloud Vision API in combination with species taxonomic data to determine the likelihood that a mention of a species on Flickr represents a given species. The spatial and temporal analyses for our case studies suggest that the Flickr dataset best reflects the NBN dataset when considering a purely spatial distribution with no time constraints. The best represented species on Flickr in comparison to NBN are diurnal garden birds as around 70% of the Flickr posts for them are valid observations relative to the NBN. Passive citizen science could offer a rich source of observation data for certain taxonomic groups, and/or as a repository for dedicated projects. Our novel method of validating Flickr records is suited to verifying more extensive collections, including less well-known species, and when used in combination with citizen science projects could offer a platform for accurate identification of species and their location.
APA, Harvard, Vancouver, ISO, and other styles
47

Pei, Jie, Li Wang, Xiaoyue Wang, et al. "Time Series of Landsat Imagery Shows Vegetation Recovery in Two Fragile Karst Watersheds in Southwest China from 1988 to 2016." Remote Sensing 11, no. 17 (2019): 2044. http://dx.doi.org/10.3390/rs11172044.

Full text
Abstract:
Since the implementation of China’s afforestation and conservation projects during recent decades, an increasing number of studies have reported greening trends in the karst regions of southwest China using coarse-resolution satellite imagery, but small-scale changes in the heterogenous landscapes remain largely unknown. Focusing on two typical karst regions in the Nandong and Xiaojiang watersheds in Yunnan province, we processed 2,497 Landsat scenes from 1988 to 2016 using the Google Earth Engine cloud platform and analyzed vegetation trends and associated drivers. We found that both watersheds experienced significant increasing trends in annual fractional vegetation cover, at a rate of 0.0027 year−1 and 0.0020 year−1, respectively. Notably, the greening trends have been intensifying during the conservation period (2001–2016) even under unfavorable climate conditions. Human-induced ecological engineering was the primary factor for the increased greenness. Moreover, vegetation change responded differently to variations in topographic gradients and lithological types. Relatively more vegetation recovery was found in regions with moderate slopes and elevation, and pure limestone, limestone and dolomite interbedded layer as well as impure carbonate rocks than non-karst rocks. Partial correlation analysis of vegetation trends and temperature and precipitation trends suggested that climate change played a minor role in vegetation recovery. Our findings contribute to an improved understanding of the mechanisms behind vegetation changes in karst areas and may provide scientific supports for local afforestation and conservation policies.
APA, Harvard, Vancouver, ISO, and other styles
48

Sugiarto, Indar, Doddy Prayogo, Henry Palit, et al. "Custom Built of Smart Computing Platform for Supporting Optimization Methods and Artificial Intelligence Research." Proceedings of the Pakistan Academy of Sciences: A. Physical and Computational Sciences 58, S (2021): 59–64. http://dx.doi.org/10.53560/ppasa(58-sp1)733.

Full text
Abstract:
This paper describes a prototype of a computing platform dedicated to artificial intelligence explorations. The platform, dubbed as PakCarik, is essentially a high throughput computing platform with GPU (graphics processing units) acceleration. PakCarik is an Indonesian acronym for Platform Komputasi Cerdas Ramah Industri Kreatif, which can be translated as “Creative Industry friendly Intelligence Computing Platform”. This platform aims to provide complete development and production environment for AI-based projects, especially to those that rely on machine learning and multiobjective optimization paradigms. The method for constructing PakCarik was based on a computer hardware assembling technique that uses commercial off-the-shelf hardware and was tested on several AI-related application scenarios. The testing methods in this experiment include: high-performance lapack (HPL) benchmarking, message passing interface (MPI) benchmarking, and TensorFlow (TF) benchmarking. From the experiment, the authors can observe that PakCarik's performance is quite similar to the commonly used cloud computing services such as Google Compute Engine and Amazon EC2, even though falls a bit behind the dedicated AI platform such as Nvidia DGX-1 used in the benchmarking experiment. Its maximum computing performance was measured at 326 Gflops. The authors conclude that PakCarik is ready to be deployed in real-world applications and it can be made even more powerful by adding more GPU cards in it.
APA, Harvard, Vancouver, ISO, and other styles
49

Shkitsa, L. Ye, V. А. Kornuta, and О. B. Kornutа. "Single information space for innovation activity of industry-specific higher education institution." Scientific Bulletin of Ivano-Frankivsk National Technical University of Oil and Gas, no. 2(47) (December 26, 2019): 57–64. http://dx.doi.org/10.31471/1993-9965-2019-2(47)-57-64.

Full text
Abstract:
The work is aimed at development and improvement of the system of information provision and activation of innovation and project activity of universities for prepairing specialists of oil and gas industry. The authors have proposed a model of the information system based on the use of electronic document flow, implemented, at this stage, for the organization of the activity process. The workflow is based on the use of cloud-based Google services for schools with a backup repository on a local server. The availability of corporate e-mail and electronic signature is the condition of access. The preparation for the implementation of the network model of scientific development has been done. A specialized technical institution of higher education (HEI) has been chosen as an institutional basis for the functioning of the scientific and educational information space. An analysis of the existing business processes of the HEI’s work has been carried out using the approach of creating classifiers and time-based budgeting based on the map-tables. The conceptual model of the combination of initial work and scientific research has been developed. The use of Trello project management system for university teaching has been proposed because Trello is an extremely simple tool that is easy to implement in the workflow without long adaptation of the staff. This system supports the flexible management of the project team and integration with the services of working with documents. It has been established that in spite of the shortcomings, at the first stage of the project management implementation the use of such cloud service would allow to overcome the psychological inertia of participants in the process of work on projects without the heavy expenses. The proposed approaches to engineering education will allow using the potential of student youth for the development of innovations by scientific-educational project teams and minimize the costs of ensuring the functioning of a single information space of innovation-project activity in oil and gas industry.
APA, Harvard, Vancouver, ISO, and other styles
50

Redkina, N. S. "Global trends of libraries development: optimism vs pessimism (foreign literature review) Part 1." Bibliosphere, no. 4 (December 30, 2018): 87–94. http://dx.doi.org/10.20913/1815-3186-2018-4-87-94.

Full text
Abstract:
The dynamic development of the external technological environment, on the one hand, impacts on libraries questioning their future existence, on the other, helps libraries to work more productively, increases competitiveness and efficiency, expands the range of social projects, develops new ways and forms of work with users taking into account their preferences in information and services. The review is based on over 500 articles searched in the world's largest databases (Google Scholar, Web of Science, Scopus, etc.), which discuss trends and future development of libraries. Then the documents were classified according to sections and types of libraries, as well as advanced technologies. Examples of information technologies were collected and reviewed, as well as articles related to the implementation of information technologies when creating new services, with the emphasis on those that may affect libraries in the future. The latest information technologies that can be applied to the next generation library have been studied. The material is structured in blocks and presented in two parts. Thie 1st one presents such sections as: 1) challenges of the external environment and the future of libraries, 2) modern information technologies in libraries development (mobile technologies and applications, cloud computing, big data, internet of things, virtual and augmented reality, technical innovations, etc.), 4) Library 4.0 concept - new directions for libraries development. The 2nd part of the review article (Bibliosphere, 2019, 1) will touch the following issues: 1) user preferences and new library services (software for information literacy development, research data management, web archiving, etc.), 2) libraries as centers of intellectual leisure, communication platforms, places for learning, co-working, renting equipment, creativity, work, scientific experiments and leisure, 3) smart buildings and smart libraries, 4) future optimism. Based on the content analysis of publications, it is concluded that libraries should not only accumulate resources and provide access to them, but renew existing approaches to forms and content of their activities, as well as goals, missions and prospects for their development using various hard- and software, cloud computing technologies, mobile technologies and apps, social networks, etc.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography