To see the other types of publications on this topic, follow the link: AWS S3.

Journal articles on the topic 'AWS S3'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'AWS S3.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

D G, Meena, Sai Meghana, Vudugundla Teja, and Dr Mamatha C M. "S3 Event Notification using AWS Lambda." International Journal of Research Publication and Reviews 6, no. 5 (2025): 6568–70. https://doi.org/10.55248/gengpi.6.0525.1785.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

García-Ruiz, Sonia, Regina Hertfelder Reynolds, Melissa Grant-Peters, et al. "aws-s3-integrity-check: an open-source bash tool to verify the integrity of a dataset stored on Amazon S3." Gigabyte 2023 (August 22, 2023): 1–15. http://dx.doi.org/10.46471/gigabyte.87.

Full text
Abstract:
Amazon Simple Storage Service (Amazon S3) is a widely used platform for storing large biomedical datasets. Unintended data alterations can occur during data writing and transmission, altering the original content and generating unexpected results. However, no open-source and easy-to-use tool exists to verify end-to-end data integrity. Here, we present aws-s3-integrity-check, a user-friendly, lightweight, and reliable bash tool to verify the integrity of a dataset stored in an Amazon S3 bucket. Using this tool, we only needed ∼114 min to verify the integrity of 1,045 records ranging between 5 bytes and 10 gigabytes and occupying ∼935 gigabytes of the Amazon S3 cloud. Our aws-s3-integrity-check tool also provides file-by-file on-screen and log-file-based information about the status of each integrity check. To our knowledge, this tool is the only open-source one that allows verifying the integrity of a dataset uploaded to the Amazon S3 Storage quickly, reliably, and efficiently. The tool is freely available for download and use at https://github.com/SoniaRuiz/aws-s3-integrity-check and https://hub.docker.com/r/soniaruiz/aws-s3-integrity-check.
APA, Harvard, Vancouver, ISO, and other styles
3

MAMUTA, Maryna, Igor KRAVCHENKO, and Oleksandr MAMUTA. "AMAZON S3 STATIC WEBSITE HOSTING." Herald of Khmelnytskyi National University. Technical sciences 317, no. 1 (2023): 128–34. http://dx.doi.org/10.31891/2307-5732-2023-317-1-128-134.

Full text
Abstract:
Nowadays, especially in period of energetic crisis, it is very important to get reliable data storage and uninterrupted access to websites. Considering that the world’s mainstream is using the clouds services for educational, scientific, business purposes it is important to explore features of hosting websites in major cloud service providers. One of the leader’s vendors in cloud servicing is Amazon with its Web Services (AWS). AWS offers hosting solutions such as Lightsail, Amplify Console, Simple Storage Service (S3) and Elastic Cloud Computing, from simple static websites to complicated dynamic, The article deals with the method of hosting static websites. Simple Storage Service was used for this purpose. The method has many benefits: the service is simple, cheap, scales well, manages everything and has great integration with other AWS services. This method is the best for websites that do not contain server-side scripting, like PHP or ASP.NET. It is ideal for websites that change infrequently, such as personal, promo, startup websites, websites of small businesses and organizations. The article focuses on how set permissions, properties and policies of AWS S3 bucket for hosting a website. But usage of S3 bucket alone provides only http connection. That’s why to deploy secure connection and speed up delivery of the static content was proposed to use AWS CloudFront service. CloudFront is content delivery network service, that uses the latest version of Transport Layer Security Protocol and saves files in the cache for 24 hours. The article shows how to configure CloudFront distribution to serve https requests for an Amazon S3 bucket. Configuration S3 bucket REST API endpoint was used as the most secure case. As a result, latency was reduced, security was improved due to traffic encryption and the content of S3 bucket was kept private.
APA, Harvard, Vancouver, ISO, and other styles
4

Ashraf, Syed Ziaurrahman. "Building a Data Lake on AWS: From Data Migration to AI-Driven Insights." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 10 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem19620.

Full text
Abstract:
As organizations generate and process increasing amounts of data, building data lakes on cloud platforms like AWS has become crucial to managing large datasets efficiently. This paper outlines the key steps in constructing a scalable data lake on AWS, starting from data migration to leveraging AI for insights. It explores how AWS services like S3, Glue, and SageMaker work together to facilitate data storage, transformation, and machine learning. In addition, it highlights the importance of orchestrating data pipelines with automation tools like AWS Lambda and Apache Airflow to ensure smooth, scalable, and efficient workflows. This paper explores the end-to-end process of migrating data to AWS, constructing scalable data lakes, and leveraging AI capabilities to drive actionable insights. Through practical examples, diagrams, and pseudocode, this paper provides a comprehensive guide to implementing data lakes with AWS services such as S3, Glue, and SageMaker, highlighting key considerations around data migration, storage, processing, and analytics. The role of automation tools like AWS Lambda and Airflow in orchestrating these pipelines is also discussed. Keywords AWS, Data Lake, AI-driven Insights, Data Migration, Amazon S3, AWS Glue, Amazon SageMaker, Cloud Analytics, Data Pipeline, ETL, Machine Learning
APA, Harvard, Vancouver, ISO, and other styles
5

Rinki*1&, Er.Karandeep Singh2. "SECURING DATA STORAGE ON AMAZON AWS S3." GLOBAL JOURNAL OF ENGINEERING SCIENCE AND RESEARCHES 5, no. 9 (2018): 23–29. https://doi.org/10.5281/zenodo.1408017.

Full text
Abstract:
Cloud Computing is a phenomenon technology that has enabled many other technologies like Internet of Things, Artificial Intelligence as Cloud has solved many problems like storage over the internet and sensors connectivity with the applications which can be controlled from anywhere. This paper describes the introduction to cloud computing, its types and benefits. Companies like Amazon and Microsoft are providing the best cloud services with plethora of functions with their products like AWS and Azure. One of the biggest problem related with Cloud is Security as data is stored in third party infrastructure. In the recent times, information has become the biggest asset of the organizations and everyone wants to store their data in secure manner. This paper includes the method to secure the data using AES 256 bit encryption on the AWS using Boto3 library and Python code. It can be used to provide encryption at server side and make storage much secure than before.
APA, Harvard, Vancouver, ISO, and other styles
6

Krishna, Mohan Pitchikala. "Strategies for Migrating S3 Objects Across Regions." European Journal of Advances in Engineering and Technology 10, no. 12 (2023): 47–50. https://doi.org/10.5281/zenodo.13320037.

Full text
Abstract:
In today’s world, organizations use cloud storage services like Amazon Simple Storage Service (S3) to manage their data. As businesses grow internationally, they often need to move their data between different AWS regions. This process must be efficient and secure to avoid service interruptions and comply with data privacy laws. In this paper, we will discuss four main methods for migrating the S3 data: S3 Replication, S3 Batch Operations, AWS DataSync, and Custom Scripts. We will highlight the benefits and limitations of each method, considering the factors such as data size, cost, and security. Additionally, we will provide a decision matrix to help in selecting the best method for specific needs and include references to detailed guides for implementing each approach. The main goal of this paper is to serve as a comprehensive guide for understanding different migration methods, making informed decisions, and successfully implementing the chosen approach with best practices.
APA, Harvard, Vancouver, ISO, and other styles
7

Goutham, Bilakanti. "Automated File Processing Between NAS and AWS Cloud." International Journal of Leading Research Publication 2, no. 2 (2021): 1–13. https://doi.org/10.5281/zenodo.15196715.

Full text
Abstract:
The convergence of automated file transfer and processing solutions between AWS Cloud and on-premises NAS has revolutionized healthcare and enterprise operations data management. The solution leverages AWS DataSync and AWS Transfer Family to facilitate secure and elastic file transfer to AWS services such as Amazon S3, EFS, and FSx. Real-time and batch processing functionality, such as AWS Lambda, Step Functions, and S3 Event Triggers, maximizes data availability with minimal human involvement. Data verification and reconciliation processes, such as checksum verifications and logging, add to data integrity. Two-way synchronization also involves continuous updating of on-premises NAS and AWS for enhanced efficiency, compliance, and automation of workflow. Through optimized data ingestion, processing, and validation, the framework supports cloud-based high-performance data workflow securely as an enterprise and healthcare infrastructure flagship solution.
APA, Harvard, Vancouver, ISO, and other styles
8

D, Prof Latha. "Blockchain Solutions for Secure Healthcare Data Management: A Comprehensive Survey." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 03 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem29730.

Full text
Abstract:
The paper introduces an innovative approach to healthcare data security, addressing the critical concerns surrounding privacy and data integrity in the modern healthcare landscape. By amalgamating blockchain technology, AWS S3 services, and attribute-based access control (ABAC), the proposed solution offers a robust framework for safeguarding sensitive medical information. Through the implementation of ABAC, the solution enables dynamic and precise access control, ensuring that only authorized entities can access and manipulate healthcare data. Additionally, the utilization of a blockchain-based XML ledger on AWS S3 services enhances data security by providing an immutable and tamper-proof storage mechanism. This comprehensive solution not only addresses the privacy challenges inherent in the digitization of healthcare but also facilitates secure data sharing among authorized parties. By leveraging the combined strengths of ABAC, blockchain technology, and AWS S3 services, the proposed framework delivers a scalable and efficient infrastructure for managing healthcare data securely. It represents a significant advancement in healthcare information security, promising to safeguard patient privacy and maintain the accuracy and reliability of medical data in an increasingly interconnected digital ecosystem. Key Words: AWS S3 Service, Blockchain technology, medical information, XML ledger, Attribute-based access control (ABAC), Tamper-proof
APA, Harvard, Vancouver, ISO, and other styles
9

Sai Sindhu, Reddyvari. "Serverless Bookstore Application Using AWS." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 04 (2025): 1–9. https://doi.org/10.55041/ijsrem46406.

Full text
Abstract:
Abstract— Recently, cloud computing has become a main standard for effective management and Internet services. Within a variety of cloud technology, without server calculation, it is distinguished from the evolution of cloud programming and reflects the range and growth of the cloud. Large Internet companies such as Amazon, Netflix, and LinkedIn are now using the cloud to develop, test, deploy, scale, operate, and update complex, multi-stage applications. Despite the flexibility and scalability, these companies face challenges in managing infrastructure costs due to increasing server load and space requirements. This is where serverless computing, introduced by AWS Lambda, offers a compelling solution. In this article, we explain how AWS Lambda works with other native AWS services through the development of a serverless chat application designed to scale seamlessly without the need for additional servers. The study also explores how AWS services such as S3, DynamoDB, and CloudWatch integrate with serverless technologies such as Lambda to improve the functionality and performance of applications in the cloud. Keywords— AWS Lambda, Amazon Amplify, Amazon DynamoDB, Amazon S3, AWS, Cloud Computing, Cloud Storage, Amazon AppSync, IAM..
APA, Harvard, Vancouver, ISO, and other styles
10

G.S, Hari Priya, Janani P.R, and J. Nagavardhan Reddy. "Exploring AWS S3: In-Depth Analysis of Bucket Management." Journal of Sensor and Cloud Computing 1, no. 2 (2024): 1–5. http://dx.doi.org/10.46610/joscc.2024.v01i02.001.

Full text
Abstract:
Amazon Simple Storage Service (S3) is a cornerstone in modern data management strategies with in Amazon Web Services (AWS). It is pivotal in providing scalable, durable, and accessible object storage infrastructure. A key advantage of S3 is its scalability, enabling seamless storage capacity expansion without upfront investments or provisioning delays. This flexibility empowers organizations to accommodate dynamic data growth while maintaining operational agility, which is crucial in today's rapidly evolving digital landscape. S3's exceptional durability, boasting an impressive 99.999999999% reliability, ensures steadfast data integrity and resilience against potential failures, mitigating concerns about data loss and bolstering confidence in managing mission-critical information effectively. Accessibility is another crucial aspect, with robust access controls and APIs facilitating secure data retrieval, storage, and management from any location. This accessibility fosters collaboration and innovation by enabling seamless data interaction and utilization across geographical boundaries.S3's integration capabilities with other AWS services enhance operational efficiency and optimize data analytics workflows. It is a foundational element in data analytics, facilitating data ingestion, storage, and processing to derive actionable insights from diverse datasets. Furthermore, S3's versatility extends beyond storage, supporting applications such as content delivery, backup and archiving, disaster recovery, and machine learning model training. This versatility drives innovation across industries and underscores S3's role in driving digital transformation and fostering a culture rooted in data-driven decision-making. In conclusion, Amazon S3's unparalleled scalability, durability, and accessibility make it indispensable for organizations navigating the data-centric landscape of modern business environments, underlining the imperative of harnessing its full potential for organizational success.
APA, Harvard, Vancouver, ISO, and other styles
11

Anushree, Misra, Maheshwari Saloni, Macker Sukriti, and Doda Ruchika. "SERVERLESS APPLICATION: SECURE S3 FILE UPLOAD." International Journal For Technological Research In Engineering 8, no. 5 (2021): 18. https://doi.org/10.5281/zenodo.5084683.

Full text
Abstract:
Cloud computing services are growing in the IT market. These services cost-effectively provide on-demand IT resources via the Internet. It facilitates the individual to pay-as-they-go rather than buying the entire service. Amazon Web Services (AWS) offer such cloud computing services. The authors of the paper used the AWS platform to create the Serverless Application to read and write a file. With various advantages offered by cloud computing services, such as cost-effectiveness and reliability, there is a significant concern that needs to be addressed, i.e., data security. The authors achieved the objective of the security of information and data by making the object private and the S3 bucket public. Thus, making the cloud hybrid and the application secure. The users can read and write their files securely and effectively in a specific period via this application.
APA, Harvard, Vancouver, ISO, and other styles
12

A. Karunamurthy, Dr. "SECURE AND SCALABLE WORDPRES DEPLOYMENT ON AWS WITH RDS." International Scientific Journal of Engineering and Management 04, no. 06 (2025): 1–9. https://doi.org/10.55041/isjem04077.

Full text
Abstract:
Abstract This paper digital landscape, ensuring high availability, scalability, and security for web applications is crucial. This project focuses on deploying a secure and scalable WordPress website on Amazon Web Services (AWS) using industry best practices. By leveraging AWS services such as EC2, Auto Scaling, RDS, S3, VPC, IAM, and Security Groups, this deployment achieves high performance, reliability, and security. The architecture includes Amazon EC2 instances running WordPress in an Auto Scaling Group, ensuring seamless horizontal scalability. Amazon RDS is used for the MySQL database, providing managed, high-performance, and fault-tolerant data storage. Amazon S3 is integrated for media storage, reducing server load and improving content delivery. A Virtual Private Cloud (VPC) is configured to establish a secure and isolated network environment. IAM roles and policies enforce strict access control, while Security Groups protect against unauthorized access. To enhance security, HTTPS is enabled using an SSL certificate, and AWS WAF (Web Application Firewall) is employed to mitigate threats such as SQL injection and DDoS attacks. Automated backups and monitoring solutions like CloudWatch and AWS Backup ensure data integrity and real-time performance tracking. This project demonstrates a robust WordPress hosting solution that dynamically scales to handle traffic spikes while maintaining strong security standards. It serves as an ideal model for businesses seeking a cloud-based, resilient WordPress deployment on Aws. Key words: WordPress hosting, Amazon Web Services (AWS), EC2, Auto Scaling, RDS, S3, VPC, IAM, Security Groups, HTTPS, SSL certificate, AWS WAF, DDoS protection, SQL injection mitigation, CloudWatch, AWS Backup, high availability, scalability, cloud security, performance monitoring, managed database, media offloading, resilient architecture, and secure cloud deployment.
APA, Harvard, Vancouver, ISO, and other styles
13

Zhukov, S. I., and K. A. Zubrilin. "Using Amazon Kinesis Service for Data Transfer from IoT Devices to Cloud Infrastructure." Programmnaya Ingeneria 14, no. 4 (2023): 187–94. http://dx.doi.org/10.17587/prin.14.187-194.

Full text
Abstract:
IoT devices connected to the Amazon Web Services (AWS) cloud use the loT interface and the MQTT protocol for communication purposes. In addition to these basic communication tools, AWS has developed specialized services for transmitting telemetry data from smart devices. Firehose allows you to upload data directly to the AWS S3 data warehouse (cloud-specific file system). Kinesis service is even more powerful; it enables the user to build back-end data logic, which is automatically applied when data arrives. Also, Kinesis allows running analytics of the telemetry data using artificial intelligence techniques to discover hidden trends and consistent patterns. This article describes how to set up an IoT device using the AWS C++ SDK so that telemetry data it generates is processed by Kinesis services. The firmware of the device connects to the AWS provider using the SDK APIs and sends telemetry data as Kinesis service data packets; a Kinesis lambda function is created on the server side in the cloud to store the received data in S3 storage and send the data to the DynamoDB database after processing. Using Kinesis services allows the user to utilize all power tools provided by the AWS framework for processing and analyzing their data (Amazon constantly develops and enhances this set of tools).
APA, Harvard, Vancouver, ISO, and other styles
14

Bello, Rotimi-Williams, Pius A. Owolawi, Etienne A. Van Wyk, and Chunling Tu. "Architecture for detecting advertisement types." International Journal of Innovative Research and Scientific Studies 8, no. 1 (2025): 1725–32. https://doi.org/10.53894/ijirss.v8i1.4772.

Full text
Abstract:
Businesses often encounter difficulties in accurately categorizing advertisements based on their content, which leads to inefficiencies in targeted marketing and data analysis. An automated system is needed to detect the type of advertisement (e.g., food, beverage, clothing, etc.) from the submitted text. This paper aims to develop an automated system leveraging Amazon Web Services (AWS) to categorize advertisement types based on text input. We employed Convolutional Neural Networks (CNNs) in developing the automated system due to the significant performance of CNNs leveraging AWS. The solution aims to boost marketing efficiency and strengthen data analysis capabilities. To achieve this, we utilize Amazon Simple Storage Service (S3), AWS Lambda, Amazon Comprehend, Amazon DynamoDB, Amazon CloudWatch, Amazon CloudFront, and AWS Web Application Firewall (AWS WAF). Moreover, we follow some procedural steps in executing the task by uploading the advertisement text to an S3 bucket, which triggers a Lambda function that forwards the text to Amazon Comprehend for analysis, and the results are stored in DynamoDB, from where the results notification is sent to the user. Magazine image datasets were employed as test datasets for this approach. This work enables automatic advertisement categorization, enhanced marketing effectiveness, better data analysis and reporting functionality, an affordable solution using AWS services, and instant feedback for users. The AWS-based architecture provides a dependable solution for the automatic identification of advertisement types. By leveraging various AWS services, the system ensures efficiency, precision, and scalability, ultimately enhancing marketing strategies and data management.
APA, Harvard, Vancouver, ISO, and other styles
15

Koppichetti, Ravi Kiran. "DATA MIGRATION FROM ON-PREMISES STORAGE TO AWS AND SNOWFLAKE USING AWS S3 AND AWS GLUE: TOOLS, TECHNIQUES, AND BEST PRACTICES." DATA MIGRATION FROM ON-PREMISES STORAGE TO AWS AND SNOWFLAKE USING AWS S3 AND AWS GLUE: TOOLS, TECHNIQUES, AND BEST PRACTICES 6, no. 11 (2021): 262–67. https://doi.org/10.5281/zenodo.14787944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Sri, Harsha Vardhan Sanne. "Overcoming Network Bottlenecks and Latency Issues in Distributed AWS Architectures." Journal of Scientific and Engineering Research 7, no. 9 (2020): 223–29. https://doi.org/10.5281/zenodo.12787485.

Full text
Abstract:
Distributed AWS architectures offer unparalleled scalability, flexibility, and resilience, making them pivotal in modern cloud computing. However, they also present significant challenges, particularly network bottlenecks and latency issues. These problems can degrade system performance, escalate operational costs, and negatively impact user experience. This paper explores comprehensive strategies to overcome these challenges, ensuring the efficient and reliable operation of AWS architectures. Key strategies include effective load balancing, network segmentation, data transfer optimization using AWS Direct Connect and S3 Transfer Acceleration, and mitigating latency through edge computing with AWS IoT Greengrass and AWS Snowball Edge. Additionally, leveraging content delivery networks (CDNs) like AWS CloudFront and implementing caching mechanisms with AWS Elastic Cache are crucial. Continuous monitoring and troubleshooting with AWS CloudWatch and AWS X-Ray further enhance network performance. Case studies of an e-commerce platform and a media streaming service illustrate the practical application and benefits of these strategies. By implementing these measures, organizations can ensure robust, responsive, and high-performing AWS architectures that deliver exceptional user experiences.
APA, Harvard, Vancouver, ISO, and other styles
17

Saxena, Mohit, Benjamin Sowell, Daiyan Alamgir, et al. "The Story of AWS Glue." Proceedings of the VLDB Endowment 16, no. 12 (2023): 3557–69. http://dx.doi.org/10.14778/3611540.3611547.

Full text
Abstract:
AWS Glue is Amazon's serverless data integration cloud service that makes it simple and cost effective to extract, clean, enrich, load, and organize data. Originally launched in August 2017, AWS Glue began as an extract-transform-load (ETL) service designed to relieve developers and data engineers of the undifferentiated heavy lifting needed to load databases, data warehouses, and build data lakes on Amazon S3. Since then, it has evolved to serve a larger audience including ETL specialists and data scientists, and includes a broader suite of data integration capabilities. Today, hundreds of thousands of customers use AWS Glue every month. In this paper, we describe the use cases and challenges cloud customers face in preparing data for analytics and the tenets we chose to drive Glue's design. We chose early on to focus on ease-of-use, scale, and extensibility. At its core, Glue offers serverless Apache Spark and Python engines backed by a purpose-built resource manager for fast startup and auto-scaling. In Spark, it offers a new data structure --- DynamicFrames --- for manipulating messy schema-free semi-structured data such as event logs, a variety of transformations and tooling to simplify data preparation, and a new shuffle plugin to offload to cloud storage. It also includes a Hivemetastore compatible Data Catalog with Glue crawlers to build and manage metadata, e.g. for data lakes on Amazon S3. Finally, Glue Studio is its visual interface for authoring Spark and Python-based ETL jobs. We describe the innovations that differentiate AWS Glue and drive its popularity and how it has evolved over the years.
APA, Harvard, Vancouver, ISO, and other styles
18

Joodi, Mohanad Azeez, Muna Hadi Saleh, and Dheya Jassim Khadhim. "Proposed Face Detection Classification Model Based on Amazon Web Services Cloud (AWS)." Journal of Engineering 29, no. 4 (2023): 176–206. http://dx.doi.org/10.31026/j.eng.2023.04.12.

Full text
Abstract:
One of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services through our cameras system to capture the images and upload them to the Amazon Simple Storage Service (AWS S3) cloud. Then two detectors were running, Haar cascade and multitask cascaded convolutional neural networks (MTCNN), at the Amazon Elastic Compute (AWS EC2) cloud, after that the output results of these two detectors are compared using accuracy and execution time. Then the classified non-permission images are uploaded to the AWS S3 cloud. The validation accuracy of the offline augmentation face detection classification model reached 98.81%, and the loss and mean square error were decreased to 0.0176 and 0.0064, respectively. The execution time of all AWS cloud systems for one image when using Haar cascade and MTCNN detectors reached three and seven seconds, respectively.
APA, Harvard, Vancouver, ISO, and other styles
19

Khot, Purva. "Cloud Migration with Google Cloud Storage and AWS (S3 Service)." International Journal for Research in Applied Science and Engineering Technology 8, no. 10 (2020): 555–58. http://dx.doi.org/10.22214/ijraset.2020.31953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Milind Gayakwad,. "Real-Time Clickstream Analytics with Apache." Journal of Electrical Systems 20, no. 2 (2024): 1600–1608. http://dx.doi.org/10.52783/jes.1466.

Full text
Abstract:
The purpose of this research work is to provide an overview of setting up an Apache-based real-time clickstream data lifecycle for user behaviour analysis and marketing strategy improvement. It uses tools like Apache Kafka, Apache Spark, Amazon S3, AWS Glue Data Catalog, Hive Metastore, and Tableau to meet the challenges of data collecting, purification, and storage. The design offers rapid data processing and analysis using Spark, high-throughput and fault-tolerant data import with Kafka, and scalable storage in Amazon S3. Data retrieval, querying, and transformation are made easier by AWS Glue Data Catalogue and Hive Metastore, while Tableau offers interactive visualisations. Data management capabilities are improved by optional interaction with a data warehouse and data lake. The scalable architecture accommodates increased data quantities and user traffic, and a mathematical model derives useful insights from clickstream data.
APA, Harvard, Vancouver, ISO, and other styles
21

Satheesh Kumar, Ashwin, Anfah K., Hariharan T., Rosna Parveen, Sizan Mahmud, and Sonal Sharma. "SECURE FILE STORAGE ON CLOUD USING HYBRID CRYPTOGRAPHY." International Journal of Advanced Research 11, no. 04 (2023): 01–05. http://dx.doi.org/10.21474/ijar01/16613.

Full text
Abstract:
Our project aims to provide secure file storage for users on the cloud (specifically on AWS S3) by utilizing a hybrid cryptography approach. This involves encrypting files using both AES and RSA algorithms, with the user receiving the encryption key through email. To further enhance security, the key will be hidden behind an image or within a PDF document using steganography techniques. Additionally, the files stored in the S3 bucket will also be encrypted, but instead of encrypting the contents of the file, we will encrypt the file format itself. The file format can be decrypted using the GHE decryption key and software, which will be made available to end-users. Overall, our project aims to provide a robust and secure solution for cloud-based file storage.
APA, Harvard, Vancouver, ISO, and other styles
22

Ms., Gloriya Kardile, and Vaibhavi Channe Ms. "AWS S3 Classes: A Comparative Analysis and Optimization Strategies for Cost Efficiency and Performance Optimization for Organizations." International Journal of Advance and Applied Research S6, no. 22 (2025): 287–95. https://doi.org/10.5281/zenodo.15501607.

Full text
Abstract:
<em>Cloud storage services, particularly Amazon Simple Storage Service (S3), have become fundamental components for organizations seeking scalable and cost-effective solutions for data storage. With Amazon S3 offering multiple storage classes tailored to different performance, Durability, and cost requirements, selecting the appropriate storage class has become a critical decision for organizations aiming to optimize both cost efficiency and performance. This paper presents a comprehensive comparative analysis of the various storage classes available in Amazon S3, including Standard, Standard-IA (Infrequent Access), One Zone-IA, Intelligent-Tiering, Glacier, and Glacier Deep Archive. Through an in-depth examination of their features, performance characteristics, and pricing models, this analysis aims to provide organizations with insights into selecting the most suitable storage class based on their specific use cases and requirements. Additionally, this paper proposes optimization strategies for maximizing cost efficiency and performance within Amazon S3, including data lifecycle management policies, tiering strategies, and performance optimization techniques. By leveraging these strategies, organizations can effectively manage their data storage costs while ensuring optimal performance and reliability. Overall, this paper serves as a valuable resource for organizations navigating the complexities of cloud storage in Amazon S3, offering actionable insights and best practices for achieving cost-efficient and high-performance data storage solutions. </em>
APA, Harvard, Vancouver, ISO, and other styles
23

Ritesh, Kumar. "Serverless Computing with AWS Lambda: Best Practices for Scalable Enterprise Applications." International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences 7, no. 2 (2019): 1–13. https://doi.org/10.5281/zenodo.14900633.

Full text
Abstract:
Serverless computing has transformed cloud application development by enabling organizations to build scalable, event-driven applications without managing infrastructure. AWS Lambda, a leading Function-as-a-Service (FaaS) platform, allows developers to run code in response to events while AWS handles provisioning, scaling, and execution. This paper presents best practices for event-driven applications using AWS Lambda, emphasizing performance optimization, security, cost efficiency, and observability. Key areas include cold start mitigation, concurrency management, IAM security, API Gateway integration, and compliance automation. A real-world use case demonstrates how AWS Lambda can be used for real-time compliance enforcement by leveraging AWS services such as S3, DynamoDB, SNS, and Step Functions. Intended for enterprise architects, DevOps engineers, and developers, this paper provides actionable insights to achieve higher performance, stronger security, and lower operational costs in serverless architectures.
APA, Harvard, Vancouver, ISO, and other styles
24

Paidy, Pavan. "Hardening AWS Infrastructure after Capital One: IAM, S3, and Network Security." JOURNAL OF RECENT TRENDS IN COMPUTER SCIENCE AND ENGINEERING 7, no. 2 (2019): 126–41. https://doi.org/10.70589/jrtcse.2019.2.10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Bhatt, Sachin. "Optimizing SAP Migration Strategies to AWS: Best Practices and Lessons Learned." Integrated Journal for Research in Arts and Humanities 1, no. 1 (2021): 74–82. http://dx.doi.org/10.55544/ijrah.1.1.11.

Full text
Abstract:
The following paper aims at discussing the utilization of the SAP migration to AWS in details by pointing out the appropriate tactics, the challenges, and the general performance of the servers. These are AWS services useful in the context of SAP applications such as EC2 and S3, data migration approaches, enhancing SAP performance and general controlling of costs. It also takes care of important aspects like planning and evaluation, picking right migration method, as well as, compliance, security, and regulatory issues. Some of the trend in SAP and AWS integration include Artificial intelligence, the serverless architecture, and SAP in hybrid environments are also covered. The paper thus offers the following strategic interventions for organisations that seek to migrate their SAP environments to AWS seamlessly, and with maximum efficiency.
APA, Harvard, Vancouver, ISO, and other styles
26

Yu, Ellen, Aparna Bhaskaran, Shang-Lin Chen, Zachary E. Ross, Egill Hauksson, and Robert W. Clayton. "Southern California Earthquake Data Now Available in the AWS Cloud." Seismological Research Letters 92, no. 5 (2021): 3238–47. http://dx.doi.org/10.1785/0220210039.

Full text
Abstract:
Abstract The Southern California Earthquake Data Center is hosting its earthquake catalog and seismic waveform archive in the Amazon Web Services (AWS) Open Dataset Program (s3://scedc-pds; us-west-2 region). The cloud dataset’s high data availability and scalability facilitate research that uses large volumes of data and computationally intensive processing. We describe the data archive and our rationale for the formats and data organization. We provide two simple examples to show how storing the data in AWS Simple Storage Service can benefit the analysis of large datasets. We share usage statistics of our data during the first year in the AWS Open Dataset Program. We also discuss the challenges and opportunities of a cloud-hosted archive.
APA, Harvard, Vancouver, ISO, and other styles
27

Kona, Sree Sandhya. "Optimizing Data Ingestion in the Cloud: Leveraging AWS Technologies like AWS S3, EMR, and Glue for Cost Efficiency and Operational Scalability." Journal of Artificial Intelligence & Cloud Computing 3, no. 2 (2024): 1–5. http://dx.doi.org/10.47363/jaicc/2024(3)308.

Full text
Abstract:
In the rapidly evolving landscape of big data, the architecture of data ingestion processes significantly impacts the operational costs and efficiency of cloud environments. This paper delves into the architectural patterns and strategies that facilitate cost-effective data ingestion, with a particular focus on leveraging Amazon Web Services (AWS) technologies such as S3, EMR, and Glue. As organizations increasingly migrate to cloud-based solutions to manage voluminous data, optimizing the cost of data ingestion becomes crucial to maintain competitiveness and operational efficiency.
APA, Harvard, Vancouver, ISO, and other styles
28

R, DHANALAKSMI. "AWS Powered Sentiment Analysis." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 02 (2025): 1–9. https://doi.org/10.55041/ijsrem41727.

Full text
Abstract:
The paper discusses the results of a study which extracts meaningful insights from vast amounts of unstructured text has become critical for businesses and organizations. Sentiment analysis, a Natural Language Processing (NLP) technique, is used to determine whether a piece of text conveys positive, negative, or neutral sentiment. By leveraging Amazon Web Services (AWS) such as Amazon Comprehend for language processing, AWS Lambda for serverless execution, and Amazon S3 for data storage, the system can process large datasets efficiently. This feedback can guide the development of new products or improvements to existing ones, aligning them more closely with customer preferences. The dataset includes a sample of 10,000 tweets, pre-processed to remove noise and irrelevant content. The results indicate that while deep learning models, particularly Long Short-Term Memory (LSTM) networks, outperform traditional methods in terms of accuracy, challenges remain in dealing with sarcasm and ambiguous language. The findings suggest that sentiment analysis can be a valuable tool for understanding public sentiment in real-time, with potential applications in marketing, political analysis, and customer feedback systems. However, improvements in model handling of contextual nuances are needed for more precise sentiment classification. Keywords – Amazon Web Service (AWS), Sentiment Analysis, Hidden emotions
APA, Harvard, Vancouver, ISO, and other styles
29

B, Shadaksharappa. "High Availability and Fault Tolerance in AWS." International Journal of Innovative Research in Information Security 09, no. 03 (2023): 66–72. http://dx.doi.org/10.26562/ijiris.2023.v0903.03.

Full text
Abstract:
With the increasing adoption of cloud computing, ensuring high availability and fault tolerance has become paramount for organizations. Amazon Web Services (AWS) offers a robust infrastructure for hosting applications, but it requires careful architectural design and implementation to achieve desired levels of availability and fault tolerance. This research paper explores two innovative concepts, namely Cloud Fractal and Decentralized Replication and Orchestration, and their application in achieving high availability and fault tolerance in AWS. We present a comprehensive analysis of these concepts and provide practical guidelines for their implementation in real-world scenarios. Our findings demonstrate the effectiveness of Cloud Fractal and Decentralized Replication and Orchestration in enhancing the reliability and resilience of AWS deployments. Keywords: Fault Isolation, High Availability, Load Balancing, Auto-scaling, Disaster Recovery, Data Replication, Backup and Restore, Fault Tolerance, Cloud Computing, Amazon Elastic Compute Cloud (EC2), Amazon Relational Database Service (RDS), Amazon Simple Storage Service (S3), AWS Elastic Load Balancer (ELB), Complexity, Operational Management, Cloud Fractal, AWS Lambda, Amazon SQS, Decentralized Replication and Orchestration (DOI).
APA, Harvard, Vancouver, ISO, and other styles
30

J, Ganavi. "Automated Deployment of Data Lake." International Journal for Research in Applied Science and Engineering Technology 9, no. 9 (2021): 326–30. http://dx.doi.org/10.22214/ijraset.2021.37946.

Full text
Abstract:
Abstract: A Data Lake is a central location that can store all your structured and unstructured data, no matter the source or format. Automated deployment for data lake solution is an automated reference implementation that deploys a highly available, cost-effective data lake architecture on the AWS Cloud along with a user-friendly console for searching and requesting datasets. The solution automatically configures the core AWS services necessary to easily tag, search, share, transform, analyse, and govern specific subsets of data across a company or with other external users. The solution deploys a console that users can access to search and browse available datasets for their business needs. Keywords: Data Lake, Cloud Computing, Aws, Ec2, S3, Athena, Glue, Cloud formation.
APA, Harvard, Vancouver, ISO, and other styles
31

Коротич, Кирило. "ВИБІР СЕРВІСУ ЗБЕРІГАННЯ ДАНИХ AWS ДЛЯ МОБІЛЬНОГО ДОДАТКУ СИСТЕМИ ЕЛЕКТРОННОЇ КОМЕРЦІЇ". Grail of Science, № 42 (10 серпня 2024): 336–41. http://dx.doi.org/10.36074/grail-of-science.02.08.2024.047.

Full text
Abstract:
Для розробки мобільного додатку одним з важливих компонентів є компонент архітектури, який буде відповідати за зберігання даних. Метою дослідження є вибір одного з сервісів Amazon, який виконує цю функцію. У дослідженні були використані емпіричні методи, що включають аналіз джерел та визначення критеріїв. На основі отриманих даних було визначено, що найкращим варіантом є Amazon S3.
APA, Harvard, Vancouver, ISO, and other styles
32

Sai, Krishna Chirumamilla. "AWS Lambda Deep Dive: Writing Event-Driven, Scalable, and Cost-Effective Code without Managing Servers." International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences 10, no. 5 (2022): 1–18. https://doi.org/10.5281/zenodo.15107569.

Full text
Abstract:
Serverless computing is one of the most disruptive models of application development and deployment. AWS Lambda can be considered one of the best platforms for developers to build applications without needing to use server hosting. This article focuses on explaining event features, scalability, and cost-effectiveness of AWS Lambda. In this article, we look at how Lambda functions behave when invoked in response to API Gateway events, S3 events, or a DynamoDB stream. By following a clearly described approach, we present an example of a serverless application and its evaluation as assessed by performance and cost. The successes establish the benefits of using AWS Lambda for new-generation application development despite drawbacks like the cold start issue and monitoring. This paper serves as a theoretical and practical guide to AWS Lambda with a focus on helping developers and organizations capitalize on the opportunity that this solution offers in constructing scalable apps.
APA, Harvard, Vancouver, ISO, and other styles
33

Красько, Богдан, та Петро Грицюк. "Особливості архітектури Amazon EC2 для масштабування обчислювальних ресурсів". Modeling, Control and Information Technologies, № 7 (7 грудня 2024): 142–43. https://doi.org/10.31713/mcit.2024.040.

Full text
Abstract:
Особливості архітектури Amazon EC2 для масштабування обчислювальних ресурсів відображають ключові можливості та стратегії, які дозволяють динамічно керувати ресурсами для забезпечення продуктивності та ефективності хмарних сервісів. Amazon Elastic Compute Cloud дозволяє створювати масштабовані віртуальні сервери, які можна налаштувати відповідно до потреб користувачів за допомогою таких функцій, як автоматичне масштабування, підтримка різних типів екземплярів, поєднання з іншими службами AWS, S3, RDS, Lambda, а також використання контейнеризації та еластичного балансування навантаження. Як наслідок, потужності мінімізують витрати на інфраструктуру, забезпечують відновлення після збою та автоматично масштабують додатки залежно від навантаження. The architecture features of Amazon EC2 for scaling computing resources reflect key capabilities and strategies that enable dynamic resource management to ensure performance and efficiency of cloud services. Amazon Elastic Compute Cloud allows the creation of scalable virtual servers, which can be configured according to user needs through features such as auto-scaling, support for different instance types, integration with other AWS services like S3, RDS, Lambda, as well as the use of containerization and elastic load balancing. As a result, these capabilities minimize infrastructure costs, ensure disaster recovery, and automatically scale applications based on demand.
APA, Harvard, Vancouver, ISO, and other styles
34

AMIRTHARAJ, SELVAMOHAN, and NACHIAPPAN RATHINA PRABHA. "CLOUD SERVICE AND SCADA-BASED WEB APPLICATION FOR MONITORING RENEWABLE ENERGY SYSTEMS." REVUE ROUMAINE DES SCIENCES TECHNIQUES — SÉRIE ÉLECTROTECHNIQUE ET ÉNERGÉTIQUE 70, no. 1 (2025): 57–62. https://doi.org/10.59277/rrst-ee.2025.1.10.

Full text
Abstract:
A renewable energy monitoring web application to track, monitor, and analyze the performance of renewable energy systems such as wind is presented in this paper. The main goal of this application is to provide users with real-time information on the energy production and consumption of their renewable energy systems. The application is meant to provide meaningful and easy-to-understand data visualization to users. This requires the application to have appropriate graphs, charts, and tables to help users quickly analyze the data. The application uses supervisory control and data acquisition (SCADA) for data collection and system monitoring. Amazon Simple Storage Service (S3), offered by Amazon Web Services (AWS), is used for data storage, as the application must handle large amounts of data and users. With the increase in the number of users and data points, AWS S3 helps to scale up without any performance degradation. The proposed renewable energy management system aims to develop a renewable energy monitoring and analytics web application with a user-friendly dashboard.
APA, Harvard, Vancouver, ISO, and other styles
35

Vijaya Kumar Katta. "Leveraging AWS cloud native services for scalable application architectures." World Journal of Advanced Research and Reviews 26, no. 2 (2025): 2108–20. https://doi.org/10.30574/wjarr.2025.26.2.1853.

Full text
Abstract:
AWS cloud-native services enable organizations to build scalable and resilient applications in today's transformed application development landscape. AWS has pioneered technologies that have become cornerstones of modern application architecture, offering comprehensive tools for implementing sophisticated solutions. The document examines serverless computing paradigms through AWS Lambda and API Gateway, highlighting their evolution, features, and best practices for implementation. It delves into container orchestration with Amazon ECS and EKS, comparing their capabilities and introducing Fargate as a serverless container execution option. Purpose-built database services including DynamoDB, Aurora Serverless, and ElastiCache are discussed alongside storage solutions like S3, EFS, and FSx, with emphasis on appropriate data access patterns and optimization techniques. Infrastructure automation through CloudFormation and CDK is explored, alongside continuous integration and deployment pipelines that form the foundation of modern software development practices. The examination of observability and monitoring tools essential for operating cloud-native systems effectively provides a comprehensive guide to leveraging AWS services for scalable application architectures
APA, Harvard, Vancouver, ISO, and other styles
36

Trudy-Ann Campbell, Samson Eromonsei, and Olusegun Afolabi. "Automated API framework tools for evaluating cloud resources (IAM, S3, KMS) for compliance with ISO 27001 case study AWS." Global Journal of Engineering and Technology Advances 20, no. 1 (2024): 131–49. http://dx.doi.org/10.30574/gjeta.2024.20.1.0126.

Full text
Abstract:
CLOUD— computing's advancements has provided scalability and adaptability but has also given rise to data security concerns. ISO 27001 is vital for cloud information security, yet compliance in dynamic settings poses challenges. Automated API framework tools automate ISO 27001 compliance checks for IAM, S3, and KMS services in AWS, boosting efficiency and minimizing errors. This study investigates the effectiveness of these frameworks, focusing on AWS environments. It explores advantages, difficulties, and practical considerations of automation in cloud compliance. Insights aim to enhance understanding of how automation reinforces security and regulatory adherence. Previous studies highlight the need for adaptable monitoring solutions in cloud setups. Recent research demonstrates the potential of programming languages like Python to streamline compliance processes effectively. This study contributes by examining the efficiency of automated compliance frameworks in AWS, offering perspectives on their practical application in cloud settings.
APA, Harvard, Vancouver, ISO, and other styles
37

Trudy-Ann, Campbell, Eromonsei Samson, and Afolabi Olusegun. "Automated API framework tools for evaluating cloud resources (IAM, S3, KMS) for compliance with ISO 27001 case study AWS." Global Journal of Engineering and Technology Advances 20, no. 1 (2024): 131–49. https://doi.org/10.5281/zenodo.13694364.

Full text
Abstract:
CLOUD&mdash; computing's advancements has provided scalability and adaptability but has also given rise to data security concerns. ISO 27001 is vital for cloud information security, yet compliance in dynamic settings poses challenges. Automated API framework tools automate ISO 27001 compliance checks for IAM, S3, and KMS services in AWS, boosting efficiency and minimizing errors. This study investigates the effectiveness of these frameworks, focusing on AWS environments. It explores advantages, difficulties, and practical considerations of automation in cloud compliance. Insights aim to enhance understanding of how automation reinforces security and regulatory adherence. Previous studies highlight the need for adaptable monitoring solutions in cloud setups. Recent research demonstrates the potential of programming languages like Python to streamline compliance processes effectively. This study contributes by examining the efficiency of automated compliance frameworks in AWS, offering perspectives on their practical application in cloud settings.
APA, Harvard, Vancouver, ISO, and other styles
38

Gupta, Himanshu. "Cost-Effective Large Data Batch Processing for Call Center Transcripts Using AWS Lambda Functions." International Journal for Research in Applied Science and Engineering Technology 12, no. 9 (2024): 102–4. http://dx.doi.org/10.22214/ijraset.2024.64137.

Full text
Abstract:
As enterprises increasingly rely on cloud services for scalable data processing, optimizing cost and efficiency in handling large datasets has become a priority. This paper explores the use of AWS Lambda for large-scale batch processing of call center transcripts, where data is stored in partitioned S3 buckets. We design a fault-tolerant and cost-effective architecture that leverages Lambda functions to process these datasets during off-peak hours, taking advantage of AWS’s pay-as-you-go pricing model. Our approach includes a retry logic for handling failures, ensuring the robustness of the system. The processed data, comprising AI-generated call transcripts, is saved back to S3. Through extensive experimentation, we demonstrate the efficiency of our method in terms of both cost and performance, making it a viable solution for large-scale data processing tasks in cloud environments.
APA, Harvard, Vancouver, ISO, and other styles
39

G., Satyanarayana, and Bhuvana J. Dr. "Speaker Identification of Customer and Agent using AWS." International Journal of Trend in Scientific Research and Development 4, no. 3 (2020): 997–99. https://doi.org/10.5281/zenodo.3892828.

Full text
Abstract:
As everyone knows that Sentimental analysis plays an important role in these days because many start ups have started with user driven content 1 . Only finding the voice is not be the real time scenario so finding the Sentiment analysis of agent and customer separately is an important research area in natural language processing. Natural language processing has a wide range of applications like voice recognition, machine translation, product review, aspect oriented product analysis, sentiment analysis and text classification etc 2 . This process will improve the business by analyze the emotions of the conversation with respect to the customer voice separately and also agent voice separately. In this project author going to perform speaker identification and analyze the sentiment of the customer and agent separately using Amazon Comprehend. Amazon Comprehend is a natural language processing NLP service that uses machine learning to extract the content of the voice. By using the speaker identification author can extract the unstructured data like images, voice etc separately so it is easy to analyze the business performance. Thus, will identify the emotions of the conversation and give the output whether the customer conversation is Positive, Negative, Neutral, or Mixed. To perform this author going to use some services from Aws due to some advantages like scaling the resources is easy compare to the normal process like doing physically such as support vector machine SVM . AWS services like s3 is a object data store, Transcribe which generate the audio to text in raw format, Aws Glue is a ETL Service which will extract transform and load the data from the S3, Aws Comprehend is a NLP service used for finding sentiment of audio, Lambda is a server less where author can write a code, Aws Athena is a analyzing tools which will make complex queries in less time and last there is quick sight is a business intelligent tool where author can visualize the data of customers and also agents. G. Satyanarayana | Dr. Bhuvana J &quot;Speaker Identification of Customer and Agent using AWS&quot; Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30753.pdf Paper Url :https://www.ijtsrd.com/computer-science/speech-recognition/30753/speaker-identification-of-customer-and-agent-using-aws/g-satyanarayana
APA, Harvard, Vancouver, ISO, and other styles
40

Vegesna, Rohith Varma. "Designing Software for Real-Time Pump Dispenser Data Streaming Using AWS Kinesis." International Journal of Multidisciplinary Research and Growth Evaluation 1, no. 1 (2020): 103–5. https://doi.org/10.54660/.ijmrge.2020.1.1.103-105.

Full text
Abstract:
Real-time data streaming is a transformative technology for fuel stations, enabling enhanced monitoring, fraud detection, and operational efficiency. This paper explores the architecture, implementation, and optimization of a robust software system using AWS Kinesis for real-time streaming of fuel dispenser and Automatic Tank Gauge (ATG) data. Multiple Kinesis streams are used to distribute data effectively, with streams dedicated to dispenser transactions, ATG data, and ATG alerts. The system ensures scalability, low latency, and data redundancy by leveraging AWS S3 for backup. Through detailed analysis and a pilot implementation, this paper highlights the significant improvements in efficiency and data reliability achieved through this approach, along with recommendations for future enhancements.
APA, Harvard, Vancouver, ISO, and other styles
41

Abbas, Amel H., Mustafa Mohammed Khudhair, Mustafa A. Aziz, Nadia Hussien, and Yasmin Makki Mohialden Mohialden. "Novel Framework for Enhancing E-Governance Systems Using Cloud Computing and Data Management Techniques." International Journal Papier Advance and Scientific Review 5, no. 2 (2024): 7–20. http://dx.doi.org/10.47667/ijpasr.v5i2.322.

Full text
Abstract:
This study proposes a novel way for electronic government organizations to create and deploy computer systems. The recommended solution uses Amazon Web Services (AWS) S3 and Python's Pandas module to handle and analyze citizen data in a safe, scalable cloud. Data, security, and flexibility are major issues with e-government apps. It shows that cloud computing may increase digital governance's reliability, security, and efficiency, offering new alternatives to conventional paradigms. The utilization of cloud computing approaches to overcome traditional network restrictions and create a more flexible and efficient digital framework makes this research unique. According to the study, AWS S3 and Pandas can handle enormous datasets, improve data security, and streamline public service delivery. This strategy promotes service delivery and citizen interaction while improving e-governance's technological competence. According to the paper, online computing's scalability, efficiency, and security make it revolutionary in public administration. It gives states an organizational structure to embrace cloud-based apps, showing how technological improvements may improve government efficiency and efficacy, benefiting residents and the public. Emphasizes electronic governance by proposing a new cloud-based computing architecture for public administration. It enables governments to build more durable and adaptive electronic systems, enabling online governance advances.
APA, Harvard, Vancouver, ISO, and other styles
42

Behlitsov, S. "Автоматизація міграції програмного забезпечення у хмарну архітектуру із використанням інструменту інфраструктури як код Terraform у середовищі AWS". COMPUTER-INTEGRATED TECHNOLOGIES: EDUCATION, SCIENCE, PRODUCTION, № 56 (28 вересня 2024): 99–106. http://dx.doi.org/10.36910/6775-2524-0560-2024-56-12.

Full text
Abstract:
У цій статті розглядається комплексний підхід до автоматизації надання інфраструктури за допомогою Terraform і AWS. Основною метою цього дослідження було оптимізувати розгортання та керування хмарними ресурсами за допомогою практик інфраструктури як коду (IaC). Стаття починається з пояснення фундаментальних принципів інфраструктури як коду, підкреслюючи її переваги з точки зору послідовності, повторюваності та співпраці. Центральне місце в цій методології займає Terraform, інструмент із відкритим кодом, розроблений для декларативного керування конфігурацією. Здатність Terraform визначати компоненти інфраструктури за допомогою простих, зрозумілих для людини файлів конфігурації забезпечує потужний механізм для надання та керування хмарними ресурсами в різних провайдерів. Ключовим моментом реалізації є використання AWS як хмарного провайдера. AWS пропонує широкий спектр послуг, від обчислювальних екземплярів (EC2) до керованих баз даних (RDS) і безсерверних обчислень (Lambda). У статті розглядаються конкретні випадки використання, коли Terraform використовується для автоматизації надання ресурсів AWS, таких як екземпляри EC2 для масштабованих веб-додатків, бакети S3 для зберігання об’єктів і конфігурації VPC для ізоляції мережі. Важливим аспектом, який обговорюється в статті, є керування станом Terraform, який відстежує поточний стан розгорнутих ресурсів і полегшує співпрацю між членами команди. Стратегії обробки файлів стану, включно з конфігураціями віддаленої серверної частини за допомогою Amazon S3 для блокування, вивчаються, щоб забезпечити узгодженість і уникнути конфліктів під час одночасного розгортання. Крім того, стаття стосується модулів Terraform, які інкапсулюють багаторазові конфігурації для різних компонентів інфраструктури, сприяючи багаторазовому використанню коду та зручності обслуговування. Завдяки модульній конфігурації команди можуть стандартизувати найкращі практики та застосовувати політики в різних середовищах, від розробки до виробництва. Інтеграція Terraform з конвеєрами CI/CD автоматизує тестування та розгортання змін інфраструктури, сприяючи гнучкості та зменшуючи ручне втручання. Ця автоматизація не тільки прискорює доставку оновлень інфраструктури, але й підвищує загальну надійність системи за допомогою автоматизованого тестування та механізмів відкату.
APA, Harvard, Vancouver, ISO, and other styles
43

Kewate, Neha. "A Review on AWS - Cloud Computing Technology." International Journal for Research in Applied Science and Engineering Technology 10, no. 1 (2022): 258–63. http://dx.doi.org/10.22214/ijraset.2022.39802.

Full text
Abstract:
Abstract: Cloud computing is something simple we can define as maintaining data centers and data servers and also u can access technology services by computing power, storage, and database using cloud computing technology AWS(Amazon Web Services). It is an emerged model which is already popular among almost all enterprises. It provides us the concept of ondemand services where we are using and scaling cloud resources on demand and as per demand respectively. AWS Cloud computing is a cost-effective model. The major concern in this model is Security and Storage in the cloud. This is one of the major reasons many enterprises of choosing AWS cloud computing. This paper provides a review of security research in the field of cloud security and storage services of the AWS cloud platform. After security and storage, we have presented the working of AWS (Amazon Web Service) cloud computing. AWS is the most trusted provider of cloud computing which not only provides excellent cloud security but also provides excellent cloud storage services. The main aim of this paper is to make cloud computing storage and security a core operation and not an add-on operation. As per the increase in the Service provider and related companies, this AWS Cloud Platform plays a vital role in service industries by giving its best web services, so, therefore, choosing the cloud service providers wisely is the basic need of the industry. Therefore we are going to see how AWS fulfills all these specific needs. Keywords: Trusted Computing, AWS, Information-Centric Security, Cloud Storage, S3, EC2, Cloud Computing
APA, Harvard, Vancouver, ISO, and other styles
44

Han, Yan. "Cloud Computing: Case Studies and Total Cost of Ownership." Information Technology and Libraries 30, no. 4 (2011): 198. http://dx.doi.org/10.6017/ital.v30i4.1871.

Full text
Abstract:
This paper consists of four major sections: The first section is a literature review of cloud computing and a cost model. The next section focuses on detailed overviews of cloud computing and its levels of services: SaaS, PaaS, and IaaS. Major cloud computing providers are introduced, including Amazon Web Services (AWS),&lt;br /&gt;Microsoft Azure, and Google App Engine. Finally, case studies of implementing web applications on IaaS and PaaS using AWS, Linode and Google AppEngine are demonstrated. Justifications of running on an IaaS provider (AWS) and running on a PaaS provider (Google AppEngine) are described. The last section discusses costs and technology analysis comparing cloud computing with local managed storage and servers. The total costs of ownership (TCO) of an AWS small instance are significantly&lt;br /&gt;lower, but the TCO of a typical 10TB space in Amazon S3 are&lt;br /&gt;significantly higher. Since Amazon offers lower storage pricing for huge amounts of data, the TCO might be lower. Readers should do their own analysis on the TCOs.
APA, Harvard, Vancouver, ISO, and other styles
45

Ganachari, Girish. "Cost - Effective Big Data Storage and Processing: Analysing Tradeoffs between S3, RDS, and DYNAMODB in AWS." International Journal of Science and Research (IJSR) 12, no. 11 (2023): 2230–33. http://dx.doi.org/10.21275/sr24827095532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Lukita, Dita, and Fandy Setyo Utomo. "Sistem Informasi Pengolahan Data Nilai Siswa Menggunakan AWS Berbasis WEB." Journal of Informatics and Interactive Technology 1, no. 3 (2024): 206–14. https://doi.org/10.63547/jiite.v1i3.24.

Full text
Abstract:
Sistem Informasi Pengolahan Data Nilai Siswa Berbasis Web dirancang untuk menyediakan laporan nilai dan informasi siswa secara online menggunakan teknologi cloud computing dari Amazon Web Services (AWS). Sistem ini mengatasi keterbatasan akses data, ketidakefektifan pengolahan nilai, dan lamanya waktu mendapatkan hasil evaluasi. Pengembangan sistem menggunakan model Waterfall dari Software Development Life Cycle (SDLC), mencakup analisis kebutuhan, desain dengan UML, implementasi menggunakan PHP dan Code Igniter, serta pengujian black box. Layanan AWS seperti Route 53, S3, Lambda, dan EC2 digunakan untuk memastikan efisiensi dan ketersediaan tinggi. Antarmuka sistem meliputi halaman beranda, login, menu admin, dan siswa, yang dirancang untuk memudahkan navigasi pengguna. Hasil implementasi menunjukkan sistem ini efektif dalam meningkatkan efisiensi biaya dan kecepatan pengolahan data nilai siswa. Dengan struktur pengembangan yang terorganisir, sistem ini diharapkan dapat mendukung pengambilan keputusan yang lebih baik di sekolah.
APA, Harvard, Vancouver, ISO, and other styles
47

Varun, Garg. "Real-Time Fault Tolerance Mechanisms in Communication Platforms Using AWS Services." International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences 7, no. 6 (2019): 1–5. https://doi.org/10.5281/zenodo.14508369.

Full text
Abstract:
In real-time communication systems, which must have minimal latency and great availability to preserve a perfect user experience, fault tolerance is essentially indispensable. On such platforms, system failures&mdash;such as network interruptions or service unavailability&mdash;may greatly affect user happiness and cause disturbance of communication. This paper explores fault tolerance methods enabled on real-time communication platforms with AWS capabilities. By use of retry rules, multi-region deployments, circuit breakers, and event-driven architectures, AWS services provide robust means of control over failures. Key services AWS Lambda, DynamoDB, Amazon S3, and CloudWatch help one to reach fault isolation, failover recovery, and system resilience. Aiming for cost-performance trade-offs, the study of various approaches shows how well they reduce downtime, maintain performance, and enable scalability. This work illustrates how cloud-native technologies could improve fault tolerance in real-time systems, therefore providing a paradigm for developing powerful communication platforms.
APA, Harvard, Vancouver, ISO, and other styles
48

Tiwari, Rajeev, Shuchi Upadhyay, Gunjan Lal, and Varun Tanwar. "Project Workflow Management: A Cloud based Solution-Scrum Console." International Journal of Engineering & Technology 7, no. 4 (2018): 2457. http://dx.doi.org/10.14419/ijet.v7i4.15799.

Full text
Abstract:
Today, there is a data workload that needs to be managed efficiently. There are many ways for the management and scheduling of processes, which can impact the performance and quality of the product and highly available, scalable web hosting can be a complex and expensive proposition. Traditional web architectures don’t offer reliability. So in this work a Scrum Console is being designed for managing a process which will be hosted on Amazon Web Services (AWS) [2] which provides a reliable, scalable, highly available and high performance infrastructure web application. The Scrum Console Platform facilitates the collaboration of various members of a team to manage projects together. The Scrum Console Platform has been developed using JSP, Hibernate &amp; Oracle 12c Enterprise Edition Database. The Platform is deployed as a web application on AWS Elastic Beanstalk which automates the deployment, management and monitoring of the application while relying on the underlying AWS resources such EC2, S3, RDS, CloudWatch, autoscaling, etc.
APA, Harvard, Vancouver, ISO, and other styles
49

Goel, Abhay, Abhishek Sharma, and Deepak Gupta. "Immigration Control and Management System using Blockchain." Innovative Computing and Communication: An International Journal 1, no. 3 (2020): 19–24. https://doi.org/10.5281/zenodo.4743748.

Full text
Abstract:
In this paper, we propose a system using Blockchain technology to create a decentralized, secure, and scalable migration records of individuals. We try to utilize Ethereum Blockchain and Proof of Work as the consensus algorithm. We attempt to mitigate the process of illegal immigration by keeping immutable and unique record of state of migration as well as personal information of an individual to check their authenticity. The AWS S3 service has been used to store the official documents of each individual securely, while still maintaining transparency. The proposed system will not only help to check for unlawful immigration but will also allow to check if an individual has successfully reached the intended destination. Also, the storage system will help keep the original documents of every individual in a cloud based storage solution.
APA, Harvard, Vancouver, ISO, and other styles
50

Adnan, Mohammed, Roshan Zameer S., Chowdhry Umar, and Ashok Kumar Dr. "Machine Learning-Based System for Weather Prediction and Air Quality Index Estimation." International Journal of Engineering and Management Research 14, no. 2 (2024): 134–42. https://doi.org/10.5281/zenodo.11084950.

Full text
Abstract:
This paper presents a study on "Machine Learning for Weather Prediction and Air Quality Index Estimation," aimed at enhancing weather forecasting and air quality monitoring. Integrating historical weather data with real-time atmospheric measurements from the OpenWeather API, the study utilizes the Random Forest Machine Learning algorithm to construct predictive models. Backend operations are managed by a Django application on AWS EC2, supported by Nginx as a reverse proxy. The frontend, a ReactJS-based web app hosted on AWS S3 and distributed via CloudFront, offers an intuitive interface. Additionally, a dedicated mobile app extends the system's reach, delivering real-time updates on weather conditions and air quality. This comprehensive approach empowers users with precise insights for informed decision-making and environmental awareness.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!