To see the other types of publications on this topic, follow the link: Algoritmo di shor.

Journal articles on the topic 'Algoritmo di shor'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Algoritmo di shor.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

H, Tubagus Rizal Abdul, Dini Destiani Siti Fatimah, and Rinda Cahyana. "Pengembangan Aplikasi Short Message Service Gateway Dengan Fitur Autoreply Short Message Service Untuk Promosi di Air’s Leather." Jurnal Algoritma 10, no. 2 (September 1, 2013): 187–97. http://dx.doi.org/10.33364/algoritma/v.10-2.187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lamorahan, Christine, Benny Pinontoan, and Nelson Nainggolan. "Data Compression Using Shannon-Fano Algorithm." d'CARTESIAN 2, no. 2 (October 1, 2013): 10. http://dx.doi.org/10.35799/dc.2.2.2013.3207.

Full text
Abstract:
Abstract Communication systems in the world of technology, information and communication are known as data transfer system. Sometimes the information received lost its authenticity, because size of data to be transferred exceeds the capacity of the media used. This problem can be reduced by applying compression process to shrink the size of the data to obtain a smaller size. This study considers compression for data text using Shannon – Fano algorithm and shows how effective these algorithms in compressing it when compared with the Huffman algorithm. This research shows that text data compression using Shannon-Fano algorithm has a same effectiveness with Huffman algorithm when all character in string all repeated and when the statement short and just one character in the statement that repeated, but the Shannon-Fano algorithm more effective then Huffman algorithm when the data has a long statement and data text have more combination character in statement or in string/ word. Keywords: Data compression, Huffman algorithm, Shannon-Fano algorithm Abstrak Sistem komunikasi dalam dunia teknologi informasi dan komunikasi dikenal sebagai sistem transfer data. Informasi yang diterima kadang tidak sesuai dengan aslinya, dan salah satu penyebabnya adalah besarnya ukuran data yang akan ditransfer melebihi kapasitas media yang digunakan. Masalah ini dapat diatasi dengan menerapkan proses kompresi untuk mengecilkan ukuran data yang besar sehingga diperoleh ukuran yang lebih kecil. Penelitian ini menunjukan salah satu kompresi untuk data teks dengan menggunakan algoritma Shannon – Fano serta menunjukan seberapa efektif algoritma tersebut dalam mengkompresi data jika dibandingkan dengan algoritma Huffman. Kompresi untuk data teks dengan algoritma Shannon-Fano menghasilkan suatu data dengan ukuran yang lebih kecil dari data sebelumnya dan perbandingan dengan algoritma Huffman menunjukkan bahwa algoritma Shannon- Fano memiliki keefektifan yang sama dengan algoritma Huffman jika semua karakter yang ada di data berulang dan jika dalam satu kalimat hanya ada satu karakter yang berulang, tapi algoritma Shannon-Fano lebih efektif jika kalimat lebih panjang dan jumlah karakter di dalam kalimat atau kata lebih banyak dan beragam. Kata kunci: Algoritma Huffman, Algoritma Shannon-Fano, Kompresi data
APA, Harvard, Vancouver, ISO, and other styles
3

Mulyani, Isnawati, Eri Satria, and Asep Deddy Supriatna. "Pengembangan Short Message Service (SMS) Gateway Layanan Informasi Akademik di SMK YPPT Garut." Jurnal Algoritma 9, no. 2 (September 1, 2012): 389–97. http://dx.doi.org/10.33364/algoritma/v.9-2.389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Margolang, Yustika, Fauriatun Helmiah, and Mardalius Mardalius. "Analisa Algoritma Apriori dengan Association Rule Untuk Rekomendasi Promosi Produk Elektronik Di Toko UD Surya Kisaran." J-Com (Journal of Computer) 1, no. 2 (July 31, 2021): 89–94. http://dx.doi.org/10.33330/j-com.v2i1.1190.

Full text
Abstract:
Abstract: Data Mining is a term used to describe the processes in each itemset to be able to find the results of each item. Analysis is used to determine the promotion of electronic products, namely the a priori algorithm association rules, therefore UD Surya Elektronik Shop for increasing sales results must have other strategies to be able to improve the sales system. One way is to determine the goods to be promoted to consumers. The collection of sales data that is owned can actually be processed using data mining to see customer buying patterns, with data mining for large data it will not be wasted and can be useful so that it can provide benefits to the company. In this study, the data processing uses the Apriori Algorithm, which is a data mining method that aims to find association patterns based on purchasing patterns made by consumers, so that it can be seen which items are often purchased simultaneously. Kata Kunci : Data Mining, Apriori Algorithms, Product Promotion Abstrak: Data Mining adalah suatu istilah yang digunakan untuk menguraikan proses-proses di setiap itemset untuk dapat menemukan hasil setiap item-item nya, Analisa yang digunakan untuk menentukan promosi produk-produk elektronik yaitu dengan aturan asosiasi algoritma apriori, oleh karena itu Toko UD Surya Elektronik untuk meningkatkan hasil penjualan maka harus memiliki strategi lain untuk dapat meningkatkan sistem penjualannya. Salah satunya adalah dengan menentukan barang yang akan dipromosikan kepada konsumen. Kumpulan data penjualan yang dimiliki sebenarnya dapat diolah menggunakan data mining untuk melihat pola pembelian pelanggan, dengan data mining untuk data yang besar tidak akan terbuang begitu saja dan dapat bermanfaat sehingga dapat memberikan keuntungan kepada perusahaan. Pada penelitian ini, proses pengolahan data menggunakan Algoritma Apriori yang merupakan salah satu metode data mining yang bertujuan untuk mencari pola assosiasi berdasarkan pola pembelian yang dilakukan oleh konsumen, sehingga bisa diketahui item-item barang apa saja yang sering dibeli secara bersamaan. Kata Kunci : Data Mining, Algoritma Apriori, Promosi Produk.
APA, Harvard, Vancouver, ISO, and other styles
5

Park, Chan Yik. "Damage Index Comparison for a Composite Stiffened Panel Using Lamb Wave." Advanced Materials Research 26-28 (October 2007): 1265–68. http://dx.doi.org/10.4028/www.scientific.net/amr.26-28.1265.

Full text
Abstract:
Various damage index (DI) algorithms of detecting changes such as a loosen bolt and a delamination development in a composite structure were examined using ultrasonic Lamb waves generated by embedded piezoelectric active sensors. The DI is a single value that is a function of response signal’s attenuation due to any damage or changes in a structure. Various DI algorithms such as active damage interrogation (ADI), time domain root men square (RMS), short time Fourier Transform (STFT) and time reversal (TR) were discussed. For experimental validation, a composite stiffened panel was used, and loosen bolt damage and low-velocity- impact damage were tested. In order to pitch and catch Lamb waves, surface mounted PZTs (lead zirconate titanate) were used. According to the DI algorithms, appropriate ultrasonic guided Lamb waves were selected for actuators. Each set of DI algorithm and drive signal showed different characteristics to detect the damage.
APA, Harvard, Vancouver, ISO, and other styles
6

Pratama, Loviga Denny, and Wahyu Setyaningrum. "Algoritma berhitung Blija ̂h pada masyarat Madura di Kabupaten Probolinggo: Alternatif pendekatan pembelajaran operasi bilangan." Pythagoras: Jurnal Pendidikan Matematika 13, no. 2 (December 14, 2018): 203–13. http://dx.doi.org/10.21831/pg.v13i2.15931.

Full text
Abstract:
Studi ini bertujuan untuk mendeskripsikan algoritma berhitung pada transaksi jual-beli yang dilakukan Blijh (penjual sayur tradisional) dalam masyarakat Madura di Kabupaten Probolinggo, Indonesia. Studi ini merupakan studi kualitatif dengan pendekatan studi kasus. Data dikumpulkan dengan teknik observasi dan wawancara. Sumber data adalah 8 orang Blijh yang tinggal di Kecamatan Paiton, Kabupaten Probolinggo, Indonesia. Analisis data dilakukan dengan menggunakan model Miles & Huberman, dengan tahapan analisis meliputi reduksi data, penyajia datam dan penyimpulan. Data yang diperoleh berupa data kualitatif yang menggambarkan beberapa algoritma berhitung yang terdiri dari penjumlahan, pengu­rangan, perkalian dan pembagian. Adapun dalam artikel ini akan fokus pada algoritma penjumlahan dan pengurangan. Hasil studi menunjukkan bahwa (1) algoritma berhitung penjumlahan dan pengurangan yang dilakukan oleh Blijh berbeda dengan algoritma yang diajarkan di sekolah; (2) dalam melakukan operasi penjumlahan dan pengurangan, cara yang berbeda yang digunakan oleh Blijh yaitu dengan menggenapkan bilangan pertama ke bilangan puluhan diatasnya, berbeda dengan cara yang diajarkan di sekolah, dimana cara yang digunakan yaitu cara bersusun panjang dan bersusun pendek. Hasil penelitian ini diharapkan dapat memberikan inspirasi dan alternatif dalam pembelajaran berhitung di kelas.Kata kunci: algoritma berhitung, Blijh, operasi bilangan, pendekatan pembelajaran Blijh’s arithmetic algorithm on Madurese society in Probolinggo Regency:An alternative approach in learning number operation AbstractThe aim of this study was to describe arithmetic algorithm on transaction em­ployed by Blijh (traditional vegetables seller) in Madurese community at Probolinggo regency, Indonesia. The study was a qualitative approach by means of case study. The data were gathered through observation and interviews. The partisipants of this study were 8 Blijh living in Paiton, Probolinggo regency, Indonesia. The data were analyzed using Miles & Huberman model, with analysis step consisting of data reduction, data display, and conclusion. The results of this study obtained some arithmatics algorithm of addition, subtraction, multipication, and division. This paper, however, will only focus on addition and subtraction algorithm. The data reveals that (1) arithmatic algorithm of addition and substraction which used by Blijh was different from the one taught at school; (2) In performing addition and substraction operations, the different method which used by Blijh namely replace the first number to tens above it, while the method that taught at school was using long and short-stacke method. The results of this study could be used as one of teaching and learning approach to develop numerical and mental calculation skills.
APA, Harvard, Vancouver, ISO, and other styles
7

Samosir, Hernita, Muhammad Amin, and Indra Ramadona Harahap. "Penerapan Data Mining untuk Klasifikasi Produk Merk Bata Menggunakan Algoritma K-Means." JUTSI (Jurnal Teknologi dan Sistem Informasi) 2, no. 1 (June 19, 2021): 161–66. http://dx.doi.org/10.33330/jutsi.v2i1.1163.

Full text
Abstract:
Abstract: Tanjungbalai Bata Store is a store that is engaged in the business of selling products and every day processes purchase data, sales data and transaction data. Transaction data is the result of sales that can be obtained so that store management knows the strategies that will be carried out to increase sales results. As for consumers who make transactions at stores for a separate reason, especially because of the completeness and many models that can be obtained from the Tanjungbalai brick shop, another reason is that the Tanjungbalai Brick Shop can provide a sense of comfort and peace in addition and the cleanliness seen from the store . There are many types of products sold at the Tanjungbalai Brick Shop. However, Tanjungbalai Brick Shop cannot classify products that are selling well and those that are not selling well. So that the difficulties experienced are the frequent shortage of stock of products that sell well because sales are high and the accumulation of products that are not selling well in the warehouse because the sellers are low. Based on the problems above, data mining is needed to classify which products are in demand and which are not. Data mining and k-means methods can help in this research combined with the PHP programming language and MySQL database. Keywords:Data Mining; Product Classification; K-Means Algorithm. Abstrak:Toko Bata Tanjungbalai adalah toko yang bergerak di bidang bisnis penjulalan produk dan setiap harinya melalukan proses data pembelian, data penjualan maupun data transaksi. Data transaksi merupakan hasil penjualan yang di dapat agar manajemen toko mengetahui strategi yang akan di lakukan untuk meningkatkan hasil penjualan. Adapun konsumen yang melakukan transaksi di toko memiliki alas an tersendiri ataupun di karenakan kelengkapan dan banyak model yang bisa di dapatkan dari toko bata tanjungbalai, alasan yang lain adalah Toko Bata Tanjungbalai dapat memberikan rasa nyamandan tentram di tambah lagi keramahan dan kebersihan yang di lihat dari toko tersebut. Ada banyak jenis produk yang terjual di Toko Bata Tanjungbalai, namun toko bata Tanjungbalai tidaklah mampu dalam membagikan kelompok produk tersebut masuk kategori laris dan tidak laris. Sehingga kesulitan yang dialami yaitu seringnya kekurangan stok produk yang laku karena penjualannya tinggi dan menumpuknya produk yang tidak laris di gudang karena penjualnnya rendah. Berdasarkan permasalahan di atas maka dibutuhkan data mining untuk mengelompokkan produk mana saja yang laris dan tidak. Data mining dan metode k-meansdapat membantu dalam penelitian ini dipadukan dengan pemrograman PHP dan MySQL. Kata Kunci :Data Mining; Klasifikasi Produk; Algoritma K-Means.
APA, Harvard, Vancouver, ISO, and other styles
8

Nurdiansyah, Rudi. "Optimasi Penjadwalan Flow Shop Menggunakan Algoritma Hybrid Differential Evolution." Rekayasa Energi Manufaktur 1, no. 2 (April 11, 2017): 43. http://dx.doi.org/10.21070/r.e.m.v1i2.404.

Full text
Abstract:
Penjadwalan produksi merupakan bagian integral di dalam sistem manufaktur. Artikel ini menyelesaikan permasalahan penjadwalan flow shop dengan fungsi obyektif total flow time. Dalam penjadwalan, total flow time menghasilkan konsumsi yang stabil terhadap sumber daya, perputaran job yang cepat serta meminimalkan work in process inventory. Permasalahan penjadwalan flow shop tergolong pada permasalahan optimasi kombinatorial yang merupakan NP-hard. Saat ini, penggunaan algoritma metaheuristik banyak digunakan untuk memecahkan kasus optimasi kombinatorial, termasuk penjadwalan flow shop. Salah satu yang memiliki performa yang baik adalah Algoritma Differential Evolution. Untuk meningkatkan kualitas solusinya, Algoritma Differential Evolution akan ditambahkan dengan prosedur local search yang dinamakan Hybrid Differential Evolution. Untuk mengetahui performa dari algoritma tersebut, dilakukan pengujian menggunakan data penjadwalan flow shop yang ada pada OR-Library. Performa Hybrid Differential Evolution akan dibandingkan dengan algoritma lain. Hasil pengujian menunjukkan bahwa Hybrid Differential Evolution memberikan performa yang lebih baik dibandingkan dengan algoritma lain.
APA, Harvard, Vancouver, ISO, and other styles
9

Hoeller, Frank. "LEO: Liquid Exploration Online." International Journal of Robotic Computing 2, no. 1 (April 1, 2020): 58–80. http://dx.doi.org/10.35708/rc1869-126259.

Full text
Abstract:
This article introduces a novel approach to the online complete- coverage path planning (CCPP) problem that is specically tailored to the needs of skid-steer tracked robots. In contrast to most of the current state-of-the-art algorithms for this task, the proposed algorithm reduces the number of turning maneuvers, which are responsible for a large part of the robot's energy consumption. Nevertheless, the approach still keeps the total distance traveled at a competitive level. The algorithm operates on a grid-based environment representation and uses a 3x3 prioritization matrix for local navigation decisions. This matrix prioritizes cardinal di- rections leading to a preference for straight motions. In case no progress can be achieved based on a local decision, global path planning is used to choose a path to the closest known unvisited cell, thereby guaranteeing completeness of the approach. In an extensive evaluation using simulation experiments, we show that the new algorithm indeed generates competi- tively short paths with largely reduced turning costs, compared to other state-of-the-art CCPP algorithms. We also illustrate its performance on a real robot.
APA, Harvard, Vancouver, ISO, and other styles
10

Arfan, Adhib, and Lussiana ETP. "Perbandingan Algoritma Long Short-Term Memory dengan SVR Pada Prediksi Harga Saham di Indonesia." PETIR 13, no. 1 (March 21, 2020): 33–43. http://dx.doi.org/10.33322/petir.v13i1.858.

Full text
Abstract:
Banyak investor masih ragu dengan risiko dalam berinvestasi, hal ini disebabkan oleh fluktuasi indeks harga saham dalam waktu singkat. Telah banyak dikembangkan metode untuk memperkirakan harga saham yang akan datang namun masih memiliki keterbatasan di antaranya adalah ketergantungan jangka panjang. Tujuan penelitian yang ingin dicapai adalah menghasilkan model peramalan harga saham yang lebih efektif dan memberikan hasil yang akurat. Tahapan yang dilakukan terdiri dari pengumpulan data, preprocessing data, pembagian data, perancangan LSTM, pelatihan LSTM dan melakukan pengujian. Berdasarkan hasil pengujian, LTSM mampu memprediksi harga saham pada tahun 2017-2019 dengan performa yang baik dan tingkat kesalahan yang relatif kecil. Sedangkan pengujian menggunakan metode Support Vector Regression (SVR), LSTM memiliki nilai loss lebih baik dari algoritma SRV. Rentang data pada LSTM mempengaruhi waktu latih yang digunakan, semakin besar rentang data maka semakin lama waktu latih yang digunakan. Rentang data pada SVR mempengaruhi nilai loss, semakin besar rentang data maka semakin besar nilai loss yang dihasilkan. Dengan demikian dapat disimpulkan bahwa LSTM mampu menanggulangi ketergantungan jangka panjang dan mampu memprediksi harga saham dengan hasil yang akurat.
APA, Harvard, Vancouver, ISO, and other styles
11

Rosiana, Mila, Andy Hidayat Jatmika, and Ariyan Zubaidi. "Pencarian Rute yang Handal Berbasis Energi Menggunakan Algoritma EA-SHORT pada Protokol Routing ZRP di Jaringan MANET." Jurnal Teknologi Informasi, Komputer, dan Aplikasinya (JTIKA ) 3, no. 1 (April 7, 2021): 31–42. http://dx.doi.org/10.29303/jtika.v3i1.112.

Full text
Abstract:
Mobile Ad-Hoc Network (MANET) adalah jaringan wireless dari kumpulan node yang tidak memiliki router tetap. Setiap node dalam jaringan bertindak sebagai router yang bertanggung jawab untuk menemukan dan menangani rute antar node. Dalam penelitian ini, konsep energy aware menggunakan algoritma EA-SHORT diterapkan pada kerangka kerja Zone Routing Protocol (ZRP). EA-SHORT mencoba mendistribusikan beban jaringan ke semua node yang ada dengan memanfaatkan variasi jumlah energi dengan memilih node yang memiliki cukup energi yang dapat berpartisipasi dalam rute dan menghindari node yang memiliki energi rendah. Kinerja ZRP akan dibandingkan dengan EA-SHORT ZRP yang telah dimodifikasi dengan EA-SHORT yang diukur dari nilai parameter yang ditentukan. Dari simulasi, hasilnya menunjukkan, pada node 50, throughput meningkat sebesar 12,374%. Untuk 100 node, peningkatan sebesar 44.597%. Pada rata-rata average end to end delay , dengan 50 node, nilai EA-SHORT ZRP menurun sebesar 20.063%, 100 node EA-SHORT ZRP menurun sebesar 8.375%. Hasil PDR pada EA-SHORT ZRP dengan 50 node meningkat 0,545%, dan untuk EA-SHORT ZRP 100 node meningkat sebesar 21,301%
APA, Harvard, Vancouver, ISO, and other styles
12

HABIBI, MUHAMMAD NIZAR, DIMAS NUR PRAKOSO, NOVIE AYUB WINDARKO, and ANANG TJAHJONO. "Perbaikan MPPT Incremental Conductance menggunakan ANN pada Berbayang Sebagian dengan Hubungan Paralel." ELKOMIKA: Jurnal Teknik Energi Elektrik, Teknik Telekomunikasi, & Teknik Elektronika 8, no. 3 (August 27, 2020): 546. http://dx.doi.org/10.26760/elkomika.v8i3.546.

Full text
Abstract:
ABSTRAKAlgoritma IncrementaL Conductance (IC) adalah algoritma yang bisa diimplementasikan pada sistem Maximum Power Point Tracking (MPPT) untuk mendapatkan daya maksimum dari panel surya. Akan tetapi algoritma MPPT IC tidak bisa bekerja dikondisi berbayang sebagian, karena menimbulkan daya maksimum lebih dari satu. Artificial Neural Network (ANN) bisa mengidentifikasi kurva karakteristik pada kondisi berbayang sebagian dan dapat mengetahui posisi daya maksimum yang sebenarnya. Masukan dari ANN merupakan nilai arus hubung singkat serta tegangan buka dari panel surya, dan keluaran dari ANN adalah nilai duty cycle yang digunakan sebagai posisi awal tracking dari MPPT IC. Data learning didapatkan dari perubahan nilai duty cycle secara manual pada sistem MPPT di berbagai kondisi radiasi. Hasil pengujian menunjukkan algoritma yang diajukan dapat menaikkan energi 5.79% - 13.32% dibandingkan dengan ANN-Perturb and Observe dan ANN-Incremental Resistance dengan durasi 0.6 detik.Kata kunci: MPPT, Incremental Conductance, Artficial Neural Network, Berbayang Sebagian, Hubungan Paralel ABSTRACTThe Incremental Conductance (IC) algorithm is an algorithm that can be implemented on Maximum Power Point Tracking (MPPT) systems to get maximum power from solar panels. However, the MPPT IC algorithm cannot work in partial shading conditions because it causes more than one maximum power. Artificial Neural Network (ANN) can identify characteristic curves under partial shading conditions and can know the actual maximum power position. The input from ANN is the short circuit current and the open voltage of the solar panel. The output of ANN is the duty cycle value that is used as the initial tracking position of the MPPT IC. Learning data is obtained from manually changing the duty cycle value in the MPPT system in various radiation conditions. The test results show the proposed algorithm can increase energy 5.79% - 13.32% when compared with ANN-Perturb and Observe and ANN-Incremental Resistance with a duration of 0.6 seconds.Keywords: Maximum Power Point Tracking, Incremental Conductance, Artficial Neural Network, Partial Shading, Parallel Connection
APA, Harvard, Vancouver, ISO, and other styles
13

Harahap, Lukman Adlin, Ridzuan Ikaram Fajri, Mohammad Fadly Syahputra, Romi Fadillah Rahmat, and Erna Budhiarti Nababan. "Identifikasi Penyakit Daun Tanaman Kelapa Sawit dengan Teknologi Image Processing Menggunakan Aplikasi Support Vector Machine." Talenta Conference Series: Agricultural and Natural Resources (ANR) 1, no. 1 (October 16, 2018): 53–59. http://dx.doi.org/10.32734/anr.v1i1.96.

Full text
Abstract:
Pengelolaan perkebunan kelapa sawit sering mengalami kendala, antara lain masalah organisme pengganggu tumbuhan (OPT) terutama masalah penyakit. Oleh karena itu, dibuatlah pendekatan untuk mengenali penyakit pada daun kelapa sawit agar dapat membantu kinerja dari para petani kelapa sawit dalam menentukan jenis penyakit pada daun sehingga mendapatkan hasil yang lebih maksimal. Deteksi tepi adalah perubahan nilai intensitas derajat keabuan yang mendadak (besar) dalam jarak yang singkat. Sobel operator digunakan untuk pengidentifikasikan pola wajah, khususnya terdapat di dalam algoritma deteksi tepi. Support Vector Machine (SVM) digunakan sebagai metode klasifikasi. Oleh karena itu, dalam penelitian ini penulis akan menerapkan metode deteksi tepi dengan menggabungkan teknik algoritma Sobel Operator untuk menghilangkan derau dan metode Support Vector Machine sebagai pengklasifikasian data penyakit pada daun kelapa sawit. The management of oil palm plantations often experiences obstacles, including problems with plant pest organisms (OPT), especially disease problems. Therefore, an approach was made to encourage the disease in the leaves of oil palm so that it can help the performance of oil palm farmers in determining the type of disease in the leaves so as to get maximum results. Edge detection is a change in the value of the sudden intensity of the degree of gray (large) in a short distance. Sobel operators are used to identifying face patterns, especially those found in edge detection algorithms. Support Vector Machine (SVM) is used as a classification method. Therefore in this study, the author will apply the edge detection method by combining the Sobel Operator algorithm technique to eliminate noise and the Support Vector Machine method as a classification of disease data on palm oil leaves.
APA, Harvard, Vancouver, ISO, and other styles
14

Wibisono, Mohamad Bayu. "PENERAPAN ALGORITMA K-NEAREST NEIGHBOUR UNTUK MENGIDENTIFIKASI JENIS BIJI KOPI ARABIKA DAN ROBUSTA." Informatik : Jurnal Ilmu Komputer 15, no. 2 (January 13, 2020): 91. http://dx.doi.org/10.52958/iftk.v15i2.1426.

Full text
Abstract:
Dengan banyaknya kebutuhan pecinta kopi yang ada di masyarakat, mempengaruhi kebutuhan kopi di masyarakat. Terkadang sulit bagi orang pertama untuk membedakan jenis kopi antara kopi robusta dan kopi arabika. Masalah yang terjadi adalah banyaknya konsumen yang tidak yakin dengan jenis kopi yang dijual oleh coffee shop. Untuk mengatasi masalah ini, perlu dilakukan penelitian untuk merancang aplikasi yang dapat membantu orang membedakan antara kopi robusta dan kopi arabika. Penelitian ini menggunakan deteksi tepi Canny dengan K-Nearest Lightweight Classification (K-NN). Pengolahan citra yang digunakan adalah citra data biji kopi dengan 200 kopi. Dari penelitian ini penulis menghasilkan akurasi terbaik sebesar 75%
APA, Harvard, Vancouver, ISO, and other styles
15

Hairani, B. Nurwahyu, Andy Hidayat Jatmika, and Fitri Bimantoro. "Penerapan Algoritma EA-SHORT pada Protokol Routing AOMDV untuk Menemukan Rute yang Handal Berbasis Energi di Jaringan MANET." Jurnal Teknologi Informasi, Komputer, dan Aplikasinya (JTIKA ) 3, no. 1 (April 7, 2021): 13–23. http://dx.doi.org/10.29303/jtika.v3i1.110.

Full text
Abstract:
Mobile Ad Hoc Network (MANET) is a stand-alone wireless network, consisting of several nodes that can move in all directions freely. The routing protocol used as the object of this study is Ad Hoc On-Demand Multipath Distance Vector (AOMDV). This research carries out an energy calculation process at a node that can be used as a new energy-based route by utilizing topology changes. Energy efficiency in the routing protocol can be done using the Energy-Aware SHORT (Self Healing and Optimizing Routing Techniques) method. The main purpose of the Energy-Aware SHORT (EA-SHORT) algorithm is to save energy on MANET, by extending the life of nodes and networks by routing packets through nodes that have enough remaining power and avoiding nodes that have low power. AOMDV performance will be compared with AOMDV which has been modified with EA-SHORT measured from the specified parameter values. Analysis results show that the application of the EA-SHORT algorithm in the efficiency of route search succeeded in improving performance. The results of throughput on EA-AOMDV increased by 13.904% for an area of ​​500x500 m2 and 13.905% at 1000x1000 m2. The packet delivery ratio increased by 0.91% and 2.273%. Average end-to-end delay decreased by 20.482% and 18.734%.
APA, Harvard, Vancouver, ISO, and other styles
16

Syahara, Zahra, Rika Nur Adiha, and Agus Perdana Windarto. "Implementasi Data Mining Algoritma Apriori Pada Sistem Persediaan Bahan Bangunan Di Karang Sari." Brahmana : Jurnal Penerapan Kecerdasan Buatan 2, no. 2 (June 30, 2021): 107–15. http://dx.doi.org/10.30645/brahmana.v2i2.72.

Full text
Abstract:
The development of construction in Indonesia is increasing rapidly, therefore building materials are needed to be used in building a construction. Every building materials store must have a transaction system and an inventory system. Whether it's an efficient or less efficient inventory system. For this reason, this research was conducted to help the owner or manager of a building shop to more easily determine the combination pattern of supplies and purchases of building goods. The a priori algorithm is used in this study because the a priori algorithm is suitable in connecting the itemset combination patterns. Therefore the a priori algorithm is suitable in determining the purchase combination pattern in order to get a good inventory system. And from the process carried out, a confidence value of 75% was obtained, not only that this research was also assisted by a priori algorithm supporting application, namely the Tanagra application.
APA, Harvard, Vancouver, ISO, and other styles
17

Kusuma Susanto, Evan, and Yosi Kristian. "Pemanfaatan Asynchronous Advantage Actor-Critic Dalam Pembuatan AI Game Bot Pada Game Arcade." Journal of Intelligent System and Computation 1, no. 2 (December 5, 2019): 74–84. http://dx.doi.org/10.52985/insyst.v1i2.82.

Full text
Abstract:
Asynchronous Advantage Actor-Critic (A3C) adalah sebuah algoritma deep reinforcement learning yang dikembangkan oleh Google DeepMind. Algoritma ini dapat digunakan untuk menciptakan sebuah arsitektur artificial intelligence yang dapat menguasai berbagai jenis game yang berbeda melalui trial and error dengan mempelajari tempilan layar game dan skor yang diperoleh dari hasil tindakannya tanpa campur tangan manusia. Sebuah network A3C terdiri dari Convolutional Neural Network (CNN) di bagian depan, Long Short-Term Memory Network (LSTM) di tengah, dan sebuah Actor-Critic network di bagian belakang. CNN berguna sebagai perangkum dari citra output layar dengan mengekstrak fitur-fitur yang penting yang terdapat pada layar. LSTM berguna sebagai pengingat keadaan game sebelumnya. Actor-Critic Network berguna untuk menentukan tindakan terbaik untuk dilakukan ketika dihadapkan dengan suatu kondisi tertentu. Dari hasil percobaan yang dilakukan, metode ini cukup efektif dan dapat mengalahkan pemain pemula dalam memainkan 5 game yang digunakan sebagai bahan uji coba.
APA, Harvard, Vancouver, ISO, and other styles
18

Wulandari, Yunia Puspita, Andy Hidayat Jatmika, and Fitri Bimantoro. "Meningkatkan Efisiensi Rute Pada Protokol Routing AOMDV Menggunakan Metode PA-SHORT di Jaringan MANET." Jurnal Teknologi Informasi, Komputer, dan Aplikasinya (JTIKA ) 1, no. 1 (March 26, 2019): 77–85. http://dx.doi.org/10.29303/jtika.v1i1.11.

Full text
Abstract:
Mobile Ad-Hoc Network (MANET) is a development of the Ad-Hoc Network, where the nodes of this network have dynamic mobility. There are several types of routing protocols in MANET, one of which is AOMDV. Route discovery on the AOMDV routing protocol is done by calculating the distance based on the number of hops. If the number of hops increased, it may cause a considerable delay and a decrease in throughput. This study compares the performance of the AOMDV routing protocol with the Path Aware-AOMDV (PA-AOMDV) routing protocol. PA-AOMDV routing protocol is obtained through modifications to the performance of the AOMDV protocol with the Path Aware SHORT algorithm. The Path Aware SHORT algorithm is a method to reduce the number of hops. SHORT improves routing optimization by monitoring routes and optimizing these routes that have better paths. The performance of both protocols will be seen based on four parameters, namely throughput, average end-to-end delay, packet delivery ratio, and routing overhead. Result shows that the throughput increased for 50 nodes is 61,84% and for 100 nodes is 45,2%, average end-to-end delay decreased for 50 nodes is 0,066% and for 100 nodes 0,12%, packet delivery ratio increased for 50 nodes is 60,87% and for 100 nodes 82,02%, and routing overhead decreased for 50 nodes is 67,07% and 100 nodes 45,36%.
APA, Harvard, Vancouver, ISO, and other styles
19

Oktaviani, Anisa, and Hustinawati. "PREDIKSI RATA-RATA ZAT BERBAHAYA DI DKI JAKARTA BERDASARKAN INDEKS STANDAR PENCEMAR UDARA MENGGUNAKAN METODE LONG SHORT-TERM MEMORY." Jurnal Ilmiah Informatika Komputer 26, no. 1 (2021): 41–55. http://dx.doi.org/10.35760/ik.2021.v26i1.3702.

Full text
Abstract:
Indonesia menempati peringkat ke-6 dari 98 negara paling berpolusi di dunia pada tahun 2019. Di tahun tersebut, rata-rata AQI (Air Quality Index) sebesar 141 dan rata-rata konsentrasi PM2.5 sebesar 51.71 μg/m3 yang lima kali lipat diatas rekomendasi World Health Organization (WHO). Salah satu kota penyumbang polusi udara yaitu Jakarta. Berdasarkan data ISPU (Indeks Standar Pencemar Udara) yang diambil dari SPKU (Stasiun Pemantau Kualitas Udara) Dinas Lingkungan Hidup DKI Jakarta melampirkan pada tahun 2019, Jakarta memiliki kualitas udara sangat tidak sehat. Oleh karena itu perlu adanya model Artificial Intelligence dalam memperdiksi rata-rata tingkat zat berbahaya pada udara di DKI Jakarta. Salah satu algoritma yang dapat diterapkan dalam membuat model prediksi dengan menggunakan data timeseries adalah Long Short-Term Memory (LSTM). Tujuan dari penelitian ini membangun model prediksi rata-rata ISPU di DKI Jakarta menggunakan metode LSTM yang berguna bagi para pemangku kepentingan dibidang lingkungan hidup khususnya mengenai polusi udara. Penelitian mengenai prediksi rata-rata ISPU di DKI Jakarta menggunakan metode LSTM, menghasilkan nilai evaluasi MAPE 12.28%. Berdasarkan hasil evaluasi MAPE yang diperoleh, model LSTM yang digunakan untuk prediksi rata-rata ISPU di DKI Jakarta masuk kedalam kategori akurat.
APA, Harvard, Vancouver, ISO, and other styles
20

Hermanto, Dedi Tri, Arief Setyanto, and Emha Taufiq Luthfi. "Algoritma LSTM-CNN untuk Binary Klasifikasi dengan Word2vec pada Media Online." Creative Information Technology Journal 8, no. 1 (March 31, 2021): 64. http://dx.doi.org/10.24076/citec.2021v8i1.264.

Full text
Abstract:
Media online banyak menghasilkan berbagai macam berita, baik ekonomi, politik, kesehatan, olahraga atau ilmu pengetahuan. Di antara itu semua, ekonomi adalah salah satu topik menarik untuk dibahas. Ekonomi memiliki dampak langsung kepada warga negara, perusahaan, bahkan pasar tradisional tergantung pada kondisi ekonomi di suatu negara. Sentimen yang terkandung dalam berita dapat mempengaruhi pandangan masyarakat terhadap suatu hal atau kebijakan pemerintah. Topik ekonomi adalah bahasan yang menarik untuk dilakukan penelitian karena memiliki dampak langsung kepada masyarakat Indonesia. Namun, masih sedikit penelitian yang menerapkan metode deep learning yaitu Long Short-Term Memory dan CNN untuk analisis sentimen pada artikel finance di Indonesia. Penelitian ini bertujuan untuk melakukan pengklasifikasian judul berita berbahasa Indonesia berdasarkan sentimen positif, negatif dengan menggunakan metode LSTM, LSTM-CNN, CNN-LSTM. Dataset yang digunakan adalah data judul artikel berbahasa Indonesia yang diambil dari situs Detik Finance. Berdasarkan hasil pengujian memperlihatkan bahwa metode LSTM, LSTM-CNN, CNN-LSTM memiliki hasil akurasi sebesar, 62%, 65% dan 74%.Kata Kunci — LSTM, sentiment analysis, CNNOnline media produce a lot of various kinds of news, be it economics, politics, health, sports or science. Among them, economics is one interesting topic to discuss. The economy has a direct impact on citizens, companies, and even traditional markets depending on the economic conditions in a country. The sentiment contained in the news can influence people's views on a matter or government policy. The topic of economics is an interesting topic for research because it has a direct impact on Indonesian society. However, there are still few studies that apply deep learning methods, namely Long Short-Term Memory and CNN for sentiment analysis on finance articles in Indonesia. This study aims to classify Indonesian news headlines based on positive and negative sentiments using the LSTM, LSTM-CNN, CNN-LSTM methods. The dataset used is data on Indonesian language article titles taken from the Detik Finance website. Based on the test results, it shows that the LSTM, LSTM-CNN, CNN-LSTM methods have an accuracy of, 62%, 65% and 74%.Keywords — LSTM, sentiment analysis, CNN
APA, Harvard, Vancouver, ISO, and other styles
21

Widyawati, Widyawati. "PENERAPAN ALGORITMA ANT COLONY OPTIMIZATION (ACO) PADA JOB SHOP SCHEDULING PROBLEM (JSSP) DI PT. SIEMENS INDONESIA (CILEGON FACTORY)." Jurnal Sistem Informasi dan Informatika (Simika) 1, no. 01 (September 6, 2018): 35–51. http://dx.doi.org/10.47080/simika.v1i01.37.

Full text
Abstract:
Fabrication process is often disrupted by non–deterministic job, this create a problem in the Pre-Fabrication department schedule because often the manufacture of raw material for non-deterministic job should given priority. This problem also affected by the existing system which is not yet fully developed to solve the problem of optimize rescheduling master line (seen from total makespan time). Ant Colony Optimization (ACO) variant Ant System (AS) was proposed to solve Job Shop Scheduling Problem (JSSP) with the objective to propose the best schedule that give shortest makespan. The algorithm tested to perform scheduling of 5 projects (consist of 10 parts) as the initial job, and another 2 projects (consist of 4 parts) as the non-deterministic job. For the initial job, makespan was 287 days and after the arrival of non-deterministic job, makespan was 362 days compare with the actual manufacturing time (7 project consist of 14 parts) which is ± 511 days
APA, Harvard, Vancouver, ISO, and other styles
22

Adrianto, Sukri, Nur Khasanah, and Deasy Wahyuni. "Implementasi Data Mining pada Penjualan Kartu Perdana Internet di Purnama Ponsel Menggunakan Metode Algoritma Apriori." JISKA (Jurnal Informatika Sunan Kalijaga) 5, no. 2 (September 10, 2020): 81. http://dx.doi.org/10.14421/jiska.2020.52-03.

Full text
Abstract:
Purnama Ponsel is a mobile business which is engaged in selling internet starter packs. Internet starter packs are cards that are used with restrictions or limits on internet usage, where by using this internet starter pack the user can save credit because all activities use quotas except telephone (voice calls) and Short Message Service (SMS). Some companies compete by issuing internet premieres so as to reduce access costs. This competition can be seen from the number of internet starter packs that are sold simultaneously and know the inventory for the next month. So to facilitate the calculation of cell phone owners using data mining with apriori algorithm method to complete this research. Using the apriori algorithm method can help mobile full moon owners in the calculation process on full moon cellphones using the Tanagra application and is expected to reduce the number of cards that expire and can predict how many card supplies are needed for the following month.
APA, Harvard, Vancouver, ISO, and other styles
23

Widodo, Erfan Djati, Denny Irawan, and Rini Puji Astutik. "ANALISIS KOORDINASI PROTEKSI RELAY ARUS LEBIH PADA SISTEM KELISTRIKAN PT. PETROKIMIA GRESIK PABRIK AMUREA 2 BERBASIS ALGORITMA GENETIKA." E-Link: Jurnal Teknik Elektro dan Informatika 16, no. 1 (July 21, 2021): 1. http://dx.doi.org/10.30587/e-link.v16i1.2693.

Full text
Abstract:
Sistem tenaga listrik tidak pernah lepas dari adanya gangguan, baik gangguan short-circuit, overload, maupun gangguan lainnya. Salah satu pengaman umtuk mengisolir gangguan yg terjadi khususnya di PT. Petrokimia Gresik Pabrik Amurea 2 adalah relay arus lebih. Sistem koordinasi proteksi relay arus lebih akan membandingkan metode perhitungan manual dan metode algoritma genetika guna untuk mengoptimalkan sistem tersebut. Hasil simulasi sistem koordinasi relay arus lebih menggunakan metode perhitungan manual masih terdapat overlapping antara kurva relay 11 (tap =1,2 ; td = 3,2) dan relay 4 (tap = 1,16 ; td = 5,58). Hasil penelitian menggunakan metode algoritma genetika mendapatkan hasil tidak terjadi overlapping antara kurva relay 11 (tap =0,9 ; td= 3,2) dan relay 4 (tap 1,5 ; td = 5,7). Dengan hasil tersebut, sistem koordinasi proteksi relay arus lebih optimal menggunakan algoritma genetika daripada perhitungan manual.
APA, Harvard, Vancouver, ISO, and other styles
24

Rachman, Rizal. "Penjadwalan Produksi Garment Menggunakan Algoritma Heuristic Pour." Jurnal Informatika 5, no. 1 (April 19, 2018): 81–89. http://dx.doi.org/10.31311/ji.v5i1.2743.

Full text
Abstract:
Abstrak Penjadwalan merupakan suatu kegiatan pengalokasian sumber daya yang terbatas untuk mengerjakan sejumlah pekerjaan. Proses penjadwalan timbul jika terdapat keterbatasan sumber daya yang dimiliki, karena pada saat ini perusahaan menerapkan sistem penjadwalan manual dimana dengan penjadwalan tersebut masih terdapat beberapa produk yang terlewati sehingga menyebabkan keterlambatan dalam proses produksi, aturan ini sering tidak menguntungkan bagi order yang membutuhkan waktu proses pendek karena apabila order itu berada dibelakang antrian maka harus menunggu lama sebelum diproses dan menyebabkan waktu penyelesaian seluruh order menjadi panjang, sehingga diperlukan adanya pengaturan sumber-sumber daya yang ada secara efisien. Adapun dasar perhitungan Penjadwalan dengan menggunakan algoritma Heuristic Pour. Tahapan-tahapan penelitian terdiri dari pengumpulan data, perhitungan waktu standar, perhitungan total waktu proses berdasarkan job, penjadwalan dengan metode awal perusahaan, penjadwalan dengan metode Heuristik Pour. Berdasarkan hasil penjadwalan menggunakan Heuristik Pour diperoleh penghematan dibanding dengan metode perusahaan saat ini, sehingga dapat digunakan sebagai alternatif metode dalam melakukan penjadwalan pengerjaan proses produksi di perusahaan Garment tersebut. Kata kunci: Penjadwalan Produksi, Algoritma, Heuristic Pour. Abstract Scheduling is a limited resource allocation activity to do a number of jobs. The scheduling process arises if there are limited resources available, because at this time the company implement a manual scheduling system where the scheduling is still there are some products passed so as to cause delays in the production process, this rule is often not profitable for orders that require short processing time because if the order is behind the queue then it must wait a long time before it is processed and cause the completion time of all orders to be long, so it is necessary to regulate the existing resources efficiently. The basic calculation of Scheduling using Heuristic Pour algorithm. The research stages consist of data collection, standard time calculation, total time calculation based on job, scheduling with company start method, scheduling with Pour Heuristic method. Based on the results of scheduling using Pour Heuristik obtained savings compared with the current company method, so it can be used as an alternative method in scheduling the process of production process in Garment company. Keywords: Production Scheduling, Algorithms, Heuristic Pour.
APA, Harvard, Vancouver, ISO, and other styles
25

Kurniasari, Iin, Kusrini Kusrini, and Hanif Al Fatta. "Analysis of Public Opinion Sentiment on Instagram regarding Covid-19 with SVM." JTECS : Jurnal Sistem Telekomunikasi Elektronika Sistem Kontrol Power Sistem dan Komputer 1, no. 1 (January 23, 2021): 67. http://dx.doi.org/10.32503/jtecs.v1i1.1416.

Full text
Abstract:
Perkembangan teknologi dewasa ini mendorong masyarakat untuk selalu tanggap teknologi, terlebih di era pandemi covid-19 yang selalu mengedepankan social distancing. Media sosial digunakan sebagai suatu alat untuk menyampaikan opini masyarakat kepada khalayak. Dalam penelitian ini, penulis melakukan penelitian tentang opini masyaraat pada media sosial instagram dengan mengguakan Support Vector Machine. Setelah dilakukan uji akurasi dan presisi ternyata SVM belum sesuai digunakan sebagai algoritma yang dapat menangkap urutan karena susunan kata yang dibolak-balik meskipun maknanya berbeda tetap bermakna sama oleh mesin SVM, hal ini dibuktikan juga dengan jumlah akurasi yang kecil.yaitu 59%. Sehingga diperlukan langkah untuk bisa diteliti dengan algoritma lain misalnya algoritma HRRN (Highest Response Ratio Next) atau LSTM (Long Short-Term Memory) yang memperhatikan urutan dan proses dengan rasio respon paling tinggi. Jika berdasarkan pendekatan ekstraksi fitur SVM dengan pendekatan count vector, tf-idf word level, tf-idf ngram level dan tf-idf char level. Dalam skenario ini nilai akurasi tertinggi terdapat pada perhitungan dengan menggunakan ekstraksi fitur count vector dan tf-idf ngram level.
APA, Harvard, Vancouver, ISO, and other styles
26

Azhar, Naziha, Putra Pandu Adikara, and Sigit Adinugroho. "Analisis Sentimen Ulasan Kedai Kopi Menggunakan Metode Naive Bayes dengan Seleksi Fitur Algoritme Genetika." Jurnal Teknologi Informasi dan Ilmu Komputer 8, no. 3 (June 15, 2021): 609. http://dx.doi.org/10.25126/jtiik.2021834436.

Full text
Abstract:
<p class="Abstrak">Di era sekarang, kedai kopi tak hanya dikenal sebagai tempat berkumpul dan menyeruput kopi saja, tetapi kedai kopi telah menjadi tempat yang nyaman untuk belajar dan bekerja. Namun, tidak semua kedai kopi memiliki kualitas yang baik sesuai dengan apa yang diharapkan pelanggan. Ulasan tentang kedai kopi dapat membantu pemilik kedai kopi untuk mengetahui bagaimana respons mengenai produk dan pelayanannya. Ulasan tersebut perlu diklasifikasikan menjadi ulasan positif atau negatif sehingga membutuhkan analisis sentimen. Terdapat beberapa tahap pada penelitian ini yaitu <em>pre-processing</em> untuk pemrosesan ulasan, ekstraksi fitur menggunakan <em>Bag of Words</em> dan <em>Lexicon Based Features</em>, serta mengklasifikasikan ulasan menggunakan metode <em>Naïve Bayes</em> dengan Algoritme Genetika sebagai seleksi fitur. Data yang digunakan pada penelitian ini sebanyak 300 data dengan 210 data sebagai data latih dan 90 data sebagai data uji. Hasil evaluasi yang didapatkan dari klasifikasi <em>Naïve Bayes</em> dan seleksi fitur Algoritme Genetika yaitu <em>accuracy</em> sebesar 0,944, <em>precision</em> sebesar 0,945, <em>recall</em> sebesar 0,944, dan <em>f-measure</em> sebesar 0,945 dengan menggunakan parameter Algoritme Genetika terbaik yaitu banyak generasi = 50, banyak populasi = 18, <em>crossover</em> <em>rate</em> = 1, dan <em>mutation</em> <em>rate</em> = 0.</p><p class="Abstrak"> </p><p class="Abstrak"><em><strong>Abstract</strong></em></p><p class="Abstract"><em>In this era, coffee shops are not only known as a place to gather and drink coffee, but also have become a comfortable place to study and work. However, not all coffee shops are in good quality according to what customers expect. Coffee shop reviews can help coffee shop owners to find out the response to their products and services. These reviews need to be classified as positive or negative reviews so that sentiment analysis is needed. There are several steps in this study, which are pre-processing to process reviews, feature extraction using Bag of Words and Lexicon Based Features, also classifying reviews using the Naïve Bayes method with Genetic Algorithm as a feature selection. The data used in this study were 300 data with 210 data as training data and 90 data as test data. Evaluation results obtained from the Naïve Bayes classification and Genetic Algorithm feature selection are 0.944 for accuracy, 0.945 for precision, 0.944 for recall, and 0.945 for f-measure using the best Genetic Algorithm parameters which are many generations = 50, many populations = 18, crossover rate = 1, and mutation rate = 0.</em></p><p class="Abstrak"><em><strong><br /></strong></em></p>
APA, Harvard, Vancouver, ISO, and other styles
27

Irsyad, Muhammad, and Teguh Oktiarso. "Penjadwalan Produksi Dengan Algoritma Dannenbring dan Branch and Bound pada Produksi Atap Galvalum Di PT NS Bluescope Lysaght Indonesia." Journal of Integrated System 3, no. 2 (December 28, 2020): 148–60. http://dx.doi.org/10.28932/jis.v3i2.2773.

Full text
Abstract:
PT NS Bluescope Lysaght Indonesia memiliki permasalahan yaitu masih terjadi keterlambatan dalam menyelesaikan pemesanan untuk memenuhi permintaan konsumen. Maka dari itu, diperlukannya penjadwalan yang paling optimal digunakan sebagai urutan proses produksi, sehingga dapat mengurangi resiko keterlambatan penyelesaian produk pesanan. Penelitian ini bertujuan untuk mendapatkan hasil terbaik dengan nilai makespan terkecil dari hasil urutan menggunakan metode Branch and Bound dan Dannenbring, sehingga dapat menentukan metode terbaik diantara kedua metode dan dapat diterapkan di PT NS Bluescope Lysaght Indonesia. Pengolahan berawal dari data yang telah diuji keseragaman dan uji kecukupan data serta performance rating untuk mendapatkan waktu standar sebagai dasar, yang selanjutnya dihitung menggunakan metode Branch and Bound dan Dannenbring. Berdasarkan penjadwalan yang telah dilakukan dengan metode Branch and Bound menghasilkan nilai makespan yang lebih kecil dari pada metode Dannenbring sebesar 639,580 detik dengan urutan proses 1-4-3-2. Metode ini, mampu mengurangi makespan 80,420 detik atau mampu mengurangi 11,169% dari kondisi awal. Dengan demikian metode ini dapat diterapkan oleh PT NS Bluescope Lysaght Indonesia untuk menyusun penjadwalan produksinya pada pembuatan galvalum jenis Krip-Lok, Trimdek Optima, Spandek, dan Smartdek dan dapat diterapkan juga menyusun penjadwalan produksi pada jenis produk yang lain untuk mengurangi waktu produksi. Kata kunci: branch and bound; dannenbring; flow shop; penjadwalan produksi
APA, Harvard, Vancouver, ISO, and other styles
28

Grandy, T. H., S. A. Greenfield, and I. M. Devonshire. "An evaluation of in vivo voltage-sensitive dyes: pharmacological side effects and signal-to-noise ratios after effective removal of brain-pulsation artifacts." Journal of Neurophysiology 108, no. 11 (December 1, 2012): 2931–45. http://dx.doi.org/10.1152/jn.00512.2011.

Full text
Abstract:
In the current study, we investigated pharmacological side effects and signal-to-noise ratios (SNRs) of two commonly used voltage-sensitive dyes (VSDs): the blue dye RH-1691 (1 mg/ml) and the red dye di-4-ANEPPS (0.1 mg/ml), applied in vivo to the rat barrel cortex. Blue dyes are often favored over red dyes in in vivo studies due to their apparent superior SNR, partly because their fluorescence spectrum is farther away from the hemoglobin absorption spectrum, making them less prone to heartbeat-associated brain-pulsation artifacts (BPA). We implemented a previously reported template-based BPA removal algorithm and evaluated its applicability to di-4-ANEPPS before comparing characteristics of the two dyes. Somatosensory-evoked potentials (SEPs) were also recorded. Whereas SEPs recorded before and after application of di-4-ANEPPS failed to exhibit demonstrable differences, RH-1691 caused a significant and prolonged increase in SEP amplitude for several hours. In contrast, neither dye influenced the spontaneous cortical activity as assessed by the spectral content of the EEG. Both dyes turned out to be strikingly similar with respect to changes in fractional fluorescence as a function of SEP response amplitude, as well as regarding shot noise characteristics after removal of the BPA. Thus there is strong evidence that the increased SNR for RH-1691 is a consequence of an artificially increased signal. When applying an appropriate BPA removal algorithm, di-4-ANEPPS has proven to be suitable for single-trial in vivo VSD imaging (VSDI) and produces no detectable neurophysiological changes in the system under investigation. Taken together, our data argue for a careful re-evaluation of pharmacological side effects of RH-1691 and support the applicability of di-4-ANEPPS for stable single-trial in vivo VSDI recordings.
APA, Harvard, Vancouver, ISO, and other styles
29

Song, Mi, Yanfei Zhong, and Ailong Ma. "Change Detection Based on Multi-Feature Clustering Using Differential Evolution for Landsat Imagery." Remote Sensing 10, no. 10 (October 21, 2018): 1664. http://dx.doi.org/10.3390/rs10101664.

Full text
Abstract:
Change detection (CD) of natural land cover is important for environmental protection and to maintain an ecological balance. The Landsat series of satellites provide continuous observation of the Earth’s surface and is sensitive to reflection of water, soil and vegetation. It offers fine spatial resolutions (15–80 m) and short revisit times (16–18 days). Therefore, Landsat imagery is suitable for monitoring natural land cover changes. Clustering-based CD methods using evolutionary algorithms (EAs) can be applied to Landsat images to obtain optimal changed and unchanged clustering centers (clusters) with minimum clustering index. However, they directly analyze difference image (DI), which finds itself subject to interference by Gaussian noise and local brightness distortion in Landsat data, resulting in false alarms in detection results. In order to reduce image interferences and improve CD accuracy, we proposed an unsupervised CD method based on multi-feature clustering using the differential evolution algorithm (M-DECD) for Landsat Imagery. First, according to characteristics of Landsat data, a multi-feature space is constructed with three elements: Wiener de-noising, detail enhancement, and structural similarity. Then, a CD method based on differential evolution (DE) algorithm and fuzzy clustering is proposed to obtain global optimal clusters in the multi-feature space, and generate a binary change map (CM). In addition, the control parameters of the DE algorithm are adjusted to improve the robustness of M-DECD. The experimental results obtained with four Landsat datasets confirm the effectiveness of M-DECD. Compared with the results of conventional methods and the current state-of-the-art methods based on evolutionary clustering, the detection accuracies of the M-DECD on the Mexico dataset and the Sardinia dataset are very close to the best results. The accuracies of the M-DECD in the Alaska dataset and the large Canada dataset increased by about 3.3% and 11.9%, respectively. This indicates that multiple features are suitable for Landsat images and the DE algorithm is effective in searching for an optimal CD result.
APA, Harvard, Vancouver, ISO, and other styles
30

Alistarh, Dan. "Distributed Computing Column 81." ACM SIGACT News 52, no. 1 (March 16, 2021): 70. http://dx.doi.org/10.1145/3457588.3457599.

Full text
Abstract:
Overview. In this edition of the column, we have an exciting contribution from Shir Cohen, Idit Keidar, and Oded Naor (Technion), who provide an in-depth perspective on communication-efficient Byzantine Agreement. With the huge popularity of blockchains, the classic area of Byzantine Agreement has seen a surge of interest, and, given the large-scale and widely-distributed deployments of such mechanisms, communication efficiency is a chief concern. This column provides a gentle introduction to the area, by rst the early work in the 80s. This provides the necessary context for the recent exciting work on communication reduction, from the King & Saia algorithm to Algorand. One very useful feature in this column's contribution is the fact that it provides a uni ed view of these results, along with the mathematical background to understand and di erentiate the underlying results. Many thanks to Shir, Idit and Oded for their contribution!
APA, Harvard, Vancouver, ISO, and other styles
31

Mao, Jian, Ru-Shan Wu, and Jing-Huai Gao. "Directional illumination analysis using the local exponential frame." GEOPHYSICS 75, no. 4 (July 2010): S163—S174. http://dx.doi.org/10.1190/1.3454361.

Full text
Abstract:
We have developed an efficient method of directional illumination analysis in the local angle domain using local exponential frame beamlets. The space-domain wavefields with different shot-receiver geometries are decomposed into the local angle domain by using the local exponential beamlets, which form a tight frame with the redundancy ratio two and are implemented by a linear combination of local cosine and local sine transforms. Because of the fast algorithms of the local cosine/sine transforms, this method is much more efficient than the previously used decomposition methods in directional illumination analysis, such as the local slant-stacking method and the Gabor-Daubechies frame method. The results of directional illumination (DI) maps and the acqui-sition dip responses (ADR) for the 2D SEG/EAGE salt model and the 45-shot 3D SEG/EAGE model demonstrated the validity and feasibility of our method. Compared with the illumination results using local slant-stacking decomposition, the new method produces illumination maps of similar quality, but it does so a few times faster. Furthermore, because of its high computational efficiency and saving in memory usage, the new method makes the 3D directional illumination analysis readily applicable in the industry.
APA, Harvard, Vancouver, ISO, and other styles
32

Niccolai, Alessandro, and Alfredo Nespoli. "Sun Position Identification in Sky Images for Nowcasting Application." Forecasting 2, no. 4 (November 16, 2020): 488–504. http://dx.doi.org/10.3390/forecast2040026.

Full text
Abstract:
Very-short-term photovoltaic power forecast, namely nowcasting, is gaining increasing attention to face grid stability issues and to optimize microgrid energy management systems in the presence of large penetration of renewable energy sources. In order to identify local phenomena as sharp ramps in photovoltaic production, whole sky images can be used effectively. The first step in the implementation of new and effective nowcasting algorithms is the identification of Sun positions. In this paper, three different techniques (solar angle-based, image processing-based, and neural network-based techniques) are proposed, described, and compared. These techniques are tested on real images obtained with a camera installed at SolarTechLab at Politecnico di Milano, Milan, Italy. Finally, the three techniques are compared by introducing some performance parameters aiming to evaluate of their reliability, accuracy, and computational effort. The neural network-based technique obtains the best performance: in fact, this method is able to identify accurately the Sun position and to estimate it when the Sun is covered by clouds.
APA, Harvard, Vancouver, ISO, and other styles
33

Tian, Qiaoping, and Honglei Wang. "Predicting Remaining Useful Life of Rolling Bearings Based on Reliable Degradation Indicator and Temporal Convolution Network with the Quantile Regression." Applied Sciences 11, no. 11 (May 23, 2021): 4773. http://dx.doi.org/10.3390/app11114773.

Full text
Abstract:
High precision and multi information prediction results of bearing remaining useful life (RUL) can effectively describe the uncertainty of bearing health state and operation state. Aiming at the problem of feature efficient extraction and RUL prediction during rolling bearings operation degradation process, through data reduction and key features mining analysis, a new feature vector based on time-frequency domain joint feature is found to describe the bearings degradation process more comprehensively. In order to keep the effective information without increasing the scale of neural network, a joint feature compression calculation method based on redefined degradation indicator (DI) was proposed to determine the input data set. By combining the temporal convolution network with the quantile regression (TCNQR) algorithm, the probability density forecasting at any time is achieved based on kernel density estimation (KDE) for the conditional distribution of predicted values. The experimental results show that the proposed method can obtain the point prediction results with smaller errors. Compared with the existing quantile regression of long short-term memory network(LSTMQR), the proposed method can construct more accurate prediction interval and probability density curve, which can effectively quantify the uncertainty of bearing running state.
APA, Harvard, Vancouver, ISO, and other styles
34

Ilinskiy, Aleksandr V., Alexey V. Fedorov, Ksenia A. Stepanova, Igor U. Kinzhagulov, and Igor O. Krasnov. "Study of the dynamic hardness of structural metal materials." Industrial laboratory. Diagnostics of materials 86, no. 1 (January 30, 2020): 57–61. http://dx.doi.org/10.26896/1028-6861-2020-86-1-57-61.

Full text
Abstract:
The mechanical properties of structural metallic materials are the most important indicators of their quality. Different methods (i.e., the methods of Shore, Brinell, Rockwell, Leeb, Vickers, method of instrumental indentation, and others) are currently used for determination of the hardness — one of the most important mechanical characteristics of structural metal materials. Among them is the method of dynamic indentation first developed at the Institute of Applied Physics of the National Academy of Sciences of Belarus. With the goal of further developing of the method of dynamic indentation, we propose the procedures aimed at increasing the accuracy of assessing the hardness of structural metallic materials: parameters of the contact interaction of the indenter with the sample material (Brinell hardness values) were measured using a dynamic indentation (DI) device; the values of surface and volumetric dynamic hardness were calculated taking into account the characteristics obtained using a DI device; a comparative analysis of hardness estimates obtained by different approaches was carried out. As a result of the comparative analysis of the methods, as well as their experimental testing, it was shown that an increase in the accuracy of hardness assessment can be achieved by using the «energy» approach based on assessing the ratio of the total work to the volume of the recovered indentation upon dynamic indentation of structural metal materials. The use of the «energy» approach provided obtaining the sample standard deviation of the volumetric dynamic hardness values, which, in turn, was significantly lower than the sample standard deviation of the surface dynamic hardness values and data of the dynamic indentation device, which directly affects an increase in the accuracy of hardness estimation during dynamic indentation of structural metal materials. Proceeding from the «energy» approach, a new algorithm for processing the initial signal is proposed when the dynamic hardness is determined using a dynamic indentation device.
APA, Harvard, Vancouver, ISO, and other styles
35

Edo, Sri Imelda, and Damianus Dao Samo. "LINTASAN PEMBELAJARAN PECAHAN MENGGUNAKAN MATEMATIKA REALISTIK KONTEKS PERMAINAN TRADISIONAL SIKI DOKA." Mosharafa: Jurnal Pendidikan Matematika 6, no. 3 (August 25, 2018): 311–22. http://dx.doi.org/10.31980/mosharafa.v6i3.320.

Full text
Abstract:
AbstrakMateri pecahan merupakan salah satu materi matematika yang rumit. Kerumitan pecahan tidak saja dialami oleh siswa, tetapi juga mahasiswa dan guru. Penyebabnya adalah penguasaan konsep pecahan yang rendah. Karena guru pada jenjang pendidikan Dasar memperkenalkan pecahan dengan metode ceramah dan langsung memberi contoh soal kemudian siswa mengerjakan soal latihan. Guru mengajarkan algoritma rutin dalam mengerjakan soal, Edo. I.S (2016). Metode ini dipraktekan secara turun temurun. Karena itu siswa merasa jenuh dan tidak tertarik belajar. Elly Risman (2008) mengatakan bahwa,” Ada tiga cara penyampaian yang efektif bagi anak, yakni dengan bermain, bernyanyi, dan bercerita. Sementara Pendekatan pembelajaran yang berlandaskan pada filosofi bahwa matematika merupakan aktivitas Insani adalah pendekatan Pembelajaran Matematika Realistik. Karena itu penelitian ini bertujuan untuk mengetahui bagaimana desain pembelajaran pecahan dengan menggunakan pendekatan Matematika Realistik Konteks Permainan Siki Doka (taplak). Adapun metode penelitian yang digunakan adalah Desain Riset yang dilaksanakan di SDN. Angkasa Kupang dan SDK. Kristen Tunas Bangsa Kupang pada siswa kelas III. Hasilnya adalah siswa sangat antusias dan menikmati seluruh aktivitas pembelajaran karena mereka belajar melalui kegiatan bermain, menggambar, mewarnai, menggunting dan menyusun kertas origami yang berwarna warni. Siswa bukan saja telah memahami konsep pecahan sederhana, membandingkan pecahan sederhana, dan memecahkan masalah yang berkaitan dengan pecahan sederhana tetapi juga mereka sudah terlibat dalam aktivitas yang berhubungan dengan konsep penjumlahan dan kelipatan pecahan.Kata Kunci: Pembelajaran Pecahan, Konsep Pecahan, Perbandingan Pecahan, Pembelajaran pecahan dengan RME, pembelajaran pecahan conteks permainan tradisional.AbstractFraction is one of hard subject of mathematics. Fractional complexity is not only experienced by students, but also students and teachers. They found difficulty to solve any mathematics problems related to fractions due to weak of fraction concept and disspointed learning method. Because teachers in elementary taught them using lecture method through routin algorthm. Teacher began the lessons by given short explanation, then some routin example provided on students’ text book. In the end of the lessons students did some exercise, Edo.I. S (2016). Therefore, students bored to follow all of learning process. Whereas Elly Risman (2008) said that there are three effective ways to teach children i.e. by playing, singing and storytelling. While Mathematics learning approach which assume that mathematics as human activity is Realistics Mathematics Education (RME). Therefore, this study aimed to design simple fraction learning trajectory using RME approach through traditional game namely siki Doka as a context. The Research method used in this research is Design Research which conducted in SDN Angkasa Kupang and SDK. Tunas Bangsa Kupang in the third grade students. The result showed that students were very enthusiastic and enjoy all the learning activities because they learned while playing, drawing, Coloring, cutting and arrange colorful origami paper. Students not only understand the concept of simple fractions, compare simple fractions, and solve problems related to simple fractions as well they are already involved in the activities to found the concept of fractional addition and its multiples.Keyword: Fractional Learning, Concepts of Fraction, comparing fraction, Fractional Learning using RME approach, fraction learning using traditional game.
APA, Harvard, Vancouver, ISO, and other styles
36

Bomben, Riccardo, Francesca Maria Rossi, Tiziana D'Agaro, Tamara Bittolo, Filippo Vit, Antonella Zucchetto, Erika Tissino, et al. "Clinical Impact of Clonal and Subclonal TP53 Mutations and Deletions in Chronic Lymphocytic Leukemia: An Italian Multicenter Experience." Blood 134, Supplement_1 (November 13, 2019): 480. http://dx.doi.org/10.1182/blood-2019-124647.

Full text
Abstract:
Background. TP53 mutations (TP53mut) along with 17p13 deletion (del17p) are strong predictors of poor survival and refractoriness to chemo-immunotherapies (CIT) in chronic lymphocytic leukemia (CLL), and their analyses should be always performed before treatment. Studies based upon ultra-deep-NGS have shown that TP53mut can be present at very low level in CLL cell populations, although their detrimental clinical impact in this setting is still matter of debate (Rossi, Blood 2014), and ERIC recomendations discourage to report TP53 mutations if subclonal Malcikova, Leukemia 2018). Aim. To investigated the presence and clinical relevance of clonal/subclonal TP53 aberrations in a large CLL cohort. Methods. The study includes 1,058 out of 1,613 CLL patients (509 treated with standard CIT) diagnosed between 1991 and 2018, and consecutively referred to a single institution for del17p analyses by FISH (167-kb 17p13 orange probe, MetaSystems), and TP53mut by ultra-deep NGS (MiSeq Illumina; median coverage &gt;2,000X with an amplicon-based strategy covering exons 2-11 using 40ng DNA/test) in CD19-purified (&gt;85% pure) CLL samples, collected before treatment (as per ERIC recommendations). For TP53mut analyses, FASTQ files were aligned to the Hg19 reference with Burrows-Wheeler Aligner-MEM algorithm, and allele variants called by FreeBayes (Garrison & Marth, arXiv 2102) with non-stringent parameters. To calculate random/systematic errors we generated a specific database with all the variant allele frequencies (VAF) observed in a subset of TP53 wild type (wt) subjects (n=362). TP53mut were accepted if: i) validated by Fisher exact test after Bonferroni correction (p&lt;0.01); ii) with a VAF at least 2.75 standard deviations from the mean of the transformed distributions. The minimal allelic fraction for TP53mut calling was 0.4%. TP53mut cases with less than 2% VAF were tested twice. Outcome was overall survival (OS) from the date of exam. Results. A total of 248 TP53mut (Fig.A) were found in 154 patients (13.5%, Fig.B) with a median mutations/patient of 1.65 (range 1-11). According to the 12.5% VAF cutoff for TP53mut (Nadeu, Blood 2016), 87 cases were clonal (at least one clone &gt; 12.5%) and 67 subclonal (all clones &lt;12.5%; Fig.AB). Clonal and subclonal TP53mut have similar molecular characteristics (Fig.C), supporting the idea of comparable pathogenic effects. In fact, cases bearing clonal and subclonal TP53mut experienced comparably OS, shorter than TP53-wt cases (Fig.D). Accordingly, ROC analysis identified a cutoff of VAF &gt;0.4% for the clinical impact of TP53mut (Fig.D), and the c-index of combined clonal/subclonal TP53mut (0.645) was significantly higher than the c-index of clonal TP53mut alone (0.602; P&lt;0.001). Del17p was found in 178 patients (16.8%; Fig.E). In keeping with a ROC analysis (Fig.F, inset), the 64 cases with del17p in &gt;10% of nuclei had significantly shorter OS than cases with del17p in &lt;10% or without del17p (Fig.F). By combining del17p and TP53mut according to ROC cutoffs, 891 cases (84.2%) presented no TP53 aberrations, 13 cases (1.2%) were del17p only, 103 cases (9.7%) were TP53mut only, and 51 cases (4.8%) were del17p/TP53mut. Compared to TP53-wt cases, similar shorter OS were observed for TP53mut and del17p/TP53mut cases (Fig.G). Same results were obtained in the context of treated patients when OS was computed from the date of treatment (Fig.H). The "yes/no effect" of TP53 mutations on CLL outcome was verified dividing our cohort in a training cohort of 630 CLL cases, and in a validation cohort of 428 CLL patients. These two cohorts presented similar OS, and the same proportion of TP53 mutated cases and 17p deletion. ROC analysis on training cohort based on TP53 mutation variables, identified &gt;0.5% as the best cutoff. In keeping with this cutoff, TP53 mutated patients experienced a significantly shorter OS than wt patients. Again cases with clonal and subclonal mutation experienced the same OS (Fig.I). Importantly, the cutoff found in the training cohort was able to reproduce the very same results also in the validation cohort both in term of mutation per se and in terms of clonal and subclonal TP53 mutations (Fig.J). Conclusion. i) By applying ERIC recommendations and a rigorous pipeline of analysis, TP53mut impacted on OS also with VAF &lt;1%; ii) del17p associated with short OS only when detectable in &gt;10% of nuclei. These cutoffs may be employed for the clinical management of CLL patients. Figure Disclosures Di Raimondo: Takeda: Consultancy; Amgen: Consultancy, Honoraria, Research Funding. Rossi:Gilead: Honoraria, Membership on an entity's Board of Directors or advisory committees, Research Funding; Abbvie: Honoraria, Other: Scientific advisory board; Janseen: Honoraria, Other: Scientific advisory board; Roche: Honoraria, Other: Scientific advisory board; Astra Zeneca: Honoraria, Other: Scientific advisory board.
APA, Harvard, Vancouver, ISO, and other styles
37

Yudanar, Arvian Furqon, Sri Hariyati Fitriasih, and Muhammad Hasbi. "Rekomendasi Barang Di Toko Elektrik Menggunakan Algoritma Apriori." Jurnal Teknologi Informasi dan Komunikasi (TIKomSiN) 8, no. 2 (October 20, 2020). http://dx.doi.org/10.30646/tikomsin.v8i2.499.

Full text
Abstract:
Each company or organization which wants to survive needs to determine the right business strategies. Sales data for products made by the company will get a lot of data. So it is very unfortunate if there is not repetition analyzing. Its offered variety products with a wide range of products, and sometimes the brand influence people to buy the product, to know the highest sales products, it needs to know the relationship between one product to others, one of them is existing algorithms in mining data algorithms. They are algorithms apriori to be informed, and it can help of this program, products which appear simultaneously knowable. The purpose of the research is to determine the recommendation of goods so that purchases of goods stock are efficient. Apriori algorithms including the type of association rules in mining data. The one-step analysis association phase which is gotten the attention of many researchers to produce efficient algorithms is the analysis of patterns of high frequency (frequent pattern mining). Important or not an association can be identified by the two benchmarks, namely: support and confidence. Support (support value) is the percentage of the combination of these items in the database, while confidence (value certainty) is a strong relationship between the items in the rules of the association. Apriori algorithm can be helpful for the development of marketing strategies. From the validity testing result, the data is efficient if the minimum support more than 10% and the minimum confidence of more than 50%. The calculation needs two different minimum support and minimum confidence to know the best result. The problem is how to increase sales, and find out the interest of buyers in the product. And the results are obtained to decide the layout of the products in the shop window as an effort to increase sales in the store.Keywords: Mining Data, Good Recommendations, Apriori, Algorithm
APA, Harvard, Vancouver, ISO, and other styles
38

Muharni, Yusraini, Ade Irman Saeful M, and Tania Ero Rubyanti. "PENJADWALAN FLOW SHOP MESIN PARALEL MENGGUNAKAN METODE LONGEST PROCESSING TIME DAN CROSS ENTROPY-GENETIC ALGORITHM PADA PEMBUATAN PRODUK STEEL BRIDGE B-60." Jurnal Ilmiah Teknik Industri 7, no. 3 (January 16, 2020). http://dx.doi.org/10.24912/jitiuntar.v7i3.6338.

Full text
Abstract:
Sistem produksi di sebuah perusahaan fabrikasi seingkali mengalami permasalahan salah satunya adalah ketika perusahaan tidak mampu mengirim order kepada customer di waktu yang telah ditentukan. Hal ini dapat menyebabkan perusahaan harus menanggung biaya pinalti. Masalah ini dapat diatasi apabila dilakukannya penjadwalan produksi dengan baik. Pentingnya penjadwalan produksi adalah untuk mengelola dengan tepat sumber daya yang ada agar terciptanya efektivitas dan efisiensi dalam penggunaan sumber daya. Salah satu usaha yang dilakukan untuk tercapainya penjadwalan yang optimal adalah dengan meminimalkan makespan. Produk Steel Bridge B-60 merupakan salah satu dari jenis jembatan Truss Bridge yang diproduksi oleh Perusahan Fabrikasi Baja Struktur Kontruksi yang berlokasi di Cilegon. Perusahan tersebut memiliki sistem produksi make to order dengan alur produksi flow shop dan susunan mesin paralel. Penelitian ini dilakukan dengan menerapkan pendekatan heuristic metode LPT (Longest Processing Time) dan pendekatan metaheuristic dengan menggunakan metode Cross Entropy yang dikombinasikan dengan Genetic Algorithm yang dikenal dengan sebutan CEGA (Cross Entropy Genetic Algortihm). Algoritma metaheuristic diterjemahkan ke dalam Bahasa pemograman Matlab. Hasil perbandingan kedua metode menunjukkan bahwa kedua metode usulan yaitu LPT dan CEGA lebih baik dari metode eksisting dan memiliki efisiensi sebesar 7.94%.
APA, Harvard, Vancouver, ISO, and other styles
39

Gunawan, Chichi Rizka, Ahmad Ihsan, and Munawir Munawir. "Optimasi Penyelesaian Permainan Rubik’s Cube Menggunakan Algoritma IDA* dan Brute Force." Jurnal Infomedia 3, no. 1 (August 8, 2018). http://dx.doi.org/10.30811/jim.v3i1.627.

Full text
Abstract:
Abstrak — Permainan rubik’s cube merupakan salah satu permainan yang penuh tantangan dan digemari anak muda. Bermain rubik’s cube itu mengasyikkan. Selain permainan kerangkasan logika, permainan ini juga dituntut bekerja keras untuk menyelesaikannya. Bagi orang-orang tertentu jenis permainan ini sulit untuk diselesaikan. Kesulitan itu karena seseorang harus berpikir berulang kali agar dapat menyamakan warna-warna pada setiap sisinya. Rubik’s cube adalah permainan kubus yang berukuran 3 x 3 x 3. Pemain berupaya menyelesaikan rubik’s cube dengan memutar enam warna yang berbeda di seluruh kubus sampai setiap dari enam sisi menunjukkan warna tertentu yang sama. Untuk menyelesaikan rubik’s cube dapat digunakan berbagai macam algoritma. Rubik’s cube dapat dicari penyelesaiannya dan solusi yang dihasilkan cukup singkat. Pada kali ini akan dikemukakan optimasi penyelesaian rubik’s cube dengan menggunakan algoritma IDA* dan algoritma Brute Force.Kata kunci — algoritma IDA*, algoritma Brute Force, Rubik’s Cube Abstract — Rubik's cube game is one of the most challenging and popular games of the young. Playing rubik's cube is fun. In addition to logic clever games, this game is also required to work hard to solve it. For certain people this type of game is difficult to complete. The difficulty is because one has to think over and over to be able to match the colors on each side. Rubik's cube is a 3 x 3 x 3 cube game. The player attempts to finish the rubik's cube by rotating six different colors across the cube until each of the six sides shows the same particular color. To finish rubik's cube can be used various kinds of algorithm. Rubik's cube can be searched for and the resulting solution is quite short. At this time will be proposed optimization of rubik's cube solving by using IDA * algorithm and Brute Force algorithm.Keywords— IDA * algorithm, Brute Force algorithm, Rubik's Cube
APA, Harvard, Vancouver, ISO, and other styles
40

Othman, Zalinda, Khairanum Subari, and Norhashimah Morad. "Job Shop Scheduling with Alternative Machines using Genetic Algorithms." Jurnal Teknologi, February 25, 2012. http://dx.doi.org/10.11113/jt.v41.711.

Full text
Abstract:
Sejak kebelakangan ini integrasi fungsi–fungsi pembuatan telah menarik minat sejumlah penyelidik terutamanya dalam perancangan pengeluaran dan penjadualan. Kedua fungsi ini memainkan peranan penting di dalam proses pengeluaran, terutamanya bagi memastikan sumber–sumber pembuatan yang diperlukan untuk melaksanakan proses pengeluaran telah tersedia. Kajian ini telah melihat integrasi penjadualan pengeluaran dengan objektif utamanya ialah menilai keupayaan Algoritma Genetik (GA) dalam menyelesaikan masalah tersebut. Masalah integrasi ini menimbangkan penghalaan alternatif untuk operasi–operasi bagi setiap tugasan semasa pembentukan jadual–jadual. Dalam penghalaan alternatif terdapat pilihan ke atas mesin–mesin yang akan memproses suatu operasi. Mesin–mesin ini mungkin mengambil masa yang berbeza untuk memproses suatu operasi yang sama. Dengan mengambilkira penghalaan alternatif, penyelesaian yang mungkin untuk masalah penjadualan menjadi terlalu besar. Pendekatan GA digunakan bagi mencari penyelesaian terbaik. Pengoptimuman masalah ini melibatkan beberapa objektif, iaitu makespan, kos pemprosesan dan bilangan yang ditolak. Kami telah membandingkan pendekatan yang dicadangkan dengan beberapa pendekatan daripada penyelidik terdahulu dan hasil simulasi telah menunjukkan keputusan yang memuaskan. Kata kunci: Algoritma genetik, penjadualan bengkel kerja, mesin alternatif Recently, an integration of manufacturing functions has gained interest from a number of researchers, particularly in production planning and scheduling. These two functions play important roles in production, especially to ensure the availability of manufacturing resources needed to accomplish production tasks. This paper explores the use of Genetic Algorithms (GA) in solving the problem associated with the integrated production scheduling. The integrated problem considers the alternative routing for operations of each job during the creation of schedules. In alternative routing there is a choice of machines on which to perform the operations. These machines take different amount of time to process the same operation. By considering the alternative routing, the possible solutions for the scheduling problem become very vast. As a robust approach, GA is used to find the most promising solution. The optimization of this problem involves several objectives, namely minimizing makespan, minimizing processing cost, and minimizing number of rejects. It also takes into account the constraints on operations sequence. The proposed solution was compared with previous approaches, and the numerical simulations showed promising results. Key words: Genetic algorithms, job shop scheduling, alternative machines
APA, Harvard, Vancouver, ISO, and other styles
41

Tusuccess, Harry, and Rosnani Ginting. "Penggunaan Algoritma Simulated Annealing dalam Penyelesaian Keterlambatan Order di PT. HTS." Talenta Conference Series: Energy and Engineering (EE) 2, no. 2 (May 31, 2019). http://dx.doi.org/10.32734/ee.v2i2.444.

Full text
Abstract:
PT. HTS adalah salah satu perusahaan yang bergerak di industri pembuatan plastik dengan 3 jenis produk yaitu joly, cangkir plastik bening, dan cangkir plastik printing. Pemenuhan produksi di pabrik ini menggunakan sistem make to order, aturan First Come First Served dan aliran produksi berdasarkan flow shop. Proses produksi sering mengalami penumpukan job dan keterlambatan order. Penelitian dilaksanakan pada unit produksi cangkir plastik bening di enam work center di lantai produksi PT. HTS. Penelitian dilaksanakan pada proses produksi bulan Juli 2013 pada tipe produk AAA, BBB, CCC, DDD, EEE, dan FFF. Kriteria penjadwalan yang digunakan hanya berupa pengurutan job dengan dasar penilaian fungsi makespan.. Jumlahnya mesin terbatas mengakibatkan banyak pekerjaan (job) yang delay, maka mesin yang dijadwalkan dengan baik diharapkan akan dapat mengurangi delay pada pekerjaan (job). Penjadwalan mesin dilakukan dengan menggunakan metode Simulated Annealing dengan proses pengurutan pekerjaan dimana proses tidak ada yang lebih awal tanpa menghambat (delay) operasi lainnya. Penjadwalan kondisi aktual menghasilkan waktu penyelesaian keseluruhan produk sebesar 24480,71 menit = 408,01 jam = 17 hari. Sedangkan penjadwalan menggunakan algoritma Simulated Annealing menghasilkan waktu penyelesaian keseluruhan produk sebesar 20091,71 menit = 334,99 jam = 13,95 hari = 14 hari . Penjadwalan yang dipilih adalah Simulated Annealing karena memiliki waktu penyelesaian keseluruhan produk yang paling minimum dan pengurangan makespan sebesar 4381 menit. Dengan urutan job pengerjaan yaitu CCC, AAA. DDD, FFF, BBB, dan EEE. PT. HTS is a company engaged in the manufacture of plastic with 3 types of products, namely joly, clear plastic cups, and plastic printing cups. Fulfillment of production at this plant uses the make to order system, First Come First Served rules and flow shop based production flow. The production process often experiences job buildup and late orders. The research was carried out in a clear plastic cup production unit in six work centers on the production floor of PT. HTS. The research was carried out in the production process in July 2013 on AAA, BBB, CCC, DDD, EEE, and FFF products. The scheduling criteria used are only in the form of sorting jobs on the basis of an assessment of the makespan function. The limited number of machines results in many delays, then a well-scheduled machine is expected to reduce job delay Machine scheduling is done by using the Simulated Annealing method with the sorting process of work where there is no earlier process without delaying other operations. Scheduling actual conditions results in the completion of the entire product amounting to 24480.71 minutes = 408.01 hours = 17 days. While scheduling using the Simulated Annealing algorithm results in the completion of the entire product by 20091.71 minutes = 334.99 hours = 13.95 days = 14 days. The chosen schedule is Simulated Annealing because it has the minimum overall product completion time and a reduction of makespan of 4381 minutes. With the order of work that is CCC, AAA. DDD, FFF, BBB and EEE.
APA, Harvard, Vancouver, ISO, and other styles
42

Suryani, Yani. "IMPLEMENTASI ALGORITMA VIGENERE CHIPER UNTUK KEAMANAN TANGGAL KADALUARSA (Studi Kasus : Toko Kemuning Depok � Prov. Jawa Barat)." Buffer Informatika 5, no. 1 (April 1, 2019). http://dx.doi.org/10.25134/buffer.v5i1.1958.

Full text
Abstract:
AbstrakMakanan adalah salah satu hal penting dalam kehidupan masyarakat. Semakin berkualitas suatu makanan maka dapat menghasilkan sumber daya manusia yang berkualitas juga. Akan tetapi maraknya pemalsuan salah satunya pemalsuan tanggal kadaluarsa mengakibatkan masyarakat harus jeli dan teliti dalam memilih produk. Salah satu makanan yang digemari masyarakat adalah jenis kue dan bolu. Toko Kemuning salah satu toko yang banyak memproduksi jenis-jenis kue tersebut untuk didistribusikan dan dijual sendiri di tokonya. Oleh karena itu, untuk meningkatkan kepercayaan masyakarat terhadap kualitas produk yang dihasilkan oleh Toko Kemuning maka pada penelitian ini dibuat sebuah aplikasi untuk mengelola data produk, dimana konsumen dapat mengecek tanggal kadaluarsa melalui anroid dengan cara scan qr code yang terdapat dalam kemasan kue. Adapun yang digenerate ke dalam bentuk qr code adalah id produk,id tersebut berisi string unik yang dienkripsi menggunakan algoritma vigenere chipper sehingga tidak mudah dideteksi oleh orang lain. Admin dapat melakukan pengelolaan data produk sehingga ketika toko memproduksi kue, data dapat tersimpan di database dan qr code dapat dicetak. Dengan penelitian ini diharapkan dapat meningkatkan kepercayaan konsumen derta menjaga kualitas produk dari toko kemuning.Kata Kunci��Kue, Enkripsi, Algoritma Vigenere Chiper�AbstractFood is one of the important things in people's lives. The more quality a food can produce quality human resources as well. However, the rise of counterfeiting, one of which is falsification of expiration date, causes the public to be observant and careful in choosing products. One of the foods that are favored by the public is a type of cake and sponge cake. Kemuning Shop is one shop that produces many types of cakes to be distributed and sold alone in the store. Therefore, to increase public trust in the quality of products produced by the Kemuning Shop, an application is made to manage product data, where consumers can check the expiration date via anroid by scanning the qr code contained in the cake packaging. As for what is generated in the form of qr code is a product id, the id contains a unique string that is encrypted using the vigenere chipper algorithm so that it is not easily detected by others. Admins can manage product data so that when a shop produces cookies, data can be stored in a database and the QR code can be printed. With this research it is expected to increase consumer confidence and maintain product quality from kemuning shops.�Keywords��Cake, Encription, Vigenere Chiper Algorithm
APA, Harvard, Vancouver, ISO, and other styles
43

Nurjayadi, Ramsy, and Titin Kristiana. "Penerapan Association Rule Menggunakan Algoritma Apriori Untuk Analisa Penjualan Aufa Baby Shop." IJCIT (Indonesian Journal on Computer and Information Technology) 4, no. 2 (November 15, 2019). http://dx.doi.org/10.31294/ijcit.v4i2.6448.

Full text
Abstract:
Perlengkapan bayi sangat diperlukan oleh bayi yang baru saja lahir, yang usianya antara 0 hingga 24 bulan. Perlengkapan bayi pada dasarnya selalu diperlukan dan menjadi salah satu kebutuhan dasar oleh setiap bayi. Setiap hari data transaksi penjualan di Aufa Baby Shop semakin bertambah banyak. Dalam satu hari data transaksi yang tercatat sebanyak ratusan transaksi dan data tersebut hanya dijadikan sebatas laporan saja dan tidak dimanfaatkan untuk mengatur strategi penjualan. Seharusnya data tersebut dimanfaatkan untuk melihat keterikatan setiap jenis barang yang dibeli oleh konsumen secara bersamaan. Penelitian ini melakukan analisa data dengan menggunakan metode algoritma apriori, dengan metode tersebut dapat diketahui produk pakaian bayi yang dibeli secara bersamaan dan paling banyak terjual dengan melihat nilai support dan confidence. Dalam proses pengolahan data menggunakan perhitungan manual dan software Rapidminer 8.1 untuk menganalisis dataset yang ada pada Aufa Baby Shop. Hasil dari penelitian ini menggunakan support sebesar 15% dan confidence 30%. Penelitian ini menghasilkan 8 aturan association rules.
APA, Harvard, Vancouver, ISO, and other styles
44

Kholidasari, Inna. "PENJADWALAN MESIN DENGAN MENGGUNAKAN ALGORITMA JADWAL NON-DELAY DI PT. SNN SEKSI WORKSHOP." Industrika: Jurnal Ilmiah Teknik Industri 5, no. 1 (April 24, 2021). http://dx.doi.org/10.37090/indstrk.v5i1.358.

Full text
Abstract:
Production scheduling is the most important part in carrying out the production process that will be carried out on a production floor. Scheduling activities are carried out before the production process begins to ensure the smooth running of the production process. If the production scheduling is not done properly, there will be obstacles in the production process and will cause losses to the company. This study aims to determine the production machine scheduling in a company engaged in the manufacture of spare parts for automotive products. This company implements a job shop production process and uses the First In First Out method in completing its work. Due to the large number of products that have to be produced, there are often two or more products that must be worked on at the same time and machine. This condition causes some products to have to wait for the associated machine to finish operating and causes long product turnaround times. This problem is solved by making a production machine scheduling using the Non-Delay method. By applying this method, the makespan of completion time can be minimized.
APA, Harvard, Vancouver, ISO, and other styles
45

Fahriah, Wildatunnisa, and Tulus Febrianto. "Aplikasi Enkripsi dan Dekripsi Short Message Service di Android Menggunakan Metode Blowfish." JISA(Jurnal Informatika dan Sains) 2, no. 1 (June 27, 2019). http://dx.doi.org/10.31326/jisa.v2i1.512.

Full text
Abstract:
Short Message Service (SMS) is a communication technology that is very popular. By using SMS someone can exchange messages with others. However, with the wiretapping issue sms, a person no longer has the right to privacy. So the program was made on a mobile phone-based application that will change the android SMS messages into codes so that information from SMS in not known by others. When sending SMS, this application will encrypt the message into shape with a key codes entered by the sender. And when the acceptance of SMS, this application will describe the codes message into form of the original message using the same key with the sender. This application was made in the android based application to send an important message to others without fear of being noticed by others. The method used in the application to encrypt and decrypt the messages are methods Block Cipher Algorithm or also called blowfish method and its implementation using the java programming language with the android mobile platform.
APA, Harvard, Vancouver, ISO, and other styles
46

Dewi, Sinta Maulina, Agus Perdana Windarto, and Dedy Hartama. "PENERAPAN DATAMINING DENGAN METODE KLASIFIKASI UNTUK STRATEGI PENJUALAN PRODUK DI UD.SELAMAT SELULAR." KOMIK (Konferensi Nasional Teknologi Informasi dan Komputer) 3, no. 1 (December 2, 2019). http://dx.doi.org/10.30865/komik.v3i1.1669.

Full text
Abstract:
In the current era of globalization, developments in various fields of business are accelerating. Both in the culinary field and other fields. One of the most sought after business developments is in the field of counters or credit sales. UD.Selamat Selular was founded in 2010, which only has a small shop with no employees to date which has more than 20 employees. This business continues to develop in ever-increasing business competition. Therefore a sales strategy is needed so that it is not inferior to other trading businesses. In this research, it is necessary to test the previous data in order to find out the right sales strategy using Naïve Bayes. The data collection method was conducted by questionnaire and interview with a questionnaire of 160 respondents. From the results of this study it can be concluded that the model formed using the Naïve Bayes algorithm produces an algorithm of 0.650 so that it is classified as Excellent Classification.Keywords: Datamining, Naïve Bayes, Sales Strategy.
APA, Harvard, Vancouver, ISO, and other styles
47

Annisa Aulia Sambas, Dida Diah Damayanti, and Budi Santosa. "Optimasi Penjadwalan Flow Shop Dengan Mesin Parallel untuk Minimasi Makespan dan Keterlambatan Menggunakan Algoritma CDS di Industri Manufaktur Pesawat Terbang." Talenta Conference Series: Energy and Engineering (EE) 2, no. 3 (December 19, 2019). http://dx.doi.org/10.32734/ee.v2i3.695.

Full text
Abstract:
PT. XYZ adalah perusahaan yang bergerak di bidang produksi komponen pesawat komersil. Proses produksi dimulai dari pemotongan bahan baku sampai menjadi komponen pesawat dalam bentuk pin, bearing, washer, bushing, cap, spring, retainer dan cover. Produk yang diproduksi dalam kuantitas kecil namun memiliki variasi yang besar. Lantai produksi menggunakan group technology layout dan menerapkan proses manufaktur flow shop dengan mesin paralel. Metode Campbell, Dudek dan Smith (CDS) digunakan untuk mengetahui urutan job baru sehingga dapat meminimasi makespan dan keterlambatan. Hasil yang diperoleh akan dibandingkan dengan jadwal eksisting yang menggunakan metode (MOPNR). Dengan menggunakan MOPNR, makespan yang dihasilkan 18.895,5 menit. Dengan menggunakan metode CDS, makespan minimum yang diperoleh adalah 15.478 dan berkurang sebanyak 18% dari makespan aktual.
APA, Harvard, Vancouver, ISO, and other styles
48

Ericko Wasita Rimbawan. "Penjadwalan Mesin dengan Menggunakan Algoritma NEH pada PT. XYZ." Talenta Conference Series: Energy and Engineering (EE) 2, no. 3 (December 19, 2019). http://dx.doi.org/10.32734/ee.v2i3.742.

Full text
Abstract:
PT. XYZ adalah salah satu perusahaan baja yang ada di Sumatera Utara. Pabrik ini memproduksi baja sesuai dengan pesanan konsumen (make to order). Sistem produksi perusahaan ini adalah flow shop, dimana setiap produk melalui proses operasi yang sama. Sistem penjadwalan yang diterapkan selama ini adalah melayani pesanan yang terlebih dahulu datang (FCFS). Metode penjadwalan yang diterapkan oleh perusahaan ini masih belum optimal. Penjadwalan yang belum optimal tersebut ditandai oleh beberapa kondisi. Salah satunya adalah masih rendahnya penggunaan sumber daya yang ditunjukkan dari rendahnya kapasitas olah dibandingkan dengan kapasitas tersedia mesin tersebut. Salah satu contoh penggunaan sumber daya yang rendah adalah pada mesin rolling mill. Urutan pengerjaan job dengan pendekatan LPT adalah Job 3-Job 2-Job 1-job 4-job 5. Sementara Makespan yang dihasilkan dari penjadwalan yang diterapkan perusahaan (FCFS) adalah sebesar 3328,02 jam dengan urutan pengerjaan job yaitu Job 1-job 2-job 3- job 4-job 5. Idle time yang didapatkan dari penjadwalan menggunakan algoritma NEH pendekatan LPT yaitu sebesar 1789,91 jam (74,6 hari). Sementara idle time dari metode penjadwalan yang diterapkan perusahaan (FCFS) yaitu sebesar 3306,21 jam (137,7 hari). Efficiency Index (EI) metode penjadawalan NEH pendekatan LPT sebesar 1,09 dan Relative Error (RE) sebesar 9,87 %. Berdasarkan hasil tersebut dapat disimpulkan bahwa penjadwalan menggunakan algoritma Nawaz, Enscore, dan Ham (NEH) dengan pendekatan LPT menghasilkan makespan dan idle time yang lebih kecil dibandingkan dengan metode penjadwalan yang diterapkan oleh perusahaan (FCFS).
APA, Harvard, Vancouver, ISO, and other styles
49

Ginting, Rosnani, and Benedictus Vito Bayu. "Penjadwalan Mesin Pada PT.XYZ Dengan Menggunakan Algoritma Genetik." Talenta Conference Series: Energy and Engineering (EE) 2, no. 2 (May 31, 2019). http://dx.doi.org/10.32734/ee.v2i2.447.

Full text
Abstract:
Persaingan antar perusahaan semakin meningkat seiring dengan meningkatnya kemajuan ilmu pengetahuan dan teknologi. Permintaan konsumen juga semakin meningkat sesuai dengan perkembangan dalam bidang perindustrian. PT. XYZ merupakan perusahaan yang bergerak di bidang produksi mesin pada Pabrik Kelapa Sawit (PKS) dan juga memproduksi spare part mesin untuk perusahaan lainnya. Perusahaan melakukan kegiatan produksi berdasarkan pesanan atau order yang masuk (make to order), dengan proses pengerjaan job shop. Permasalahan yang dihadapi oleh PT. XYZ adalah permasalahan keterlambatan. Terdapat 5 produk yang mengalami keterlambatan. Untuk itu, pada penelitian ini dilakukan penjadwalan mesin untuk mengatasi permasalahan keterlambatan tersebut.Metode penjadwalan yang digunakan yaitu menggunakan algoritma genetik. Pada lagoritma genetik ini, dilakukan inisialisasi awal dengan menggunakan metode SPT untuk memperoleh urutan job. Kemudian dilakukan tahap seleksi, crossover dan mutase untuk memperoleh uruta job yang paling optimal. Berdasarkan pengolahan data yang dilakukan, diperoleh bahwa dalam tiga generasi terdapat empat kromosom terbaik, yaitu BECAD, BACED, BEACD, BCAED dengan nilai fitness yang sama yaitu 0,02144. Urutan kerja yang dipilih dalam hal ini adalah BCAED, yakni urutan pengerjaan semua produk. Urutan job ini memiliki makespan sebesar 46,637 jam merupakan yang terbaik dari tiap generasi dengan nilai fitness terbaik yaitu 0,02144. Competition between companies is increasing along with the advancement of science and technology. Consumer demand is also increasing according to developments in the industrial sector. PT. XYZ is a company engaged in the production of machinery at the Palm Oil Mill (PKS) and also produces engine spare parts for other companies. The company conducts production activities based on orders or orders that enter (make to order), with the job shop process. Problems faced by PT. XYZ is a matter of delay. There are 5 products that experience delays. For that reason, in this study a machine scheduling was carried out to overcome the problem of delay. The scheduling method used was using a genetic algorithm. In this genetic lag, initial initialization is done by using the SPT method to obtain the job sequence. Then do the selection, crossover and mutase stages to get the most optimal job sequence. Based on the data processing performed, it was found that in three generations there were four of the best chromosomes, namely BECAD, BACED, BEACD, BCAED with the same fitness value of 0.02144. The work order chosen in this case is BCAED, which is the order of work for all products. This job sequence has a makespan of 46,637 hours which is the best of each generation with the best fitness value of 0,02144.
APA, Harvard, Vancouver, ISO, and other styles
50

Picciani, Massimiliano, Manuel Athènes, and Mihai-Cosmin Marinica. "Calculation of migration rates of vacancies and divacancies in α-Iron using transition path sampling biased with a Lyapunov indicator." MRS Proceedings 1383 (2012). http://dx.doi.org/10.1557/opl.2012.180.

Full text
Abstract:
ABSTRACTPredicting the microstructural evolution of radiation damage in materials requires handling the physics of infrequent-events, in which several time scales are involved. The reactions rates characterizing these events are the main ingredient for simulating the kinetics of materials under irradiation over large time scales and high irradiation doses. We propose here an efficient, finite temperature method to compute reaction rate constants of thermally activated processes. The method consists of two steps. Firstly, rare reactive trajectories in phase-space are sampled using a transition path sampling (TPS) algorithm supplemented with a local Lyapunov bias favoring diverging trajectories. This enables the system to visit transition regions separating stable configurations more often, and thus enhances the probability of observing transitions between stable states during relatively short simulations. Secondly, reaction constants are estimated from the unbiased fraction of reactive trajectories, yielded by an appropriate statistical data analysis tool, the multistate Bennett acceptance ratio (MBAR) package. We apply our method to the calculation of reaction rates for vacancy and di-vacancy migration in α-Iron crystal, using an Embedded Atom Model potential, for temperatures ranging from 300 K to 800 K.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography