Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: IBM Z.

Dissertationen zum Thema „IBM Z“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-15 Dissertationen für die Forschung zum Thema "IBM Z" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Kryszczuk, Darius. "Návrh prezentace metadat reportů z IBM Cognos BI." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-198465.

Der volle Inhalt der Quelle
Annotation:
Dramatic increasing of amount of data in running a business has been consireded a typical phenomenon of the last decade. This fact, however, has its impact on the business planning and decision making. The area, which deals with this problematics, is called Business Intelligence (BI). The problem lies in the implementation of applications directing to the BI, because it is often oriented only on isolated projects, so they cause only partial effect. In such cases, BI cannot fulfil the expected role, but still preserves the negatives instead. By saying negatives, it is meant: complicated searching for analytical outputs, significant dete-rioration of their output data, cost increase in administation of BI systems, and others. The solution can represented by integration of partial BI systems and by transfer of their BI metadata into a single informational system, serving as an "entering gate" for all the business workers. From the whole company perspective, there exist also benefits in the metadata managament, such as: determination of quality and usage of reports, prolonging of the life cycle, cooperative forming of the knowledge basis, consolidation of the terms in the business, more efficient way of browsing in the system and comprehensibility of analytical system's outputs. The aim of this diploma thesis is formation of a metadata presentation and its realization into a preselected information system, together with meeting of the advantages resulting from applying business metadata.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Faruzel, Petr. "Empirické porovnání komerčních systémů dobývání znalostí z databází." Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-76606.

Der volle Inhalt der Quelle
Annotation:
The presented work "An Empirical Comparison of Commercial Data Mining Tools" deals with data mining tools from world's leading software providers of statistical solutions. The aim of this work is to compare commercial packages IBM SPSS Modeler and SAS Enterprise Miner in terms of their specification and utility considering a chosen set of evaluation criteria. I would like to achieve the appointed goal by a detailed analysis of selected features of the surveyed software packages as well as by their application on real data. The comparison is founded on 29 component criteria which reflect user's requirements regarding functionality, usability and flexibility of the system. The pivotal part of the comparative process is based on an application of the surveyed data mining tools on data concerning meningoencephalitis. Results predestinate evaluation of their performance while analyzing small and large data. Quality of developed data models and duration of their derivation are stated in reference to the use of six comparable data mining techniques for classification. Small data more likely comply with IBM SPSS Modeler. Although it produces slightly less accurate models, their development times are much shorter. Increasing the amount of data changes the situation in favor of competition. SAS Enterprise Miner manages better results while analyzing large data. Considerably more accurate models are accompanied by slightly shorter times of their development. Functionality of the surveyed data mining tools is comparable, whereas their usability and flexibility differentiate. IBM SPSS Modeler offers apparently better usability and learnability. Users of SAS Enterprise Miner have a slightly more flexible data mining tool at hand.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Šverák, Martin. "Analýza nestrukturovaného obsahu z veřejně dostupných sociálních médií za pomocí nástroje Watson společnosti IBM." Master's thesis, Vysoká škola ekonomická v Praze, 2014. http://www.nusl.cz/ntk/nusl-192396.

Der volle Inhalt der Quelle
Annotation:
This graduate thesis deals with the analysis of unstructured data from public social media. In particular, it deals with the analysis of data from social media of Vodafone Czech Republic a.s. This thesis is divided into two parts. The first part provides theoretical background for the second part. Therefore, the first part describes social media, structured and unstructured data and tools which are used for analysing of unstructured data. In the second part, tool Watson is used for the analysis of publicly available data. Then, methodology is designed to control the analysis process and subsequently this methodology used in the formation of the pilot application that has to verify the functionality of unstructured data by tool Watson. The results of the analysis are in the conclusion. The main benefits of this thesis are the development of a pilot application of Watson and the verification of its functionality. The pilot application cannot be equated with a complete analysis that can be done by Watson. But this pilot application may work as a demonstration of Watson's functionalities.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Gustafsson, Alex, and Carl Stensson. "The Performance of Post-Quantum Key Encapsulation Mechanisms : A Study on Consumer, Cloud and Mainframe Hardware." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-21849.

Der volle Inhalt der Quelle
Annotation:
Background. People use the Internet for communication, work, online banking and more. Public-key cryptography enables this use to be secure by providing confidentiality and trust online. Though these algorithms may be secure from attacks from classical computers, future quantum computers may break them using Shor’s algorithm. Post-quantum algorithms are therefore being developed to mitigate this issue. The National Institute of Standards and Technology (NIST) has started a standardization process for these algorithms. Objectives. In this work, we analyze what specialized features applicable for post-quantum algorithms are available in the mainframe architecture IBM Z. Furthermore, we study the performance of these algorithms on various hardware in order to understand what techniques may increase their performance. Methods. We apply a literature study to identify the performance characteristics of post-quantum algorithms as well as what features of IBM Z may accommodate and accelerate these. We further apply an experimental study to analyze the practical performance of the two prominent finalists NTRU and Classic McEliece on consumer, cloud and mainframe hardware. Results. IBM Z was found to be able to accelerate several key symmetric primitives such as SHA-3 and AES via the Central Processor Assist for Cryptographic Functions (CPACF). Though the available Hardware Security Modules (HSMs) did not support any of the studied algorithms, they were found to be able to accelerate them via a Field-Programmable Gate Array (FPGA). Based on our experimental study, we found that computers with support for the Advanced Vector Extensions (AVX) were able to significantly accelerate the execution of post-quantum algorithms. Lastly, we identified that vector extensions, Application-Specific Integrated Circuits (ASICs) and FPGAs are key techniques for accelerating these algorithms. Conclusions. When considering the readiness of hardware for the transition to post-quantum algorithms, we find that the proposed algorithms do not perform nearly as well as classical algorithms. Though the algorithms are likely to improve until the post-quantum transition occurs, improved hardware support via faster vector instructions, increased cache sizes and the addition of polynomial instructions may significantly help reduce the impact of the transition.<br>Bakgrund. Människor använder internet för bland annat kommunikation, arbete och bankärenden. Asymmetrisk kryptering möjliggör att detta sker säkert genom att erbjuda sekretess och tillit online. Även om dessa algoritmer förväntas vara säkra från attacker med klassiska datorer, riskerar framtida kvantdatorer att knäcka dem med Shors algoritm. Därför utvecklas kvantsäkra krypton för att mitigera detta problem. National Institute of Standards and Technology (NIST) har påbörjat en standardiseringsprocess för dessa algoritmer. Syfte. I detta arbete analyserar vi vilka specialiserade funktioner för kvantsäkra algoritmer som finns i stordator-arkitekturen IBM Z. Vidare studerar vi prestandan av dessa algoritmer på olika hårdvara för att förstå vilka tekniker som kan öka deras prestanda. Metod. Vi utför en litteraturstudie för att identifiera vad som är karaktäristiskt för kvantsäkra algoritmers prestanda samt vilka funktioner i IBM Z som kan möta och accelerera dessa. Vidare applicerar vi en experimentell studie för att analysera den praktiska prestandan av de två framträdande finalisterna NTRU och Classic McEliece på konsument-, moln- och stordatormiljöer. Resultat. Vi fann att IBM Z kunde accelerera flera centrala symmetriska primitiver så som SHA-3 och AES via en hjälpprocessor för kryptografiska funktioner (CPACF). Även om befintliga hårdvarusäkerhetsmoduler inte stödde några av de undersökta algoritmerna, fann vi att de kan accelerera dem via en på-plats-programmerbar grind-matris (FPGA). Baserat på vår experimentella studie, fann vi att datorer med stöd för avancerade vektorfunktioner (AVX) möjlggjorde en signifikant acceleration av kvantsäkra algoritmer. Slutligen identifierade vi att vektorfunktioner, applikationsspecifika integrerade kretsar (ASICs) och FPGAs är centrala tekniker som kan nyttjas för att accelerera dessa algortmer. Slutsatser. Gällande beredskapen hos hårdvara för en övergång till kvantsäkra krypton, finner vi att de föreslagna algoritmerna inte presterar närmelsevis lika bra som klassiska algoritmer. Trots att det är sannolikt att de kvantsäkra kryptona fortsatt förbättras innan övergången sker, kan förbättrat hårdvarustöd för snabbare vektorfunktioner, ökade cachestorlekar och tillägget av polynomoperationer signifikant bidra till att minska påverkan av övergången till kvantsäkra krypton.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Maraczek, Patrik. "Návrh a realizace systému zpracování dat z environmentálních čidel v prostředí IoT." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2020. http://www.nusl.cz/ntk/nusl-417775.

Der volle Inhalt der Quelle
Annotation:
The master’s thesis deal with design and realization of measuring station, which process data from environmental sensors in IoT environment. Thesis includes research of sensors, cloud services for IoT, microcontrollers and environmental data available online. Thesis contains detailed procedure for realization of designed system, including code description for microcontrollers STM32W55 and STM32 B-L475E-IOT01A2 Discovery kit, configuration of IBM Watson IoT Platform cloud service and procedure for correct programming of Node-RED application responsible for logic of whole system. Source code, which might be used for simple implementation of designed system, is attached to the master’s thesis.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Uwakweh, Ozioma I. F. "Cybersecurity in the Retail Industry: Third Party Implications." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1595848539891614.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

David, Östling. "IBMs stordators framtid i en molnbaserad IT värld : Kommer IBMs stordator att överleva ytterligare ettparadigmskifte eller har den spelat ut sin roll?" Thesis, Karlstads universitet, Handelshögskolan, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-43408.

Der volle Inhalt der Quelle
Annotation:
IT analytiker har många gånger gett IBMs stordator en dödsdom. Redan 1991 skrev denerkände kritikern Stewart Alsop, dåvarande chefredaktör på InfoWorld att den sista stordatornskulle tas ur bruk den 15 mars 1996. Det visade sig att detta uttalande var felaktigt, vilketStewart Alsop även erkände i InfoWorld bara dagar innan hans förutsägelse skulle ha infallit.När vi nu går in i ytterligare ett paradigmskifte i och med att många tjänster går över tillmolnet, ställer jag i denna uppsats frågan om IBMs stordator har en framtid i en molnbaseradIT värld, och hur den i så fall ser ut. Syftet är att genom litteraturstudier och intervjuer undersöka om IBMs stordator kan överlevaytterligare en genomomgripande teknikrevolution eller om den har spelat ut sin roll.Undersökningen mynnar ut i slutsatserna att IBMs stordator har en stark position i dagsläget,framförallt inom bank och finanssektorn d.v.s. inom branscher med speciellt höga kravbeträffande tillgänglighet, skalbarhet, och säkerhet. Sannolikt har stordatorn en viktig roll attspela även för framtidens satsningar i molnet. IBM erbjuder redan idag molnlösningar sominkluderar mainframes, och det framgick även i de intervjuer som gjordes på IBM, att de seren ljus framtid för IBMs stordatorer. De menar att IBM inte bara följer med, utan även är medoch leder utvecklingen inom molntjänster, och att det är främst för de öppna standarderna somLinux och Unix som IBM kommer att ha den ljusaste framtiden. Det faktum att IBM varje årinvesterar miljardbelopp i utvecklingen av sina stordatorer talar också sitt tydliga språk, d.v.s.att IBM fullt ut verkar tro på att stordatorn har en viktig roll att spela i den molnbaserade ITvärldsom just nu växer fram
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Petrželka, Jiří. "Překlad z češtiny do angličtiny." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-237125.

Der volle Inhalt der Quelle
Annotation:
Tato diplomová práce popisuje principy statistického strojového překladu a demonstruje, jak sestavit systém pro statistický strojový překlad Moses. V přípravné fázi jsou prozkoumány volně dostupné bilingvní česko-anglické korpusy. Empirická analýza časové náročnosti vícevláknových nástrojů pro zarovnání slov demonstruje, že MGIZA++ může dosáhnout až pětinásobného zrychlení, zatímco PGIZA++ až osminásobného zrychlení (v porovnání s GIZA++). Jsou otestovány tři způsoby morfologického pre-processingu českých trénovacích dat za použití jednoduchých nefaktorových modelů. Zatímco jednoduchá lemmatizace může snížit BLEU, sofistikovanější přístupy většinou BLEU zvyšují. Positivní efekty morfologického pre-processingu se vytrácejí s růstem velikosti korpusu. Vztah mezi dalšími charakteristikami korpusu (velikost, žánr, další data) a výsledným BLEU je empiricky měřen. Koncový systém je natrénován na korpusu CzEng 0.9 a vyhodnocen na testovacím vzorku z workshopu WMT 2010.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Griffel, Frank. "Logische und ontologische Probleme in al-Ġazālīs Fayṣal al-tafriqaẗ bayna l-l-islām wa-z-zandaqa : eine Analyse der ersten sechs Kapitel /." [S.l. : s.n], 1995. http://catalogue.bnf.fr/ark:/12148/cb41085275x.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Benkhmas, Omar. "Procédés et fonctions de l'ironie à travers la Risalat at-tawabi' wa z-zawabi' "Épître des subordonnés et des génies" d'Ibn Shuhayd al-Andalusi." Paris 4, 1995. http://www.theses.fr/1994PA040273.

Der volle Inhalt der Quelle
Annotation:
Le but de cette thèse est de dégager les procédés de l'ironie à travers une épître de l'époque omeyyade. Elle est constituée d'une introduction globale présentant le cadre historique et les démarches techniques, ainsi que les buts et les motifs de cette composition. Dans le premier chapitre, nous nous sommes proposé de réunir par un dépouillement aussi minutieux que possible les plus grands nombres d'indications d'ordre général ou personnel que nous avons pu tirer concernant la biographie d'Ibn Shuhayd et de son œuvre. Le deuxième chapitre est une technique suivie qui nous a aidé à dégager les procédés d'ironie en précisant leurs fonctionnements et leurs effets sur les cibles. Nous avons aussi désigné l'auteur en tant qu'ironique ; ainsi que les dimensions de cette ironie dans le domaine littéraire andalou. Quant au dernier chapitre, c'est un essai ou l'auteur a métamorphosé ses adversaires afin de les déshonorer et mépriser leur capacité culturelle. Tout cela pour objet de faire renaitre un auteur touché par la chute de la dynastie Amiride et par le mépris de ses adversaires, un écrivain qui représente en bien des points de vue, un précurseur de problème littéraire, critique et rhétorique<br>The object of this thesis is to expose the methods of irony through an epistle of the time of Omayyad. It is constituted of general introduction that presents the historical field and the technical approach as well as the aims of the selection. In the first chapter, we have proposed to assemble a great number of indications that are personal or general concerning the biography of Ibn Shuhayd and his work. The second chapter is a technical proceding, which has helped us to recognize irony and its effects on those who are being ironized. We have also designated the author as an ironist and his links. And the proportions of this irony in the field of Andalusian literature. As for the last chapter is concerned it is a test where the author metamorphosed his opponents in order to dishonor and to scorn their cultural capacity. Hence, this is the aim of the revival of an author who has been hurt by the collapse of the Amiride's dynasty, and the scorns of his opponents, a writer who represents points of view, a precursor of literature, critics and rhetorical problems
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Laaroussi, Abdellaoui Zakia. "Sībawayhi et la tradition grammaticale arabe : deux systèmes d'analyse à l'oeuvre sur l'annexion." Paris 3, 2008. http://www.theses.fr/2008PA030165.

Der volle Inhalt der Quelle
Annotation:
La thèse analyse la notion d'annexion dans le Kitāb de Sībawayhi comme dans la tradition grammaticale arabe à travers des textes choisis. Elle met en évidence deux systèmes explicatifs différents, les confronte et analyse leur pertinence. En ce qui concerne Sībawayhi, le chapitre 100 du Kitāb se révèle d'une importance capitale pour la notion d'annexion, surtout quand celle-ci est regardée, comme le fait la thèse, à travers le principe explicatif fondamental de liberté syntaxique (tamakkun) développé ailleurs dans le Kitāb. La thèse démontre les réticences des grammairiens arabes envers la catégorisation de l'annexion chez Sībawayhi (basée d'après nous sur la notion de liberté syntaxique) et sa théorie de la rection (camal) selon laquelle un nom peut régir un autre nom en le mettant au génitif. Les grammairiens arabes ont opté pour un système explicatif alternatif basé sur la théorie de la rection (selon laquelle le nom n'a pas de rection). Ce travail révèle ainsi, à travers le traitement du génitif dans l'annexion, deux logiques concernant la théorie de la rection qui ne donnent pas entière satisfaction. Par ailleurs, la thèse explique comment l'adoption par les grammairiens arabes d'un système rectionnel alternatif, présenté par ceux-ci comme cohérent avec la démarche Sībawayhienne, constitue en fait une rupture radicale avec celle-ci<br>The thesis analyzes the concept of annexation in Sībawayhi's Kitāb as in the Arab grammatical tradition through selected texts. It highlights two different explanatory systems, confronts them and analyzes their relevance. With regard to Sībawayhi, chapter 100 of the Kitāb appears of capital importance on the annexation, especially when this one is looked at, like does it the thesis, through the fundamental explanatory principle of syntactic freedom (tamakkun) developed elsewhere in the Kitāb. The thesis shows the reserves of the Arab grammarians towards Sībawayhi's categorization of the annexation (based according to us on the concept of syntactic freedom) and his theory of the governance / dependency (camal) according to which a noun can govern another noun by putting it in the genitive. The Arab grammarians chose an alternative explanatory system based on the theory of the governance / dependency (according to which a noun cannot govern another noun). This work reveals thus, through the treatment of the genitive in the annexation, two logics concerning the theory of the governance / dependency which do not give whole satisfaction. In addition, the thesis explains how the adoption by the Arab grammarians of an alternative governance / dependency system, presented by those like coherent with the Sībawayhienne approach, constitutes in fact a radical rupture with this one
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Slanda, Arkadiusz Marcin. "IBM mainframe : a study in business strategy." Thesis, 2009. http://hdl.handle.net/2152/ETD-UT-2009-12-464.

Der volle Inhalt der Quelle
Annotation:
On April 7, 2009, IBM celebrated the mainframe’s 45th year. Drawing on its roots in punch-card tabulators, the machine has come a long way to become many customers’ preferred e-business solution. Throughout its lifetime, IBM’s strategy adapted the machine to the changing market. During the late 1960s, the introduction of the System/360 provided customers with compatibility and scalability across various computer lines. Popularity of the system began to suffer during the client/server era of the 1990s but it quickly recovered as the z Series server line was developed to support e-business solutions. IBM’s strategy made the mainframe successful but continued improvements are still necessary to ensure its future success.<br>text
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Kreusch, Jonas. "Benutzerverwaltung in z/OS." 2015. https://ul.qucosa.de/id/qucosa%3A17157.

Der volle Inhalt der Quelle
Annotation:
In dieser Arbeit werden die Konzepte der Benutzerverwaltung und des Ressourcenschutzes unter z/OS mit RACF erläutert und anhand des Zusammenspiels mit TSO konkretisiert. Außerdem werden die Konzepte von RACF mit denen des UNIX-Ressourcenschutz verglichen. Dazu werden auch die UNIX System Services, die UNIX-konforme Schnittstelle von z/OS, analysiert. Mit Hilfe eines REXX-Scriptes werden, aufbauend auf RACF-Befehlen, komplexere Aufgaben der Benutzerverwaltung implementiert.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Šrom, Jakub. "Převod vybraných algoritmů data-mining z jazyka Java do binární (.exe) formy." Master's thesis, 2015. http://www.nusl.cz/ntk/nusl-191298.

Der volle Inhalt der Quelle
Annotation:
There are many successful systems for data-mining (eg. WEKA, RapidMiner, etc.), which currently hold many algorithms implemented in Java, which allows their use under different operating systems. The disadvantage of the interpreted source code is a slowdown in the calculation and limited memory usage. The thesis is focused on the transfer of several selected implementations of algorithms in Java binaries (.exe) through the conversion of source code in C ++ under MS Windows 7 64-bit. The aim is to speed up calculations and improve management of memory usage. Binary form must give identical results as the original form. In addition to the actual transfer, the thesis also includes comparing time and memory requirements of the original (using the Java Runtime Environment, JRE) interpreted implementation in Java (JRE 64-bit) and x64 resulting binary forms, for selected test data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Sokołowska, Dorota. "Ewolucja Wszechświata w modelu z dwoma dubletami pól Higgsa." Doctoral thesis, 2013. http://depotuw.ceon.pl/handle/item/167.

Der volle Inhalt der Quelle
Annotation:
Rozprawa poświęcona jest badaniu ewolucji Wszechświata w modelu z biernym dubletem pól skalarnych (IDM). Jest to model dwudubletowy (2HDM) z pełną symetrią $Z_2$, w którym występuje kandydatka na cząstkę ciemnej materii (DM). Model zawiera jeden, ,,standardowy'' dublet $SU(2)$ pól skalarnych (dublet Higgsa), który jest odpowiedzialny za łamanie symetrii elektrosłabej i nadanie mas bozonom cechowania i fermionom oraz zawiera bozon Higgsa tak jak w Modelu Standardowym. Drugi, ,,bierny'' dublet ma zerową wartość próżniową i nie sprzęga się do fermionów. Jego pola składowe są realizowane w postaci fizycznych cząstek, z których najlżejsza jest stabilna dzięki zachowaniu multiplikatywnej liczby kwantowej i może być traktowana jako kandydatka na cząstkę ciemnej materii. W modelu 2HDM mogą być realizowane również inne typy próżni o innych własnościach, w szczególności nie istnieją w nich cząstki ciemnej materii. W trakcie ewolucji Wszech\-świat mógł przechodzić przez stany próżni innych niż próżnia IDM. W rozprawie przeprowadziliśmy pełną analizę własności modelu 2HDM z symetrią $Z_2$. Zbadaliśmy strukturę próżni oraz możliwą ewolucję Wszechświata (w przybliżeniu $T^2$) w miarę spadku temperatury. Analiza została przeprowadzona przy użyciu trzech typów przestrzeni fazowych: parametrów potencjału $(\lambda_4,\lambda_5)$, parametrów masowych $(\mu_1,\mu_2)$ oraz przestrzeni sprzężeń cząstek DM $(\lambda_{345},\lambda_2)$. Każda z nich dostarcza uzupełniających informacji na temat własności próżni i ich ewolucji. Znaleźliśmy zestaw sekwencji przejść fazowych, które w jednym, dwóch lub trzech krokach prowadzą do dzisiejszej próżni z cząstkami DM opisanej przez IDM. Analiza została przeprowadzona dla trzech zakresów mas cząstek DM: lekkich, średnich i ciężkich z wykorzystaniem danych akceleratorowych oraz wyników eksperymentów astrofizycznych. Trzy zakresy mas cząstek ciemnej materii różnią się zarówno możli\-wymi typami ewolucji, co jest widoczne na diagramach $(\lambda_{345},\lambda_2)$, jak i wartościami gęstości reliktowej, które ograniczają wartości $\lambda_{345}$. Znaleźliśmy związek między typem ewolucji Wszechświata a masą cząstek ciemnej materii, na przykład dla cząstek ciężkich możliwa jest tylko sekwencja z jednym przejściem fazowym. Zbadaliśmy również możliwość współistnienia minimów. Może to wystąpić w przypadku trzech scenariuszy dla średnich mas DM, a nie jest możliwe dla lekkich i ciężkich cząstek. Współistnienie może być tymczasowe, kiedy lokalne minimum przestaje istnieć krótko po tym, jak Wszechświat wchodzi w fazę bierną, w której istnieją cząstki ciemnej materii. Występuje również możliwość współistnienia minimów dla $T=0$. Wykazaliśmy jak wielką rolę odgrywa parametr $\lambda_2$, który zwykle był pomijany w analizach modelu IDM. Z naszej analizy wynika, że parametr ten ma istotne znaczenie dla ewolucji Wszechświata. Obliczenia gęstości reliktowej, która nie zależy wprost od parametru $\lambda_2$, ograniczają sprzężenie $\lambda_{345}$ dla wybranych wartości mas, a ponieważ parametry $\lambda_{345}$ i $\lambda_2$ są ze sobą związane przez warunki stabilności próżni oraz warunki na istnienie określonych typów minimów, dane astrofizyczne ograniczają również parametr $\lambda_2$. Dlatego arbitralne ustalenie wartości $\lambda_2$ może doprowadzić do błędnych wniosków w zakresie wyznaczania obszarów parametrów zgodnych z obserwacjami astrofizycznymi.<br>In this thesis we study the evolution of the Universe in the Inert Doublet Model, which is a $Z_2$ symmetric version of 2HDM with a scalar dark matter (DM) candidate. The model contains one ,,standard'' scalar (Higgs) doublet $\Phi_S$, which is responsible for electroweak symmetry breaking and masses of gauge bosons and fermions and contains a Higgs boson as in the Standard Model (SM), and one ,,inert'' scalar doublet, $\Phi_D$, which doesn't receive vacuum expectation value (v.e.v.) and doesn't couple to fermions. All the components of the second scalar doublet are realized as massive scalar $D$-particles. By construction, they possess a conserved multiplicative quantum number and therefore the lightest particle among them can be considered as a candidate for DM particle. Other types of vacua can also be realized in the $Z_2$ symmetric 2HDM. If current state of the Universe is described by IDM, then the Universe can pass during the thermal evolution through various intermediate phases, that are different from the inert one. In particular, these possible intermediate phases contain no dark matter. In the thesis we perform the full analysis of the properties of the IDM. We analyze the vacuum structure of IDM, as well as the possible history of the Universe during cooling down (in the $T^2$ approximation). The analysis is performed in the three sets of phase spaces: quartic parameters $(\lambda_4,\lambda_5)$, mass parameters $(\mu_1,\mu_2)$ and DM-Higgs and quartic DM couplings $(\lambda_{345},\lambda_2)$. Each of them gives the complementary information useful for description of IDM and the evolution of the Universe. We found that three types of sequences of the phase transitions that lead to the today's inert phase are possible: in one, two or three steps. We perform the analysis for the three regions of dark matter masses: low, medium and high DM mass and use the experimental data form the colliders and astrophysical experiments to constrain the possible scenarios. Those three regions of DM mass exhibit the different behaviour: both in the possible types of evolution (shown by the form of $(\lambda_{345},\lambda_2)$ plane) and the energy relic density values. We also found relations between the mass of the dark matter particle and the possible type of evolution of the Universe, for example for a very heavy DM particle only one phase transition is possible. We also trace the possibility of coexistence of the local minima. This opportunity may be realized for three types of scenarios for the DM in the medium mass region. The coexistence may be temporary, when the local inertlike minimum disappears shortly after the Universe enters the inert phase. However, there's also a possibility of having a local inertlike minimum for $T=0$. We also proved the importance of the usually neglected $\lambda_2$ self-coupling. We argue, that the astrophysical data should be used in two steps to limit the values of the self couplings. First, the relic density calculation for a fixed $M_{D_H}$ gives the allowed values of the $\lambda_{345}$ parameter. This calculation doesn't depend on the exact value of $\lambda_2$. Second, those obtained values of $\lambda_{345}$ should be used to constrain the $\lambda_2$ parameter through the positivity constraints or the conditions for the realization of the different rays. Therefore, fixing $\lambda_2$ to an arbitrary value, as done in all IDM analyses in literature, may result in the exclusion of the WMAP allowed region.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!