Academic literature on the topic 'Hard times tokens'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Hard times tokens.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Hard times tokens"

1

Ni, Wangze, Pengze Chen, Lei Chen, Peng Cheng, Chen Jason Zhang, and Xuemin Lin. "Utility-Aware Payment Channel Network Rebalance." Proceedings of the VLDB Endowment 17, no. 2 (October 2023): 184–96. http://dx.doi.org/10.14778/3626292.3626301.

Full text
Abstract:
The payment channel network (PCN) is a promising solution to increase the throughput of blockchains. However, unidirectional transactions can deplete a user's deposits in a payment channel (PC), reducing the success ratio of transactions (SRoT). To address this depletion issue, rebalance protocols are used to shift tokens from well-deposited PCs to under-deposited PCs. To improve SRoT, it is beneficial to increase the balance of a PC with a lower balance and a higher weight (i.e., more transaction executions rely on the PC). In this paper, we define the utility of a transaction and the utility-aware rebalance (UAR) problem. The utility of a transaction is proportional to the weight of the PC and the amount of the transaction, and inversely proportional to the balance of the receiver. To maximize the effect of improving SRoT, UAR aims to find a set of transactions with maximized utilities, satisfying the budget and conservation constraints. The budget constraint limits the number of tokens shifted in a PC. The conservation constraint requires that the number of tokens each user sends equals the number of tokens received. We prove that UAR is NP-hard and cannot be approximately solved with a constant ratio. Thus, we propose two heuristic algorithms, namely Circuit Greedy and UAR_DC. Extensive experiments show that our approaches outperform the existing approach by at least 3.16 times in terms of utilities.
APA, Harvard, Vancouver, ISO, and other styles
2

Hu, Teng, Siqi Yang, Yanping Wang, Gongliang Li, Yulong Wang, Gang Wang, and Mingyong Yin. "N-Accesses: A Blockchain-Based Access Control Framework for Secure IoT Data Management." Sensors 23, no. 20 (October 18, 2023): 8535. http://dx.doi.org/10.3390/s23208535.

Full text
Abstract:
With the rapid advancement of network communication and big data technologies, the Internet of Things (IoT) has permeated every facet of our lives. Meanwhile, the interconnected IoT devices have generated a substantial volume of data, which possess both economic and strategic value. However, owing to the inherently open nature of IoT environments and the limited capabilities and the distributed deployment of IoT devices, traditional access control methods fall short in addressing the challenges of secure IoT data management. On the one hand, the single point of failure issue is inevitable for the centralized access control schemes. On the other hand, most decentralized access control schemes still face problems such as token underutilization, the insecure distribution of user permissions, and inefficiency.This paper introduces a blockchain-based access control framework to address these challenges. Specifically, the proposed framework enables data owners to host their data and achieves user-defined lightweight data management. Additionally, through the strategic amalgamation of smart contracts and hash-chains, our access control scheme can limit the number of times (i.e., n-times access) a user can access the IoT data before the deadline. This also means that users can utilize their tokens multiple times (predefined by the data owner) within the deadline, thereby improving token utilization while ensuring strict access control. Furthermore, by leveraging the intrinsic characteristics of blockchain, our framework allows data owners to gain capabilities for auditing the access records of their data and verifying them. To empirically validate the effectiveness of our proposed framework and approach, we conducted extensive simulations, and the experimental results demonstrated the feasibility and efficiency of our solution.
APA, Harvard, Vancouver, ISO, and other styles
3

Mor, Alon, Yonatan Belinkov, and Benny Kimelfeld. "Accelerating the Global Aggregation of Local Explanations." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 17 (March 24, 2024): 18807–14. http://dx.doi.org/10.1609/aaai.v38i17.29845.

Full text
Abstract:
Local explanation methods highlight the input tokens that have a considerable impact on the outcome of classifying the document at hand. For example, the Anchor algorithm applies a statistical analysis of the sensitivity of the classifier to changes in the token. Aggregating local explanations over a dataset provides a global explanation of the model. Such aggregation aims to detect words with the most impact, giving valuable insights about the model, like what it has learned in training and which adversarial examples expose its weaknesses. However, standard aggregation methods bear a high computational cost: a naive implementation applies a costly algorithm to each token of each document, and hence, it is infeasible for a simple user running in the scope of a short analysis session. We devise techniques for accelerating the global aggregation of the Anchor algorithm. Specifically, our goal is to compute a set of top-k words with the highest global impact according to different aggregation functions. Some of our techniques are lossless and some are lossy. We show that for a very mild loss of quality, we are able to accelerate the computation by up to 30 times, reducing the computation from hours to minutes. We also devise and study a probabilistic model that accounts for noise in the Anchor algorithm and diminishes the bias toward words that are frequent yet low in impact.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Jun, Sijing Zhang, Carsten Maple, and Zhengxu Zhao. "Guaranteeing hard real-time traffic with legitimately short deadlines with the timed token protocol." Computer Standards & Interfaces 31, no. 3 (March 2009): 557–65. http://dx.doi.org/10.1016/j.csi.2008.03.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Sijing, Alan Burns, Jing Chen, and E. Stewart Lee. "Hard Real-Time Communication with the Timed Token Protocol: Current State and Challenging Problems." Real-Time Systems 27, no. 3 (September 2004): 271–95. http://dx.doi.org/10.1023/b:time.0000029051.60313.06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yongkie, Yongkie, and Hari Sutra Disemadi. "Non-Fungible Tokens as Jurisdictionless Innovation: Legal Vacuum, Loopholes, Potentials and Solutions." Widya Yuridika 6, no. 1 (December 2, 2022): 157. http://dx.doi.org/10.31328/wy.v6i1.4035.

Full text
Abstract:
Non-Fungible Tokens (NFTs) are one of the technological innovations that provide convenience for every human being, especially in the context of business and economic opportunities. Features such as anonymity, decentralization, and its online scope are undeniably a double-edged knife phenomenon. On the one hand, it makes things easier, but on the other hand, it has the potential to become a platform for criminal acts such as money laundering, personal data violations, and copyright plagiarism. So that the urgency of special regulations should be considered, but with a progressive nature and paradigm so that NFT innovation does not die just because the law does not adjust to the times. Progressive law is a solution and answer to the phenomena that occur, where the legal paradigm and its enforcement must be in accordance with the moral system, the times, and the values that live in society to achieve substantive justice. Practical breakthroughs such as utilizing the latest technology can be developed to assist the law enforcement process in cyberspace.
APA, Harvard, Vancouver, ISO, and other styles
7

Du, Quan, Kai Feng, Chen Xu, Tong Xiao, and Jingbo Zhu. "Non-autoregressive neural machine translation with auxiliary representation fusion." Journal of Intelligent & Fuzzy Systems 41, no. 6 (December 16, 2021): 7229–39. http://dx.doi.org/10.3233/jifs-211105.

Full text
Abstract:
Recently, many efforts have been devoted to speeding up neural machine translation models. Among them, the non-autoregressive translation (NAT) model is promising because it removes the sequential dependence on the previously generated tokens and parallelizes the generation process of the entire sequence. On the other hand, the autoregressive translation (AT) model in general achieves a higher translation accuracy than the NAT counterpart. Therefore, a natural idea is to fuse the AT and NAT models to seek a trade-off between inference speed and translation quality. This paper proposes an ARF-NAT model (NAT with auxiliary representation fusion) to introduce the merit of a shallow AT model to an NAT model. Three functions are designed to fuse the auxiliary representation into the decoder of the NAT model. Experimental results show that ARF-NAT outperforms the NAT baseline by 5.26 BLEU scores on the WMT’14 German-English task with a significant speedup (7.58 times) over several strong AT baselines.
APA, Harvard, Vancouver, ISO, and other styles
8

SONG, Weijie. "Environmentality, Sustainability, and Chinese Storytelling." Cultura 20, no. 1 (January 1, 2023): 55–66. http://dx.doi.org/10.3726/cul012023.0005.

Full text
Abstract:
Abstract: Environmentality teases out the multilayered human-environment contacts and connections in terms of human agency and governmentality, ecological objects and their (in)dependence, power/knowledge and environmental (in)justice. “Sustainable Development Goals” recognize that ending poverty and other deprivations must go hand-in-hand with strategies that improve health and education, reduce inequality, and spur economic growth – all while tackling climate change and working to preserve our environment. This paper outlines the scopes, scales, and methods of Chinese storytelling and multimedia exhibitions on deforestation and afforestation, pollution and purification, and wastelands and eco-systems in industrial, de-industrial, and post-industrial times. The author considers short stories, novels, reportages, nonfiction writings, and visual artworks, with specific focus on trees, forests, and plant writing. By reading Kong Jiesheng’s “Forest Primeval,” Ah Cheng’s “King of Trees,” Xu Gang’s Loggers, Wake Up!, Yan Lianke’s Garden No. 711: The Ultimate Last Memo of Beijing, and Chen Yingsong’s The Forest Is Silent, the author aims to bring to light the awakening and formation of Chinese ecological consciousness, the token of marred humanity and ecocritical reflection, the manifestation of biophilia-biophobia experiences, as well as the structural transformation of private feelings and public emotions in modern and contemporary China
APA, Harvard, Vancouver, ISO, and other styles
9

Włodarczyk, Michał. "Digital Disruption in Art: A Comprehensive Analysis of AI and NFT Market Dynamics." Annales Universitatis Mariae Curie-Skłodowska, sectio H – Oeconomia 58, no. 2 (July 5, 2024): 171–93. http://dx.doi.org/10.17951/h.2024.58.2.171-193.

Full text
Abstract:
Theoretical background: The dynamic development of generative artificial intelligence such as ChatGPT has transformed the perception of creative work. In the graphic realm, AI systems like Midjourney, DALL-E, or Adobe Firefly allow the creation of high-quality graphics without the need for artistic skills or hiring a talented designer. Concurrently, the emergence of cryptocurrencies and the associated non-fungible tokens (NFTs) has resulted in radical changes in the creative sector, especially in the art market. Purpose of the article: The aim of this article is to examine the development pace and impact of AI-generated art and NFTs on the global art market, focusing on market trends, dependency on energy prices, segmentation, and how these changes influence artwork pricing and artists’ livelihoods. Research methods: The author has assessed the popularity and development of the NFT market, as well as the main reasons for its collapse in 2022. The author identified possible scenarios for the development of the art market, pointed out the primary potential threats, and highlighted the key determinants of future digital asset valuation. Main findings: Despite initial euphoria, buyers depreciate digital goods, especially those generated by artificial intelligence. They are considered inherently inferior and less valuable. Overproduction of works combined with the availability of AI solutions means that the traditional supply and demand mechanism lowers the price of assets and thereby forces more and more graphic designers, photographers, and painters to abandon their professions. Competition from artificial intelligence is subject to the same supply and demand mechanisms, which reduces the cost of access to quality graphics for entrepreneurs and individuals. Simultaneously, as the market becomes saturated with synthetic goods, there will be a delineation at the segment level, similar to what has happened with artisanal beers, hand-assembled cars, and furniture, or handmade ceramics. An unwritten, "made by humans" certificate will result in a 300–500% higher price for similar goods made by humans compared to works of artificial intelligence. This situation will resemble the division of the clothing or furniture market into mass-produced goods and designer items. Amid all this, the increasingly prominent role of NFTs will be evident, which will also appreciate in value, but due to the ecological taxation resulting from energy consumption that is 100,000 times higher than that of a regular bank payment.
APA, Harvard, Vancouver, ISO, and other styles
10

Abdullahi, Nasir Umar. "Contemporary Northern Nigerian Literature and the Poverty Discourse: A Critique of Aliyu Kamal’s Hausa Boy." European Journal of English Language and Literature Studies 11, no. 5 (May 15, 2023): 76–90. http://dx.doi.org/10.37745/ejells.2013/vol11n57690.

Full text
Abstract:
Literature plays numerous roles in the society: the cultural, the political, the religious, the economical, the social and the scientific (therapeutic). From the classical epochs to 21st century, writers have written and have been writing plays, poetry, novels, as well as short stories to educate, enlighten, persuade, warn and entertain their community, and sometimes the world at large. However, owing to the incessant changes in times, writers have to explore emerging themes such as migration, regional disputes, Aids, tribalism, terrorism, ethnic and religious violence, gender politics, institutionalized corruption and poverty for example. The aim of this paper is to explore the theme of poverty, as one of the contemporary thematic preoccupations in African literature in the 2I century, as portrayed in one of Aliyu Kamal‘s latest novellas, Hausa Boy. Set in the Northern part of Nigeria, Kamal ‘s prime concern is to demonstrate how some families in the country feel the deep and painful bite of abject poverty, which not only forces them to live from hand to mouth, but also makes it thorny for them to send their children to school. The end result is the children; particularly the young girls become street hawkers, a trade which endangers their life in the long run. The paper also wants to unveil that of the handful of the under-privileged children that have been to school, a significant number of them shamelessly drop out, owing to their parents’ inability to pay for their school fees. This further leads to the rising wave of the unemployment rate in the country. The paper reveals how poverty profoundly affects young men-women courtship, as it deters the former from fulfilling their cultural obligation of giving out some money token to their girl-friends and fiancées in each visit they pay to them as a sign of love. Yet, it is also a cultural practice, which causes the young men to suffer in Northern Nigeria’s contemporary reality.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Hard times tokens"

1

Wang, Jun. "Hard synchronous real-time communication with the time-token MAC protocol." Thesis, University of Bedfordshire, 2009. http://hdl.handle.net/10547/241811.

Full text
Abstract:
The timely delivery of inter-task real-time messages over a communication network is the key to successfully developing distributed real-time computer systems. These systems are rapidly developed and increasingly used in many areas such as automation industry. This work concentrates on the timed-token Medium Access Control (MAC) protocol, which is one of the most suitable candidates to support real-time communication due to its inherent timing property of bounded medium access time. The support of real-time communication with the timed-token MAC protocol has been studied using a rigorous mathematical analysis. Specifically, to guarantee the deadlines of synchronous messages (real-time messages defined in the timed-token MAC protocol), a novel and practical approach is developed for allocating synchronous bandwidth to a general message set with the minimum deadline (Dmin) larger than the Target Token Rotation Time (TTRT). Synchronous bandwidth is defined as the maximum time for which a node can transmit its synchronous messages every time it receives the token. It is a sensitive paramater in the control of synchronous message transmission and must be properly allocated to individual nodes to guarantee deadlines of real-time messages. Other issues related to the schedulability test, including the required buffer size and the Worst Case Achievable Utilisation (WCAU) of the proposed approach, are then discussed. Simulations and numerical examples demonstrate that this novel approach performs better than any previously published local synchronous bandwidth allocation (SBA) schemes, in terms of its ability to guarantee the real-time traffic. A proper selection of the TTRT, which can maximise the WCAU of the proposed SBA scheme, is addressed. The work presented in this thesis is compatible with any network standard where timed-token MAC protocol is employed and therefore can be applied by engineers building real-time systems using these standards.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Hard times tokens"

1

Rulau, Russell. Hard times tokens. 3rd ed. [Iola, Wis.]: Krause Publications, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rulau, Russell. Hard Times tokens. 4th ed. Iola, WI: Krause Publications, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rulau, Russell. The standard catalog of Hard Times tokens. 9th ed. Iola, WI: Krause Publications, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rulau, Russell. U.S. merchant tokens, 1845-1860: A catalog of the unofficial coinage of America from the end of the Hard Times era to the eve of the Civil War : includes many advertising and business promotion pieces. 3rd ed. Iola, Wis: Krause Publications, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rulau, Russell. U.S. merchant tokens, 1845-1860: A catalog of the unofficial coinage of America from the end of the Hard Times era to the eve of the Civil War : includes many advertising and business promotion pieces. 2nd ed. Iola, Wis: Krause Publications, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hard Times Tokens. Creative Media Partners, LLC, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hard Times tokens. 6th ed. Iola, WI: Krause Publications, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rulau, Russell. Hard Times Tokens/1832-1844. 4th ed. Krause Pubns Inc, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Butler, Jan. Clash of the Timbres. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199985227.003.0013.

Full text
Abstract:
This chapter explores the conflict between live- and studio-centered aesthetics in the context of the San Francisco and Los Angeles rock scenes in the late 1960s. Bands such as Jefferson Airplane, who had spent their careers playing live concerts, were faced with new challenges of working in a more technical setting. By the same token, the technological capabilities of studio recording practice allowed for increased creative opportunities. Contributors to nascent rock magazines such as Crawdaddy often disagreed about the true meaning of “authentic” rock and roll, some advocating for a more polished and produced sound while others argued that rock should imitate live performance, even in a recording studio. Over time, artists and producers developed means to oscillate between live and recorded styles depending on their artistic intention.
APA, Harvard, Vancouver, ISO, and other styles
10

Kachun, Mitch. Crispus Attucks from the Bicentennial to the Culture Wars, 1970s–1990s. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199731619.003.0009.

Full text
Abstract:
The 1976 bicentennial brought greater mainstream attention to Attucks and black participation in the Revolution as well as increasing opportunities to disseminate interpretations of Attucks and other African American heroes in schools and through ever-expanding mass media exposure over the subsequent decades. Attucks was becoming a standard figure in most popular American history textbooks and was featured even more visibly in mainstream culture outside the classroom. Of all the competing versions of Attucks circulating at that time, it was the taken-for-granted Revolutionary token that seemed most prominent in the nation’s collective memory; for many, he was a bland symbol of a romanticized American Revolution and an unthreatening black patriotism. By the end of the twentieth century, Attucks had, to a large degree, become a black American hero of the Revolution, though one who was still marginalized within the nation’s story.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Hard times tokens"

1

Prakash, Aditya. "Checking History-Determinism is NP-hard for Parity Automata." In Lecture Notes in Computer Science, 212–33. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-57228-9_11.

Full text
Abstract:
AbstractWe show that the problem of checking if a given nondeterministic parity automaton simulates another given nondeterministic parity automaton is NP-hard. We then adapt the techniques used for this result to show that the problem of checking history-determinism for a given parity automaton is NP-hard. This is an improvement from Kuperberg and Skrzypczak’s previous lower bound of solving parity games from 2015. We also show that deciding if Eve wins the one-token game or the two-token game of a given parity automaton is NP-hard. Finally, we show that the problem of deciding if the language of a nondeterministic parity automaton is contained in the language of a history-deterministic parity automaton can be solved in quasi-polynomial time.
APA, Harvard, Vancouver, ISO, and other styles
2

Ezeife, Christie I., and Timothy E. Ohanekwu. "The Use of Smart Tokens in Cleaning Integrated Warehouse Data." In Data Warehousing and Mining, 1355–75. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-951-9.ch077.

Full text
Abstract:
Identifying integrated records that represent the same real-world object in numerous ways is just one form of data disparity (dirt) to be resolved in a data warehouse. Data cleaning is a complex process, which uses multidisciplinary techniques to resolve conflicts in data drawn from different data sources. There is a need for initial cleaning at the time a data warehouse is built, and incremental cleaning whenever new records are brought into the data warehouse during refreshing. Existing work on data cleaning have used pre-specified record match thresholds and multiple scanning of records to determine matching records in integrated data. Little attention has been paid to incremental matching of records. Determining optimal record match score threshold in a domain is hard. Also, direct long record string comparison is highly inefficient and intolerant to typing errors. Thus, this article proposes two algorithms, the first of which uses smart tokens defined from integrated records to match and identify duplicate records during initial warehouse cleaning. The second algorithm uses these tokens for fast, incremental cleaning during warehouse refreshing. Every attribute value forms either a special token like birth date or an ordinary token, which can be alphabetic, numeric, or alphanumeric. Rules are applied for forming tokens belonging to each of these four classes. These tokens are sorted and used for record match. The tokens also form very good warehouse identifiers for future faster incremental warehouse cleaning. This approach eliminates the need for match threshold and multiple passes at data. Experiments show that using tokens for record comparison produces a far better result than using the entire or greater part of a record.
APA, Harvard, Vancouver, ISO, and other styles
3

Dipiero, W. S., and Peter Burian. "Introduction." In Euripides Ion, 3–19. Oxford University PressNew York, NY, 1996. http://dx.doi.org/10.1093/oso/9780195094510.003.0001.

Full text
Abstract:
Abstract The Ion is one of those plays of Euripides that refuses to stay put. Is it a savage attack on Apollo and traditional Greek religion? a celebration of Athens’ divine origins and imperial destiny? or a sophisticated and disenchanted comedy of ideas? It has been claimed as all these things and more. Although the various readings seem fundamentally incompatible, none can simply be dismissed as without textual foundation. But attempts to fix the play’s meaning by reference to a religious or political thesis or even to escapism dictated by the hard times in Athens around 410 B.C.1are inevitably reductive. Even a cursory glance at the action is enough to suggest what disturbing riptides of thought and feeling run just below the shimmering surface of Euripidean melodrama. Kreousa, queen of Athens, and Xouthos, her foreign husband, arrive at Delphi to ask Apollo’s help in ending their childlessness. The god, however, had long ago raped Kreousa and left her with a son whom she bore in secret and abandoned. Unbeknownst to her, Apollo had the baby brought to Delphi and raised to become a temple servant. Now, when the boy is already entering young manhood, Apollo bestows him on Xouthos as the latter’s child. Kreousa, who does not know the truth about the child’s identity, reacts to her husband’s good fortune by trying to kill-as an interloper-the very son she has despaired of finding. The attempt providentially fails, but it is only after the boy in turn threatens Kreousa with death that the Pythia at last reveals the birth tokens that permit the mother to recognize and embrace her son. The child of Kreousa and Apollo will now shoulder his Athenian destiny, and Xouthos will be left content in the belief that the boy, whom he has named Ion, is really his own.
APA, Harvard, Vancouver, ISO, and other styles
4

Franchino, Gianluca, Giorgio C., and Tullio Facchinetti. "Token Passing Techniques for Hard Real-Time Communication." In Factory Automation. InTech, 2010. http://dx.doi.org/10.5772/9529.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Damodaran, A. "The Future of Arts Organizations." In Managing Arts in Times of Pandemics and Beyond, 209–42. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192856449.003.0008.

Full text
Abstract:
The final chapter surveys the impact of COVID-19 on arts organizations. It is observed that the advent of COVID-19 in early 2020 had pushed many arts organizations to transition to digital streaming media to relay their live shows. The transition resulted in a phenomenal expansion in their customer base. To a large extent, the switch-over enabled these organizations to tide over the losses brought about by the cancellation of live shows during the lockdown phase. Indeed many large museums had shifted to virtual visits during the lockdown phase of COVID-19. Many of them were able to amplify their virtual experience, thanks to digital imaging technologies. Though these Museums reverted to the pre-pandemic routines during the latter half of 2020, they did not completely abandon their forays into the virtual world. Another development during the pandemic years was the sale of art works through blockchain platforms that minted non-fungible crypto tokens. Although, it is conceded that large arts organizations are in a better position to take advantage of digital platforms and technologies, it is also argued that with better business models, it is possible for small arts organizations to avail these technologies.
APA, Harvard, Vancouver, ISO, and other styles
6

"Elijah The Tishbites Supplication." In Prophetic Writings Of Lady Eleanor Davies, edited by Esther S. Cope, 325–28. Oxford University PressNew York, NY, 1995. http://dx.doi.org/10.1093/oso/9780195078756.003.0033.

Full text
Abstract:
Abstract Where lastly to be short with the time like that little dark Cloud a Hand like, to the waiting Prophet no small welcom token, as gathered therefrom these of Palmistries Science, extending to the present jubilee or Number of Fifty; so points to a Blow when as much attention lends to their Note, as they of such took notice sent from him, those Baals Sons the Image of God both alike, as the Baboon or such like theirs carried with the current of the Cormorant Times.
APA, Harvard, Vancouver, ISO, and other styles
7

Reichenbach, Hans. "The Tenses of Verbs." In Semantics, 526–33. Oxford University PressNew York, NY, 2004. http://dx.doi.org/10.1093/oso/9780195136975.003.0024.

Full text
Abstract:
Abstract A particularly important form of token-reflexive symbol is found in the tenses of verbs. The tenses determine time with reference to the time point of the act of speech—that is, of the token uttered. A closer analysis reveals that the time indication given by the tenses is of a rather complex structure. Let us call the time point of the token the point of speech. Then the three indications—”before the point of speech,” “simultaneous with the point of speech,” and “after the point of speech”—furnish only three tenses; since the number of verb tenses is obviously greater, we need a more complex interpretation. From a sentence like ‘Peter had gone’ we see that the time order expressed in the tense does not concern one event, but two events, whose positions are determined with respect to the point of speech.
APA, Harvard, Vancouver, ISO, and other styles
8

Schulzinger, Robert D. "Fighting the War: 1965–1967." In A Time for War, 182–214. Oxford University PressNew York, NY, 1997. http://dx.doi.org/10.1093/oso/9780195071894.003.0008.

Full text
Abstract:
Abstract From the end of 1965 until the end of 1967 the war in Vietnam became more and more of an American affair. At the beginning of 1965 the United States stationed 23,000 troops in Vietnam. A year later the number was 184,000, rising to 385,000 at the end of the year and 535,000 by the beginning of 1968. Although the United States preferred conventional “big unit” confrontations with the NLF and the North Vietnamese, the enemy decided when to engage the Americans and ARVN forces, thereby limiting their own casualties until the time they expected the Americans would weary of the war. General William C. Westmoreland, the U.S. commander, tried unsuccessfully to counter these tactics with an attrition strategy of his own. He sent giant B-52 bombers and smaller fighter bombers over South Vietnam to terrorize the Vietcong. After the bombers had prepared the battlefield, helicopter-borne American units descended on the countryside on search-and-destroy missions to root out and kill enemy soldiers. Americans would fly out from their bases in the morning, pursue the Vietcong or North Vietnamese in fire fights, and return to bases in the evening. The tokens of progress in the war became the “body count” of soldiers killed, rather than territory captured or decapitation of the enemy’s command and control structure. Westmoreland adopted the procedure because it seemed to provide the quantifiable data that McNamara insisted upon.
APA, Harvard, Vancouver, ISO, and other styles
9

"Antidotes and Counterspells." In Curse Tablets and Binding Spells from the Ancient World, edited by John G. Gager, 218–42. Oxford University PressNew York, NY, 1992. http://dx.doi.org/10.1093/oso/9780195062267.003.0008.

Full text
Abstract:
Abstract In his account of the Jewish uprising against the foreign dynasty of the Greek Seleucids, the pious editor of 2 Maccabees relates the following episode (12:34-39): Judas, surnamed Maccabeus (“the hammerer”), lost a number of his men in battle; on the following day, when Judas went out to recover their bodies, he discovered that every fallen soldier had been wearing an amulet (“sacred tokens of the idols of Jamnia”), which, the editor notes in a sanctimonious aside, “the Law forbids the Jews to wear.” Whether Judas thought to check for similar amulets among the survivors, the editor does not trouble to say, for the message is clear-the dead had fallen because of their forbidden compromise with heathen beliefs and practices. But the great likelihood is that the survivors, too, had fortified themselves against “anything harmful” by putting on their engraved stones or their inscribed sheets of metal and papyrus. To be sure, 2 Maccabees does not offer the sort of hard demo graphic data preferred by modern social scientists, but the fact remains that in this randomly chosen sample of ancient Jews, every one wore an amulet, as did virtually every sensible person of the time.
APA, Harvard, Vancouver, ISO, and other styles
10

Park, Robert L. "The Virtual Astronaut." In Voodoo Science, 68–91. Oxford University PressOxford, 2000. http://dx.doi.org/10.1093/oso/9780198507451.003.0004.

Full text
Abstract:
Abstract I reluctantly took my place at the witness table before the House Subcommittee of Space and Aeronautics on April 9, 1997, to testify about the International Space Station. I would be about as popular at this hearing as a skunk that wandered into a garden party. Seated between the head of NASA’s human spaceflight program on my left and a former astronaut on my right, I had been invited as the token critic. I had testified on the space station before congressional committees many times in the past, but there had always been some hope that Congress might be persuaded to cancel the project. But in 1997, although the space station was years behind schedule and several times over budget, its support in Congress and particularly in the Space Subcommittee was stronger than ever. With the much-delayed launch of the first module now
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Hard times tokens"

1

Zhai, Mingliang, Yulin Li, Xiameng Qin, Chen Yi, Qunyi Xie, Chengquan Zhang, Kun Yao, Yuwei Wu, and Yunde Jia. "Fast-StrucTexT: An Efficient Hourglass Transformer with Modality-guided Dynamic Token Merge for Document Understanding." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. California: International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/585.

Full text
Abstract:
Transformers achieve promising performance in document understanding because of their high effectiveness and still suffer from quadratic computational complexity dependency on the sequence length. General efficient transformers are challenging to be directly adapted to model document. They are unable to handle the layout representation in documents, e.g. word, line and paragraph, on different granularity levels and seem hard to achieve a good trade-off between efficiency and performance. To tackle the concerns, we propose Fast-StrucTexT, an efficient multi-modal framework based on the StrucTexT algorithm with an hourglass transformer architecture, for visual document understanding. Specifically, we design a modality-guided dynamic token merging block to make the model learn multi-granularity representation and prunes redundant tokens. Additionally, we present a multi-modal interaction module called Symmetry Cross-Attention (SCA) to consider multi-modal fusion and efficiently guide the token mergence. The SCA allows one modality input as query to calculate cross attention with another modality in a dual phase. Extensive experiments on FUNSD, SROIE, and CORD datasets demonstrate that our model achieves the state-of-the-art performance and almost 1.9x faster inference time than the state-of-the-art methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Franchino, Gianluca, Giorgio C. Buttazzo, and Tullio Facchinetti. "Properties of BuST and timed-token protocols in managing hard real-time traffic." In Factory Automation (ETFA 2008). IEEE, 2008. http://dx.doi.org/10.1109/etfa.2008.4638555.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Franchino, Gianluca, Giorgio C. Buttazzo, and Tullio Facchinetti. "BuST: Budget Sharing Token protocol for hard real-time communication." In 2007 IEEE Conference on Emerging Technologies & Factory Automation (EFTA 2007). IEEE, 2007. http://dx.doi.org/10.1109/efta.2007.4416928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Winnicka-Jasłowska, Dorota. "Function, Form and Ergonomics of Design Solutions for Entrance Zones to Public Utility Buildings. In Situ Analyses." In Applied Human Factors and Ergonomics Conference. AHFE International, 2018. http://dx.doi.org/10.54941/ahfe100108.

Full text
Abstract:
Entrance zones of modern public utility buildings has always had three major functions in the contact between man and the building. First and foremost, they connect the external world with the interiors of the building. Secondly, they provide functional comfort and safety, by means of architectural solutions and material solutions, and, last but not least, they are tokens of a prestige of an institution and its visiting card. The order of these three priority functions has been subject of changes over the centuries. It is enough to analyze different architectural styles to notice the predominance of one function over the other two. Likewise, the functionality and ease of entrance has also been understood in different manners. Nevertheless, from the perspective of the 21st century citizen, it seems that the user has not always been considered with the same importance. Old buildings of the past should not be assessed on the same terms as modern ones. At first, cultural, anthropological and human body dimension factors had the greatest influence on the architectural solutions of entrance zones, only later followed by architectural styles prevailing at given times.
APA, Harvard, Vancouver, ISO, and other styles
5

Ramachandran, Vilayanur S. "Perception of shape from shading." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1989. http://dx.doi.org/10.1364/oam.1989.mu3.

Full text
Abstract:
Since the time of Leonardo Da Vinci there has been very little research on how human observers perceive shape from shading. Our experiments suggest that shading is a primitive visual dimension that is extracted relatively early in visual processing. More recently, our research has focused on the question of how information about shading interacts with other visual functions such as motion, occlusion, stereopsis, symmetry, and perceptual grouping. We find that (1) In extracting shape from shading the visual system incorporates the single light source constraint; i.e., it assumes that there is only one light source illuminating most of the image. This is especially true for different parts of a single object. (The rule can sometimes be overridden for multiple objects.) (2) Tokens defined by shading can be used for perceptual grouping and segregation. (3) Occlusion boundaries (e.g., illusory contours) strongly influence the extraction of shape from shading. Chromatic borders, on the other hand, are completely ineffective. (4) Shading can provide an input to motion perception.
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Zicheng, Li Wang, Siyuan Li, Zedong Wang, Haitao Lin, and Stan Z. Li. "LongVQ: Long Sequence Modeling with Vector Quantization on Structured Memory." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. California: International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/510.

Full text
Abstract:
Transformer models have been successful in various sequence processing tasks, but the self-attention mechanism's computational cost limits its practicality for long sequences. Although there are existing attention variants that improve computational efficiency, they have a limited ability to abstract global information effectively based on their hand-crafted mixing strategies. On the other hand, state-space models (SSMs) are tailored for long sequences but cannot capture complicated local information. Therefore, the combination of them as a unified token mixer is a trend in recent long-sequence models. However, the linearized attention degrades performance significantly even when equipped with SSMs. To address the issue, we propose a new method called LongVQ. LongVQ uses the vector quantization (VQ) technique to compress the global abstraction as a length-fixed codebook, enabling the linear-time computation of the attention matrix. This technique effectively maintains dynamic global and local patterns, which helps to complement the lack of long-range dependency issues. Our experiments on the Long Range Arena benchmark, autoregressive language modeling, and image and speech classification demonstrate the effectiveness of LongVQ. Our model achieves significant improvements over other sequence models, including variants of Transformers, Convolutions, and recent State Space Models.
APA, Harvard, Vancouver, ISO, and other styles
7

Radescu, Radu, and Sever Pasca. "ENHANCING THE SECURITY LEVEL OF THE NEW VERSION OF THE EASY-LEARNING ONLINE PLATFORM." In eLSE 2017. Carol I National Defence University Publishing House, 2017. http://dx.doi.org/10.12753/2066-026x-17-108.

Full text
Abstract:
The Easy-Learning platform is a system of online education developed as an original product of the Department of Applied Electronics and Information Engineering from the University Politehnica of Bucharest. The platform has undergone many changes over the years, from a simple project and becoming a complex and efficient work in virtual learning environment. At this time, the platform has achieved a high degree of maturity, obtained using Symfony's framework, which simplifies many repetitive tasks, enables automatic generation entities and networking with other technologies currently used. To design and implement version 2.0 of the Easy-Learning platform the following technologies were used: PHP5, JavaScript, HTML5, CSS3, MariaDB database management system (compared to MySQL in previous versions), RESTful Web services, Android, Apache (as Web server) and methodologies for securing the communication between an application server and a client application. Due to technology and security issues that arise in older versions of each framework, it was decided rewriting the PHP code of the platform in order to use the Symfony 2 framework. Accessing the interfaces for administrator, tutor and student is a secure action using a user name and password so that access is not allowed to unauthenticated users. On a server running MariaDB multiple users may be defined. For security reasons, the root user should be used only for administrative purposes. For each user to use the system, it must establish an individual account, which corresponds to a user name and password. They must not be identical usernames and passwords outside MariaDB system (for example, user names and passwords for UNIX and NT). Like MySQL, MariaDB has a complex system of privileges. A privilege is the right to perform a particular action on an object and is associated with a particular user. The concept is very similar to permissions on files. When creating users in MariaDB, it is assigned a set of privileges to specify the actions they can perform in the system. JavaScript scripts are limited by severe restrictions imposed by web browsers. For security reasons, JavaScript can not read, write, create and delete files on your hard disk. In terms of security, PHP provide developers a flexible and efficient set of safety measures. Developing of open-source PHP caused its rapid adaptation to the Web needs, and an efficient and secure code. PHP 5.5 (2013) and 5.6 (2014) are stable versions, including solving security issues. Twig is a templating system that supports PHP. Among the advantages of its use it is the security function. Twig has a sandbox mode used to evaluate the code. Twig can be used as a templating language where users are allowed to execute design actions. Among the recent improvements made to the Easy-Learning platform are rewriting PHP code, so that the Symfony 2 framework structure can be used, and securing authentication forms and private sections using an SSL certificate. Symfony 2 allows passwords to be encrypted using different algorithms, such as MD5, SHA1, SHA512 and bcrypt. The bcrypt algorithm is a function of a key derivation for passwords based on the Blowfish cipher. Besides that has defined a leap to protect against a dictionary-based attacks prefilled with different values, it is an adaptive function: in time, the number of iterations can be increased to make it more difficult to decrypt password. The bcrypt encryption algorithm used for passwords in Symfony 2 is presented. Because the platform works with personal data is needed as they are sent to the server through a secure protocol to prevent attacks like Man in the Middle or data theft. To demonstrate the implementation of this requirement, SSL (Secure Sockets Layer) certificates generated on the server development were used. This means that accessing the Easy-Learning platform browser will display a warning message indicating that the SSL certificate was not issued by an authority. The operation of porting the new version of the platform to the production server introduced a valid certificate. Sections that were considered opportune to introduce the HTTPS protocol are: login page, admin interface, tutor interface and student interface. Communication via HTTPS is done using the (public key, private key) pair. Thus, the data entered in the form is encrypted using a public key and sent to server. The server, using the private key, can decrypt and extract the contents of the data sent. The procedures for securing a specific page in Symfony 2 and the entire admin interface are presented. In order to test platform vulnerabilities, Acunetix Web Vulnerability Scanner 8 was used. With this tool, were tested vulnerabilities as: SQL Injection, XSS, Trojan Script, Week_Password_Basic_Auth, CRLF Injection, PHP Code Injection and CSRF. The results obtained by running this toll are presented. The recent contributions to the Easy-Learning platform includes creating a RESTful web service that can be used by external applications to access public and private information, based on a token generated for each student using its authentication service. In perspective, it is intended to add an external caching system such as Varnish.
APA, Harvard, Vancouver, ISO, and other styles
8

Schooley, Ben, Akanksha Singh, Sarah Floyd, Stephan Pill, and John Brooks. "Direct Weighting Interactive Design of Patient Preferences for Shared Decision Making in Orthopaedic Practice." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002105.

Full text
Abstract:
Patients need the ability to accurately and efficiently communicate their preferences across outcome domains to their healthcare providers.1-7 No existing system provides an efficient and timely approach to collect and communicate patient preferences across outcome domains to support shared decision making (SDM) in orthopaedic practice.2-4,8-19 The overarching goal of this research is to design, build, and test an app that collects baseline patient preferences and health status across orthopaedic outcomes and reports this information to the provider for use in patient care. A core component of the app is a Direct-Weighting (DW) preference assessment approach, originated from our prior research, and applied in a touchscreen based interactive design. It is envisioned that patients will use the app after scheduling a first visit to a surgeon for a new orthopaedic condition. Direct weighting (DW) approaches calculate patient-specific preference weights across outcomes by asking patients to disperse portions of a hypothetical “whole” across outcomes in a manner that reflects a patient’s preferences.20 DW has low respondent burden but it requires respondents to make “implicit” comparisons which may be difficult to conceptualize.20 However, the DW approach has become generally accepted in the quality-of-life literature and it has been shown that patients dividing up pieces of a “pie” across quality-of-life domains yields valid representations of patient preferences across the domains.20-22 However, the DW approach has not been validated with specific clinical scenarios using a clinically focused set of outcomes or by using a mobile software app. Drawing on prior research, we iteratively design and develop the app with input from prior DW research, informaticians, and clinicians. We use a qualitative approach to pilot test the app with 20 first-time visit patients presenting with joint pain and/or function deficiency. Participants were interviewed about their outcome preferences for care, used the app to prioritize outcome preferences, answered interview questions about their experience using the app, and completed a mHealth App Usability Questionnaire (MAUQ). Interview questions focused on the utility and usability of the mobile app for communicating with their provider, and capability of the app to capture their outcome preferences. Results validated five core preference domains, with most users dividing their 100-point allocation across 1-3 domains. The tool received moderate to high usability scores. Patients with older age and lower literacy found the DW approach more difficult in terms of allocating 100 points across 5 domains. Suggestions for DW interface interaction improvement included instantiation of a token/points oriented DW preference scoring methodology rather than a 1-10 sliding scale approach for improved preference weighting cognition and SDM with a provider. As more patient reported outcome (PRO) apps hit the marketplace across a broad spectrum of health conditions, these results provide evidence for a DW approach and interactive design for patients to communicate their treatment preferences to their providers.References:1.Baumhauer JF, Bozic KJ. Value-based Healthcare: Patient-reported Outcomes in Clinical Decision Making. Clin Orthop Relat Res. 2016;474(6):1375-1378.2. Slim K, Bazin JE. From informed consent to shared decision-making in surgery. J Visc Surg. 2019;156(3):181-184.3. Damman OC, Jani A, de Jong BA, et al. The use of PROMs and shared decision-making in medical encounters with patients: An opportunity to deliver value-based health care to patients. J Eval Clin Pract. 2020;26(2):524-540.4. Sorensen NL, Hammeken LH, Thomsen JL, Ehlers LH. Implementing patient-reported outcomes in clinical decision-making within knee and hip osteoarthritis: an explorative review. BMC Musculoskelet Disord. 2019;20(1):230.5. Kamal RN, Lindsay SE, Eppler SL. Patients Should Define Value in Health Care: A Conceptual Framework. J Hand Surg Am. 2018;43(11):1030-1034.6. Charles C, Gafni A, Whelan T. Decision-making in the physician-patient encounter: revisiting the shared treatment decision-making model. Social Science & Medicine. 1999;49(5):651-661.7. Niburski K, Guadagno E, Mohtashami S, Poenaru D. Shared decision making in surgery: A scoping review of the literature. Health Expect. 2020.8. Selten EM, Geenen R, van der Laan WH, et al. Hierarchical structure and importance of patients' reasons for treatment choices in knee and hip osteoarthritis: a concept mapping study. Rheumatology (Oxford). 2017;56(2):271-278.9. Kannan S, Seo J, Riggs KR, Geller G, Boss EF, Berger ZD. Surgeons' Views on Shared Decision-Making. J Patient Cent Res Rev. 2020;7(1):8-18.10. Briffa N. The employment of Patient-Reported Outcome Measures to communicate the likely benefits of surgery. Patient Relat Outcome Meas. 2018;9:263-266.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography