Academic literature on the topic 'Winogrand'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Winogrand.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Winogrand"

1

Constantine, Simon. "From the Museum to the Street: Garry Winogrand’s Public Relations and the Actuality of Protest." Arts 8, no. 2 (2019): 59. http://dx.doi.org/10.3390/arts8020059.

Full text
Abstract:
Focusing on Garry Winogrand’s Public Relations (1977), this article explores the problematic encounter between street photography and protest during the Vietnam War era. In doing so, it considers the extent to which Winogrand’s engagement with protest altered the formalist discourse that had surrounded his practice and the ‘genre’ of street photography more broadly since the 1950s. It is suggested that, although Winogrand never abandoned his debt to this framework, the logic of protest also intensified its internal contradictions, prompting a new attitude towards the crowd, art institution, street and mass media. By exploring this shift, this article seeks to demonstrate that, while the various leftist critiques of Winogrand’s practice remain valid, Public Relations had certain affinities with the progressive artistic and political movements of the period.
APA, Harvard, Vancouver, ISO, and other styles
2

Parry Janis, Eugenia. "Winogrand; Excesses of a Modern Mannerist." History of Photography 13, no. 3 (1989): 257–58. http://dx.doi.org/10.1080/03087298.1989.10441202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Anderson, Kathy J. "WINOGRAND: FIGMENTS FROM THE REAL WORLD. John Szarkowski." Art Documentation: Journal of the Art Libraries Society of North America 7, no. 4 (1988): 171–72. http://dx.doi.org/10.1086/adx.7.4.27947982.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sakaguchi, Keisuke, Ronan Le Bras, Chandra Bhagavatula, and Yejin Choi. "WinoGrande: An Adversarial Winograd Schema Challenge at Scale." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 05 (2020): 8732–40. http://dx.doi.org/10.1609/aaai.v34i05.6399.

Full text
Abstract:
The Winograd Schema Challenge (WSC) (Levesque, Davis, and Morgenstern 2011), a benchmark for commonsense reasoning, is a set of 273 expert-crafted pronoun resolution problems originally designed to be unsolvable for statistical models that rely on selectional preferences or word associations. However, recent advances in neural language models have already reached around 90% accuracy on variants of WSC. This raises an important question whether these models have truly acquired robust commonsense capabilities or whether they rely on spurious biases in the datasets that lead to an overestimation of the true capabilities of machine commonsense.To investigate this question, we introduce WinoGrande, a large-scale dataset of 44k problems, inspired by the original WSC design, but adjusted to improve both the scale and the hardness of the dataset. The key steps of the dataset construction consist of (1) a carefully designed crowdsourcing procedure, followed by (2) systematic bias reduction using a novel AfLite algorithm that generalizes human-detectable word associations to machine-detectable embedding associations. The best state-of-the-art methods on WinoGrande achieve 59.4 – 79.1%, which are ∼15-35% (absolute) below human performance of 94.0%, depending on the amount of the training data allowed (2% – 100% respectively).Furthermore, we establish new state-of-the-art results on five related benchmarks — WSC (→ 90.1%), DPR (→ 93.1%), COPA(→ 90.6%), KnowRef (→ 85.6%), and Winogender (→ 97.1%). These results have dual implications: on one hand, they demonstrate the effectiveness of WinoGrande when used as a resource for transfer learning. On the other hand, they raise a concern that we are likely to be overestimating the true capabilities of machine commonsense across all these benchmarks. We emphasize the importance of algorithmic bias reduction in existing and future benchmarks to mitigate such overestimation.
APA, Harvard, Vancouver, ISO, and other styles
5

Baer, Ulrich. "Seeing the Future in an Image from the Past: Hannah Arendt, Garry Winogrand, and Photographing the World." Yearbook of Comparative Literature 55, no. 1 (2009): 226–63. http://dx.doi.org/10.1353/cgl.2011.0000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sakaguchi, Keisuke, Ronan Le Bras, Chandra Bhagavatula, and Yejin Choi. "WinoGrande." Communications of the ACM 64, no. 9 (2021): 99–106. http://dx.doi.org/10.1145/3474381.

Full text
Abstract:
Commonsense reasoning remains a major challenge in AI, and yet, recent progresses on benchmarks may seem to suggest otherwise. In particular, the recent neural language models have reported above 90% accuracy on the Winograd Schema Challenge (WSC), a commonsense benchmark originally designed to be unsolvable for statistical models that rely simply on word associations. This raises an important question---whether these models have truly acquired robust commonsense capabilities or they rely on spurious biases in the dataset that lead to an overestimation of the true capabilities of machine commonsense. To investigate this question, we introduce WinoGrande, a large-scale dataset of 44k problems, inspired by the original WSC, but adjusted to improve both the scale and the hardness of the dataset. The key steps of the dataset construction consist of (1) large-scale crowdsourcing, followed by (2) systematic bias reduction using a novel AFLITE algorithm that generalizes human-detectable word associations to machine-detectable embedding associations. Our experiments demonstrate that state-of-the-art models achieve considerably lower accuracy (59.4%-79.1%) on WINOGRANDE compared to humans (94%), confirming that the high performance on the original WSC was inflated by spurious biases in the dataset. Furthermore, we report new state-of-the-art results on five related benchmarks with emphasis on their dual implications. On the one hand, they demonstrate the effectiveness of WINOGRANDE when used as a resource for transfer learning. On the other hand, the high performance on all these benchmarks suggests the extent to which spurious biases are prevalent in all such datasets, which motivates further research on algorithmic bias reduction.
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, Tao, Zhezhi He, Tengchuan Kou, et al. "BISWSRBS: A Winograd-based CNN Accelerator with a Fine-grained Regular Sparsity Pattern and Mixed Precision Quantization." ACM Transactions on Reconfigurable Technology and Systems 14, no. 4 (2021): 1–28. http://dx.doi.org/10.1145/3467476.

Full text
Abstract:
Field-programmable Gate Array (FPGA) is a high-performance computing platform for Convolution Neural Networks (CNNs) inference. Winograd algorithm, weight pruning, and quantization are widely adopted to reduce the storage and arithmetic overhead of CNNs on FPGAs. Recent studies strive to prune the weights in the Winograd domain, however, resulting in irregular sparse patterns and leading to low parallelism and reduced utilization of resources. Besides, there are few works to discuss a suitable quantization scheme for Winograd. In this article, we propose a regular sparse pruning pattern in the Winograd-based CNN, namely, Sub-row-balanced Sparsity (SRBS) pattern, to overcome the challenge of the irregular sparse pattern. Then, we develop a two-step hardware co-optimization approach to improve the model accuracy using the SRBS pattern. Based on the pruned model, we implement a mixed precision quantization to further reduce the computational complexity of bit operations. Finally, we design an FPGA accelerator that takes both the advantage of the SRBS pattern to eliminate low-parallelism computation and the irregular memory accesses, as well as the mixed precision quantization to get a layer-wise bit width. Experimental results on VGG16/VGG-nagadomi with CIFAR-10 and ResNet-18/34/50 with ImageNet show up to 11.8×/8.67× and 8.17×/8.31×/10.6× speedup, 12.74×/9.19× and 8.75×/8.81×/11.1× energy efficiency improvement, respectively, compared with the state-of-the-art dense Winograd accelerator [20] with negligible loss of model accuracy. We also show that our design has 4.11× speedup compared with the state-of-the-art sparse Winograd accelerator [19] on VGG16.
APA, Harvard, Vancouver, ISO, and other styles
8

Castro, Roberto L., Diego Andrade, and Basilio B. Fraguela. "OpenCNN: A Winograd Minimal Filtering Algorithm Implementation in CUDA." Mathematics 9, no. 17 (2021): 2033. http://dx.doi.org/10.3390/math9172033.

Full text
Abstract:
Improving the performance of the convolution operation has become a key target for High Performance Computing (HPC) developers due to its prevalence in deep learning applied mainly to video processing. The improvement is being pushed by algorithmic and implementation innovations. Algorithmically, the convolution can be solved as it is mathematically enunciated, but other methods allow to transform it into a Fast Fourier Transform (FFT) or a GEneral Matrix Multiplication (GEMM). In this latter group, the Winograd algorithm is a state-of-the-art variant that is specially suitable for smaller convolutions. In this paper, we present openCNN, an optimized CUDA C++ implementation of the Winograd convolution algorithm. Our approach achieves speedups of up to 1.76× on Turing RTX 2080Ti and up to 1.85× on Ampere RTX 3090 with respect to Winograd convolution in cuDNN 8.2.0. OpenCNN is released as open-source software.
APA, Harvard, Vancouver, ISO, and other styles
9

Ubiquity staff. "Talking with Terry Winograd." Ubiquity 2002, July (2002): 1. http://dx.doi.org/10.1145/763764.763765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ubiquity staff. "Talking with Terry Winograd." Ubiquity 2002, July (2002): 1. http://dx.doi.org/10.1145/763766.763765.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Winogrand"

1

TABLADA, Claudio Javier. "Aproximações para a dct Baseadas nos Algoritmos de Feig-winograd e Chen." Universidade Federal de Pernambuco, 2014. https://repositorio.ufpe.br/handle/123456789/12303.

Full text
Abstract:
Submitted by Etelvina Domingos (etelvina.domingos@ufpe.br) on 2015-03-12T19:56:16Z No. of bitstreams: 2 TESE Claudio Javier Tablada.pdf: 2444677 bytes, checksum: 2dcc68c8d3d0abc3074f8f0081738d37 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)<br>Made available in DSpace on 2015-03-12T19:56:16Z (GMT). No. of bitstreams: 2 TESE Claudio Javier Tablada.pdf: 2444677 bytes, checksum: 2dcc68c8d3d0abc3074f8f0081738d37 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2014-02<br>CAPES<br>Nos últimos anos, a comunidade de processamento e análise de sinais tem apresentado contribuições teóricas e práticas objetivando a proposição de aproximações para a transformada discreta do cosseno (DCT). A DCT tem a importância de ser a ferramenta matemática central empregada em vários padrões de compressão de imagens e vídeo, tais como JPEG, MPEG-1, MPEG-2, H.261, H.263, H.264 e o recente HEVC. Aproximações para a DCT são usualmente livres de multiplicação e podem ser implementadas em hardware com baixo custo computacional. Nesta dissertação é realizada uma revisão da literatura de aproximações para a DCT com os principais resultados obtidos neste campo. Como contribuições originais, são propostas: (i) uma classe de aproximações para a DCT baseada na parametrização da fatoração de Feig-Winograd e (ii) duas aproximações baseadas na fatoração de Chen. Para a classe de aproximações baseada na fatoração de Feig-Winograd, foi considerado um problema de otimização multiobjetivo para selecionar transformadas ótimas com respeito a algumas medidas objetivas de qualidade, tais como erro de energia, erro quadrático médio, ganho de codificação e eficiência da transformada. As aproximações introduzidas neste trabalho são avaliadas no contexto de compressão de imagens e comparadas com aproximações descritas na literatura. Para esta avaliação foram consideradas a relação sinal-ruído de pico e o índice de similaridade estrutural como figuras de mérito. Dos resultados obtidos, conclui-se que as novas aproximações propostas resultam ser boas transformadas para serem usadas no contexto de compressão de imagens em aplicações que requerem baixo custo de implementação.
APA, Harvard, Vancouver, ISO, and other styles
2

Bruand, Aline Vanessa [Verfasser], and M. [Akademischer Betreuer] Papenbrock. "„The Epic American... – Garry Winogrands Blick auf die USA“ / Aline Vanessa Bruand ; Betreuer: M. Papenbrock." Karlsruhe : KIT-Bibliothek, 2020. http://d-nb.info/1221186892/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Qureshi, Fahad, Mario Garrido, and Oscar Gustafsson. "Unified architecture for 2, 3, 4, 5, and 7-point DFTs based on Winograd Fourier transform algorithm." Linköpings universitet, Elektroniksystem, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-74760.

Full text
Abstract:
A unified hardware architecture that can be reconfigured to calculate 2, 3, 4, 5, or 7-point DFTs is presented. The architecture is based on the Winograd Fourier transform algorithm and the complexity is equal to a 7-point DFT in terms of adders/subtractors and multipliers plus only seven multiplexers introduced to enable reconfigurability. The processing element finds potential use in memory-based FFTs, where non-power-of-two sizes are required such as in DMB-T.
APA, Harvard, Vancouver, ISO, and other styles
4

Stothers, Andrew James. "On the complexity of matrix multiplication." Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4734.

Full text
Abstract:
The evaluation of the product of two matrices can be very computationally expensive. The multiplication of two n×n matrices, using the “default” algorithm can take O(n3) field operations in the underlying field k. It is therefore desirable to find algorithms to reduce the “cost” of multiplying two matrices together. If multiplication of two n × n matrices can be obtained in O(nα) operations, the least upper bound for α is called the exponent of matrix multiplication and is denoted by ω. A bound for ω < 3 was found in 1968 by Strassen in his algorithm. He found that multiplication of two 2 × 2 matrices could be obtained in 7 multiplications in the underlying field k, as opposed to the 8 required to do the same multiplication previously. Using recursion, we are able to show that ω ≤ log2 7 < 2.8074, which is better than the value of 3 we had previously. In chapter 1, we look at various techniques that have been found for reducing ω. These include Pan’s Trilinear Aggregation, Bini’s Border Rank and Sch¨onhage’s Asymptotic Sum inequality. In chapter 2, we look in detail at the current best estimate of ω found by Coppersmith and Winograd. We also propose a different method of evaluating the “value” of trilinear forms. Chapters 3 and 4 build on the work of Coppersmith and Winograd and examine how cubing and raising to the fourth power of Coppersmith and Winograd’s “complicated” algorithm affect the value of ω, if at all. Finally, in chapter 5, we look at the Group-Theoretic context proposed by Cohn and Umans, and see how we can derive some of Coppersmith and Winograd’s values using this method, as well as showing how working in this context can perhaps be more conducive to showing ω = 2.
APA, Harvard, Vancouver, ISO, and other styles
5

Suter, Frédéric. "Parallélisme mixte et prédiction de performances sur réseaux hétérogènes de machines parallèles." Lyon, École normale supérieure (sciences), 2002. http://www.theses.fr/2002ENSL0233.

Full text
Abstract:
"Avec la généralisation de l'Internet, il est désormais possible pour les utilisateurs de calcul numérique d'accéder aux machines les plus puissantes disponibles de par le monde et ce depuis leur station de travail. A grande échelle, ce type d'accès distant est appelé "metacomputing". Les travaux effectués au cours de cette thèsze ont tout d'abord concerné la parallélisation du logiciel SCILAB, en suivant, entre autres, une aproche basée sur des serveurs de calcul. Au cours de ces dévéloppements, les lacunes des environnements de ce type ont été exhibées, notamment le problème de goulot d'étranglement posé par la présence d'un agent centralisé. Afin de pallier ce problème, et donc de proposer un environnement extensible, nous avons suivi une approche hiérarchique pour développer le logiciel DIET (Distributed Interactive Engineering Toolbox). Un des points cruciaux des environnements de ce type concerne la capacité à estimer le temps d'exécution d'une routine sur machine donnée et les coûts de transfert des données depuis un client ou un serveur vers le serveur choisi pour la résolutions. La bibliothèque FAST (Fast Agent's System Timer), que nous avons étendue afin de gérer les routines parallèles, permet d'acquérir ce type d'informations. D'un point de vue algorithmique, nous avons mené une étude à la fois théorique et expérimentale du parallélisme mixte, i. E. , l'exploitation simultanée des parallélismes de tâches et données. Après avoir appliqué ce paradigme aux algorithmes rapides de produit de matrices de Strassen et Winograd, nous avons proposé un algorithme d'ordonnancement en parallélisme mixte dans le cas où les données ne peuvent pas être dupliquées. Cet algorithme effectue simultanément le placement et l'ordonnancement des tâches d'un graphe en se basant sur les modèles de coûts fournis par notre extension de FAST et sur un ensemble de distributions possibles. "
APA, Harvard, Vancouver, ISO, and other styles
6

顏景池. "A multiple-functional-unit data-path chip for a winograd FFT processor." Thesis, 1992. http://ndltd.ncl.edu.tw/handle/49088508935897731344.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"An intelligent co-reference resolver for Winograd schema sentences containing resolved semantic entities." Master's thesis, 2013. http://hdl.handle.net/2286/R.I.20798.

Full text
Abstract:
abstract: There has been a lot of research in the field of artificial intelligence about thinking machines. Alan Turing proposed a test to observe a machine's intelligent behaviour with respect to natural language conversation. The Winograd schema challenge is suggested as an alternative, to the Turing test. It needs inferencing capabilities, reasoning abilities and background knowledge to get the answer right. It involves a coreference resolution task in which a machine is given a sentence containing a situation which involves two entities, one pronoun and some more information about the situation and the machine has to come up with the right resolution of a pronoun to one of the entities. The complexity of the task is increased with the fact that the Winograd sentences are not constrained by one domain or specific sentence structure and it also contains a lot of human proper names. This modification makes the task of association of entities, to one particular word in the sentence, to derive the answer, difficult. I have developed a pronoun resolver system for the confined domain Winograd sentences. I have developed a classifier or filter which takes input sentences and decides to accept or reject them based on a particular criteria. Once the sentence is accepted. I run parsers on it to obtain the detailed analysis. Furthermore I have developed four answering modules which use world knowledge and inferencing mechanisms to try and resolve the pronoun. The four techniques I use are : ConceptNet knowledgebase, Search engine pattern counts,Narrative event chains and sentiment analysis. I have developed a particular aggregation mechanism for the answers from these modules to arrive at a final answer. I have used caching technique for the association relations that I obtain for different modules, so as to boost the performance. I run my system on the standard &lsquo;nyu dataset&rsquo; of Winograd sentences and questions. This dataset is then restricted, by my classifier, to 90 sentences. I evaluate my system on this 90 sentence dataset. When I compare my results against the state of the art system on the same dataset, I get nearly 4.5 % improvement in the restricted domain.<br>Dissertation/Thesis<br>M.S. Computer Science 2013
APA, Harvard, Vancouver, ISO, and other styles
8

"Solving Winograd Schema Challenge : Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning." Master's thesis, 2014. http://hdl.handle.net/2286/R.I.27387.

Full text
Abstract:
abstract: Turing test has been a benchmark scale for measuring the human level intelligence in computers since it was proposed by Alan Turing in 1950. However, for last 60 years, the applications such as ELIZA, PARRY, Cleverbot and Eugene Goostman, that claimed to pass the test. These applications are either based on tricks to fool humans on a textual chat based test or there has been a disagreement between AI communities on them passing the test. This has led to the school of thought that it might not be the ideal test for predicting the human level intelligence in machines. Consequently, the Winograd Schema Challenge has been suggested as an alternative to the Turing test. As opposed to deciding the intelligent behavior with the help of chat servers, like it was done in the Turing test, the Winograd Schema Challenge is a question answering test. It consists of sentence and question pairs such that the answer to the question depends on the resolution of a definite pronoun or adjective in the sentence. The answers are fairly intuitive for humans but they are difficult for machines because it requires some sort of background or commonsense knowledge about the sentence. In this thesis, I propose a novel technique to solve the Winograd Schema Challenge. The technique has three basic modules at its disposal, namely, a Semantic Parser that parses the English text (both sentences and questions) into a formal representation, an Automatic Background Knowledge Extractor that extracts the Background Knowledge pertaining to the given Winograd sentence, and an Answer Set Programming Reasoning Engine that reasons on the given Winograd sentence and the corresponding Background Knowledge. The applicability of the technique is illustrated by solving a subset of Winograd Schema Challenge pertaining to a certain type of Background Knowledge. The technique is evaluated on the subset and a notable accuracy is achieved.<br>Dissertation/Thesis<br>Masters thesis defense presentation slides<br>Masters Thesis Computer Science 2014
APA, Harvard, Vancouver, ISO, and other styles
9

"Towards Understanding Natural Language: Semantic Parsing, Commonsense Knowledge Acquisition, Reasoning Framework and Applications." Doctoral diss., 2019. http://hdl.handle.net/2286/R.I.54850.

Full text
Abstract:
abstract: Reasoning with commonsense knowledge is an integral component of human behavior. It is due to this capability that people know that a weak person may not be able to lift someone. It has been a long standing goal of the Artificial Intelligence community to simulate such commonsense reasoning abilities in machines. Over the years, many advances have been made and various challenges have been proposed to test their abilities. The Winograd Schema Challenge (WSC) is one such Natural Language Understanding (NLU) task which was also proposed as an alternative to the Turing Test. It is made up of textual question answering problems which require resolution of a pronoun to its correct antecedent. In this thesis, two approaches of developing NLU systems to solve the Winograd Schema Challenge are demonstrated. To this end, a semantic parser is presented, various kinds of commonsense knowledge are identified, techniques to extract commonsense knowledge are developed and two commonsense reasoning algorithms are presented. The usefulness of the developed tools and techniques is shown by applying them to solve the challenge.<br>Dissertation/Thesis<br>Doctoral Dissertation Computer Science 2019
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Winogrand"

1

Wilner, Stack Trudy, and University of Arizona. Center for Creative Photography. Garry Winogrand Archive., eds. Winogrand: 1964. Arena Editions, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1928-1984, Winogrand Garry, San Francisco Museum of Modern Art, and National Gallery of Art (U.S.), eds. Garry Winogrand. San Francisco Museum of Modern Art, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bone, Sebastian. Garry Winogrand - the social and formal. Derbyshire College of Higher Education, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Szarkowski, John. Winogrand: Figments from the real world. Museum of Modern Art, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

John, Szarkowski, and Museum of Modern Art (New York, N.Y.), eds. Winogrand: Figments from the real world. Museum of Modern Art, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Software design and usability: Talks with Bonnie Nardi, Jakob Nielsen, David Smith, Austin Henderson & Jed Harris, Terry Winograd, Stephanie Rosenbaum. Copenhagen Business School, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Szarkowski, John. Winogrand. Museum of Modern Art, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stack, Trudy Wilner. Winogrand 1964. Arena Editions, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Whiteread, Rachel. Garry Winogrand: Public Relations. The Museum of Modern Art, New York, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Szarkowski, John. Garry Winogrand: The Animals. The Museum of Modern Art, New York, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Winogrand"

1

Wei, Hui, Enjie Liu, Youbing Zhao, and Hongqing Yu. "Efficient Non-fused Winograd on GPUs." In Advances in Computer Graphics. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61864-3_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Zelong, Qiang Lan, Hongjun He, and Chunyuan Zhang. "Winograd Algorithm for 3D Convolution Neural Networks." In Artificial Neural Networks and Machine Learning – ICANN 2017. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68612-7_69.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fähndrich, Johannes, Sabine Weber, and Hannes Kanthak. "A Marker Passing Approach to Winograd Schemas." In Semantic Technology. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04284-4_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Zhi-Gang, and Matthew Mattina. "Efficient Residue Number System Based Winograd Convolution." In Computer Vision – ECCV 2020. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58529-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Barabasz, Barbara, and David Gregg. "Winograd Convolution for DNNs: Beyond Linear Polynomials." In Lecture Notes in Computer Science. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-35166-3_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dumas, Jean-Guillaume, Pascal Lafourcade, Julio Lopez Fenner, et al. "Secure Multiparty Matrix Multiplication Based on Strassen-Winograd Algorithm." In Advances in Information and Computer Security. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-26834-3_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Rathkanthiwar, Shubhangi, Sandeep Kakde, Rajesh Thakare, Rahul Kamdi, and Shailesh Kamble. "High Performance DFT Architectures Using Winograd Fast Fourier Transform Algorithm." In Advances in Intelligent Systems and Computing. Springer India, 2016. http://dx.doi.org/10.1007/978-81-322-2755-7_58.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Isaak, Nicos, and Loizos Michael. "WinoFlexi: A Crowdsourcing Platform for the Development of Winograd Schemas." In AI 2019: Advances in Artificial Intelligence. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-35288-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gall, Francois Le, and Florent Urrutia. "Improved Rectangular Matrix Multiplication using Powers of the Coppersmith-Winograd Tensor." In Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms. Society for Industrial and Applied Mathematics, 2018. http://dx.doi.org/10.1137/1.9781611975031.67.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Isaak, Nicos, and Loizos Michael. "Blending NLP and Machine Learning for the Development of Winograd Schemas." In Lecture Notes in Computer Science. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-71158-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Winogrand"

1

Cozman, Fábio, and Hugo Munhoz. "The Winograd Schemas from Hell." In Encontro Nacional de Inteligência Artificial e Computacional. Sociedade Brasileira de Computação - SBC, 2020. http://dx.doi.org/10.5753/eniac.2020.12157.

Full text
Abstract:
The Winograd Challenge has been advocated as a test of computer understanding with respect to commonsense reasoning. The challenge is based on Winograd Schemas: sentences that contain correferential ambiguities. Most Winograd Schemas are relatively easy for human subjects, and today the best computer systems for the Winograd Challenge can work close to human performance. In this paper, we examine the assumptions behind the Winograd Challenge, and investigate how far we can push the difficulty level of Winograd Schemas, proposing various strategies to build really challenging schemas.
APA, Harvard, Vancouver, ISO, and other styles
2

Melo, Gabriela, Vinicius Imaizumi, and Fábio Cozman. "Winograd Schemas in Portuguese." In Encontro Nacional de Inteligência Artificial e Computacional. Sociedade Brasileira de Computação - SBC, 2019. http://dx.doi.org/10.5753/eniac.2019.9334.

Full text
Abstract:
The Winograd Schema Challenge has become a common benchmark for question answering and natural language processing. The original set of Winograd Schemas was created in English; in order to stimulate the development of Natural Language Processing in Portuguese, we have developed a set of Winograd Schemas in Portuguese. We have also adapted solutions proposed for the English-based version of the challenge so as to have an initial baseline for its Portuguese-based version; to do so, we created a language model for Portuguese based on a set of Wikipedia documents.
APA, Harvard, Vancouver, ISO, and other styles
3

Williams, Virginia Vassilevska. "Multiplying matrices faster than coppersmith-winograd." In the 44th symposium. ACM Press, 2012. http://dx.doi.org/10.1145/2213977.2214056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Minsik, Cheonjun Park, Sungjun Kim, Taeyoung Hong, and Won Woo Ro. "Efficient Dilated-Winograd Convolutional Neural Networks." In 2019 IEEE International Conference on Image Processing (ICIP). IEEE, 2019. http://dx.doi.org/10.1109/icip.2019.8803277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yan, Da, Wei Wang, and Xiaowen Chu. "Optimizing batched winograd convolution on GPUs." In PPoPP '20: 25th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming. ACM, 2020. http://dx.doi.org/10.1145/3332466.3374520.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Hongming, and Yangqiu Song. "A Distributed Solution for Winograd Schema Challenge." In ICMLC 2018: 2018 10th International Conference on Machine Learning and Computing. ACM, 2018. http://dx.doi.org/10.1145/3195106.3195127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ciucanu, Radu, Matthieu Giraud, Pascal Lafourcade, and Lihua Ye. "Secure Strassen-Winograd Matrix Multiplication with MapReduce." In 16th International Conference on Security and Cryptography. SCITEPRESS - Science and Technology Publications, 2019. http://dx.doi.org/10.5220/0007916302200227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wu, Ruofan, Feng Zhang, Zhen Zheng, Xiaoyong Du, and Xipeng Shen. "Exploring deep reuse in winograd CNN inference." In PPoPP '21: 26th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming. ACM, 2021. http://dx.doi.org/10.1145/3437801.3441588.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Haokun, William Huang, Dhara Mungra, and Samuel R. Bowman. "Precise Task Formalization Matters in Winograd Schema Evaluations." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. http://dx.doi.org/10.18653/v1/2020.emnlp-main.664.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Amsili, Pascal, and Olga Seminck. "A Google-Proof Collection of French Winograd Schemas." In Proceedings of the 2nd Workshop on Coreference Resolution Beyond OntoNotes (CORBON 2017). Association for Computational Linguistics, 2017. http://dx.doi.org/10.18653/v1/w17-1504.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography