Name
Papers
Collaborators
HINRICH SCHÜTZE
253
286
Citations 
PageRank 
Referers 
2113
362.21
4289
Referees 
References 
2863
2227
Search Limit
1001000
Title
Citations
PageRank
Year
Towards a Broad Coverage Named Entity Resource: A Data-Efficient Approach for Many Diverse Languages.00.342022
Position Information in Transformers: An Overview.00.342022
Graph Neural Networks for Multiparallel Word Alignment00.342022
Modeling Ideological Salience and Framing in Polarized Online Groups with Graph Neural Networks and Structured Sparsity.00.342022
CaMEL: Case Marker Extraction without Labels.00.342022
Generating Datasets with Pretrained Language Models.00.342021
Semantic Text Segment Classification of Structured Technical Content.00.342021
Semi-Automated Labeling of Requirement Datasets for Relation Extraction.00.342021
Static Embeddings as Efficient Knowledge Bases?00.342021
Continuous Entailment Patterns for Lexical Inference in Context.00.342021
Self-Diagnosis and Self-Debiasing: A Proposal for Reducing Corpus-Based Bias in NLP00.342021
It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners00.342021
Explainable and Discourse Topic-aware Neural Language Understanding00.342020
TopicBERT for Energy Efficient Document Classification00.342020
Monolingual and Multilingual Reduction of Gender Bias in Contextualized Representations.00.342020
Rare Words: A Major Problem For Contextualized Embeddings And How To Fix It By Attentive Mimicking00.342020
SimAlign - High Quality Word Alignments without Parallel Training Data using Static and Contextualized Embeddings.00.342020
Fine-Grained Argument Unit Recognition And Classification10.352020
Trendnert: A Benchmark For Trend And Downtrend Detection In A Scientific Domain00.342020
Are Pretrained Language Models Symbolic Reasoners over Knowledge?00.342020
Towards Summarization for Social Media - Results of the TL;DR Challenge.00.342019
Unsupervised Text Generation from Structured Data.00.342019
Texttovec: Deep Contextualized Neural autoregressive Topic Models of Language with Distributed Compositional Prior.00.342019
Robust Argument Unit Recognition and Classification.00.342019
Multi-View Domain Adapted Sentence Embeddings for Low-Resource Unsupervised Duplicate Question Detection00.342019
SMAPH: A Piggyback Approach for Entity-Linking in Web Queries.00.342019
SherLIiC: A Typed Event-Focused Lexical Inference Benchmark for Evaluating Natural Language Inference.00.342019
Rare Words: A Major Problem for Contextualized Embeddings And How to Fix it by Attentive Mimicking.00.342019
Evaluating neural network explanation methods using hybrid documents and morphological prediction.00.342018
Neural Transductive Learning and Beyond: Morphological Generation in the Minimal-Resource Setting.00.342018
textTOvec: Deep Contextualized Neural Autoregressive Models of Language with Distributed Compositional Prior.10.352018
News Article Teaser Tweets and How to Generate Them.00.342018
A Stronger Baseline for Multilingual Word Embeddings.10.352018
Document Informed Neural Autoregressive Topic Models.00.342018
Attentive Convolution: Equipping CNNs with RNN-style Attention Mechanisms.20.372018
Interpretable Textual Neuron Representations for NLP.00.342018
Neural Semi-Markov Conditional Random Fields for Robust Character-Based Part-of-Speech Tagging.00.342018
End-Task Oriented Textual Entailment via Deep Explorations of Inter-Sentence Interactions.40.382018
Joint Bootstrapping Machines for High Confidence Relation Extraction.00.342018
Embedding Learning Through Multilingual Concept Induction00.342018
CIS at TAC Cold Start 2015: Neural Networks and Coreference Resolution for Slot Filling.00.342018
Joint Semantic Synthesis and Morphological Analysis of the Derived Word.30.422018
Two Methods for Domain Adaptation of Bilingual Tasks: Delightfully Simple and Broadly Applicable.00.342018
Deep Temporal-Recurrent-Replicated-Softmax for Topical Trends over Time.10.352018
Task-Specific Attentive Pooling of Phrase Alignments Contributes to Sentence Matching.00.342017
Noise Mitigation for Neural Entity Typing and Relation Extraction.00.342017
Comparative Study of CNN and RNN for Natural Language Processing.361.042017
Task-Specific Attentive Pooling of Phrase Alignments Contributes to Sentence Matching.00.342017
From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042).00.342017
AutoExtend: Combining Word Embeddings with Semantic Resources.20.472017
  • 1
  • 2