Name
Affiliation
Papers
VIVEK SRIKUMAR
Department of Computer Science, University of Illinois at Urbana-Champaign
70
Collaborators
Citations 
PageRank 
157
511
38.58
Referers 
Referees 
References 
1738
1116
584
Search Limit
1001000
Title
Citations
PageRank
Year
Is My Model Using The Right Evidence? Systematic Probes for Examining Evidence-Based Tabular Reasoning.00.342022
PYLON: A PyTorch Framework for Learning with Constraints.00.342022
Supertagging the Long Tail with Tree-Structured Decoding of Complex Categories00.342021
OSCaR - Orthogonal Subspace Correction and Rectification of Biases in Word Embeddings.00.342021
Database Workload Characterization with Query Plan Encoders00.342021
A Visual Tour of Bias Mitigation Techniques for Word Representations00.342021
Putting Words in BERT's Mouth - Navigating Contextualized Vector Spaces with Pseudowords.00.342021
An Interactive Visual Demo of Bias Mitigation Techniques for Word Representations From a Geometric Perspective.00.342021
Pylon: A PyTorch Framework for Learning with Constraints.00.342021
Evaluating Relaxations of Logic for Neural Networks - A Comprehensive Study.00.342021
Incorporating External Knowledge to Enhance Tabular Reasoning00.342021
Bert & Family Eat Word Salad: Experiments With Text Understanding00.342021
DirectProbe: Studying Representations without Classifiers00.342021
Learning Constraints for Structured Prediction Using Rectifier Networks00.342020
UNQOVERing Stereotyping Biases via Underspecified Questions00.342020
On Measuring And Mitigating Biased Inferences Of Word Embeddings00.342020
Secrets in Source Code: Reducing False Positives using Machine Learning00.342020
Structured Tuning for Semantic Role Labeling00.342020
Learning In Practice: Reasoning About Quantization.00.342019
A Logic-Driven Framework for Consistency of Neural Models00.342019
Beyond Context: A New Perspective for Word Embeddings.00.342019
Amazon at MRP 2019 - Parsing Meaning Representations with Lexical and Phrasal Anchoring.00.342019
On the Limits of Learning to Actively Learn Semantic Representations00.342019
NLIZE: A Perturbation-Driven Visual Interrogation Tool for Analyzing and Interpreting Natural Language Inference Models.60.442019
Comprehensive Supersense Disambiguation Of English Prepositions And Possessives00.342018
Visual Interrogation of Attention-Based Models for Natural Language Inference and Machine Comprehension.30.402018
CogCompNLP: Your Swiss Army Knife for NLP.10.352018
Newton: Gravitating Towards the Physical Limits of Crossbar Acceleration.00.342018
Visual Exploration of Semantic Relationships in Neural Word Embeddings.110.522018
Comprehensive Supersense Disambiguation of English Prepositions and Possessives.00.342018
Learning to Speed Up Structured Output Prediction.00.342018
Adposition Supersenses v2.00.342017
An Algebra For Feature Extraction00.342017
DeepLog: Anomaly Detection and Diagnosis from System Logs through Deep Learning.822.402017
Coping with Construals in Broad-Coverage Semantic Annotation of Adpositions.00.342017
Double Trouble: The Problem of Construal in Semantic Annotation of Adpositions.00.342017
A corpus of preposition supersenses in English web reviews.00.342016
Is Sentiment in Movies the Same as Sentiment in Psychotherapy? Comparisons Using a New Psychotherapy Sentiment Database.00.342016
ISAAC: A Convolutional Neural Network Accelerator with In-Situ Analog Arithmetic in Crossbars.1263.702016
Continuous Kernel Learning.10.352016
A Corpus of Preposition Supersenses.50.452016
Exploiting Sentence Similarities for Better Alignments.20.362016
A Corpus of Preposition Supersenses.00.342016
EDISON: Feature Extraction for NLP, Simplified.00.342016
RhymeDesign: A Tool for Analyzing Sonic Devices in Poetry40.482015
Recursive Neural Networks for Coding Therapist and Patient Behavior in Motivational Interviewing10.412015
A Hierarchy with, of, and for Preposition Supersenses.00.342015
Expressiveness of Rectifier Networks20.502015
RhymeDesign: A Tool for Analyzing Sonic Devices in Poetry.00.342015
IllinoisSL: A JAVA Library for Structured Prediction.20.402015
  • 1
  • 2