Ajay Patel

Posts with the tag  nlp

Learning Interpretable Embeddings via LLMs
Machine Learning · November 11, 2023
Our paper at EMNLP 2023, “Learning Interpretable Style Embeddings via Prompting LLMs” used GPT-3 to generate a synthetic dataset used for training interpretable text style embeddings, where each dimension of the embedding vector has a meaningful interpretation to a human.
ICLR 2023: Is GPT the Wrong Architecture?
Machine Learning · December 21, 2022
Our paper at ICLR 2023, “Bidirectional Language Models Are Also Few-shot Learners” surprisingly discovers that older models like T5, that predate GPT-3, were promptable and could perform in-context learning.
Language Comprehension is NP‑Complete
Machine Learning · October 10, 2020

Making sense of language is an NP-complete problem, which seems like something you might intuitively guess, but you can prove this by reducing the well known NP-complete problem of graph $k$-coloring to the anaphora agreement problem in linguistics.

Relationship between Cosine Similarity and Euclidean Distance
Machine Learning · May 27, 2020

For unit-length vectors, both the cosine similarity and Euclidean distance measures can be used for ranking with the same order. In fact, you can directly convert between the two.

Currently in 🔔 Philadelphia, PA, USA 🇺🇸
Copyright © 2024 Ajay Patel. All rights reserved.