CMD + K

textgrad

Community

Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.

Installation

To install this package, run one of the following:

Conda
$conda install conda-forge::textgrad

Usage Tracking

0.1.4
1 / 8 versions selected
Downloads (Last 6 months): 0

Description

Arxiv

An autograd engine -- for textual gradients!

TextGrad is a powerful framework building automatic ``differentiation'' via text. TextGrad implements backpropagation through text feedback provided by LLMs, strongly building on the gradient metaphor

We provide a simple and intuitive API that allows you to define your own loss functions and optimize them using text feedback. This API is similar to the Pytorch API, making it simple to adapt to your usecases.

Analogy with Torch

PyPI: https://pypi.org/project/textgrad/

:fire: The conda-forge recipe was generated with Conda-Forger App.

About

Summary

Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.

Last Updated

Jun 18, 2024 at 07:26

License

MIT

Total Downloads

1.5K

Supported Platforms

noarch