Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
copied from cf-staging / textgradAn autograd engine -- for textual gradients!
TextGrad is a powerful framework building automatic ``differentiation'' via text. TextGrad implements backpropagation through text feedback provided by LLMs, strongly building on the gradient metaphor
We provide a simple and intuitive API that allows you to define your own loss functions and optimize them using text feedback. This API is similar to the Pytorch API, making it simple to adapt to your usecases.
:fire: The conda-forge recipe was generated with Conda-Forger App.