From 88f02990f209c53a1b0151c6d9062aabdef1a8e3 Mon Sep 17 00:00:00 2001 From: Mount-Blanc <102170589+Mount-Blanc@users.noreply.github.com> Date: Mon, 23 May 2022 17:26:35 -0700 Subject: [PATCH] Remove what is Aesara section --- README.rst | 35 ----------------------------------- 1 file changed, 35 deletions(-) diff --git a/README.rst b/README.rst index a06e0379d0..c47a1af859 100644 --- a/README.rst +++ b/README.rst @@ -11,41 +11,6 @@ efficiently evaluate mathematical expressions involving multi-dimensional arrays. -What is |Project Name| -======== - -Aesara is a fork of Theano, and Theano was commonly referred to as a "deep learning" (DL) library, but Aesara is not a DL library. - -Designations like "deep learning library" reflect the priorities/goals of a library; specifically, that the library serves the purposes of DL and its computational needs. Aesara is not explicitly intended to serve the purpose of constructing and evaluating DL models, but that doesn't mean it can't serve that purpose well. - -As far as designations or labels are concerned, instead of describing our project's priorities/goals based on an area of study or application (e.g. DL, machine learning, statistical modeling, etc.), we prefer to focus on the functionality that Aesara is expected to provide, and that's primarily symbolic tensor computations. - -The designation "tensor library" is more apt, but, unlike most other tensor libraries (e.g. TensorFlow, PyTorch, etc.), Aesara is more focused on what one might call the symbolic functionality. - -As a library, Aesara focuses on and advocates the extension of its core offerings, which are as follows: - -* a framework for flexible graph-based representations of computations, - * E.g. the construction of custom ``Type`` s, ``Variable`` s, ``Op`` s, and lower-level graph elements - -* implementations of basic tensor objects and operations, - * E.g. ``Type``, ``Variable``, and ``Op`` implementations that mirror "tensor"-based NumPy and SciPy offerings, and their gradients - -* graph analysis and rewriting, - * E.g. the general manipulation of graphs for the purposes of "optimization", automation, etc. - -* and code transpilation. - * E.g. the conversion of graphs into performant code via other target languages/representations - -Most tensor libraries perform these operations to some extent, but many do not expose the underlying operations for use at any level other than internal library development. Furthermore, when they do, many libraries cross a large language barrier that unnecessarily hampers rapid development (e.g. moving from Python to C++ and back). - -For most tensor libraries, a NumPy-like interface to compiled tensor computations is the primary/only offering of the library. Aesara takes the opposite approach and views all the aforementioned operations as part of the core offerings of the library, but it also stitches them together so that the library can be used like other tensor libraries. - -There are some concrete reasons for taking this approach, and one is the representation and construction of efficient domain-specific symbolic computations. If you follow the history of this project, you can see that it grew out of work on PyMC, and PyMC is a library for domain-specific (i.e. probabilistic modeling) computations. Likewise, the other ``aesara-devs`` projects demonstrate the use of Aesara graphs as an intermediate representation (IR) for a domain-specific language/interface (e.g. `aeppl `_ provides a graph representation for a PPL) and advanced automations based on IR (e.g. `aemcmc `_ as a means of constructing custom samplers from IR, ``aeppl`` as a means of automatically deriving log-probabilities for basic tensor operations represented in IR). - -This topic is a little more advanced and doesn't really have parallels in other tensor libraries, but it's one of the things that Aesara uniquely facilitates. - -The PyMC/probabilistic programming connection is similar to the DL connection Theano had, but—unlike Theano—we don't want Aesara to be conflated with one of its domains of application—like probabilistic modeling. Those primary domains of application will always have some influence on the development of Aesara, but that's also why we need to avoid labels/designations like "deep learning library" and focus on the functionality, so that we don't unnecessarily compromise Aesara's general applicability, relative simplicity, and/or prevent useful input/collaboration from other domains. - Features ========