Skip to content

Commit

Permalink
Enhance transform and plot docs (#88)
Browse files Browse the repository at this point in the history
* Add bibtex citation of google quantum advantage paper

* Enhance plot docstring

* Change citation

* Enhance plot docstring

* add another citation

* Resolve conflicts

* Resolve conflicts

* Resolve conflicts

* Apply @mofeing suggestions from code review

Co-authored-by: Sergio Sánchez Ramírez <[email protected]>

* Update TenetMakieExt.jl

* Update TenetMakieExt.jl

* Update transformations.md

---------

Co-authored-by: Sergio Sánchez Ramírez <[email protected]>
  • Loading branch information
jofrevalles and mofeing authored Sep 15, 2023
1 parent ef96be2 commit c0c3c5c
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 5 deletions.
10 changes: 10 additions & 0 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -95,4 +95,14 @@ @misc{cotengra
howpublished={https://github.com/jcmgray/cotengra},
url={https://github.com/jcmgray/cotengra},
}
@article{arute2019quantum,
title={Quantum supremacy using a programmable superconducting processor},
author={Arute, Frank and Arya, Kunal and Babbush, Ryan and Bacon, Dave and Bardin, Joseph C and Barends, Rami and Biswas, Rupak and Boixo, Sergio and Brandao, Fernando GSL and Buell, David A and others},
journal={Nature},
volume={574},
number={7779},
pages={505--510},
year={2019},
publisher={Nature Publishing Group}
}
}
4 changes: 2 additions & 2 deletions docs/src/transformations.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ In tensor network computations, it is good practice to apply various transformat

A crucial reason why these methods are indispensable lies in their ability to drastically reduce the problem size of the contraction path search and also the contraction. This doesn't necessarily involve reducing the maximum rank of the Tensor Network itself, but more importantly, it reduces the size (or rank) of the involved tensors.

Our approach has been significantly inspired by the ideas presented in the [Quimb](https://quimb.readthedocs.io/) library, explained in [this paper](https://arxiv.org/pdf/2002.01935.pdf).
Our approach is based in [gray2021hyper](@cite), which can also be found in [quimb](https://quimb.readthedocs.io/).

In Tenet, we provide a set of predefined transformations which you can apply to your `TensorNetwork` using both the `transform`/`transform!` functions.

Expand Down Expand Up @@ -249,7 +249,7 @@ fig #hide

## Example: RQC simplification

Here we show how can we reduce the complexity of the tensor network by applying a tranformation to it. We take as an example the Sycamore circuit from the [Google's quantum supremacy paper](https://www.nature.com/articles/s41586-019-1666-5)
Local transformations can dramatically reduce the complexity of tensor networks. Take as an example the Random Quantum Circuit circuit on the Sycamore chip from Google's quantum advantage experiment [arute2019quantum](@cite).

```@setup plot
using Makie
Expand Down
5 changes: 2 additions & 3 deletions ext/TenetMakieExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,8 @@ Plot a [`TensorNetwork`](@ref) as a graph.
# Keyword Arguments
- `inds` Whether to show the index labels. Defaults to `false`.
- `layout` Algorithm used to map graph vertices to a (2D or 3D) coordinate system.
The algorithms implemented in the `NetworkLayout` package are recommended.
- `labels` If `true`, show the labels of the tensor indices. Defaults to `false`.
- The rest of `kwargs` are passed to `GraphMakie.graphplot`.
"""
function Makie.plot(tn::TensorNetwork; kwargs...)
f = Figure()
Expand Down

0 comments on commit c0c3c5c

Please sign in to comment.