Fine Grained Tensor Network Methods.
We develop a strategy for tensor network algorithms that allows to deal very efficiently with lattices of high connectivity. The basic idea is to fine-grain the physical degrees of freedom, i.e., decompose them into more fundamental units which, after a suitable coarse-graining, provide the original ones. Thanks to this procedure, the original lattice with high connectivity is transformed by an isometry into a simpler structure, which is easier to simulate via usual tensor network methods. In particular this enables the use of standard schemes to contract infinite 2d tensor networks - such as Corner Transfer Matrix Renormalization schemes - which are more involved on complex lattice structu…