Tensor Contraction (0) : Introduction to TN

# Home:Home | Tensors.net

# Overview :arxiv.org/pdf/1708.09213v4.pdf,arxiv.org/pdf/1402.0939v3.pdf

* What are tensor networks:

Are useful constructs for efficiently representing and manipulating correlated data.

Decomposing high-dimensional data (expressed as a product of many index tensors).

Directly dealing with decomposed tensor. 

Each of index tensors contains only a relatively small number of parameters.

Theoretical understanding of quantum wavefuctions of quantum entanglement.

Forming the basis of many powerful numerical simulation approaches (Datas of exp amount).

Quantum gravity and holography, error correcting codes, classical data compression, big data analysis and machine learning.

* Why Tensor Networks:

Quantum many-body systems : Encode the coefficients of a state wavefunction

Classical many-body systems : Encode statistical ensembles of microstates ( 配分函数 )

Big data analytics : multi-dimensional data in diverse branches such as Signal processing, Neuroscience, Bio-metrics, Pattern recognition.

A greatly compressed representation of a large structured dataset 'super' compression.

Potentially allow for a better characterization of structure within a data set, particularly in terms of the correlations.

Offer a distributed representation of a data set, such that many manipulations can be performed in parallel.

Allow a unified framework for manipulating large data sets.

Are often well suited for operating with noisy or missing data, as the decompositions on which they are based are typically robust.

* Basic knowledge of TensorConstraction:

    * Tensor Order (Num of legs) :

    

    * Dimension of Index (So called Rank) :

   

   * 3 ways for Contraction : 

showed by np.einsum:

​import numpy as np

d = 2

A = np.random.rand(d,d)
B = np.random.rand(d,d)
C = np.random.rand(d,d)

D0 = np.einsum('ij,ik,kl -> il',A,B,C)

showed by pure np:

​# Evaluare network via summation over internal indices
F0 = np.zeros((d,d))
for di in range(d):
    for dj in range(d):
        for dk in range(d):
            for dl in range(d):
                F0[di,dj] = F0[di,dj] + A[di,dk]*B[dk,dl]*C[dl,dj]
            
# Evaluare network via sequence of binary contractions
F1 = (A @ B) @ C​
# F1 相当于 D0

showed by ncon:

from ncon import ncon​

​D0 = np.einsum('ij,jk,kl->il', A,B,C)
F2 = ncon([A,B,C],[[-1,1],[1,2],[2,-2]],[1,2])
​
# ncon with ncon([A],[L],[O])
# A : partial tensors
# L : graph indexs('-1,-2' is often used to represent open index)
# O : contraction indexs

print(D0-F2)

* How to calculate the constraction cost:

( Actually i don't know the exact method is :(    )

  * Decompose of whole graph + Summing up the cost of each steps:

(0)O(dddd) 

(1)AXB ~ O(d^3) ; (AXB)XC ~ O(d^3) 

(2)Dd^4 + d^4D((AXB)XC)(3)ddDDd + dDDdd ((AXC)XB)

(4)ddDDd + dDDdd ((BXC)XA)

不要忽略未收缩指标的继承!!

Now we come to pb.1(a) :)

Now we come to pb.1(b) :)

showed by pure np:

import numpy as np

d = 20
A = np.random.rand(d,d,d)
B = np.random.rand(d,d,d)
C = np.random.rand(d,d,d)

##### Evaluate network via index summation
def tempfunct(A,B,C,d):
    D0 = np.zeros((d,d,d))
    for b1 in range(d):
        for a2 in range(d):
            for c3 in range(d):
                for a1 in range(d):
                    for a3 in range(d):
                        for c1 in range(d):
                            D0[b1,a2,c3] = D0[b1,a2,c3]+A[a1,a2,a3]*B[b1,a1,c1]*C[c1,a3,c3]
    
    return D0

D0 = tempfunct(A,B,C,d)
print(D0)

showed by np.permute & np.transform:

​##### Evaluate network using reshape and permute
def tempfunct2(A,B,C,d):
    Xmid = (B.transpose(0,2,1).reshape(d**2,d) @ A.reshape(d,d**2)).reshape(d,d,d,d)
    D1 = (Xmid.transpose(0,2,1,3).reshape(d**2,d**2) @ C.reshape(d**2,d)).reshape(d,d,d)

    return D1

D1 = tempfunct2(A,B,C,d)
print(D1)​​

showed by np.einsum:

​​​##### Evaluate network using np.einsum for being almost the most convinent way :)
def tempfunct3(B,A,C):
    D2 = np.einsum('ijk,jml,kln -> imn',B,A,C )
    # 注意同时针序标号就可以了!
    return D2

D2 = tempfunct3(B,A,C)
print(D2)​

# That's How we start np and quit it quick , , ,
print(D2-D1)
print(D2-D0)
print(D1-D0)

showed by ncon:

​
from ncon import ncon

##### Evaluate using ncon
D3 = ncon([A,B,C],[[1,-2,2],[-1,1,3],[3,2,-3]]

print(D3)
print(D3-D2)

* Basic Knowledge of TensorDecompositions:

(下节再说吧,还有关于剪枝一起讲)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值