polyhedral model schedule tree

前言

当前主流的基于polyhedral model的deep learning compiler 例如Tensor Comprehensions(Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions)都用到了schedule tree技术,这里简单介绍。


from: http://impact.gforge.inria.fr/impact2017/papers/impact-17-a-general-compilation-algorithm-to-parallelize-and-optimize-counted-loops-with-dynamic-data-dependences_slides.pdf

 

Schedule tree节点类型

Filter node:选择部分statement instances来做schedule

Band node:节点内所有statement instances都进行schedule

sequence与set node:子节点分别是有序和无序执行

Domain node和Context node比较容易理解,上面图中解释已经很清晰了。

mark node:A mark node allows the user to mark specific subtrees of the schedule tree.

extension node:introduce auxiliary computations that are not part of the original iteration domain, which is useful for, e.g., introducing statements copying data to and from shared memory.

 

Example (from tensor comprehensions): 

Schedule tree operation

参考文献Schedule Trees

 

注:本文主要内容摘自/改编自文中已注释的和下面列出的参考文献等,欢迎大家批评指正。

S. Verdoolaege, S. Guelton, T. Grosser, and A. Cohen. Schedule Trees. In 4th Workshop on
Polyhedral Compilation Techniques (IMPACT, Associated with HiPEAC), page 9, Vienna, Austria,
Jan. 2014.http://impact.gforge.inria.fr/impact2014/papers/impact2014-verdoolaege.pdf

Oleksandr Zinenko, Lorenzo Chelini, Tobias Grosser. Declarative Transformations in the Polyhedral Model. [Research Report] RR-9243, Inria; ENS Paris - Ecole Normale Supérieure de Paris; ETH Zurich; TU Delft; IBM Zürich. 2018. ffhal-01965599f. https://hal.inria.fr/hal-01965599/document

Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions

Polyhedral AST Generation Is More Than Scanning Polyhedra.
Polyhedral Parallel Code Generation for CUDA

https://dl.acm.org/doi/pdf/10.1145/2400682.2400713

Diesel: DSL for linear algebra and neural net computations on gpus

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Luchang-Li

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值