7月13日平凡又不平凡_平凡的改进,影响很大

7月13日平凡又不平凡

7月13日平凡又不平凡

I’m always happy when tooling catches up to the problems we have to solve. Sure, you can figure things out through trial and error (and to be honest, a lot of debugging is still very much that, and I don’t think that’s likely to change), but quality tools can help eliminate the guesswork and provide a shortcut of sorts.

当工具赶上我们必须解决的问题时,我总是很高兴。 当然,您可以通过反复试验弄清楚(老实说,很多调试仍然很多,而且我认为这不太可能改变),但是高质量的工具可以帮助您消除猜测,并提供各种捷径。

Here’s an example.

这是一个例子。

A couple of months ago, I was helping one company boost its performance online. We addressed plenty of front-end optimizations, but one of the most troubling issues was their Time to First Byte (TTFB).

几个月前,我正在帮助一家公司提高其在线绩效。 我们解决了许多前端优化问题,但最令人困扰的问题之一是它们的首字节时间(TTFB)。

The cache hit was very low (a topic for another day), and whenever the server had to provide a fresh version of the page, it was taking an unacceptably long time more often than not. This particular company was using Shopify, so it wasn’t a matter of tuning servers or CDN settings somewhere but figuring out what was taking so long for Shopify to render the Liquid templates necessary to serve the page.

高速缓存命中率非常低(第二天是一个话题),每当服务器必须提供页面的新版本时,它花费的时间往往是不可接受的,而且经常是这样。 这家特定的公司使用Shopify,因此在某个地方调整服务器或CDN设置不是问题,而是弄清楚Shopify呈现该页面所需的Liquid模板所需的时间。

Interestingly, the synthetic data was pretty much smoothing over the TTFB issues entirely. Synthetic tests, even properly tuned to match device and network characteristics, only occasionally surfaced the issue. The RUM data made it clear as day (and it was easily reproduced in individual testing).

有趣的是,综合数据在整个TTFB问题上都非常平滑。 综合测试,甚至经过适当调整以匹配设备和网络特性,也只会偶尔出现此问题。 RUM数据使一天变得清晰(并且可以在单独的测试中轻松复制)。

Looking at the page itself, we noticed a block of JSON data on product pages. This JSON data was very large, providing detailed product information for all the products in the catalog. It added a lot of size to the HTML (anywhere from 50-100kb depending on the page), and we suspected it was a big part of the server delay as well. Just how much of the delay it accounted for, and whether or not it was the primary culprit, we weren’t sure. Trial and error is fine, and often the only way forward, but it’s always nice to have some definitive evidence to guide decisions.

在查看页面本身时,我们注意到产品页面上有一个JSON数据块。 该JSON数据非常大,可提供目录中所有产品的详细产品信息。 它为HTML增加了很多大小(取决于页面,大小从50-100kb不等),我们怀疑这也是服务器延迟的很大一部分。 我们不确定造成延迟的原因有多大,以及是否是主要的罪魁祸首。 反复试验是好的,并且通常是唯一的解决方法,但是拥有一些确定的证据来指导决策总是很高兴的。

I pinged someone I know at Shopify around this time, and they hooked me up with a beta version of a new profiler they built for analyzing Shopify Liquid rendering times. (The profiler has since been released as a Chrome extension, so it’s a bit easier to get up and running now.) The profiler takes a JSON object of data about how long Shopify spent rendering the Liquid template and then presents that data in the form of a flame graph.

大约在这个时候,我在Shopify上对我认识的人进行了ping操作,他们为我建立了一个用于分析Shopify Liquid渲染时间的新探查器的Beta版本。 (探查器已作为Chrome扩展程序发布,因此现在起来启动和运行起来要容易一些。)探查器获取一个JSON对象的数据,该数据涉及Shopify渲染Liquid模板花费的时间,然后以表格形式显示该数据。 火焰图

Sure enough, running the profiler showed that the creation of this JSON object was a significant bottleneck—the primary bottleneck, in fact. The image below is from a profile where the template rendering took 3.8 seconds.

果然,运行探查器表明创建此JSON对象是一个重大瓶颈,实际上是主要瓶颈。 下图来自配置文件,模板渲染花费了3.8秒。

The Shopify profiler shows how much time Shopify spends on each part of a given template. Here, we see including the template that creates a JSON object takes 2.8s.

The Shopify profiler shows how much time Shopify spends on each part of a given template. Here, we see including the template that creates a JSON object takes 2.8s.

Shopify探查器显示Shopify在给定模板的每个部分上花费了多少时间。 在这里,我们看到包含创建JSON对象的模板的时间为2.8s。

If you don’t yet speak flame graph, here’s what it’s telling you.

如果您还不说火焰图,那么这就是在告诉您。

First, on the bottom, is all the work for the Page itself. Everything above that are tasks that had to complete as part of the work to render that page.

首先,在页面底部是Page本身的所有工作。 以上所有内容都是必须完成的工作,才能呈现该页面。

The part highlighted, which is showing 2.8 samples (seconds), is the time it took to handle a particular include in the theme. As you move further up the stack, you can see there’s an if statement (if template.name == "collection") that triggers the creation of our JSON object (collection.prodcuts | json).

突出显示的部分显示了2.8个样本(秒),是处理主题中特定内容所花费的时间。 当您进一步向上移动时,可以看到有一个if语句( if template.name == "collection" )触发创建我们的JSON对象( collection.prodcuts | json )。

The width of that final chunk of work to create the JSON object is nearly the same as the width for all work associated with that include, indicating that’s where the bulk of that time is coming from.

用于创建JSON对象的最后工作块的宽度几乎与与之相关联的所有工作的宽度相同,表明那是大部分时间的来源。

We could have put a cap on how many products would be returned in the JSON object, or maybe stripped out some of the data. Either would have made that process faster (and reduced the weight passed over the network as well). We didn’t have to go through the trouble, though. As it turns out, it was for an analytics service the company was no longer using. We removed the creation of the JSON object altogether and watched the TTFB decrease dramatically—to the tune of 50-60% at the median.

我们可以对JSON对象中返回多少产品设置上限,或者可以剔除一些数据。 两种方法都可以使该过程更快(并减少通过网络传递的权重)。 不过,我们不必经历麻烦。 事实证明,这是该公司不再使用的分析服务。 我们完全删除了JSON对象的创建,并观察了TTFB的急剧下降-中位数达到50-60%。

From a technical perspective, the optimization isn’t altogether that interesting—there’s nothing super exciting about commenting out a block of JSON. But, to me, the fact that the fix was so boring is precisely what makes it interesting.

从技术角度来看,优化并不完全有趣,注释掉一块JSON并没有什么超级令人兴奋的事情。 但是,对我而言,修复是如此无聊的事实正是使它变得有趣的原因。

The optimization was a huge savings just quietly waiting for someone to find it. If we only had synthetic testing to look at, we would have missed it entirely. As I noted earlier, synthetic tests only rarely showed the long TTFB’s. It took RUM data to both surface the problem, and make it clear to us just how frequently the issue was occurring.

仅在静默等待某人找到它的情况下,优化是一笔巨大的节省。 如果只考虑综合测试,我们将完全错过它。 如前所述,综合测试很少会显示较长的TTFBRUM数据既暴露了问题, 又向我们表明了问题发生的频率。

Even after the problem was discovered, identifying the fix required better visibility into the work happening under the hood. Through digging through the completed HTML, we were able to come up with a reasonable guess as to what was causing the problem. But, the availability of a tool with a bit more precision was able to tell us exactly what we were dealing with (and would have no doubt saved us some time if we had it from the beginning).

即使在发现问题之后,识别修补程序也需要更好地了解引擎盖下正在进行的工作。 通过挖掘完整HTML,我们能够对引起问题的原因做出合理的猜测。 但是,使用精度更高的工具可以准确地告诉我们我们正在处理的内容(并且如果从一开始就拥有它,无疑会为我们节省一些时间)。

The combination of quality monitoring and quality tooling is one that tends to have powerful ripple effects. Browsing around Shopify sites, I see a lot of long TTFB issues, and I’m confident that profiling the templates for a few of these pages would surface plenty of areas for quick improvement. If Shopify were also able to find a way to better surface cache hit ratio’s and what exactly is triggering cache invalidation, I suspect a bunch of common patterns would emerge. These patterns would then lead to both individual companies using the platform and likely Shopify itself, identifying changes that would reap huge rewards for performance (and revenue).

质量监控和质量工具的结合往往会产生强大的连锁React。 在Shopify网站上浏览时,我看到了很多长期的TTFB问题,而且我有信心对其中一些页面的模板进行分析将为快速改进铺平很多领域。 如果Shopify还能够找到一种方法来改善表面缓存命中率,以及究竟是什么触发缓存失效,那么我怀疑会出现一堆常见的模式。 然后,这些模式将导致两家公司都使用该平台,并可能导致Shopify自身使用,从而识别出可以为性能(和收入)带来丰厚回报的变化。

It’s not always clever tricks and browser intricacies (though those are always fun) that lead to performance improvements. Often, it’s far more mundane tasks—cleaning up here, adjusting there—just waiting for the right combination of tooling and monitoring to make them apparent.

并非总是聪明的窍门和复杂的浏览器(尽管它们总是很有趣)可以提高性能。 通常,要等到工具和监控正确组合以使其明显时,这才是平凡的任务-在这里进行清理,在那里进行调整。

翻译自: https://timkadlec.com/remembers/2020-04-13-mundane-improvements-big-impact/

7月13日平凡又不平凡

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值