vs 增加api 应用程序_如何加快应用程序的API消耗

vs 增加api 应用程序

介绍 (Introduction)

In the process of creating a PHP application you may come to a point when keeping it isolated from remote resources or services may become a barrier in its development. To move along with the project you may employ different API services to fetch remote data, connect with user accounts on other websites or transform resources shared by your application.

在创建PHP应用程序的过程中,将其与远程资源或服务保持隔离可能会达到一个关键点,这可能成为其开发的障碍。 为了与项目一起进行,您可以使用不同的API服务来获取远程数据,与其他网站上的用户帐户连接或转换应用程序共享的资源。

The ProgrammableWeb website states that there are currently more than ten thousand APIs available all over the web so you’d probably find a lot of services that can be used to extend your PHP app’s functionality. But using APIs in an incorrect way can quickly lead to performance issues and lengthen the execution time of your script. If you’re looking for a way to avoid it, consider implementing some of the solutions described in the article.

ProgrammableWeb网站指出,当前整个网络上有超过一万个API,因此您可能会发现很多可用于扩展PHP应用程序功能的服务。 但是,以错误的方式使用API​​可能会Swift导致性能问题并延长脚本的执行时间。 如果您正在寻找避免这种情况的方法,请考虑实施本文中介绍的一些解决方案。

一次发出多个请求 (Make multiple requests at a time)

alt

When a typical PHP script is being executed, the commands put in the code are run one after the other. It seems completely logical as you probably want to get the result of the previous operation (e.g. a database query or a variable manipulation) to move to the next step of the script. The same rule applies when you make an API call. You have to send a request, wait for a response from the remote host and then you can do anything with the data you received. But if your app makes several API calls and you need the data from each source to move on, you don’t have to execute each request separately. Remember that the server responsible for handling API calls is prepared to work with several queries at a time. What you need to do is just to create a script that executes API calls in parallel, not one after another. Fortunately, PHP offers a set of curl_multi functions which are designed to do it.

当执行典型PHP脚本时,代码中放置的命令将一个接一个地运行。 似乎完全合乎逻辑,因为您可能希望获得上一个操作的结果(例如,数据库查询或变量操作)以转到脚本的下一步。 进行API调用时,适用相同的规则。 您必须发送请求,等待远程主机的响应,然后您才能对收到的数据进行任何处理。 但是,如果您的应用程序进行了几次API调用,并且您需要继续使用每个来源的数据,则不必分别执行每个请求。 请记住,负责处理API调用的服务器已准备好一次处理多个查询。 您需要做的只是创建一个脚本,该脚本并行执行API调用,而不是一个接一个地执行。 幸运的是,PHP提供了一组curl_multi目的而设计的curl_multi函数。

Using curl_multi functions is similar to making typical requests in PHP with cURL library. The only difference is that you need to prepare a set of requests to execute (not just one) with the curl_init function and pass them to the curl_multi_add_handle function. Then, calling the curl_multi_exec function will execute the requests simultaneously and curl_multi_getcontent will let you get the results of each of the API call. Just read here to see a code example which implements the described logic.

使用curl_multi函数类似于使用cURL库在PHP中发出典型请求。 唯一的区别是您需要准备一组使用curl_init函数执行的请求(而不仅仅是一个请求),并将它们传递给curl_multi_add_handle函数。 然后,调用curl_multi_exec函数将同时执行请求,而curl_multi_getcontent将使您获得每个API调用的结果。 只需阅读此处即可查看实现所描述逻辑的代码示例。

If you want to employ curl_multi functions in your PHP application, there are some important points to remember. First of all, the curl_multi_exec function will take as long as the slowest API call in the set of requests passed to the curl_multi_add_handle function. Using curl_multi thus makes sense in cases where each of the API calls takes a similar amount of time. If there is one request that is significantly slower than the others in a curl_multi set, your script won’t be able to move on until that slowest request is finished.

如果要在PHP应用程序中使用curl_multi函数,请记住一些要点。 首先, curl_multi_exec函数将花费传递给curl_multi_add_handle函数的请求集中最慢的API调用时间。 因此,在每个API调用花费相似时间的情况下,使用curl_multi才有意义。 如果curl_multi集中有一个请求比其他请求明显慢,那么您的脚本将无法继续执行,直到完成最慢的请求。

What is also important, is that you need to identify the number of parallel requests that can be executed at a time. Remember that if your site handles a lot of traffic and each user triggers simultaneous API calls to one remote server, the total number of the requests being made at a time may quickly become high. Don’t forget to check the limits stated in the API documentation and get to know how the service will respond when you hit them. The remote server may send a specific HTTP response code or an error message when you hit the limits. Such cases should be properly handled by your application or put in a log so that you can diagnose the issue and lower the number of requests.

同样重要的是,您需要确定一次可以执行的并行请求的数量。 请记住,如果您的站点处理大量流量,并且每个用户触发对一台远程服务器的同时API调用,则一次发出的请求总数可能很快就会变高。 不要忘了检查API文档中规定的限制,并了解当您遇到该限制时服务将如何响应。 达到限制时,远程服务器可能会发送特定的HTTP响应代码或错误消息。 此类情况应由您的应用程序正确处理或放入日志中,以便您可以诊断问题并减少请求数量。

将API调用与应用主流程分开 (Separate API calls from the app main flow)

If you want to keep your web application responsive and avoid serving pages that load slowly, a high amount of API calls being made to a remote server may make this task lot more difficult. If all of the requests are made within the main app flow, the end user won’t see a rendered page until the PHP script receives the API responses and processes the data. Of course there are plenty of API services that are hosted on fast servers and process the requests quickly. But still, your app may occasionally get slowed down by connection lags or some random factors impacting the connection process or the remote server itself.

如果要保持Web应用程序的响应速度并避免为加载缓慢的页面提供服务,则对远程服务器进行大量的API调用可能会使此任务更加困难。 如果所有请求都是在主应用程序流程中发出的,则最终用户将不会看到渲染的页面,直到PHP脚本收到API响应并处理数据为止。 当然,有很多API服务托管在快速服务器上,可以快速处理请求。 但是,连接滞后或一些随机因素会影响连接过程或远程服务器本身,但您的应用有时仍会变慢。

If you want to protect the end user from such issues you need to separate the part of the application responsible for handling the requests from the main flow to a standalone script. It means that the API calls will be executed in a separate thread that doesn’t interfere with the part of the code responsible for displaying the site.

如果要保护最终用户免受此类问题的侵扰,则需要将负责处理请求的应用程序部分从主流转移到独立脚本中。 这意味着API调用将在一个单独的线程中执行,该线程不会干扰负责显示网站的代码部分。

To implement such a solution you may just write a separate PHP script and execute it using the exec() function, just like you would execute any command line application. Different PHP frameworks often offer modules that simplify writing command line scripts and allow you to integrate them easily with existing application models or components. Just check Symfony2 or CakePHP console components to see some examples. Various PHP platforms – not only frameworks – may also offer tools that make writing command line scripts easier, like WP CLI – a command line interface for WordPress.

要实现这样的解决方案,您可以编写一个单独PHP脚本并使用exec()函数执行它,就像执行任何命令行应用程序一样。 不同PHP框架通常提供简化了命令行脚本编写的模块,并允许您轻松地将它们与现有应用程序模型或组件集成在一起。 只需检查Symfony2CakePHP控制台组件以查看一些示例。 各种PHP平台(不仅是框架)还可以提供使编写命令行脚本更容易的工具,例如WP CLI (WordPress的命令行界面)。

If you’re looking for a more powerful way of handling API calls in a separate process, consider setting up a job server like Gearman. A job server is a complete solution that performs all the actions necessary to separate specific tasks (jobs) to independent processes. Read Alireza Rahmani Khalili’s Introduction to Gearman article to check how it works and how to implement it in PHP. If you work on the Zend Server platform, you can employ the Zend Job Queue component which offers similar functionality. Its features and usage examples are described in Scheduling with Zend Job Queue article written by Alex Stetsenko.

如果您正在寻找一种更强大的方法来处理单独的API调用,请考虑设置作业服务器,例如Gearman 。 作业服务器是一个完整的解决方案,它执行将特定任务( jobs )分离为独立进程所需的所有操作。 阅读Alireza Rahmani Khalili的Gearman简介一文,以检查它如何工作以及如何在PHP中实现它。 如果您在Zend Server平台上工作,则可以使用提供类似功能的Zend Job Queue组件。 Alex Stetsenko撰写的“ 使用Zend Job Queue进行计划”中描述了其功能和使用示例。

No matter which solution of separating API calls you choose, you have to think of a way for the different parts of your app to communicate with each other. First and foremost, you should put the data received from an API call in a place (e.g. a database table or a file) accessible by the whole app. You also have to share the status of the execution of a separate script. The main application has to know whether the API call executed externally is already in progress, has completed a while ago or has failed. If you think of employing a job server solution, it will probably offer a functionality to monitor the job status. But if you just want to stick with writing a simple PHP command line script, you will have to implement such logic by yourself.

无论选择哪种分离API调用的解决方案,都必须考虑使应用程序的不同部分相互通信的方法。 首先,您应该将从API调用中接收的数据放在整个应用可访问的位置(例如数据库表或文件)。 您还必须共享执行单独脚本的状态。 主应用程序必须知道从外部执行的API调用是否已经在进行中,是否已经完成了一段时间或已失败。 如果您考虑采用作业服务器解决方案,它可能会提供监视作业状态的功能。 但是,如果您只想坚持编写简单PHP命令行脚本,则必须自己实现这种逻辑。

Multiple HTTP requests or multiple threads? So which solution is better – employing curl_multi functions to execute several HTTP requests at a time or separating API calls from the main app flow? Well, it depends on the context in which the remote server is being queried. You may find out that the whole API calls handling script takes long not only because of the requests being made. There may be also an extensive code responsible for dealing with the received data, especially when it includes transforming files or making heavy database writes. In such cases using the curl_multi functions probably won’t be sufficient to speed up your app. Running a separate thread responsible for the whole operation, along with processing the data received from a remote host, may result in achieving better results in terms of the performance of your app. On the other hand, if you need to execute a lot of simple API calls which doesn’t involve heavy data processing at your side, sticking with the curl_multi functions will probably be enough to make your app faster.

多个HTTP请求或多个线程? 那么,哪种解决方案更好–使用curl_multi函数一次执行多个HTTP请求或将API调用与主应用程序流分开? 好吧,这取决于要查询远程服务器的上下文。 您可能会发现,整个API调用处理脚本不仅花费很长时间,还因为发出了请求。 可能还有大量代码负责处理接收到的数据,尤其是当它包括转换文件或进行大量数据库写操作时。 在这种情况下,使用curl_multi函数可能不足以加速您的应用程序。 运行负责整个操作的单独线程,以及处理从远程主机接收的数据,可能会导致在应用程序性能方面获得更好的结果。 另一方面,如果您需要执行很多简单的API调用,而这些调用不涉及大量的数据处理,那么坚持使用curl_multi函数可能足以使您的应用程序更快。

And of course there is a third solution – mixing the two ways described above. So you can run a separate thread responsible for dealing with API calls and then try to make it run faster by making multiple requests at a time. It may be more efficient than executing a separate script for each request. But it may also require a deeper analysis on how to design the flow of the script in a way that different script executions and different API calls executed at once don’t interfere with each other and don’t duplicate each other’s job.

当然,还有第三个解决方案–混合上述两种方式。 因此,您可以运行一个单独的线程来处理API调用,然后通过一次发出多个请求来尝试使其更快地运行。 它可能比为每个请求执行单独的脚本更有效。 但这还可能需要对如何设计脚本流进行更深入的分析,以使不同的脚本执行和同时执行的不同API调用不会相互干扰,也不会互相复制工作。

构建一个智能缓存引擎 (Build a smart cache engine)

Another solution to speed up an application that relies heavily on API usage is building a smart caching engine. It may prevent your script from making calls which are unnecessary as the content located on a different server hasn’t changed. Proper caching can also reduce the amount of data transferred between the servers in a single API call.

加快严重依赖API使用率的应用程序的另一种解决方案是构建智能缓存引擎。 这可能会阻止您的脚本进行不必要的调用,因为位于其他服务器上的内容没有更改。 适当的缓存还可以减少在单个API调用中在服务器之间传输的数据量。

To write a cache engine that works properly and returns valid data, you need to identify the cases in which the response from a remote server doesn’t change so there’s no need to fetch it every time. It will probably differ depending on a specific API service but the general idea is to find a set of parameters (which are being passed in the request) which give the same response in a given time period. For example, if you fetch daily currency exchange rates from a remote service, you can be sure that the exchange rate for a given currency (which is the parameter) stays the same for the whole day. So the cache key for storing data received from this particular API has to contain both the currency and the date. If your app will have to fetch this specific exchange rate next time, you can refer to the data saved in cache (e.g. in a database or a file) and avoid making an HTTP request.

要编写可正常工作并返回有效数据的缓存引擎,您需要确定远程服务器的响应没有变化的情况,因此无需每次都获取它。 根据特定的API服务,它可能会有所不同,但是一般的想法是找到一组参数(在请求中传递),这些参数在给定的时间段内给出相同的响应。 例如,如果您从远程服务获取每日货币汇率,则可以确保给定货币(这是参数)的汇率在一整天内保持不变。 因此,用于存储从此特定API接收到的数据的缓存键必须同时包含货币和日期。 如果您的应用下次必须提取此特定汇率,则可以引用保存在缓存中(例如数据库或文件中)的数据,并避免发出HTTP请求。

The scenario described above assumes that your application takes all the responsibility for examining the cases when the data received from a remote service can be cached so you need to implement proper caching logic by yourself. But there are also cases in which an API service tracks the changes in the data it shares and returns additional fields containing the metadata linked with a certain resource. The metadata may be composed of such values as last modification date, revision number or a hash computed basing on the resource content. Making use of such data can be a great way to improve the performance of your PHP application, especially when dealing with large amounts of data. Instead of fetching the whole resource each time you connect with the API, you just need to compare a timestamp or a hash with a value that you had received the last time. If they are equal, it just means that you can use the data fetched before as the remote content hasn’t changed. Such solution assumes that you do employ a caching engine in your application, but you don’t need to worry if the data stored in cache is valid. As you rely on the metadata being returned by the API service, you only need to compare the metadata values given by the remote server.

上述方案假定您的应用程序承担检查从远程服务接收的数据可以缓存的情况的全部责任,因此您需要自己实现适当的缓存逻辑。 但是在某些情况下,API服务会跟踪它共享的数据中的更改,并返回包含与特定资源链接的元数据的其他字段。 元数据可以由诸如最后修改日期,修订号或基于资源内容计算出的哈希值之类的值组成。 利用此类数据可能是提高PHP应用程序性能的好方法,尤其是在处理大量数据时。 无需每次与API进行连接时都获取全部资源,只需将时间戳或哈希值与上次收到的值进行比较即可。 如果它们相等,则仅表示您可以使用之前获取的数据,因为远程内容没有更改。 这种解决方案假定您在应用程序中确实使用了缓存引擎,但是您不必担心缓存中存储的数据是否有效。 当您依靠API服务返回的元数据时,只需比较远程服务器提供的元数据值。

Using remote resources metadata may be especially beneficial when employing a file hosting service API. Working with remote folders and files usually means transferring a lot of data which may lead to performance issues. To give you an example on how to avoid this, let me describe the solutions used in the Dropbox API. The Dropbox API service returns specific data that should be used to check whether the remote files have changed. First of all, the metadata method (which returns folders and files information like their names, sizes or paths) contains the hash field representing the hash value of the returned resource. If you provide a hash value from a previous request as a parameter in a new one and the remote data hasn’t changed between requests, the API will just return a HTTP 304 (Not modified) response. The Drobox API also offers the delta method which is created exclusively for informing about the changes in specific folders or files. Using the hash values and the delta method is recommended in the API documentation as it may give your application a significant performance boost.

当使用文件托管服务API时,使用远程资源元数据可能特别有益。 使用远程文件夹和文件通常意味着传输大量数据,这可能会导致性能问题。 为了给您提供有关如何避免这种情况的示例,让我描述Dropbox API中使用的解决方案。 Dropbox API服务返回应用于检查远程文件是否已更改的特定数据。 首先, 元数据方法(返回文件夹和文件的信息,例如其名称,大小或路径)包含表示所返回资源的哈希值的hash字段。 如果您从前一个请求中提供一个哈希值作为新请求中的参数,并且在两次请求之间远程数据没有更改,则API只会返回HTTP 304( Not modified )响应。 Drobox API还提供了delta方法,该方法专门用于通知特定文件夹或文件中的更改。 API文档中建议使用哈希值和delta方法,因为它可以使您的应用程序获得显着的性能提升。

最后但并非最不重要的一点:掌握API文档 (Last but not least: master the API documentation)

It may sound obvious but in some cases reading the API documentation thoroughly may provide you with specific solutions on how to make API calls more efficiently. The Dropbox API usage described above is a very clear example. But there may be other ways to reduce the amount of data being transferred in a response (e.g. selecting only a few specific fields to be returned by the API instead of receiving the whole dataset). You can also check whether the actions you execute in separate requests cannot be performed at once. For example, the translate method of the Google Translate API (which is being used for fetching text translations in different languages), may return more than one translation in one request. By passing a few text strings to process in a single API call, you can avoid making multiple requests which will probably result in saving some app execution time.

听起来似乎很明显,但是在某些情况下,仔细阅读API文档可能会为您提供有关如何更有效地进行API调用的特定解决方案。 上面描述的Dropbox API用法是一个非常清楚的示例。 但是,可能还有其他方法可以减少响应中正在传输的数据量(例如,仅选择要由API返回的几个特定字段,而不是接收整个数据集)。 您还可以检查在单独的请求中执行的操作是否不能一次执行。 例如,Google Translate API的翻译方法(用于获取不同语言的文本翻译)可能会在一个请求中返回多个翻译。 通过在单个API调用中传递一些文本字符串进行处理,您可以避免发出多个请求,这可能会节省一些应用程序的执行时间。

摘要 (Summary)

As you can see, there are many ways to improve the performance of a PHP application which relies heavily on using remote APIs. You can execute multiple requests at once – either by using curl_multi functions or by running separate application threads. Another solution is to implement a caching engine which will prevent you from making unnecessary API calls or lower the amount of the data transferred between servers. Finally, the methods offered by the API service may provide you with some out-of-the box solutions to get a performance boost, just like executing multiple actions in one request.

如您所见,有很多方法可以提高PHP应用程序的性能,而这些应用程序在很大程度上依赖于使用远程API。 您可以一次执行多个请求-通过使用curl_multi函数或运行单独的应用程序线程。 另一个解决方案是实现一个缓存引擎,该引擎将防止您进行不必要的API调用或减少服务器之间传输的数据量。 最后,API服务提供的方法可能会为您提供一些现成的解决方案来提高性能,就像在一个请求中执行多个操作一样。

I hope the article gave you some insight into how to handle API requests efficiently. If you have any comments regarding the points presented in the article or any other tips on how to speed up working with APIs, feel free to post it below. You can also contact me directly through Google Plus.

我希望本文能为您提供一些有关如何有效处理API请求的见解。 如果您对本文中提出的观点有任何意见,或者对如何加快使用API​​的速度有任何其他建议,请随时在下面发布。 您也可以通过Google Plus直接与我联系。

翻译自: https://www.sitepoint.com/speed-apps-api-consumption/

vs 增加api 应用程序

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值