Automated Test - Five Trends Shaping its Future

Kirtesh Mistry, Technical Marketing Engineer at National Instruments investigates five key trends affecting the way automated test is developing.

When I ask test engineers and managers what led them to attend one of our conferences or summits, I often get the reply, "to keep up with what's new in test."

This question often results in further discussions where managers come to realise that the latest technologies can help them optimise their processes, ensure that their team of engineers are as productive as they can be and ultimately give their business a competitive edge. The technology changes that occur in automated testing are driven by a number of factors, including the increasing complexity of devices under test (DUTs) and the needs to meet market demands.

One of the biggest challenges for test engineers and their managers is keeping up to date on technology trends.

To help them stay up-to-date, NI annually publishes the Automated Test Outlook, drawing on relationships with over 35,000 companies worldwide to provide a comprehensive view of key technologies and methodologies impacting the test and measurement industry over the coming 1 to 3 years.

NI is well positioned to stay up to date on technology trends in automated test through internal research and development activities and close relationships with suppliers. The overall outlook can extend into technologies often far ahead of commercialisation.

The 2012 outlook identifies five key trends, each of which will be summarised in this article.

  1. Optimising Test Organisations

  2. Measurements and Simulation in the Design Flow

  3. PCI Express External Interfaces

  4. Proliferation of Mobile Devices

  5. Portable Measurement Algorithms

1. Optimising Test Organisations

"Test is a foundational activity in any development, manufacturing and maintenance endeavour. Not only must it be included when considering product quality, time to market and business objectives, but it must also be effective and affordable. At Lockheed Martin, we are investing in the people, process and technology aspects of automated test to ensure we meet our objectives." Tom Wissink, Director of Integration, Test and Evaluation, Lockheed Martin Corporate Engineering & Technology.

The business strategy for many organisations in the coming years will be to focus on optimisation.

Optimising test solutions

In many leading organisations, information technology has evolved over two decades from being a support function to a strategic asset. IT can now streamline critical line-of-business processes and help executives make real time decisions. The strategic importance of IT was confirmed by the Chief Information Officer (CIO) magazine 2010 State of the CIO Survey, which revealed that 70 percent of CIOs are now members of their companies' executive committees.

Similarly, an emerging trend for electronics and manufacturing companies is the elevation of the test engineering function from a cost centre to a strategic asset, for competitive differentiation.

This shift was confirmed by recent global NI survey of test engineering leaders who said their top goal over the next 1 to 2 years is to reorganise their test organisation structures for increased efficiency. This strategic realignment reduces the cost of quality and impacts a company's financials by getting better products to market faster.

Companies making this transformation must commit to a long-term strategy because, according to NI research, it generally takes 3 to 5 years to realise the full benefits. A company must have a disciplined and innovative investment strategy to transform the test organisation through four maturity levels: ad-hoc, reactive, proactive and optimised. Each level includes people, process and technology elements.

The right people are required to develop and maintain the cohesive test strategy. Process improvements are required to streamline test development and reuse throughout product development. And finally, tracking and incorporating the latest technologies is required to improve system performance while lowering cost.

An organisation steadily builds a foundation for strategic transformation by sticking to a sequential approach and identifying short-term initiatives that help the company improve its maturity level and that map to annual operating objectives. As the foundation is built, test productivity and asset utilisation increase, paying dividends on the original investment. This phased approach enables organisations to realise the benefits early on - after the completion of just one or two projects.

Typically, optimised organisations will develop standardised test architectures, with strong reuse components from design to production and provide systematic enterprise data management and analysis that result in company level business impact.

2. Measurements and Simulation in the Design Flow

"Connectivity between our EDA tools and NI's test software allows engineers to develop a test bench simultaneously with product development, providing earlier test feedback into the design process and greatly shortening design cycles by making development and test parallel rather than serial" - Serge Leef, Vice President / General Manager of System Level Engineering Division, Mentor Graphics.

A key objective for many organisations is to shorten the product development cycle. This has long been a case in the automotive and aerospace industries, for which the end product is a highly complex "system of systems". This is also seen as a trend in the semiconductor and consumer electronics industries, where shorter life spans and increasing product complexity are fuelling the pressure to reduce product development time.

Measurement and simulation

One approach to reduce development time is concurrent design and test, which is often represented by the V-diagram product development model. The left side of the V-diagram is considered "design" and right side represents "test". The idea is to increase efficiency by validating and testing subsystems during the design phase, before development of an entire system is complete.

A key method to empower this practice is increasing the connectivity between electronic design automation (EDA) simulation software and test software.

During initial design and simulation, EDA software is used to model either physical or electrical behaviours of a simulated product. During the validation and verification stage of product development, engineers use software to automate measurements on a real prototype. However, similar to the design and simulation phase, the validation and verification process requires measurement algorithms to those used by EDA software tools.

With National Instruments' recent acquisition of AWR, a leading supplier of electronic design automation (EDA) software for designing RF and high-frequency components and systems, engineers can now benefit from tighter software and hardware integration between AWR's Microwave Office Design Suite and NI LabVIEW graphical system design software.

One benefit of the connectivity between design and test software environments is that it allows design engineers to use significantly richer measurement algorithms earlier in the design process. A second benefit is that it allows test engineers to develop working test code much sooner, which ultimately reduces time to market for complex products.

For example, the design of a cellular multimode RF power amplifier is traditionally modelled using RF EDA tools that allow the engineer to simulate RF characteristics such as efficiency, 1dB compression point and gain.

However, the end product must meet additional RF measurement criteria explicitly established for cellular standards such as GSM/EDGE, WCDMA and LTE.

Historically, "standard specific" measurement data from metrics such as LTE error vector magnitude (EVM) and adjacent leakage channel ratio (ACLR) required instrumentation on a physical DUT, largely because of measurement complexity. Going forward, new connectivity between EDA and test automation software enables the use of these sophisticated measurement algorithms within the EDA environment on a simulated device. As a result, engineers will be able to identify system-related or complex product issues much earlier in the design cycle and therefore shorten design times.

3. PCI Express External Interfaces

Due to the combination of its excellent performance and pervasiveness, PCI Express is the default choice for systems buses. With new fibre-optic and copper cable technologies, it is emerging as the leading choice for high-performance external interfaces." - Mark Wetzel, Distinguished Engineer for Process Architectures, National Instruments.

PCI Express external interfaces

PCs in various form factors, such as desktops, workstations, industrial and embedded systems have been used to provide central control for instrumentation and automate test procedures since the invention of GPIB in the 1960s. These days, they offer a variety of interface buses such as USB, Ethernet, Serial, GPIB, PCI and PCI Express to interface to instrumentation hardware in automated test systems. Because PCs play such a critical role in an automated test system, the test and measurement industry must track the progression of the PC industry and exploit any new technologies for increasing capabilities and performance, while lowering the cost of test.

Since PCI Express is a serial bus, it has a variety of inherent advantages over parallel buses such as PCI and VME. Technical challenges like timing skew, power consumption, electromagnetic interference and crosstalk across parallel buses become more and more difficult to circumvent when trying to increase data bandwidth. PCI has been the fundamental bus on the motherboard of many computers, and PCI Express, since its release in 2004, has seen continuous improvements in its data transfer capabilities. Furthermore, PCI Express uses the same software stack as PCI and provides full backward compatibility.

PCI or PCI Express offer better performance over other external interfaces like GPIB because they are directly available from the CPU inside a PC. A more recent implementation of PCI Express as an external interface, Thunderbolt, is a technology Intel pioneered under the code name Light Peak which has the potential to be extremely pervasive. Thunderbolt combines PCI Express and DisplayPort video protocol into a serial interface bus that can be driven over either copper or fibre-optic cables. Since PCs will natively offer Thunderbolt ports, it has the promise to be a high performance, low cost and ubiquitous solution.

Devices under test are increasing in complexity and require a multitude of measurements such as RF, audio, vibration and even video analysis, for example. The volume of data produced drives the demand for greater bus bandwidth. Platforms utilising PCI/PXI Express are commonly used to address this need for high bandwidth, whether for rapid streaming to Redundant Array of Inexpensive Disks (RAID) for post processing, or even shuttling data between test systems. By eliminating bottlenecks caused by insufficient bandwidth, test times can be reduced. These benefits will drive the adoption of the external PCI Express port in its various form factors.

Automated test systems that leverage PCI Express, in its various implementations, are positioned to offer the highest performance and most flexibility, as well as low cost. For these reasons, PXI as a modular instruments test platform can very easily become a default choice for many automated test and measurement applications.

4. The Proliferation of Mobile Devices

"Tablets and smartphones are becoming increasingly ubiquitous computing devices and we expect them to complement laptops and desktops when it comes to remote access to important data." - Jean-Claude Monney, Chief Technology Strategist for Microsoft US Discrete Industries.

Proliferation of mobile devices

As an avid reader of Radio-Electronics.com, you may already be familiar with the many articles that explain the development of mobile devices from RF component design through to functional testing. To spin this around, it is also worth now considering how the mobile device can be used by the test engineer as part of the test setup.

While tablets and smartphones cannot replace ubiquitous PC or PC-based measurement platforms like PXI, they offer unique benefits when used as extensions to a test system. When the Nielsen Company surveyed consumers in 2011 to understand why they were using tablets instead of traditional PCs, the top reasons cited included user experience improvements like superior portability, ease of use, faster start up time and longer battery longevity.

For engineers, the PC is probably one of the most important engineering tools and so as mobile device adoption increases engineers likely transition to use mobile devices and tablets for many of the same reasons outlined by consumers in the Nielsen study.

The expected use cases for mobile devices within automated test include test system monitoring control and test data report viewing.

Perhaps, as an engineer, you need to remotely monitor and control your test rig so you can check on how things are going and look for alarm states while you travel between two locations. Maybe you are monitoring from across a room, a building or across the world. The mobile device provides a secondary user interface to a test system that is located on the other side of the world. A tablet or smartphone can instantaneously view a wide variety of information related to remote test system, or control its mode of operation.

Rather than interact with the test system directly, test engineers may want to consolidate test reports that characterise the results of previous tests and identify trends.

The explosion of mobile devices like tablets and smartphones provides compelling benefits to engineers, technicians and managers involved in automated test, who need remote access to test status information and results. Test organisations will need new expertise to unite the networking, web services and mobile app portions of the solution.

5. Portable Measurement Algorithms

"With business needs demanding computing platforms beyond the venerable microprocessor, our familiar programming paradigms are struggling to keep pace. Providing tools that offer efficient design capture through a variety of models of computation, combined with ability to target multiple types of processing hardware, is a key goal for NI investment." - David Fuller, Vice President of Application and Embedded Software, National Instruments.

Portable measurement algorithms

Over the past 20 years, the concept of user-programmable, microprocessor-based measurement algorithms has become mainstream allowing engineers to rapidly adapt to changing test requirements. This is approach is called virtual instrumentation.

If the microprocessor initiated the virtual instrumentation revolution, then the field-programmable gate array (FPGA) is ushering in its next phase. FPGAs have been used in instruments for many years. For instance, today's high-bandwidth oscilloscopes collect so much data, it is impossible for users to quickly analyse all of it. Hardware-defined algorithms on these devices, often implemented on FPGAs, perform data analysis and reduction (averaging, waveform math and triggering), compute statistics (mean, standard deviation, maximum and minimum) and process the data for display, all to present the results to the user in a meaningful way. While these capabilities offer obvious value, there is lost potential in the closed nature of these FPGAs. In most cases, users cannot deploy their own custom measurement algorithms to this powerful processing hardware.

Open, user-programmable FPGAs on measurement hardware offer many advantages over processor-only systems. Because of their immense computational capabilities, FPGAs can deliver higher test throughput and greater test coverage, which reduces test time and capital expenditures. The low latency of FPGA measurements also provides the ability to implement tests that are not possible on a microprocessor alone. Their inherent parallelism offers true multisite test, even more so than with multicore processors. And finally, FPGAs can play a key role in real-time hardware sequencing and DUT control.

An example is that provided by ST-Ericsson, which manages communication protocols with FPGA RF instrumentation based on NI FlexRIO. NI FlexRIO, which contains a Xilinx Virtex-5 FPGA, requires an adapter module to access its digital I/O at high speeds. They used the NI 6581 digital adapter module for FlexRIO, which can access I/O at up to 100 MBit/s, to customise the socketed component-level IP (CLIP) Node, which interfaces with the NI 6581 to include a digital clock manager(DCM). It performs a layout of the external clock and provides some protocols and derived clocks to the bench.

As the project advanced, they needed a digital RF high speed (1.4 Gbit/s) transfer, so they developed an adapter module using the NI FlexRIO Adapter Module Development Kit. This module replaces the NI 6581 and provides differential RF channel (RX, TX and CLK) transfer speeds that were needed.

They were able to exploit VHDL code reuse, and it was possible to quickly implement a complex protocol without having to rewrite it in the LabVIEW FPGA single cycle Timed Loop.

Hardware description language abstraction, high level synthesis and different models of computation with development software will provide greater levels of hardware abstraction and flexibility across execution targets to deliver higher performance, cost effectiveness and shorter time to market.

The outlook

Developments in areas from business strategy to test architecture are shaping the evolution of automated testing. By adopting these strategies and technologies, businesses will be better positioned in years to come with optimised test processes that are able to address the constraints and complexities of their DUTs, maintaining competitive edge over their rivals. Don't get left behind.


1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看README.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。 1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看README.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。 1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看README.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。
应用背景为变电站电力巡检,基于YOLO v4算法模型对常见电力巡检目标进行检测,并充分利用Ascend310提供的DVPP等硬件支持能力来完成流媒体的传输、处理等任务,并对系统性能做出一定的优化。.zip深度学习是机器学习的一个子领域,它基于人工神经网络的研究,特别是利用多层次的神经网络来进行学习和模式识别。深度学习模型能够学习数据的高层次特征,这些特征对于图像和语音识别、自然语言处理、医学图像分析等应用至关重要。以下是深度学习的一些关键概念和组成部分: 1. **神经网络(Neural Networks)**:深度学习的基础是人工神经网络,它是由多个层组成的网络结构,包括输入层、隐藏层和输出层。每个层由多个神经元组成,神经元之间通过权重连接。 2. **前馈神经网络(Feedforward Neural Networks)**:这是最常见的神经网络类型,信息从输入层流向隐藏层,最终到达输出层。 3. **卷积神经网络(Convolutional Neural Networks, CNNs)**:这种网络特别适合处理具有网格结构的数据,如图像。它们使用卷积层来提取图像的特征。 4. **循环神经网络(Recurrent Neural Networks, RNNs)**:这种网络能够处理序列数据,如时间序列或自然语言,因为它们具有记忆功能,能够捕捉数据中的时间依赖性。 5. **长短期记忆网络(Long Short-Term Memory, LSTM)**:LSTM 是一种特殊的 RNN,它能够学习长期依赖关系,非常适合复杂的序列预测任务。 6. **生成对抗网络(Generative Adversarial Networks, GANs)**:由两个网络组成,一个生成器和一个判别器,它们相互竞争,生成器生成数据,判别器评估数据的真实性。 7. **深度学习框架**:如 TensorFlow、Keras、PyTorch 等,这些框架提供了构建、训练和部署深度学习模型的工具和库。 8. **激活函数(Activation Functions)**:如 ReLU、Sigmoid、Tanh 等,它们在神经网络中用于添加非线性,使得网络能够学习复杂的函数。 9. **损失函数(Loss Functions)**:用于评估模型的预测与真实值之间的差异,常见的损失函数包括均方误差(MSE)、交叉熵(Cross-Entropy)等。 10. **优化算法(Optimization Algorithms)**:如梯度下降(Gradient Descent)、随机梯度下降(SGD)、Adam 等,用于更新网络权重,以最小化损失函数。 11. **正则化(Regularization)**:技术如 Dropout、L1/L2 正则化等,用于防止模型过拟合。 12. **迁移学习(Transfer Learning)**:利用在一个任务上训练好的模型来提高另一个相关任务的性能。 深度学习在许多领域都取得了显著的成就,但它也面临着一些挑战,如对大量数据的依赖、模型的解释性差、计算资源消耗大等。研究人员正在不断探索新的方法来解决这些问题。
深度学习是机器学习的一个子领域,它基于人工神经网络的研究,特别是利用多层次的神经网络来进行学习和模式识别。深度学习模型能够学习数据的高层次特征,这些特征对于图像和语音识别、自然语言处理、医学图像分析等应用至关重要。以下是深度学习的一些关键概念和组成部分: 1. **神经网络(Neural Networks)**:深度学习的基础是人工神经网络,它是由多个层组成的网络结构,包括输入层、隐藏层和输出层。每个层由多个神经元组成,神经元之间通过权重连接。 2. **前馈神经网络(Feedforward Neural Networks)**:这是最常见的神经网络类型,信息从输入层流向隐藏层,最终到达输出层。 3. **卷积神经网络(Convolutional Neural Networks, CNNs)**:这种网络特别适合处理具有网格结构的数据,如图像。它们使用卷积层来提取图像的特征。 4. **循环神经网络(Recurrent Neural Networks, RNNs)**:这种网络能够处理序列数据,如时间序列或自然语言,因为它们具有记忆功能,能够捕捉数据中的时间依赖性。 5. **长短期记忆网络(Long Short-Term Memory, LSTM)**:LSTM 是一种特殊的 RNN,它能够学习长期依赖关系,非常适合复杂的序列预测任务。 6. **生成对抗网络(Generative Adversarial Networks, GANs)**:由两个网络组成,一个生成器和一个判别器,它们相互竞争,生成器生成数据,判别器评估数据的真实性。 7. **深度学习框架**:如 TensorFlow、Keras、PyTorch 等,这些框架提供了构建、训练和部署深度学习模型的工具和库。 8. **激活函数(Activation Functions)**:如 ReLU、Sigmoid、Tanh 等,它们在神经网络中用于添加非线性,使得网络能够学习复杂的函数。 9. **损失函数(Loss Functions)**:用于评估模型的预测与真实值之间的差异,常见的损失函数包括均方误差(MSE)、交叉熵(Cross-Entropy)等。 10. **优化算法(Optimization Algorithms)**:如梯度下降(Gradient Descent)、随机梯度下降(SGD)、Adam 等,用于更新网络权重,以最小化损失函数。 11. **正则化(Regularization)**:技术如 Dropout、L1/L2 正则化等,用于防止模型过拟合。 12. **迁移学习(Transfer Learning)**:利用在一个任务上训练好的模型来提高另一个相关任务的性能。 深度学习在许多领域都取得了显著的成就,但它也面临着一些挑战,如对大量数据的依赖、模型的解释性差、计算资源消耗大等。研究人员正在不断探索新的方法来解决这些问题。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值