Abstract
|
|
---|---|
As the complexity of digital systems increases, the existing simulation-based quantization approaches soon become unaffordable due to the exceedingly long simulation times. Thus, it is necessary to develop optimized strategies aimed at significantly reducing the computation times required by the algorithms to find a valid solution (Clark et al., 2005; Hill, 2006). In this sense, interval-based computations are particularly well-suited to reduce the number of simulations required to quantize a digital system, since they are capable of evaluating a large number of numerical samples in a single interval-based simulation. This chapter presents a review of the most common interval-based computation techniques, as well as some experiments that show their application to the analysis and design of digital Linear Time Invariant (LTI) systems. One of the main features of these computations is that they are capable of significantly reducing the number of simulations needed to characterize a digital system, at the expense of some additional complexity in the processing of each operation. On the other hand, one of the most important problems associated to these computations is interval oversizing (i.e., the computed bounds of the intervals are wider than required), so new descriptions and methods are continuously being proposed. In this sense, each description has its own features and drawbacks, making it suitable for a different type of processing. | |
International
|
Si |
|
|
Book Edition
|
1 |
Book Publishing
|
INTECH Open Access Publisher |
ISBN
|
978-953-307-650-8 |
Series
|
|
Book title
|
Applications of Digital Signal Processing |
From page
|
279 |
To page
|
296 |