自定义博客皮肤VIP专享

*博客头图:

格式为PNG、JPG,宽度*高度大于1920*100像素,不超过2MB,主视觉建议放在右侧,请参照线上博客头图

请上传大于1920*100像素的图片!

博客底图:

图片格式为PNG、JPG,不超过1MB,可上下左右平铺至整个背景

栏目图:

图片格式为PNG、JPG,图片宽度*高度为300*38像素,不超过0.5MB

主标题颜色:

RGB颜色,例如:#AFAFAF

Hover:

RGB颜色,例如:#AFAFAF

副标题颜色:

RGB颜色,例如:#AFAFAF

自定义博客皮肤

-+

  • 博客(0)
  • 资源 (109)
  • 收藏
  • 关注

空空如也

Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing

This textbook introduces sparse and redundant representations with a focus on applications in signal and image processing. The theoretical and numerical foundations are tackled before the applications are discussed. Mathematical modeling for signal sources is discussed along with how to use the proper model for tasks such as denoising, restoration, separation, interpolation and extrapolation, compression, sampling, analysis and synthesis, detection, recognition, and more. The presentation is elegant and engaging. Sparse and Redundant Representations is intended for graduate students in applied mathematics and electrical engineering, as well as applied mathematicians, engineers, and researchers who are active in the fields of signal and image processing.

2010-11-13

Dimension Reduction:A Guided Tour

We give a tutorial overview of several geometric methods for dimension reduction. We divide the methods into projective methods and methods that model the manifold on which the data lies. For projective methods, we review projection pursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, canonical correlation analysis, oriented PCA, and several techniques for sufficient dimension reduction. For the manifold methods, we review multidimensional scaling (MDS), landmark MDS, Isomap, locally linear embedding, Laplacian eigenmaps and spectral clustering. The Nystr¨om method, which links several of the manifold algorithms, is also reviewed. The goal is to provide a self-contained overview of key concepts underlying many of these algorithms, and to give pointers for further reading.

2010-11-11

GEOMETRIC PARTIAL DIFFERENTIAL EQUATIONS AND IMAGE ANALYSIS

This book provides an introduction to the use of geometric partial differential equations in image processing and computer vision. It brings a number of new concepts into the field, providing a very fundamental and formal approach to image processing. State-of-the-art practical results in a large number of real problems are achieved with the techniques described. Applications covered include image segmentation, shape analysis, image enhancement, and tracking. The volume provides information for people investigating new solutions to image processing problems as well as for people searching for existent advanced solutions.

2010-11-11

Moments and Moment Invariants in Pattern Recognition

Moments as projections of an image’s intensity onto a proper polynomial basis can be applied to many different aspects of image processing. These include invariant pattern recognition, image normalization, image registration, focus/ defocus measurement, and watermarking. This book presents a survey of both recent and traditional image analysis and pattern recognition methods, based on image moments, and offers new concepts of invariants to linear filtering and implicit invariants. In addition to the theory, attention is paid to efficient algorithms for moment computation in a discrete domain, and to computational aspects of orthogonal moments. The authors also illustrate the theory through practical examples, demonstrating moment invariants in real applications across computer vision, remote sensing and medical imaging. Key features: Presents a systematic review of the basic definitions and properties of moments covering geometric moments and complex moments. Considers invariants to traditional transforms – translation, rotation, scaling, and affine transform - from a new point of view, which offers new possibilities of designing optimal sets of invariants. Reviews and extends a recent field of invariants with respect to convolution/blurring. Introduces implicit moment invariants as a tool for recognizing elastically deformed objects. Compares various classes of orthogonal moments (Legendre, Zernike, Fourier-Mellin, Chebyshev, among others) and demonstrates their application to image reconstruction from moments. Offers comprehensive advice on the construction of various invariants illustrated with practical examples. Includes an accompanying website providing efficient numerical algorithms for moment computation and for constructing invariants of various kinds, with about 250 slides suitable for a graduate university course. Moments and Moment Invariants in Pattern Recognition is ideal for researchers and engineers involved in pattern recognition in medical i

2010-08-25

Nonlinear operators in image restoration

We firstly present a variational approach such that during image restoration, edges detected in the original image are being preserved, and then we compare in a second part, the mathematical foundation of this method with respect to some of the well known methods recently proposed in the literature within the class of PDE based algorithms (anisotropic diffusion, mean curvature motion, min/max flow technique,...). The performance of our approach is carefully examined and compared to the classical methods. Experimental results on synthetic and real images will illustrate the capabilities of all the studied approaches.

2010-06-12

Machine Learning for Human Motion Analysis Theory and Practice

"With the ubiquitous presence of video data and its increasing importance in a wide range of real-world applications, it is becoming increasingly necessary to automatically analyze and interpret object motions from large quantities of footage. Machine Learning for Human Motion Analysis: Theory and Practice highlights the development of robust and effective vision-based motion understanding systems. This advanced publication addresses a broad audience including practicing professionals working with specific vision applications such as surveillance, sport event analysis, healthcare, video conferencing, and motion video indexing and retrieval."

2010-06-08

Algebraic Geometry and Statistical Learning Theory

Sure to be influential, Watanabe's book lays the foundations for the use of algebraic geometry in statistical learning theory. Many models/machines are singular: mixture models, neural networks, HMMs, Bayesian networks, stochastic context-free grammars are major examples. The theory achieved here underpins accurate estimation techniques in the presence of singularities.

2010-06-07

Computer Vision (Shapiro 2000)

A textbook and reference for students and practitioners, presenting the necessary theory for work in fields where significant information must be extracted from images. Topics covered include databases and virtual and augmented reality, and the text includes more than 250 exercises and programming projects. DLC: Computer vision.

2010-04-23

Effective C# (Covers C# 4.0) 50 Specific Ways to Improve Your C# 2nd Edition

C# has matured over the past decade: It’s now a rich language with generics, functional programming concepts, and support for both static and dynamic typing. This palette of techniques provides great tools for many different idioms, but there are also many ways to make mistakes. In Effective C#, Second Edition, respected .NET expert Bill Wagner identifies fifty ways you can leverage the full power of the C# 4.0 language to express your designs concisely and clearly. Effective C#, Second Edition, follows a clear format that makes it indispensable to hundreds of thousands of developers: clear, practical explanations, expert tips, and plenty of realistic code examples. Drawing on his unsurpassed C# experience, Wagner addresses everything from types to resource management to dynamic typing to multicore support in the C# language and the .NET framework. Along the way, he shows how to avoid common pitfalls in the C# language and the .NET environment. You’ll learn how to Use both types of C# constants for efficiency and maintainability (see Item 2) Employ immutable data types to promote multicore processing (see Item 20) Minimize garbage collection, boxing, and unboxing (see Items 16 and 45) Take full advantage of interfaces and delegates (see Items 22 though 25) Make the most of the parallel framework (see Items 35 through 37) Use duck typing in C# (see Item 38) Spot the advantages of the dynamic and Expression types over reflection (see Items 42 and 43) Assess why query expressions are better than loops (see Item 8) Understand how generic covariance and contravariance affect your designs (see Item 29) See how optional parameters can minimize the number of method overloads (see Item 10) You’re already a successful C# programmer–this book will help you become an outstanding one.

2010-04-20

Continuous Primal-Dual methods for Image Processing

In this article we study a continuous Primal-Dual method proposed by Appleton and Talbot and generalize it to other problems in image processing. We interpret it as an Arrow-Hurwicz method which leads to a better description of the system of PDEs obtained. We show existence and uniqueness of solutions and get a convergence result for the denoising problem. Our analysis also yields new a posteriori estimates.

2010-04-02

The Mathematics of Medical Imaging

A Beginner's Guide to the Mathematics of Medical Imaging presents the basic mathematics of computerized tomography – the CT scan – for an audience of undergraduates in mathematics and engineering. Assuming no prior background in advanced mathematical analysis, topics such as the Fourier transform, sampling, and discrete approximation algorithms are introduced from scratch and are developed within the context of medical imaging. A chapter on magnetic resonance imaging focuses on manipulation of the Bloch equation, the system of differential equations that is the foundation of this important technology. The text is self-contained with a range of practical exercises, topics for further study, and an ample bibliography, making it ideal for use in an undergraduate course in applied or engineering mathematics, or by practitioners in radiology who want to know more about the mathematical foundations of their field.

2010-04-02

G:\BT\CRC.Press.-.Algorithms.and.Theory.of.Computation.Handbook.Volume.II.Second.Edition.2009.Retail.Ebook-ATTiCA

Algorithms and Theory of Computation Handbook, Second Edition: Special Topics and Techniques provides an up-to-date compendium of fundamental computer science topics and techniques. It also illustrates how the topics and techniques come together to deliver efficient solutions to important practical problems. Along with updating and revising many of the existing chapters, this second edition contains more than 15 new chapters. This edition now covers self-stabilizing and pricing algorithms as well as the theories of privacy and anonymity, databases, computational games, and communication networks. It also discusses computational topology, natural language processing, and grid computing and explores applications in intensity-modulated radiation therapy, voting, DNA research, systems biology, and financial derivatives. This best-selling handbook continues to help computer professionals and engineers find significant information on various algorithmic topics. The expert contributors clearly define the terminology, present basic results and techniques, and offer a number of current references to the in-depth literature. They also provide a glimpse of the major research issues concerning the relevant topics.

2010-03-30

Principles and Theory for Data Mining and Machine Learning

This book is a thorough introduction to the most important topics in data mining and machine learning. It begins with a detailed review of classical function estimation and proceeds with chapters on nonlinear regression, classification, and ensemble methods. The final chapters focus on clustering, dimension reduction, variable selection, and multiple comparisons. All these topics have undergone extraordinarily rapid development in recent years and this treatment offers a modern perspective emphasizing the most recent contributions. The presentation of foundational results is detailed and includes many accessible proofs not readily available outside original sources. While the orientation is conceptual and theoretical, the main points are regularly reinforced by computational comparisons. Intended primarily as a graduate level textbook for statistics, computer science, and electrical engineering students, this book assumes only a strong foundation in undergraduate statistics and mathematics, and facility with using R packages. The text has a wide variety of problems, many of an exploratory nature. There are numerous computed examples, complete with code, so that further computations can be carried out readily. The book also serves as a handbook for researchers who want a conceptual overview of the central topics in data mining and machine learning. Bertrand Clarke is a Professor of Statistics in the Department of Medicine, Department of Epidemiology and Public Health, and the Center for Computational Sciences at the University of Miami. He has been on the Editorial Board of the Journal of the American Statistical Association, the Journal of Statistical Planning and Inference, and Statistical Papers. He is co-winner, with Andrew Barron, of the 1990 Browder J. Thompson Prize from the Institute of Electrical and Electronic Engineers. Ernest Fokoue is an Assistant Professor of Statistics at Kettering University. He has also taught at Ohio State University and been a long term visitor at the Statistical and Mathematical Sciences Institute where he was a Post-doctoral Research Fellow in the Data Mining and Machine Learning Program. In 2000, he was the winner of the Young Researcher Award from the International Association for Statistical Computing. Hao Helen Zhang is an Associate Professor of Statistics in the Department of Statistics at North Carolina State University. For 2003-2004, she was a Research Fellow at SAMSI and in 2007, she won a Faculty Early Career Development Award from the National Science Foundation. She is on the Editorial Board of the Journal of the American Statistical Association and Biometrics.

2010-03-10

Continuous-Time Markov Decision Processes

Continuous-time Markov decision processes (MDPs), also known as controlled Markov chains, are used for modeling decision-making problems that arise in operations research (for instance, inventory, manufacturing, and queueing systems), computer science, communications engineering, control of populations (such as fisheries and epidemics), and management science, among many other fields. This volume provides a unified, systematic, self-contained presentation of recent developments on the theory and applications of continuous-time MDPs. The MDPs in this volume include most of the cases that arise in applications, because they allow unbounded transition and reward/cost rates. Much of the material appears for the first time in book form.

2010-03-01

Sams - Windows Server 2008 R2 Unleashed (2010) (ATTiCA)

Windows Server 2008 R2 Unleashed is the most comprehensive and realistic guide to planning, design, prototyping, implementation, migration, administration, and support. Based on the authors’ unsurpassed experience working with Windows Server 2008 R2 since its earliest alpha releases, it offers indispensable guidance drawn from hundreds of production environments. Microsoft MVP Rand Morimoto and his colleagues systematically introduce Windows Server 2008 R2 to IT professionals, identifying R2’s most crucial enhancements and walking through every step of installation and configuration. Next, they present comprehensive coverage of every area of Windows Server 2008 R2, including Active Directory, networking services, security, R2 migration from Windows Server 2003 and 2008, administration, fault tolerance, optimization and troubleshooting, core application services, and more. The authors thoroughly address major Windows Server 2008 R2 enhancements and present extensive coverage of R2 innovations ranging from Hyper-V virtualization to DirectAccess and the enhancements in Failover Clustering. Every chapter contains tips, tricks, and best practices learned from actual deployments: practical information for using Windows Server 2008 R2 to solve real business problems. Detailed information on how to... Plan and migrate from Windows Server 2003/2008 to Windows Server 2008 R2 and use R2’s new server migration tools Manage Active Directory with Active Directory Administrative Center, Best Practice Analyzer, and PowerShell scripts Use R2’s updated security tools and templates to lock down servers, clients, and networks Maximize availability with Windows Server 2008 R2 clustering, fault tolerance, and replication Streamline client management with new Group Policy ADMX settings and management tools Improve remote access using DirectAccess, Remote Desktop Services (formerly Terminal Services), and Virtual Desktop Infrastructure Implement Hyper-V virtualization including the built-in Live Migration technology Leverage add-ons such as Windows SharePoint Services, Windows Media Services, and IIS 7.5

2010-02-25

Ant Colony Optimization And Swarm Intelligence -2006gpg.pdf

This book constitutes the refereed proceedings of the 5th International Workshop on Ant Colony Optimization and Swarm Intelligence, ANTS 2006, held in Brussels, Belgium, in September 2006. The 27 revised full papers, 23 revised short papers, and 12 extended abstracts presented were carefully reviewed and selected from 115 submissions. The papers are devoted to theoretical and foundational aspects of ant algorithms, evolutionary optimization, ant colony optimization, and swarm intelligence and deal with a broad variety of optimization applications in networking, operations research, multiagent systems, robot systems, networking, etc.

2010-02-02

Microsoft.Press.Windows.Internals.5th.Edition.Jun.2009

Get the architectural perspectives and inside details you need to understand how Windows operates. See how the core components of the Windows operating system work behind the scenes—guided by a team of internationally renowned internals experts. Fully updated for Windows Server 2008 and Windows Vista, this classic guide delivers key architectural insights on system design, debugging, performance, and support—along with hands-on experiments to experience Windows internal behavior firsthand. Delve inside Windows architecture and internals: Understand how the core system and management mechanisms work—from the object manager to services to the registry Explore internal system data structures using tools like the kernel debugger Grasp the scheduler's priority and CPU placement algorithms Go inside the Windows security model to see how it authorizes access to data Understand how Windows manages physical and virtual memory Tour the Windows networking stack from top to bottom—including APIs, protocol drivers, and network adapter drivers Troubleshoot file-system access problems and system boot problems Learn how to analyze crashes

2010-01-30

Genetic.Programming.On.the.Programming.of.Computers.by.Means.of.Natural.Selection.-.John.R.Koza

Genetic programming may be more powerful than neural networks and other machine learning techniques, able to solve problems in a wider range of disciplines. In this ground-breaking book, John Koza shows how this remarkable paradigm works and provides substantial empirical evidence that solutions to a great variety of problems from many different fields can be found by genetically breeding populations of computer programs. Genetic Programming contains a great many worked examples and includes a sample computer code that will allow readers to run their own programs. In getting computers to solve problems without being explicitly programmed, Koza stresses two points: that seemingly different problems from a variety of fields can be reformulated as problems of program induction, and that the recently developed genetic programming paradigm provides a way to search the space of possible computer programs for a highly fit individual computer program to solve the problems of program induction. Good programs are found by evolving them in a computer against a fitness measure instead of by sitting down and writing them. John R. Koza is Consulting Associate Professor in the Computer Science Department at Stanford University.

2009-12-29

Geometric Description of Images as Topographic Maps (Lecture Notes in Mathematics) By Vicent Caselles, Pascal Monasse

This volume discusses the basic geometric contents of an image and presents a tree data structure to handle those contents efficiently. The nodes of the tree are derived from connected components of level sets of the intensity, while the edges represent inclusion information. Grain filters, morphological operators simplifying these geometric contents, are analyzed and several applications to image comparison and registration, and to edge and corner detection, are presented. The mathematically inclined reader may be most interested in Chapters 2 to 6, which generalize the topological Morse description to continuous or semicontinuous functions, while mathematical morphologists may more closely consider grain filters in Chapter 3. Computer scientists will find algorithmic considerations in Chapters 6 and 7, the full justification of which may be found in Chapters 2 and 4 respectively. Lastly, all readers can learn more about the motivation for this work in the image processing applications presented in Chapter 8.

2009-12-28

Your Research Project:A Step-by-Step Guide for the First-Time Researcher

In this new edition of Your Research Project, Nicholas S.R. Walliman has made this bestselling book even better with the addition of a number of new features whilst retaining all the benefits of the original. New features include: more elaboration on the differing needs of masters and PhD students; a new overview of the entire research chronology from start to finish; student checklists throughout; a new chapter on research ethics; new sections on critical reading skills and compiling literature reviews; examples from a wide range of disciplines and a student glossary.

2009-12-24

Variable-length Codes for Data Compression

Most data compression methods that are based on variable-length codes employ the Huffman or Golomb codes. However, there are a large number of less-known codes that have useful properties - such as those containing certain bit patterns, or those that are robust - and these can be useful. This book brings this large set of codes to the attention of workers in the field and to students of computer science. David Salomon’s clear style of writing and presentation, which has been familiar to readers for many years now, allows easy access to this topic. This comprehensive text offers readers a detailed, reader-friendly description of the variable length codes used in the field of data compression. Readers are only required to have a general familiarity with computer methods and essentially an understanding of the representation of data in bits and files. Topics and Features: • Discusses codes in-depth, not the compression algorithms, which are readily available in many books • Includes detailed illustrations, providing readers with a deeper and broader understanding of the topic • Provides a supplementary author-maintained website, with errata and auxiliary material – www.davidsalomon.name/VLCadvertis/VLC.html • Easily understood and used by computer science majors requiring only a minimum of mathematics • Can easily be used as a main or auxiliary textbook for courses on algebraic codes or data compression and protection • An ideal companion volume to David Salomon’s fourth edition of Data Compression: The Complete Reference Computer scientists, electrical engineers and students majoring in computer science or electrical engineering will find this volume a valuable resource, as will those readers in various physical sciences and mathematics. David Salomon is a professor emeritus of Computer Science at California State University, Northridge. He has authored numerous articles and books, including Coding for Data and Computer Communications, Guide to Data Compression Methods, Data Privacy and Security, Computer Graphics and Geometric Modeling, Foundations of Computer Security, and Transformations and Projections in Computer Graphics.

2009-12-24

Prentice Hall - Fundamentals of Statistical Signal Processing-Estimation Theory (Kay)

A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.

2009-12-16

Hebbian Learning and Negative Feedback Networks

The central idea of Hebbian Learning and Negative Feedback Networks is that artificial neural networks using negative feedback of activation can use simple Hebbian learning to self-organise so that they uncover interesting structures in data sets. Two variants are considered: the first uses a single stream of data to self-organise. By changing the learning rules for the network, it is shown how to perform Principal Component Analysis, Exploratory Projection Pursuit, Independent Component Analysis, Factor Analysis and a variety of topology preserving mappings for such data sets. The second variants use two input data streams on which they self-organise. In their basic form, these networks are shown to perform Canonical Correlation Analysis, the statistical technique which finds those filters onto which projections of the two data streams have greatest correlation. The book encompasses a wide range of real experiments and displays how the approaches it formulates can be applied to the analysis of real problems.

2009-12-01

Pattern Recognitionand Image Analysis 4th Iberian Conference, IbPRIA 2009

This volume constitutes the refereed proceedings of the 4th Iberian Conference on Pattern Recognition and Image Analysis, IbPRIA 2009, held in Póvoa de Varzim, Portugal in June 2009. The 33 revised full papers and 29 revised poster papers presented together with 3 invited talks were carefully reviewed and selected from 106 submissions. The papers are organized in topical sections on computer vision, image analysis and processing, as well as pattern recognition.

2009-12-01

2D Object Detection and Recognition Models, Algorithms, and Networks

Two important subproblems of computer vision are the detection and recognition of 2D objects in gray-level images. This book discusses the construction and training of models, computational approaches to efficient implementation, and parallel implementations in biologically plausible neural network architectures. The approach is based on statistical modeling and estimation, with an emphasis on simplicity, transparency, and computational efficiency. The book describes a range of deformable template models, from coarse sparse models involving discrete, fast computations to more finely detailed models based on continuum formulations, involving intensive optimization. Each model is defined in terms of a subset of points on a reference grid (the template), a set of admissible instantiations of these points (deformations), and a statistical model for the data given a particular instantiation of the object present in the image. A recurring theme is a coarse to fine approach to the solution of vision problems. The book provides detailed descriptions of the algorithms used as well as the code, and the software and data sets are available on the Web.

2009-11-25

A Wavelet Tour of Signal Processing: the Sparse Way

The new edition of this classic book gives all the major concepts, techniques and applications of sparse representation, reflecting the key role the subject plays in today’s signal processing. The book clearly presents the standard representations with Fourier, wavelet and time-frequency transforms, and the construction of orthogonal bases with fast algorithms. The central concept of sparsity is explained and applied to signal compression, noise reduction, and inverse problems, while coverage is given to sparse representations in redundant dictionaries, super-resolution and compressive sensing applications.

2009-10-21

Horn & Johnson, Topics in Matrix Analysis (CUP 1991)

Building on the foundations of its predecessor volume, Matrix Analysis, this book treats in detail several topics with important applications and of special mathematical interest in matrix theory not included in the previous text. These topics include the field of values, stable matrices and inertia, singular values, matrix equations and Kronecker products, Hadamard products, and matrices and functions. The authors assume a background in elementary linear algebra and knowledge of rudimentary analytical concepts. The book should be welcomed by graduate students and researchers in a variety of mathematical fields both as an advanced text and as a modern reference work.

2009-10-19

Horn R A, Johnson C R, Matrix Analysis (CUP 1990)

Linear algebra and matrix theory have long been fundamental tools in mathematical disciplines as well as fertile fields for research. In this book the authors present classical and recent results of matrix analysis that have proved to be important to applied mathematics. Facts about matrices, beyond those found in an elementary linear algebra course, are needed to understand virtually any area of mathematical science, but the necessary material has appeared only sporadically in the literature and in university curricula. As interest in applied mathematics has grown, the need for a text and reference offering a broad selection of topics in matrix theory has become apparent, and this book meets that need. This volume reflects two concurrent views of matrix analysis. First, it encompasses topics in linear algebra that have arisen out of the needs of mathematical analysis. Second, it is an approach to real and complex linear algebraic problems that does not hesitate to use notions from analysis. Both views are reflected in its choice and treatment of topics.

2009-10-19

Wavelet Theory and Its Application to Pattern Recognition World

This is not a purely mathematical text. It presents the basic principle of wavelet theory to electrical and electronic engineers, computer scientists, and students, as well as the ideas of how wavelets can be applied to pattern recognition. It also contains many research results from the authors' research team.

2009-09-30

Toeplitz and Circulant Matrices:A review

The fundamental theorems on the asymptotic behavior of eigenvalues, inverses, and products of banded Toeplitz matrices and Toeplitz matrices with absolutely summable elements are derived in a tutorial manner. Mathematical elegance and generality are sacrificed for conceptual simplicity and insight in the hope of making these results available to engineers lacking either the background or endurance to attack the mathematical literature on the subject. By limiting the generality of the matrices considered, the essential ideas and results can be conveyed in a more intuitive manner without the mathematical machinery required for the most general cases. As an application the results are applied to the study of the covariance matrices and their factors of linear models of discrete time random processes.

2009-09-27

Dan Brown - The Lost Symbol (ATTiCA)

In this stunning follow-up to the global phenomenon The Da Vinci Code, Dan Brown demonstrates once again why he is the world’s most popular thriller writer. The Lost Symbol is a masterstroke of storytelling--a deadly race through a real-world labyrinth of codes, secrets, and unseen truths . . . all under the watchful eye of Brown’s most terrifying villain to date. Set within the hidden chambers, tunnels, and temples of Washington, D.C., The Lost Symbol accelerates through a startling landscape toward an unthinkable finale. As the story opens, Harvard symbologist Robert Langdon is summoned unexpectedly to deliver an evening lecture in the U.S. Capitol Building. Within minutes of his arrival, however, the night takes a bizarre turn. A disturbing object--artfully encoded with five symbols--is discovered in the Capitol Building. Langdon recognizes the object as an ancient invitation . . . one meant to usher its recipient into a long-lost world of esoteric wisdom. When Langdon’s beloved mentor, Peter Solomon--a prominent Mason and philanthropist--is brutally kidnapped, Langdon realizes his only hope of saving Peter is to accept this mystical invitation and follow wherever it leads him. Langdon is instantly plunged into a clandestine world of Masonic secrets, hidden history, and never-before-seen locations--all of which seem to be dragging him toward a single, inconceivable truth. As the world discovered in The Da Vinci Code and Angels & Demons, Dan Brown’s novels are brilliant tapestries of veiled histories, arcane symbols, and enigmatic codes. In this new novel, he again challenges readers with an intelligent, lightning-paced story that offers surprises at every turn. The Lost Symbol is exactly what Brown’s fans have been waiting for . . . his most thrilling novel yet.

2009-09-18

An Introduction to Statistical Signal Processing

This volume describes the essential tools and techniques of statistical signal processing. At every stage, theoretical ideas are linked to specific applications in communications and signal processing. The book begins with an overview of basic probability, random objects, expectation, and second-order moment theory, followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the text.

2009-09-13

Haykin S.&Co(eds) New Directions in Statistical Signal Processing- From Systems to Brain

Signal processing and neural computation have separately and significantly influenced many disciplines, but the cross-fertilization of the two fields has begun only recently. Research now shows that each has much to teach the other, as we see highly sophisticated kinds of signal processing and elaborate hierachical levels of neural computation performed side by side in the brain. In New Directions in Statistical Signal Processing, leading researchers from both signal processing and neural computation present new work that aims to promote interaction between the two disciplines. The book's 14 chapters, almost evenly divided between signal processing and neural computation, begin with the brain and move on to communication, signal processing, and learning systems. They examine such topics as how computational models help us understand the brain's information processing, how an intelligent machine could solve the "cocktail party problem" with "active audition" in a noisy environment, graphical and network structure modeling approaches, uncertainty in network communications, the geometric approach to blind signal processing, game-theoretic learning algorithms, and observable operator models (OOMs) as an alternative to hidden Markov models (HMMs).

2009-09-13

Fractal Image Compression:Theory and Application

This book presents the theory and application of new methods of image compression based on self-transformations of an image. These methods lead to a representation of an image as a fractal, an object with detail at all scales. Very practical and completely up-to-date, this book will serve as a useful reference for those working in image processing and encoding and as a great introduction for those unfamiliar with fractals. The book begins with an elementary introduction to the concept of fractal image compression and contains a rigorous description of all the relevant mathemtics of the subjects.

2009-09-09

Image_Analysis_Random_Fields_and_Dynamic_Monte_Carlo_Methods

This second edition of G. Winkler's successful book on random field approaches to image analysis, related Markov Chain Monte Carlo methods, and statistical inference with emphasis on Bayesian image analysis concentrates more on general principles and models and less on details of concrete applications. Addressed to students and scientists from mathematics, statistics, physics, engineering, and computer science, it will serve as an introduction to the mathematical aspects rather than a survey. Basically no prior knowledge of mathematics or statistics is required. The second edition is in many parts completely rewritten and improved, and most figures are new. The topics of exact sampling and global optimization of likelihood functions have been added. This second edition comes with a CD-ROM by F. Friedrich,containing a host of (live) illustrations for each chapter. In an interactive environment, readers can perform their own experiments to consolidate the subject. --This text refers to the Hardcover edition.

2009-09-09

Methods of Information Geometry

Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the $\alpha$-connections. The duality between the $\alpha$-connection and the $(-\alpha)$-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability distributions, and the general theory of dual affine connections. The second half of the text provides an overview of wide areas of applications, such as statistics, linear systems, information theory, quantum mechanics, convex analysis, neural networks, and affine differential geometry. The book will serve as a suitable text for a topics course for advanced undergraduates and graduate students.

2009-09-01

CRC.Algorithmic.Cryptanalysis.Jun.2009.eBook-ELOHiM

Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program. Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applications. The final part addresses more sophisticated cryptographic applications, including LFSR-based stream ciphers and index calculus methods. Accounting for the impact of current computer architectures, this book explores the algorithmic and implementation aspects of cryptanalysis methods. It can serve as a handbook of algorithmic methods for cryptographers as well as a textbook for undergraduate and graduate courses on cryptanalysis and cryptography.

2009-08-27

Wiley.An.Introduction.To.Mathematical.Models.In.Ecology.And.Evolution.Time.And.Space.2nd.Edition.May.2009.eBook-ELOHiM

Product Description Students often find it difficult to grasp fundamental ecological and evolutionary concepts because of their inherently mathematical nature. Likewise, the application of ecological and evolutionary theory often requires a high degree of mathematical competence. This book is a first step to addressing these difficulties, providing a broad introduction to the key methods and underlying concepts of mathematical models in ecology and evolution. The book is intended to serve the needs of undergraduate and postgraduate ecology and evolution students who need to access the mathematical and statistical modelling literature essential to their subjects. The book assumes minimal mathematics and statistics knowledge whilst covering a wide variety of methods, many of which are at the fore-front of ecological and evolutionary research. The book also highlights the applications of modelling to practical problems such as sustainable harvesting and biological control.

2009-08-27

Models for Probability and Statistical Inference

This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readers Models for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping. Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses modes of convergence of sequences of random variables, with special attention to convergence in distribution. The second half of the book addresses statistical inference, beginning with a discussion on point estimation and followed by coverage of consistency and confidence intervals. Further areas of exploration include: distributions defined in terms of the multivariate normal, chi-square, t, and F (central and non-central); the one- and two-sample Wilcoxon test, together with methods of estimation based on both; linear models with a linear space-projection approach; and logistic regression. Each section contains a set of problems ranging in difficulty from simple to more complex, and selected answers as well as proofs to almost all statements are provided. An abundant amount of figures in addition to helpful simulations and graphs produced by the statistical package S-Plus(r) are included to help build the intuition of readers.

2009-08-20

Paul Ekman - Unmasking The Face. A guide to recognizing emotions from facial clues.part2.rar

Ekman is a leading authority on the study of the facial expressions and their relation to emotion, and this book is a methodical and thorough (for the layperson, at least) introduction to the field, with special focus on recognizing what he calls the six basic universally expressed emotions: happiness, fear, anger, surprise, disgust/contempt, and sadness. Ekman provides clear, well-detailed instructions for recognizing each of these emotions, and the book includes plenty of photos to illustrate the different facial contortions of each emotion. The only complaint I would have is that I had to do a lot of page flipping back and forth to read Ekman's remarks and subsequently refer to the relevant picture. Perhaps a future edition could be better formatted to make it easier for the reader to view the expression alongside Ekman's comments. In addition to writing about specific emotions, Ekman also covers some fascinating related topics such as recognizing facial deceit and discovering the patterns of one's own facial expressions, i.e., what you're telling the world with your own face. Ekman is an academic and his writing shows it; he's precise, methodical, thorough, and careful in the extent of his claims. Readers who are new to the subject of reading facial expressions but are seriously committed to learning about it will find this an invaluable book. (Ekman's later work, "Emotions Revealed," is also a great read and contains much of the same information as "Unmasking the Face," although I found the former to be lighter on technical information and practice faces, and more focused on the larger reflections Ekman has made looking back on his work over the last few decades. In other words, both books are great, but "Unmasking the Face" is a bit more technical and thorough, and therefore a better book to pick up for learning how to recognize facial expressions.)

2009-08-17

空空如也

TA创建的收藏夹 TA关注的收藏夹

TA关注的人

提示
确定要删除当前文章?
取消 删除