图书推荐——Image Analysis, Classification, and Change Detection in Remote Sensing With Algorithms...

图书特点

·<wbr><wbr><wbr><wbr><wbr> 简明介绍所需的数学和统计背景知识 </wbr></wbr></wbr></wbr></wbr>

·<wbr><wbr><wbr><wbr><wbr> 深度介绍非线性数据分析方法,包括支持向量机等</wbr></wbr></wbr></wbr></wbr>

·<wbr><wbr><wbr><wbr><wbr> 详细介绍多变量变化检测及软件的实现</wbr></wbr></wbr></wbr></wbr>

·<wbr><wbr><wbr><wbr><wbr> 提供每个章节的练习源代码</wbr></wbr></wbr></wbr></wbr>

·<wbr><wbr><wbr><wbr><wbr> 作者个人网站随时更新最新的<span>ENVI</span>二次开发程序</wbr></wbr></wbr></wbr></wbr>

图书推荐:Image<wbr>Analysis,<wbr>Classification,<wbr>and<wbr>Change<wbr>Detection<wbr>in<wbr>Remote<wbr>Sensing<wbr>With<wbr>Algorithms

图书目录:

Images, Arrays, and Matrices
Multispectral Satellite Images
Algebra of Vectors and Matrices
Eigenvalues and Eigenvectors
Singular Value Decomposition
Vector Derivatives
Finding Minima and Maxima

Image Statistics
Random Variables
Random Vectors
Parameter Estimation
Hypothesis Testing and Sample Distribution Functions
Conditional Probabilities, Bayes’ Theorem, and Classification
Ordinary Linear Regression
Entropy and Information

Transformations
Discrete Fourier Transform
Discrete Wavelet Transform
Principal Components
Minimum Noise Fraction
Spatial Correlation

Filters, Kernels, and Fields
Convolution Theorem
Linear Filters
Wavelets and Filter Banks
Kernel Methods
Gibbs–Markov Random Fields

Image Enhancement and Correction
Lookup Tables and Histogram Functions
Filtering and Feature Extraction
Panchromatic Sharpening
Topographic Correction
Image–Image Registration

Supervised Classification: Part 1
Maximum a Posteriori Probability
Training Data and Separability
Maximum Likelihood Classification
Gaussian Kernel Classification
Neural Networks
Support Vector Machines

Supervised Classification: Part 2
Postprocessing
Evaluation and Comparison of Classification Accuracy
Adaptive Boosting
Hyperspectral Analysis

Unsupervised Classification
Simple Cost Functions
Algorithms That Minimize the Simple Cost Functions
Gaussian Mixture Clustering
Including Spatial Information
Benchmark
Kohonen Self-Organizing Map
Image Segmentation

Change Detection
Algebraic Methods
Postclassification Comparison
Principal Components Analysis
Multivariate Alteration Detection
Decision Thresholds and Unsupervised Classification of Changes
Radiometric Normalization

Appendix A: Mathematical Tools
Cholesky Decomposition
Vector and Inner Product Spaces
Least Squares Procedures

Appendix B: Efficient Neural Network Training Algorithms
Hessian Matrix
Scaled Conjugate Gradient Training
Kalman Filter Training
A Neural Network Classifier with Hybrid Training

Appendix C: ENVI Extensions in IDL
Installation
Extensions

Appendix D: Mathematical Notation

References

Index

<wbr><wbr><wbr><wbr> 图书详细介绍:<a href="http://www.crcpress.com/product/isbn/9781420087130"><span style="color:blue">http://www.crcpress.com/product/isbn/9781420087130</span></a></wbr></wbr></wbr></wbr>

<wbr><wbr><wbr> 在Amazon可购买。</wbr></wbr></wbr>



<wbr></wbr>

图书目录:

Images, Arrays, and Matrices
Multispectral Satellite Images
Algebra of Vectors and Matrices
Eigenvalues and Eigenvectors
Singular Value Decomposition
Vector Derivatives
Finding Minima and Maxima

Image Statistics
Random Variables
Random Vectors
Parameter Estimation
Hypothesis Testing and Sample Distribution Functions
Conditional Probabilities, Bayes’ Theorem, and Classification
Ordinary Linear Regression
Entropy and Information

Transformations
Discrete Fourier Transform
Discrete Wavelet Transform
Principal Components
Minimum Noise Fraction
Spatial Correlation

Filters, Kernels, and Fields
Convolution Theorem
Linear Filters
Wavelets and Filter Banks
Kernel Methods
Gibbs–Markov Random Fields

Image Enhancement and Correction
Lookup Tables and Histogram Functions
Filtering and Feature Extraction
Panchromatic Sharpening
Topographic Correction
Image–Image Registration

Supervised Classification: Part 1
Maximum a Posteriori Probability
Training Data and Separability
Maximum Likelihood Classification
Gaussian Kernel Classification
Neural Networks
Support Vector Machines

Supervised Classification: Part 2
Postprocessing
Evaluation and Comparison of Classification Accuracy
Adaptive Boosting
Hyperspectral Analysis

Unsupervised Classification
Simple Cost Functions
Algorithms That Minimize the Simple Cost Functions
Gaussian Mixture Clustering
Including Spatial Information
Benchmark
Kohonen Self-Organizing Map
Image Segmentation

Change Detection
Algebraic Methods
Postclassification Comparison
Principal Components Analysis
Multivariate Alteration Detection
Decision Thresholds and Unsupervised Classification of Changes
Radiometric Normalization

Appendix A: Mathematical Tools
Cholesky Decomposition
Vector and Inner Product Spaces
Least Squares Procedures

Appendix B: Efficient Neural Network Training Algorithms
Hessian Matrix
Scaled Conjugate Gradient Training
Kalman Filter Training
A Neural Network Classifier with Hybrid Training

Appendix C: ENVI Extensions in IDL
Installation
Extensions

Appendix D: Mathematical Notation

References

Index

<wbr><wbr><wbr><wbr> 图书详细介绍:<a href="http://www.crcpress.com/product/isbn/9781420087130"><span style="color:blue">http://www.crcpress.com/product/isbn/9781420087130</span></a></wbr></wbr></wbr></wbr>

<wbr><wbr><wbr> 在Amazon可购买。</wbr></wbr></wbr>

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
To calculate a commonly used metric for classification problems, such as accuracy or F1 score, we need to compare the predicted labels with the true labels. In the given code, the predicted labels are stored in `y_pred`, and the true labels are stored in `y_train` (assuming `y_train` represents the training labels). Here's an example of how to calculate accuracy and F1 score using the predicted and true labels: ```python import numpy as np from sklearn.metrics import accuracy_score, f1_score # Convert predicted probabilities to binary labels y_pred_binary = np.round(y_pred.squeeze().detach().numpy()) # Convert true labels to numpy array y_train_numpy = y_train.numpy() # Calculate accuracy accuracy = accuracy_score(y_train_numpy, y_pred_binary) # Calculate F1 score f1 = f1_score(y_train_numpy, y_pred_binary) print(f"Accuracy: {accuracy}") print(f"F1 Score: {f1}") ``` For comparison with XGBoost, you can train an XGBoost classifier using similar data and calculate the same metrics. Here's an example of how to use XGBoost for binary classification: ```python import xgboost as xgb # Convert PyTorch tensors to numpy arrays X_train_numpy = X_train.numpy() y_train_numpy = y_train.numpy() # Create DMatrix for XGBoost training dtrain = xgb.DMatrix(X_train_numpy, label=y_train_numpy) # Set XGBoost parameters params = { 'objective': 'binary:logistic', 'eval_metric': 'error' } # Train the XGBoost classifier num_rounds = 100 model_xgb = xgb.train(params, dtrain, num_rounds) # Predict labels using XGBoost model y_pred_xgb = model_xgb.predict(dtrain) # Convert predicted probabilities to binary labels y_pred_xgb_binary = np.round(y_pred_xgb) # Calculate accuracy and F1 score accuracy_xgb = accuracy_score(y_train_numpy, y_pred_xgb_binary) f1_xgb = f1_score(y_train_numpy, y_pred_xgb_binary) print(f"XGBoost Accuracy: {accuracy_xgb}") print(f"XGBoost F1 Score: {f1_xgb}") ``` By comparing the accuracy and F1 score obtained from the neural network model with those obtained from the XGBoost classifier, you can assess the performance of both models on the classification task.

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值