Program

Program = Algorithm + Data + Structure + Model + Tools + OS

  1. OS

UNIX: 

Shell:

GPU: Parallel Cuda

Simulation Demo not sample:
Network:Tcpip Https Sockets Session SSL

  1. Tools

No SQL :Distributed real-time massive concurrent throughput

SQL:MySQL Oracle

Mini SQL : Excel:

Graph SQL:Noe4j OrientDB

statistical programming languagePython Scala, R or MATLAB/Octave

Python:

Third Tools: OpenCV OpenGL OpenCL  OpenMP

Third Tools Of ML:Caffe、MXNet、Tensorflow、Torch

  1. Model:

Math Model:

Design Model:

UML:

  1. Structure:

Basic:Stack Heap Tree Map

High:Thread JVM

  1. Data

BI : PowerBI fineBI tableau

SQL 

NOSQL 

GraphSQL Noe4j Janus-graph GCN

  1. Algorithm:
    1.  Classic Algorithem: Ranking

         (Relevance Ranking Model):

(Boolean Model) (Vector Space Model) (Latent Semantic Analysis),BM25,LMIR

(Importance Ranking Model)

         PageRank,HITS,HillTop,TrustRank

Learning to Rank(LTR

信息检索(IR) 、 自然语言处理(NLP) 和 数据挖掘(DM)

  1) WTA(Winners take all) 对于给定的查询q,如果模型返回的结果列表中,第一个文档是相关的,则WTA(q)=1,否则为0.

  2) MRR(Mean Reciprocal Rank) 对于给定查询q,如果第一个相关的文档的位置是R(q),则MRR(q)=1/R(q)。

  3) MAP(Mean Average Precision) 对于每个真实相关的文档d,考虑其在模型排序结果中的位置P(d),统计该位置之前的文档集合的分类准确率,取所有这些准确率的平均值。

  4) NDCG(Normalized Discounted Cumulative Gain) 是一种综合考虑模型排序结果和真实序列之间的关系的一种指标,也是最常用的衡量排序结果的指标,详见Wikipedia

  5) RC(Rank Correlation) 

 

    1.  "Big 3"  Classification     Clustering  Regression:

NB: Naive Bayes (NB) 

Classification Algorithms

 

Accuracy

F1-Score

Logistic Regression

84.60%

0.6337

Naive Bayes

80.11%

0.6005

Stochastic Gradient Descent

82.20%

0.5780

K-Nearest Neighbours

83.56%

0.5924

Decision Tree

84.23%

0.6308

Random Forest

84.33%

0.6275

Support Vector Machine

84.09%

0.6145

Regression:

k-nearest neighbors

Linear Regression (LASSO Ridge and Elastic-Net)  (Regularized) L0,L1,L2  Overfitting

L1-regularized Logistic Regression L1 norm

L2-regularized Logistic Regression L2 norm

 

      

Regression Tree:

Decision Tree: Regressor SVR   Bayes

Logistic Regression:

Random forests: (classification tree)

(CV) Classification                                               

Detection:signal detection

Recognition

    1.  Kernel Methods:

 [SVM] rankings, clusters, or classifications

Logistic

Softmax 

 

            

    1.  Ensemble: Voting, Averaging, Random Forest,

Bagging, Blending, Boosting, Stacking

Bagging+ Decision Tree= Random Forest (RF)

AdaBoost + Decision Tree = Boosting Tree

Gradient Boosting + Decision Tree = GBDT

                                  

Voting:

Averaging:

Random Forest: RF An alternative to Bagging  (m=p)

Bagging: Bootstrap sampling,分类——>投票,回归——>平均。

                          

 

Blending:

Boosting:

              Bootstrap:

Adaboost: (Target Recognize、Face Detection)

https://i-blog.csdnimg.cn/blog_migrate/daeba89814984931929e85db4049233c.png

        GBDT: (MARTMultiple Additive Regression Tree

GBRTGradient Boosting Regression Tree

 

Loss Function:

XGboost:

 

      

 

 

Stacking:

5-Fold Stacking 

 

    1. Dimensionality reduction  

LDA: Linear Discriminant Analysis [Supervised]

Fisher Linear Discriminant  FLD

 

PCA: Principal component analysis [ Unsupervised ]

SVD: Singular value decomposition

 

FA:

ICA:

LPP: An alternative to PCA

LLE: Locally linear embedding

TSNE:

 

LEP:

UV:

Missing Values Ratio

Low Variance Filter

High Correlation Filter

Random Forests

Backward Feature Elimination

Forward Feature Construction

 

 

 

    1. Expectation Maximum (EM:):

(HMM GMM LDA MLE) 非梯度优化

EM &

------------------https://i-blog.csdnimg.cn/blog_migrate/7b45a1423d0b72dbf8965444e1b40787.png

EM & GMM:————>https://i-blog.csdnimg.cn/blog_migrate/d57427a72cd8e30484563636118d8860.png

-----------------https://i-blog.csdnimg.cn/blog_migrate/ee7f07222f4cd365d2fb0e99092bd0c0.png

EM & K-means

· k-means算法是高斯混合聚类在混合成分方差相等,且每个样本仅指派一个混合成分时候的特例。k-means算法与EM算法的关系是这样的:

· k-means是两个步骤交替进行:确定中心点,对每个样本选择最近中心点--> E步和M步。

· E步中将每个点选择最近的类优化目标函数,分给中心距它最近的类(硬分配),可以看成是EM算法中E步(软分配)的近似。

· M步中更新每个类的中心点,可以认为是在「各类分布均为单位方差的高斯分布」的假设下,最大化似然值;

    1. Nearest Neighbors:

K-means: (Clustering)

Affinity Propagation: (Clustering)

Hierarchical / Agglomerative: (Clustering)

DBSCAN: (Clustering)

KNN:

PageRank:

DBSCAN :

    1. Correlation:

Apriori:Data mining

 

Affinity Propagation 

    1. Neural networks(NN) -> 9
  1. Math:

Linear algebra

symmetric matrix   

Orthogonal matrix  

Probability and statistics

Numerical optimization                    

Multivariable Calculus             

Ordinary Least Squares Regression                           

Stepwise Regression       

Multivariate Adaptive Regression Splines      

  1. Tuning: performance index

Tuning of Bugs:

Tuning of Concurrent

Tuning of Online:

Tuning of ML: PCA

SGD,Adagrad,Adadelta,Adam,Adamax,Nadam

 

 

预测1

预测0

实际1

True Positive(TP)

False Negative(FN)

实际0

False Positive(FP)

True Negative(TN)

 

 

 

    1. Classification Performance index

Statistics:Precision (P) 、Recall (R) 、F1、

Accuracy: (acc)

Error rate: (1 - acc )

Precision = 提取出的正确信息条数 /  提取出的信息条数     

Recall = 提取出的正确信息条数 /  样本中的信息条数    

F-measure  F1 = 2PR/ (P+ R) ——>  2/ F1=1/P+1/R

 

GooSeeker、 Specificity、ROC、AUC

ROC (Receiver Operating Characteristic)

AUC

PSI (population stability index) = sum((实际占比-预期占比)/ln(实际占比/预期占比))

 

    1. GD:

Stochastic Gradient Descent

Stochastic Average Gradient (sag)

BGD(Batch Gradient Descent)、

SGD

MBGD

    1. Entropy:

Conditional entropy

Information gain

Information gain ratio

Gini index

 

    1. Regression Performance index

R^2 、SSE、MSE、RMSE、MAE、R-Squared

 

  1. ANN DNN

AlexNet、GoogleNet、Fast/Faster-RCNN、SSD、Yolo、SegNet

 

 

  1. DL ML (deep learning)

•       Supervised learning

•       Unsupervised learning

•       Reinforcement learning

•       Multi-task learning

•       Cross-validation

  1. Digital Signal:

Signal Processing: Short Time Fourier Transform, Moving Median Filtering,

Singular Value Decomposition

Text: NLP

NUM: Observing Data - > Finding Features - > Design Algorithms - > Validation of Algorithms - > Washing Data - > Engineering - > On-line Viewing Effect - > Goto Observation Data

Image:

Video: Optical Flow Field, Edge Extraction, Feature Point Extraction, SVM, AdaBoost, Neural Network

  1. Terminal

Oral Textbook

  1. Open framework

 

Face Recognize:

 

Recommendation DeepFMWide & DeepDIN

Search

Ads

User portrait

DBpedia freebase yago openkg

?mid=&wid=51824&sid=&tid=8357&rid=LOADED&custom1=mp.csdn.net&custom2=%2Fpostedit%2F89442414&t=1559038603469?mid=&wid=51824&sid=&tid=8357&rid=BEFORE_OPTOUT_REQ&t=1559038603469?mid=&wid=51824&sid=&tid=8357&rid=FINISHED&custom1=mp.csdn.net&t=1559038603471?mid=&wid=51824&sid=&tid=8298&rid=OPTOUT_RESPONSE_OK&t=1559038603792?mid=cd1d2&wid=51824&sid=&tid=8298&rid=MNTZ_INJECT&t=1559038603797?mid=90f06&wid=51824&sid=&tid=8298&rid=MNTZ_INJECT&t=1559038603800

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Program Files 是Windows操作系统中的一个默认安装目录,用于存放64位软件。而Program Files (x86) 则是用于存放32位软件。根据默认安装路径可以判断软件是32位还是64位。如果默认路径是C:\Program Files (x86),则是32位软件;如果默认路径是C:\Program Files,则是64位软件。这两个文件夹都是重要的程序文件夹,不建议随意删除其中的文件,否则可能导致程序无法正常运行。如果需要卸载应用程序,最好使用控制面板的“添加和卸载程序”功能进行卸载。\[1\]\[2\] 至于你提到的D盘中出现的Program Files文件夹,以及其中的空文件夹ModifiableWindowsApps,可能是你之前安装并卸载了一些与Xbox相关的软件,但是卸载过程中可能残留了一些文件夹。如果你想彻底删除这些文件夹,可能需要获得SYSTEM权限,并使用一些专业的文件删除工具。\[3\] #### 引用[.reference_title] - *1* [浅谈一下Program Files和Program Files(x86)](https://blog.csdn.net/weixin_48879289/article/details/122840694)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [win10里C盘的Program Files和 Program Files(x86)的区别](https://blog.csdn.net/polar_night_down/article/details/124188420)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [解决非系统盘出现Program Files文件夹以及Program Files下的ModifiableWindowsApps文件夹无法删除的问题。](https://blog.csdn.net/weixin_43959142/article/details/127579029)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值