土肥宅娘口三三
码龄10年
关注
提问 私信
  • 博客:670,633
    社区:6,758
    问答:10
    677,401
    总访问量
  • 109
    原创
  • 987,579
    排名
  • 345
    粉丝
  • 3
    铁粉
IP属地以运营商信息为准,境内显示到省(区、市),境外显示到国家(地区)
IP 属地:浙江省
  • 加入CSDN时间: 2014-08-10
博客简介:

XuShuai

查看详细资料
个人成就
  • 获得290次点赞
  • 内容获得170次评论
  • 获得1,113次收藏
  • 代码片获得1,306次分享
创作历程
  • 25篇
    2018年
  • 29篇
    2017年
  • 55篇
    2016年
成就勋章
TA的专栏
  • 复杂网络
    14篇
  • 数据结构
    27篇
  • 随笔备忘
    14篇
  • 算法学习
    14篇
  • 论文笔记
    5篇
  • 机器学习
    31篇
  • mongodb
    2篇
  • java入门
    1篇
  • python
    2篇
  • keras
    1篇
  • deep learning
    13篇
兴趣领域 设置
  • 大数据
    flink
  • 人工智能
    机器学习
创作活动更多

如何做好一份技术文档?

无论你是技术大神还是初涉此领域的新手,都欢迎分享你的宝贵经验、独到见解与创新方法,为技术传播之路点亮明灯!

180人参与 去创作
  • 最近
  • 文章
  • 代码仓
  • 资源
  • 问答
  • 帖子
  • 视频
  • 课程
  • 关注/订阅/互动
  • 收藏
搜TA的内容
搜索 取消

flink并行度问题

发布问题 2024.09.13 ·
1 回答

XGBoost 3 - XGBoost原理及调用

XGBoost原理Boosting AdaBoostGradient BoostingXGBoost1 - BoostingBoosting: 将弱学习器组合成强分类构造一个性能很高的强学习器是一件很困难的事情但构造一个性能一般的弱学习器并不难弱学习器:性能比随机猜测好(层数不深的CART是一个好选择)G(x)=∑t=1Tαtϕt(x)G(x)=∑t=1Tα...
原创
发布博客 2018.07.03 ·
1514 阅读 ·
2 点赞 ·
2 评论 ·
7 收藏

XGBoost 2 - 机器学习基础

2 - 机器学习基础监督学习分类回归树随机森林2.1 - 监督学习模型参数目标函数 损失函数正则项优化2.1.1 - 模型若y为离散值,则为分类问题;若y为连续值,则为回归问题。对于给定的x如何预测标签y^y^\hat{y}? - 对于回归问题中的线性回归,其模型为:y^=f(x)=∑jwjxjy^=f(x)=∑jwjxj\hat{y} = f...
原创
发布博客 2018.07.03 ·
666 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

XGBoost 1 - 基础及简单调用

XGBoostextreme gradient boosting, 是gradient boosting machine的优化实现,快速有效。xgboost简介 xgboost特点xgboost基本使用指南xgboost理论基础 supervise learningCARTboostinggradient boostingxgboostxgboost实战 特征工程参...
原创
发布博客 2018.07.03 ·
3763 阅读 ·
4 点赞 ·
0 评论 ·
31 收藏

machine learning博客索引

本系列为台大林轩田老师《机器学习基石》和《机器学习技法》课程的部分学习笔记。机器学习基础机器学习笔记-Nonlinear Transformation机器学习笔记-Hazard of Overfitting机器学习笔记-Regularization机器学习笔记-Validation机器学习笔记-线性回归机器学习笔记-Logistic回归机器学习笔记-利用线性模型进行分类SV...
原创
发布博客 2018.06.16 ·
1070 阅读 ·
4 点赞 ·
0 评论 ·
3 收藏

deep learning博客索引

Course1 Week2-foundation of neural network Week3-one hidden layer neural network Week4-deep neural network Course2 Week1-setting up your ML application ...
原创
发布博客 2018.06.15 ·
575 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course4-week4-face recognition and neural style transfer

1 - what is face recognition?This week will show you a couple important special applications of CONVnet, we will start with face recognition and then go on to neural style transfer. Verification:...
原创
发布博客 2018.06.11 ·
612 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course4-week3-object detection

1 - object localizationIn order to build up the object detection, we first learn about object localization. image classification: the algorithm look at the picture and responsible for saying ...
原创
发布博客 2018.06.11 ·
723 阅读 ·
0 点赞 ·
0 评论 ·
1 收藏

Course4-week2-case studies

case studies1 - why look at cases studies?how the together the basic building block, such as CONV layer, POOL layer, FC layer, to form effective convolutinal neural network?outline:classic ...
原创
发布博客 2018.06.09 ·
504 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course4-week1-convolutional neural network

1 - Computer visionComputer vision problem:images recognitionobject detectionstyle transfer One of the challenges of the computer vision problem is that input can get really big. Th...
原创
发布博客 2018.06.09 ·
842 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course3 - machine learning strategy 2

1 - carrying out error analysisIf the learning algorithm is not yet at the performance of a human, then manually examiming mistakes that the algorithm is making can give us a insight into what to do...
原创
发布博客 2018.06.08 ·
858 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course3 - machine learning strategy 1

introduction to ML strategy1 - why ML strategy?How to structure machine learning project, that is on the machine learning strategy. What is machine learning strategy, let’s say we are working ...
原创
发布博客 2018.06.08 ·
585 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course2-week3-hyperparameterTuning - BatchNormalization - Framework

hyperparameter tuning1 - tuning processHow to systematically organize hyperparameters tuning process?hyperparameterslearning rate αα\alphaββ\beta in momentum, or set the default 0.9mini-b...
原创
发布博客 2018.06.08 ·
551 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course2-week2-optimization algorithm

optimization algorithms1 - mini-batch gradient descentvectorization allows you to efficiently compute on m examples.But if m is large then it can be very slow. With the implement of graident des...
原创
发布博客 2018.06.08 ·
596 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course2-week1-setting up your ML application

setting up your ML application1 - train/dev/test setThis week we’ll learn the partical aspects of how to make your neural network work well, ranging from things like hyperparameters tuning to ho...
原创
发布博客 2018.06.08 ·
587 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course1-week4-deep neural network

4.1 - deep L-layer neural networkWe have seen forward propagation and backward propagation in the context of a neural network with a single hidden layer as well we the logistic regression, and we le...
原创
发布博客 2018.06.08 ·
399 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course1-week3-one hidden layer neural network

3.1 - neural networks overviewSome new notation have been introduce, we’ll use superscript square bracket 1 to refer to the layer of neural network, for instance, w[1]w[1]w^{[1]} representing the pa...
原创
发布博客 2018.06.08 ·
518 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

Course1-week2-foundation of neural network

Week 2Basics of Neural Network Programming2.1 binary classificationmmm training example: (x(1),y(1)),⋯,(x(m),y(m))(x(1),y(1)),⋯,(x(m),y(m))(x^{(1)}, y^{(1)}), \cdots, (x^{(m)}, y^{(m)}) X=⎡...
原创
发布博客 2018.06.08 ·
395 阅读 ·
0 点赞 ·
0 评论 ·
0 收藏

keras学习简易笔记

Keras:基于Python的深度学习库Keras是一个高层神经网络API,Keras由纯Python编写而成并基Tensorflow、Theano以及CNTK后端。Keras适用的Python版本是:Python 2.7-3.6。1 - 一些基本概念1.1 - 符号计算Keras的底层库使用Theano或TensorFlow,这两个库也称为Keras的后端。无论是Thea...
原创
发布博客 2018.05.06 ·
2533 阅读 ·
0 点赞 ·
1 评论 ·
13 收藏

lambda函数的用法简记

lambda函数lambda是一个匿名函数,其语法为:lambda parameters:express一般用法import numpy as npsigmoid = lambda x:1./(1.+np.exp(-x))sigmoid(np.array([-10, 0, 10]))array([ 4.53978687e-05, 5.00000000e-01, 9.9
原创
发布博客 2018.01.27 ·
1665 阅读 ·
3 点赞 ·
0 评论 ·
3 收藏
加载更多