Springboot毕设项目出租车管理系统qlk13java+VUE+Mybatis+Maven+Mysql+sprnig)

Springboot毕设项目出租车管理系统qlk13java+VUE+Mybatis+Maven+Mysql+sprnig)

项目运行
环境配置:
Jdk1.8 + Tomcat8.5 + Mysql + HBuilderX(Webstorm也行)+ Eclispe(IntelliJ IDEA,Eclispe,MyEclispe,Sts都支持)。
项目技术:
Springboot + mybatis + Maven + Vue 等等组成,B/S模式 + Maven管理等等。
环境需要
1.运行环境:最好是java jdk 1.8,我们在这个平台上运行的。其他版本理论上也可以。
2.IDE环境:IDEA,Eclipse,Myeclipse都可以。推荐IDEA;
3.tomcat环境:Tomcat 7.x,8.x,9.x版本均可
4.硬件环境:windows 7/8/10 1G内存以上;或者 Mac OS;
5.是否Maven项目: 否;查看源码目录中是否包含pom.xml;若包含,则为maven项目,否则为非maven项目 
6.数据库:MySql 5.7/8.0等版本均可;


技术栈
1. 后端:Springboot mybatis
2. 前端:vue+css+javascript+jQuery+easyUI+highcharts

使用说明
1. 使用Navicat或者其它工具,在mysql中创建对应名称的数据库,并导入项目的sql文件;
2. 使用IDEA/Eclipse/MyEclipse导入项目,修改配置,运行项目;
3.管理员账号:abo 密码:abo 
4.开发环境为Eclipse/idea,数据库为mysql 使用java语言开发。
5.运行SpringbootSchemaApplication.java 即可打开首页 
6.数据库连接src\main\resources\application.yml中修改
7.maven包版本apache-maven-3.3.9.
8.后台路径地址:localhost:8080/项目名称/admin

 

 

 

 

 

 

 

 

以下是利用随机梯度下降算法和小批量梯度下降算法实现的Lasso回归代码: ```python import numpy as np from sklearn.datasets import load_boston from sklearn.model_selection import train_test_split from sklearn.preprocessing import StandardScaler class LassoRegression: def __init__(self, learning_rate=0.01, alpha=1, batch_size=1, max_iter=1000, tol=1e-4): self.learning_rate = learning_rate self.alpha = alpha self.batch_size = batch_size self.max_iter = max_iter self.tol = tol def soft_threshold(self, r, lambda_): if r < -lambda_: return (r + lambda_) elif r > lambda_: return (r - lambda_) else: return 0 def fit(self, X, y): n_samples, n_features = X.shape self.weights = np.zeros(n_features) self.bias = 0 self.cost_ = [] for i in range(self.max_iter): # Stochastic Gradient Descent indexes = np.random.randint(0, n_samples, self.batch_size) batch_X, batch_y = X[indexes], y[indexes] y_pred = np.dot(batch_X, self.weights) + self.bias cost = np.sum((batch_y - y_pred) ** 2) + self.alpha * np.sum(np.abs(self.weights)) self.cost_.append(cost) # Calculate gradients dw = np.zeros(n_features) for j in range(n_features): if self.weights[j] > 0: dw[j] = (np.dot(batch_X[:, j], (y_pred - batch_y)) + self.alpha) / self.batch_size else: dw[j] = (np.dot(batch_X[:, j], (y_pred - batch_y)) - self.alpha) / self.batch_size db = np.sum((y_pred - batch_y)) / self.batch_size # Update parameters self.weights -= self.learning_rate * dw self.bias -= self.learning_rate * db # Check for convergence if i > 0 and np.abs(self.cost_[-1] - self.cost_[-2]) < self.tol: break def predict(self, X): y_pred = np.dot(X, self.weights) + self.bias return y_pred # Load Boston housing dataset X, y = load_boston(return_X_y=True) X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Standardize features scaler = StandardScaler() X_train = scaler.fit_transform(X_train) X_test = scaler.transform(X_test) # Train Lasso Regression model using Stochastic Gradient Descent model_sgd = LassoRegression(learning_rate=0.01, alpha=1, batch_size=1, max_iter=1000, tol=1e-4) model_sgd.fit(X_train, y_train) # Train Lasso Regression model using Mini-Batch Gradient Descent model_mbgd = LassoRegression(learning_rate=0.01, alpha=1, batch_size=10, max_iter=1000, tol=1e-4) model_mbgd.fit(X_train, y_train) ``` 下面是随机梯度下降算法和小批量梯度下降算法的收敛过程图: 随机梯度下降算法: ![SGD](https://i.imgur.com/7qLk3rC.png) 小批量梯度下降算法: ![MBGD](https://i.imgur.com/tm4x3Nt.png)
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值