Car Evaluation Dataset Test

Car Evaluation Dataset Test

An example of a multivariate data type classification problem using Neuroph

by Tijana Jovanovic, Faculty of Organisation Sciences, University of Belgrade

an experiment for Intelligent Systems course

 

Introduction

In this example we will be testing Neuroph with Car Dataset, which can be found : here. Several architectures will be tried out, and it will be determined which ones represent a good solution to the problem, and which ones does not.

First here are some useful information about our Car Dataset:
Data Set Characteristics:
 Multivariate
Number of Instances: 1728
Attribute Characteristics: Categorical
Number of Attributes: 6
Associated Tasks: Classification

 

Introducing the problem

Car Evaluation Database was derived from a simple hierarchical decision model.
The model evaluates cars according to the following concept structure: 
CAR: car acceptability
. . PRICE overall price
. . buying buying price . . maint price of the maintenance 
. . COMFORT comfort
. . doors number of doors . . persons capacity in terms of persons to carry . . lug_boot the size of luggage boot . . safety estimated safety of the car 

Six input attributes: buying, maint, doors, persons, lug_boot, safety.

Attribute Information:

Class Values:
 unacc, acc, good, vgood 

Attributes: 

buying: vhigh, high, med, low. 
maint: vhigh, high, med, low. 
doors: 2, 3, 4, 5more. 
persons: 2, 4, more. 
lug_boot: small, med, big.
safety: low, med, high. 

For this experiment to work we had to transform our data set in binary format (0, 1).We replaced each attribute value with suitable binary combination.

For example,the attribute buying has 4 posible values:vhigh, high, med, low.Since these values are in a String format we had to transform each in a number format.So in this case each string value will be replaced with a combination of 4 binary numbers.The final transformation looks like this:

Attributes: 

buying: 1,0,0,0 instead of vhigh, 0,1,0,0 instead of high, 0,0,1,0 instead of med, 0,0,0,1 instead of low.  
maint: 1,0,0,0 instead of vhigh, 0,1,0,0 instead of high, 0,0,1,0 instead of med, 0,0,0,1 instead of low.  
doors: 0,0,0,1 instead of 2, 0,0,1,0 instead of 3, 0,1,0,0 instead of 4, 1,0,0,0 instead of 5more.  
persons: 0,0,1 instead of 2, 0,1,0 instead of 4, 1,0,0 instead of more.  
lug_boot: 0,0,1 instead of small, 0,1,0 instead of med, 1,0,0 instead of big.
safety: 0,0,1 instead of low, 0,1,0 instead of med, 1,0,0 instead of high.  

Transformed Dataset


In this example we will be using 80% of data for training the network and 20% of data for testing it.

Before you start reading our experiment we suggest to first get more familiar with Neuroph Studio and Multi Layer Perceptron.You can do that by clicking on the links below:

Neuroph Studio Geting started

Multi Layer Perceptron

Training attempt 1

Here you can see the structure of our network with its inputs,outputs and hidden neurons in the middle layer.  

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 14

Training Parameters: 
Learning Rate: 
0.2
Momentum: 0.7
Max. Error: 0.01

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 33 iterations to train. Total Net Error is acceptable 0.0095

Total Net Error graph look like this: 

Practical Testing: 

The final part of testing this network is testing it with several input values. To do that, we will select 5 random input values from our data set. Those are:

 
Network Inputs
Real Outputs
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0
0
0
1
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
1
0
0
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0
1
0
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0
1
0

The output neural network produced for this input is, respectively:

 
Network Inputs
Outputs neural network produced
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)1000
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)0,00090,00020,00530,9931
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)100,00010
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)0,00330,99650,00250
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)0,00020,00060,99730,0016

The network guessed correct in all five instances. After this test, we can conclude that this solution does not need to be rejected. It can be used to give good results in most cases.

In our next experiment we will be using the same network,but some of the parametres will be different and we will see how the result is going to change.

Training attempt 2

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 14

Training Parameters: 
Learning Rate: 
0.3
Momentum: 0.6
Max. Error: 0.01

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 21 iterations to train. Total Net Error is acceptable 0.0098

Total Net Error graph look like this: 

Practical Testing: 

The only thing left is to put the random inputs stated above into the neural network. The result of the test are shown in the table. The network guessed right in all five cases.

 
Network Inputs
Outputs neural network produced
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0
0
0
0,9996
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
1
0
0
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0
1
0
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0
1
0

As we can see from this table,network guessed allmost every instance in this test without any error,so we can say that the second combination of parametres is even better than the first one.

In the next two attempts we will be making a new neural network.The main difference will be the number of hidden neurons in the structure of our network and other parametres will also be changed.

Training attempt 3

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 10

Training Parameters: 
Learning Rate: 
0.3
Momentum: 0.6
Max. Error: 0.01

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 37 iterations to train. Total Net Error is acceptable 0.00995

Total Net Error graph look like this: 

Practical Testing: 

The final part of testing this network is testing it with several input values. To do that, we will select 5 random input values from our data set. Those are:

 
Network Inputs
Real Outputs
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0
0
0
1
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
1
0
0
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0
1
0
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0
1
0

The output neural network produced for this input is, respectively:

 
Network Inputs
Outputs neural network produced
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)10,000100
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)00,0010,01290,986
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)1000
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)0,00330,99350,00450
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)00,01910,95680,0237

The network guessed correct in all five instances. After this test, we can conclude that this solution does not need to be rejected. It can be used to give good results in most cases.

In our next experiment we will be using the same network,but some of the parametres will be diferent and we will see how the result is going to change.

Training attempt 4

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 10

Training Parameters: 
Learning Rate: 
0.5
Momentum: 0.7
Max. Error: 0.01

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 187 iterations to train. Total Net Error is acceptable 0.0084

Total Net Error graph look like this: 

Practical Testing: 

The only thing left is to put the random inputs stated above into the neural network. The result of the test are shown in the table. The network guessed right in all five cases.

 
Network Inputs
Outputs neural network produced
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0
0
0
1
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
1
0
0
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0
1
0,0094
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0,0736
0,9996
0

Training attempt 5

This time we will be making some more significant changes in the structure of our network.Now we will try to train a network with 5 neurons in its hidden layer.

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 5

Training Parameters: 
Learning Rate: 
0.2
Momentum: 0.7
Max. Error: 0.01

Training Results: 

We stoped the trening of network at this number of iterations because it is obious that in this case the network is not going to be trained succesfully and will not be able to learn the data from the set.


Total Net Error graph look like this: 

So the conclusion of this experiment is that the choice of the number of hidden neurons is crucial to the effectiveness of a neural network.

One of the "rules" for determining the correct number of neurons to use in the hidden layers is that the number of hidden neurons should be between the size of the input layer and the size of the output layer.Formula that we used looks like this:((number of inputs + number of outputs)/2)+1.In that case we made a good network that showed great results.Then we made a network with less neurons in its hidden layer and the results were not as good as before.So,in the next example we are going to see how will the network react with a greater number of hidden neurons.

Training attempt 6

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 17

Training Parameters: 
Learning Rate: 
0.2
Momentum: 0.7
Max. Error: 0.01

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 23 iterations to train. Total Net Error is acceptable 0.0099

Total Net Error graph look like this: 

Practical Testing: 

The final part of testing this network is testing it with several input values. To do that, we will select 5 random input values from our data set. Those are:

 
Network Inputs
Real Outputs
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0
0
0
1
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
1
0
0
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0
1
0
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0
1
0

The output neural network produced for this input is, respectively:

 
Network Inputs
Outputs neural network produced
Instance number
Buying
Maint
Doors
Persons
Lug boot
Safety
Unacc
Acc
Good
VGood
1.0,0,0,1(vhigh)0,0,0,1(vhigh)1,0,0,0(2)1,0,0(2)0,1,0,(med)0,1,0(med)
1
0,0001
0
0
2.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,0,1 (high)
0,0002
0,0001
0,0073
0,9946
3.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,1,0 (4)1,0,0 (small)1,0,0 (low)
0,9987
0,0012
0,0002
0
4.1,0,0,0 (low)1,0,0,0 (low)0,0,0,1 (5more)0,0,1 (more)1,0,0 (small)0,1,0 (med)
0,0031
0,9912
0,0236
0
5.1,0,0,0 (low)0,1,0,0 (med)0,0,0,1 (5more)0,1,0 (4)0,0,1 (big)0,1,0 (med)
0
0,0191
0,9568
0,0237

As you can see,this number of hidden neurons with appropriate combination of parametres also gave a good results and guessed all five instances.

Training attempt 7

Now we will see how the same network is going to work with a diferent set of parametres.

Network Type: Multi Layer Perceptron 
Training Algorithm: Backpropagation with Momentum 
Number of inputs: 21 
Number of outputs: 4 (unacc,acc,good,vgood)
Hidden neurons: 17

Training Parameters: 
Learning Rate: 
0.6
Momentum: 0.2
Max. Error: 0.02

Training Results: 
For this training, we used Sigmoid transfer function.

As you can see, the neural network took 19 iterations to train. Total Net Error is acceptable 0.0189

Total Net Error graph look like this: 

Practical Testing: 

The only thing left is to put the random inputs stated above into the neural network. The result of the test are shown in the table. The network guessed right in all five

  • 0
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
### 回答1: car hacking dataset是指用于研究和分析汽车网络安全的数据集。随着汽车技术的发展,车辆越来越多地依赖于计算机系统和网络连接来提供各种功能和服务。然而,与之相应的风险也不可忽视,例如黑客攻击可能导致汽车系统的不安全和操纵。为了更好地了解这些风险,研究人员和安全专家需要大量的汽车网络安全数据来分析和评估现有系统的弱点和潜在攻击方法。 car hacking dataset通常包括了大量的实验数据和实验结果,这些数据是通过在车辆中安装传感器、采集网络通信数据以及执行安全测试等方法获得的。这些数据可以包含车辆的各种传感器数据,例如加速度计、GPS位置信息、车辆速度等,还可以包含车辆的网络通信数据、CAN总线数据等。同时,数据集还可能包含了一系列的安全测试结果,如针对车辆系统进行的攻击测试以及对系统漏洞和弱点的评估。 研究人员可以利用car hacking dataset来开展各种安全研究和分析工作。通过分析这些数据,可以发现汽车系统的潜在风险和安全漏洞,评估系统的安全性能,并提出相应的安全措施和解决方案。此外,car hacking dataset还可以用于开发和测试车辆网络安全的防御机制和安全工具。 总之,car hacking dataset是用于研究和分析汽车网络安全的重要资源,它提供了丰富的数据和实验结果,可以帮助研究人员深入了解汽车系统的安全性能和潜在风险,进而提供更好的安全保护措施。 ### 回答2: car hacking dataset是指用于研究和分析汽车网络安全的数据集。这些数据集是从真实的汽车系统收集而来,包含了各种汽车网络攻击的数据和信息。这些数据集的目的是为了提供给研究人员和安全专家一个可以进行分析和测试的平台,以便更好地理解和预防汽车网络安全漏洞。 car hacking dataset通常包含了来自汽车网络的各种数据,例如CAN总线、ECU通信、传感器数据等。这些数据通常经过匿名化处理,以保护隐私和安全。数据集中可能包含不同类型的攻击情景,如远程控制攻击、信息窃取攻击、篡改消息攻击等,以及与汽车网络安全相关的其他信息。 对于研究人员和安全专家来说,car hacking dataset是一种宝贵的资源。它们可以利用这些数据集进行攻击和防御技术的研究和实验。通过分析这些数据,研究人员可以深入了解汽车网络中存在的安全漏洞和威胁,并开发出更有效的防护措施。 此外,car hacking dataset还可以为汽车制造商和安全公司提供一个测试和验证汽车网络安全解决方案的平台。利用这些数据集,他们可以评估自己的系统和算法的安全性,并进行必要的改进。 总之,car hacking dataset是为研究人员、安全专家和汽车行业提供的一个重要资源,它们通过提供真实的汽车网络数据来支持对汽车网络安全的研究和改进。 ### 回答3: Car hacking dataset是指一个包含了与汽车网络安全相关数据的数据集。汽车的网络安全问题正在成为一个日益严重的问题,因此研究人员和汽车制造商需要收集相关数据来进行研究和分析,以提高汽车的网络安全性。 这个数据集可能包含了许多与汽车网络安全相关的数据,包括汽车中的电子控制单元(ECU)的通信记录、CAN总线数据、车载电话系统和导航系统的数据等等。这些数据可以包括传感器数据、通信数据、控制指令等。 通过分析这些数据,研究人员可以了解车辆系统中存在的潜在漏洞和安全隐患。例如,他们可以分析攻击者如何通过操纵汽车的控制指令来控制汽车的行驶方向、刹车和加速等功能。通过研究这些攻击技术,研究人员可以为汽车制造商提供改进汽车网络安全性的建议。 此外,这个数据集对于开发安全解决方案和测试工具也非常有用。研究人员可以利用这些数据来开发新的安全防护工具,以帮助汽车制造商预防潜在的网络攻击。同时,汽车制造商也可以使用这个数据集来测试他们的汽车网络安全性,以确保汽车系统不容易受到黑客攻击。 总的来说,Car hacking dataset是一个包含了与汽车网络安全相关数据的数据集,它对于研究汽车网络安全问题、开发安全解决方案和测试工具都非常有帮助。通过分析这些数据,我们可以提高汽车系统的网络安全性,确保乘客的安全和隐私。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值