Disharmony Trees 树状数组

Disharmony Trees
Time Limit:1000MS     Memory Limit:32768KB     64bit IO Format:%I64d & %I64u
Submit  Status

Description

One day Sophia finds a very big square. There are n trees in the square. They are all so tall. Sophia is very interesting in them. 

She finds that trees maybe disharmony and the Disharmony Value between two trees is associated with two value called FAR and SHORT. 

The FAR is defined as the following:If we rank all these trees according to their X Coordinates in ascending order.The tree with smallest X Coordinate is ranked 1th.The trees with the same X Coordinates are ranked the same. For example,if there are 5 tree with X Coordinates 3,3,1,3,4. Then their ranks may be 2,2,1,2,5. The FAR of two trees with X Coordinate ranks D1 and D2 is defined as F = abs(D1-D2). 

The SHORT is defined similar to the FAR. If we rank all these trees according to their heights in ascending order,the tree with shortest height is ranked 1th.The trees with the same heights are ranked the same. For example, if there are 5 tree with heights 4,1,9,7,4. Then their ranks may be 2,1,5,4,2. The SHORT of two trees with height ranks H1 and H2 is defined as S=min(H1,H2). 

Two tree’s Disharmony Value is defined as F*S. So from the definition above we can see that, if two trees’s FAR is larger , the Disharmony Value is bigger. And the Disharmony value is also associated with the shorter one of the two trees. 

Now give you every tree’s X Coordinate and their height , Please tell Sophia the sum of every two trees’s Disharmony value among all trees.
 

Input

There are several test cases in the input 

For each test case, the first line contain one integer N (2 <= N <= 100,000) N represents the number of trees. 

Then following N lines, each line contain two integers : X, H (0 < X,H <=1,000,000,000 ), indicating the tree is located in Coordinates X and its height is H.
 

Output

For each test case output the sum of every two trees’s Disharmony value among all trees. The answer is within signed 64-bit integer.
 

Sample Input

2
10 100
20 200
4
10 100
50 500
20 200
20 100
 

Sample Output

1
13
 
 1 #include <iostream>
 2 #include <string.h>
 3 #include <stdio.h>
 4 #include <algorithm>
 5 using namespace std;
 6 #define ll long long
 7 typedef struct abcd
 8 {
 9     ll x,f,i;
10 } abcd;
11 abcd a[110000];
12 bool cmp1(abcd x,abcd y)
13 {
14     return x.x<y.x;
15 }
16 bool cmp2(abcd x,abcd y)
17 {
18     return x.f<y.f;
19 }
20 typedef struct abc
21 {
22     ll sum,ci;
23 }abc;
24 abc aa[110000];
25 ll need,need1,n;
26 ll lowbit(ll x)
27 {
28     return x&(-x);
29 }
30 void update(ll x)
31 {
32     ll y=x;
33     while(x<=n)
34     {
35         aa[x].ci++;
36         aa[x].sum+=y;
37         x+=lowbit(x);
38     }
39 }
40 ll fun(ll x,ll y)
41 {
42     ll  ans,nu=0,sum=0,z=x;
43     while(x>0)
44     {
45         nu+=aa[x].ci;
46         sum+=aa[x].sum;
47         x-=lowbit(x);
48     }
49     ans=y*((need-sum)-(need1-nu)*z)+y*(nu*z-sum);
50     return ans;
51 }
52 int main()
53 {
54     int i,j,now,noww;
55     ll ans;
56     while(~scanf("%d",&n))
57     {
58         for(i=0; i<n; i++)
59             scanf("%I64d%I64d",&a[i].x,&a[i].f);
60         sort(a,a+n,cmp1);
61         now=a[0].x,a[0].x=1,noww=2;
62         for(i=1; i<n; i++)
63         {
64             if(a[i].x==now)a[i].x=a[i-1].x;
65             else now=a[i].x,a[i].x=noww;
66             noww++;
67         }
68         sort(a,a+n,cmp2);
69         now=a[0].f,a[0].f=1,noww=2;
70         for(i=1; i<n; i++)
71         {
72             if(a[i].f==now)a[i].f=a[i-1].f;
73             else now=a[i].f,a[i].f=noww;
74             noww++;
75         }
76         need=0,need1=0;
77         ans=0;
78         memset(aa,0,sizeof(aa));
79         for(i=n-1;i>=0;i--)
80         {
81             ans+=fun(a[i].x,a[i].f);
82             update(a[i].x);
83             need+=a[i].x;
84             need1++;
85         }
86         cout<<ans<<endl;
87     }
88 }
View Code

 

 

转载于:https://www.cnblogs.com/ERKE/p/3842804.html

神经网络奇异态是指在训练深度神经网络时,由于梯度消失或爆炸等原因导致的网络权重发生剧烈波动的现象。这种现象会导致网络性能下降,甚至无法收敛。因此,研究神经网络奇异态是深度学习领域的重要问题之一。以下是一些关于神经网络奇异态研究的文献综述。 1.《Deep Residual Learning for Image Recognition》 这篇论文提出了一种残差神经网络(ResNet)的结构,通过引入残差连接来缓解梯度消失和爆炸问题,从而提高了网络的性能。该论文在ImageNet数据集上获得了当时最先进的结果,同时也为后续的奇异态研究提供了重要的思路。 2.《Understanding the Disharmony between Dropout and Batch Normalization by Variance Shift》 该论文研究了在使用Dropout和Batch Normalization时,为什么会导致网络出现奇异态现象。通过分析网络权重的变化情况,论文提出了一种基于方差偏移的解释,从而阐述了这种现象的本质原因。 3.《Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer》 该论文研究了在使用Transformer模型进行自然语言处理任务时,如何避免奇异态问题。通过引入稳定性增强机制和训练技巧,论文提高了网络的稳定性和性能,同时也为其他自然语言处理模型的研究提供了启示。 4.《Towards Understanding the Dynamics of Batch Normalization》 该论文研究了Batch Normalization的动态变化过程,从而深入探究了其在网络训练中的作用和影响。论文提出了一种基于矩阵分析的方法,从理论和实验两个方面对Batch Normalization的奇异态问题进行了分析。 5.《On the Convergence and Robustness of Adversarial Training》 该论文研究了在对抗训练中,如何提高网络的收敛性和鲁棒性。论文通过引入正则化项和改进损失函数的方式,提高了网络的鲁棒性,并且在对抗攻击下具有更好的性能。 综上所述,神经网络奇异态是深度学习领域中的一个重要问题,相关研究已经涉及到网络结构、训练技巧、正则化方法等多个方面。随着深度学习的不断发展,相信在未来会有更多关于神经网络奇异态的研究和发展。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值