【评测】Java Stream一组数据让我对它疑问

public class Test {

    public static void main(String[] args) throws InterruptedException {

        int n = 100000000;
        long result = 0;


        long begin, end;

        Thread.sleep(2000);
        begin = System.currentTimeMillis();
        for (long i = 1L; i <= n; i++) {
            result += i;
        }
        end = System.currentTimeMillis();
        System.out.println("for loop result = " + result + " time: " + (end - begin));

        Thread.sleep(2000);
        begin = System.currentTimeMillis();
        result = LongStream.rangeClosed(1, n)
                .sum();
        end = System.currentTimeMillis();
        System.out.println("LongStream sum result = " + result + " time: " + (end - begin));

        Thread.sleep(2000);
        begin = System.currentTimeMillis();
        result = Stream.iterate(1L, i -> i+1)
                .limit(n)
                .reduce(0L, Long::sum);
        end = System.currentTimeMillis();
        System.out.println("Stream iterate result = " + result + " time: " + (end - begin));

        Thread.sleep(2000);
        begin = System.currentTimeMillis();
        result = Stream.iterate(1L, i -> i+1)
                .parallel()
                .limit(n)
                .reduce(0L, Long::sum);
        end = System.currentTimeMillis();
        System.out.println("Stream iterate parallel result = " + result + " time: " + (end - begin));
    }

执行结果:

for loop result = 5000000050000000 time: 34
LongStream sum result = 5000000050000000 time: 92
Stream iterate result = 5000000050000000 time: 1153
Stream iterate parallel result = 5000000050000000 time: 25224

这样的执行效率不知道是不是我写错了,还是说场景不对。
是不是只能再巨大数据量的情况下才能体现出优势呢!
当设置1到10亿的累加就异常了。

for loop result = 500000000500000000 time: 390
LongStream sum result = 500000000500000000 time: 450
Stream iterate result = 500000000500000000 time: 9366
Exception in thread "main" java.lang.OutOfMemoryError
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:598)
	at java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:677)
	at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:735)
	at java.util.stream.SliceOps$1.opEvaluateParallelLazy(SliceOps.java:155)
	at java.util.stream.AbstractPipeline.sourceSpliterator(AbstractPipeline.java:432)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
	at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:474)
	at com.seventh.hospital.app.controller.Test.main(Test.java:65)
Caused by: java.lang.OutOfMemoryError: Java heap space
	at java.lang.Long.valueOf(Long.java:840)
	at com.seventh.hospital.app.controller.Test.lambda$main$1(Test.java:62)
	at com.seventh.hospital.app.controller.Test$$Lambda$4/1651191114.apply(Unknown Source)
	at java.util.stream.Stream$1.next(Stream.java:1033)
	at java.util.Spliterators$IteratorSpliterator.trySplit(Spliterators.java:1784)
	at java.util.stream.AbstractShortCircuitTask.compute(AbstractShortCircuitTask.java:114)
	at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
	at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
	at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

欢迎补充

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值