Understanding Map and Reduce in Java 8/9 functional programming (lambda expression). How map() and reduce() increases performance?

This one line of Functional Programming code does:
2*3 + 4*3 + 6*3 + 8*3 + 10*3 operation.

 int sum = IntStream.rangeClosed(1,10) /* closed range */
            .filter(x -> x%2 == 0) /* filter to even numbers in range */
            .map(x -> x*3) /* map */
            .sum(); /* actual sum operation happens */
            System.out.println(sum); /* prints 90 */

I understand what it is doing. I would like to know what is happening under the hood in terms of memory allocation? We can have the similar old alternatives of above operation as below. This is very easy to understand, but above Lambda based code is more expressive.

int sum=0;
for(int i=1; i<=10;i++) {
   if(i%2 == 0) {
System.out.println(sum); /* prints 90 */

First the lambda expressions will be de-sugared to static methods inside your class file (use javap to see that).

For the Predicate there will a .class generated (that you can see via -Djdk.internal.lambda.dumpProxyClasses=/Your/Path parameter set when you invoke your class.

The same thing goes for the Function for the map operation.

Since your lambdas are stateless there will be a single instance of the Predicate and the Function created and re-used for each operation. If it would have been a stateful lambda – a new instance would be generated for each element that is processed.

And from your question title map and reduce do not increase performance (unless there are tons of elements and you can parallelize the process with a benefit). Your simple loop will be faster – but not that much faster than streams. You have also chosen a pretty simple example – suppose you choose an example that does some heavy grouping and then a custom collection, etc – the verbosity of the simple approach via stream would be significant.