Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
eliya and sharon :) v2
(version: 1)
Comparing performance of:
reduce with push vs flatMap vs reduce with concat
Created:
one year ago
by:
Guest
Jump to the latest result
Script Preparation code:
var arr = Array(10_000_000).fill({entities:[1,2]})
Tests:
reduce with push
arr.reduce((acc, { entities }) => { const newE = entities.map((x) => x); acc.push(...newE); return acc; }, []);
flatMap
arr.flatMap(({ entities }) => entities.map((x) => x));
reduce with concat
arr.reduce((acc, { entities }) => { const newE = entities.map((x) => x); acc.concat(newE); return acc; }, []);
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (3)
Previous results
Fork
Test case name
Result
reduce with push
flatMap
reduce with concat
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
one year ago
)
User agent:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36
Browser/OS:
Chrome 129 on Windows
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
reduce with push
0.8 Ops/sec
flatMap
0.9 Ops/sec
reduce with concat
2.0 Ops/sec
Autogenerated LLM Summary
(model
gpt-4o-mini
, generated one year ago):
The provided benchmark tests various methods for transforming and flattening a large array of objects in JavaScript. Specifically, it compares how three different approaches perform when applied to an array consisting of 10 million objects, each containing an array under the property `entities`. ### Overview of Approaches 1. **Reduce with Push**: - **Code**: `arr.reduce((acc, { entities }) => { const newE = entities.map((x) => x); acc.push(...newE); return acc; }, []);` - This method uses the `reduce` function to iterate over the array, mapping `entities` to create `newE`, which is then spread into the accumulator `acc` using `push`. - **Pros**: - The use of `push` with the spread operator is straightforward and keeps the code readable. - It can be efficient for building the result array incrementally. - **Cons**: - The performance can degrade with large arrays as the accumulation can lead to repetitive memory allocations. 2. **FlatMap**: - **Code**: `arr.flatMap(({ entities }) => entities.map((x) => x));` - This approach combines the `map` and `flatten` operations into a single method, effectively transforming and flattening the array. - **Pros**: - It is concise and more declarative, which makes the intent clear. - Generally provides better performance than using `reduce` for the same operation by optimizing the flattening process internally. - **Cons**: - Potentially less flexible for more complex transformations, as it’s somewhat specialized in the flattening operation. 3. **Reduce with Concat**: - **Code**: `arr.reduce((acc, { entities }) => { const newE = entities.map((x) => x); acc.concat(newE); return acc; }, []);` - Similar to the first approach, but this uses the `concat` method to combine arrays rather than `push`. - **Pros**: - `concat` can create a new array and may offer clarity when combining different array formats. - **Cons**: - It can lead to worse performance because every call to `concat` creates a new array, leading to additional allocations and copies. ### Benchmark Results - The benchmark results show that the **Reduce with Push** method outperforms the others, achieving approximately **1.97 executions per second**. - The **FlatMap** approach performs less efficiently at about **0.91 executions per second**, while **Reduce with Concat** is the slowest at approximately **0.79 executions per second**. These results highlight that while `flatMap` is elegant, it may not always be the most performant choice for very large datasets like the one tested here. ### Alternative Considerations - Alternatives to these methods include traditional `for` or `forEach` loops, which can sometimes be more performant for specific use cases, especially if you can optimize memory usage and avoid intermediate array creations. - Another alternative might be creating a simple iterative function that utilizes manual index manipulation to build a new array, which can further optimize performance in memory-intensive tasks. All these tests provide insights into the efficiency of handling large data sets in JavaScript and highlight the trade-offs between readability and performance, allowing developers to make informed decisions based on their specific context and requirements.
Related benchmarks:
for vs map to fill array
for vs map to fill array fixed
Flatmaps with native (large array)
reduce()+push() vs map()
reduce vs map + filter
flatMap vs reduce 99991234
test test test
reducer vs fromEntries
eliya and sharon :)
Comments
Confirm delete:
Do you really want to delete benchmark?