Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
reduce vs map/filter --- filter and reshape
(version: 8)
Comparing performance of:
reduce with inner mutation vs map filter vs reduce with spreading
Created:
3 years ago
by:
Registered User
Jump to the latest result
Script Preparation code:
var array = Array(10_000).fill(0).map((_, i) => i % 3)
Tests:
reduce with inner mutation
[...array].reduce((acc, v) => { if (v !== 0) { acc.push({ number: v }); } return acc; }, []);
map filter
array.filter(v => v !== 0).map(v => ({ number: v }));
reduce with spreading
array.reduce((acc, v) => v === 0 ? acc : [...acc, { number: v }], []);
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (3)
Previous results
Fork
Test case name
Result
reduce with inner mutation
map filter
reduce with spreading
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the provided benchmark and explain what is tested, compared options, their pros and cons, and other considerations. **Benchmark Overview** The benchmark tests three approaches to transform an array of 10,000 elements from 0 to 2: `reduce` with inner mutation, `map` followed by `filter`, and `reduce` with spreading. **Options Compared** 1. **`reduce` with inner mutation**: This approach uses the `push` method to add elements to the accumulator array within the reducer function. * Pros: + More efficient for small arrays since it avoids creating a new array with `map`. + Can be faster for some JavaScript engines. * Cons: + May cause issues with array indices and bounds checks due to the mutable nature of the accumulator. 2. **`map` followed by `filter`**: This approach uses the `map` method to create a new array with transformed elements, followed by the `filter` method to remove unwanted elements. * Pros: + Less prone to issues with array indices and bounds checks compared to `reduce`. + More flexible for complex transformations since it allows using multiple methods. * Cons: + May be slower due to creating a new array with `map` and then iterating over it again with `filter`. 3. **`reduce` with spreading**: This approach uses the spread operator (`...`) to add elements to the accumulator array within the reducer function. * Pros: + More concise and expressive than using `push`. + May be faster since it avoids creating a new array with `push`. **Other Considerations** * The benchmark script preparation code creates an array of 10,000 elements using the `Array` constructor and fills them with values from 0 to 2. * The HTML preparation code is empty, suggesting that this aspect is not relevant for this specific benchmark. * The library used in the benchmark is none, as it only tests native JavaScript features. **Special JS Features/Syntax** There are no special JavaScript features or syntax mentioned in the provided benchmark. It only uses standard JavaScript features like arrays, `push`, `map`, `filter`, and spread operator (`...`). **Alternatives** Other alternatives to consider when transforming an array with similar operations might include: * Using a library like Lodash, which provides a `map` function that can be used in conjunction with other methods. * Using a functional programming approach with reduce() or flatMap() functions from libraries like Ramda. * Using a more modern JavaScript feature like `array.prototype.for...of()` or `Array.from()` to create new arrays. However, these alternatives are not tested in this specific benchmark.
Related benchmarks:
flatMap vs reduce small array
Filter and Map vs Reduce
Flat map + filter vs. Reduce
flatMap vs reduce flattern array
Comments
Confirm delete:
Do you really want to delete benchmark?