Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
filter.map vs reduce 2
(version: 1)
Comparing performance of:
filter.map vs reduce 1 vs reduce 2 vs reduce 3
Created:
5 years ago
by:
Registered User
Jump to the latest result
Script Preparation code:
arr = [{ foo: true, bar: 'a' }, { foo: false, bar: 'b' }, { foo: false, bar: 'c'}, { foo: true, bar: 'a' }, { foo: false, bar: 'b' }, { foo: false, bar: 'c' }, { foo: true, bar: 'a' }, { foo: false, bar: 'b' }, { foo: false, bar: 'c' }];
Tests:
filter.map
const result = arr .filter((item) => !item.foo) .map((item) => item.bar);
reduce 1
const result = arr.reduce((acc, item) => { if (!item.foo) { acc.push(item.bar); } return acc; }, []);
reduce 2
const result = arr.reduce((acc, item) => acc.concat(!item.foo ? item.bar : []), []);
reduce 3
const result = arr.reduce((acc, item) => item.foo ? acc : [...acc, item.bar], []);
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (4)
Previous results
Fork
Test case name
Result
filter.map
reduce 1
reduce 2
reduce 3
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's dive into the world of JavaScript microbenchmarks on MeasureThat.net. The provided benchmark compares four different approaches to filter and map an array in JavaScript: 1. `filter.map` 2. `reduce 1` 3. `reduce 2` 4. `reduce 3` **Overview** Each test case is designed to measure the performance of a specific JavaScript operation on a large dataset. The dataset consists of an array of objects with two properties: `foo` and `bar`. **Filtering** In all four test cases, filtering is applied to the array before mapping. This means that only items where `foo` is `false` are considered for further processing. **Mapping** After filtering, each test case applies a different transformation to the filtered array: 1. `filter.map`: Maps each item in the filtered array to its `bar` property. 2. `reduce 1`: Accumulates an array of `bar` properties into a single result using the `push` method. 3. `reduce 2`: Similar to `reduce 1`, but uses the spread operator (`...`) to concatenate arrays instead of `push`. 4. `reduce 3`: Maps each item in the filtered array to its `bar` property, but only if `foo` is `true`. **Options Comparison** Here's a brief overview of each approach and their pros/cons: 1. **Filter.map**: This approach is concise and efficient, as it leverages the built-in `map` method. * Pros: Easy to read, optimized by the JavaScript engine. * Cons: May not be as flexible or customizable as other approaches. 2. **Reduce 1**: Uses the `push` method to accumulate an array of results in a single call to `reduce`. * Pros: Flexible, can handle different data structures and scenarios. * Cons: May be slower due to the additional function calls. 3. **Reduce 2**: Similar to `reduce 1`, but uses the spread operator (`...`) for concatenation. * Pros: More efficient than `push` in some cases, as it avoids creating intermediate arrays. * Cons: Less intuitive than `reduce 1`. 4. **Reduce 3**: Maps each item to its `bar` property only if `foo` is `true`. * Pros: Can be useful for filtering out unwanted data before mapping. * Cons: May be slower due to the additional conditional statement. **Library Usage** None of the test cases rely on external libraries. The built-in JavaScript methods (`filter`, `map`, `reduce`) are used consistently across all four approaches. **Special JS Features/Syntax** The only notable feature is the use of template literals in the script preparation code (e.g., `"const result = arr\r\n .filter((item) => !item.foo)\r\n .map((item) => item.bar);"`). This syntax is supported by most modern JavaScript engines and is used to simplify string concatenation. **Alternatives** If you're looking for alternative approaches, consider the following: * Using a library like Lodash or Ramda, which provide more extensive set of utility functions for array operations. * Employing a different data structure, such as a `Set` or a `Map`, to reduce filtering and mapping overhead. * Utilizing parallel processing techniques, like Web Workers or async/await, to take advantage of multi-core processors. Keep in mind that the performance differences between these approaches may not be significant for small datasets. However, as the dataset grows, one of these optimized methods might outperform others.
Related benchmarks:
filter() then map() vs reduce() + concat()
filter() then map() vs reduce() + concat() vs reduce() + push() vs forEach()
Reduce vs map with empty filter
Reduce Push vs. flatMap vs 123
Comments
Confirm delete:
Do you really want to delete benchmark?