Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
reduce vs map & filter
(version: 0)
Comparing performance of:
map and filter vs reduce
Created:
2 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var arr = []; for (var i = 0; i < 1234; i++) { const obj = {}; obj[`${i}key`] = i; arr.push(obj); } function shuffle(array) { let currentIndex = array.length, randomIndex; // While there remain elements to shuffle. while (currentIndex > 0) { // Pick a remaining element. randomIndex = Math.floor(Math.random() * currentIndex); currentIndex--; // And swap it with the current element. [array[currentIndex], array[randomIndex]] = [ array[randomIndex], array[currentIndex] ]; } return array; } shuffle(arr); var arr2 = []; for (var i = 0; i < 12345; i++) { const obj = {}; obj[`${i}key`] = i; arr2.push(obj); }
Tests:
map and filter
arr2.map((item, key) => arr[key]).filter(item => item);
reduce
arr2.reduce((acc, item, key) => {if(arr[key]){acc.push(item)}return acc;}, [])
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
map and filter
reduce
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
2 years ago
)
User agent:
Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:121.0) Gecko/20100101 Firefox/121.0
Browser/OS:
Firefox 121 on Windows
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
map and filter
8360.3 Ops/sec
reduce
12799.1 Ops/sec
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
**Benchmark Overview** MeasureThat.net is a website that allows users to create and run JavaScript microbenchmarks. The provided benchmark measures the performance of two different approaches: using `reduce()` vs using `map()` and then filtering the results. **Benchmark Definition JSON** The benchmark definition JSON provides information about the test cases, which are: 1. **"reduce vs map & filter"**: This is the main benchmark that compares the performance of `reduce()` and `map()` with a subsequent filter operation. 2. **"arr2.map((item, key) => arr[key]).filter(item => item);"** (Test Name: "map and filter"): This test case uses the `map()` function to transform an array of objects, where each object has a key that is also present in another array (`arr`). 3. **"arr2.reduce((acc, item, key) => {if(arr[key]){acc.push(item)}return acc;}, [])"** (Test Name: "reduce"): This test case uses the `reduce()` function to accumulate an array of items from the same array (`arr`). **Benchmark Preparation Code** The preparation code is a JavaScript script that generates two arrays: * `arr`: An array of 1234 objects, where each object has a key with a value equal to its index in the array. * `arr2`: Another array of 12345 objects, with the same structure as `arr`. **Options Compared** The benchmark compares the performance of two approaches: 1. **`map()` and then filtering**: The `map()` function is used to transform the `arr2` array into a new array with only the desired elements (i.e., those that exist in `arr`). The resulting array is then filtered using another `filter()` function. 2. **`reduce()` with a conditional statement**: The `reduce()` function is used to accumulate an array of items from `arr2`, but instead of pushing all items to the accumulator, it checks if the key exists in `arr` before adding the item. **Pros and Cons** Here are some pros and cons of each approach: 1. **`map()` and then filtering**: * Pros: More concise and expressive code. * Cons: May incur additional overhead due to the intermediate array creation. 2. **`reduce()` with a conditional statement**: * Pros: Can be more efficient for large datasets, as it avoids creating an intermediate array. * Cons: Code is less concise and may require more mental effort. **Library and Special JS Features** There are no libraries or special JavaScript features used in this benchmark. However, the use of `map()` and `reduce()` functions implies a familiarity with functional programming concepts in JavaScript. **Other Alternatives** If you're looking for alternative approaches to these two methods, consider: 1. **Using `filter()` instead of `map() & filter`**: This could potentially be faster than using `map()` followed by filtering, as it avoids the overhead of creating an intermediate array. 2. **Using `forEach()` with a conditional statement**: Similar to `reduce()` with a conditional statement, this approach can be more efficient for large datasets but may require more code and mental effort. 3. **Using other aggregation methods**, such as `every()`, `some()`, or `all()`, depending on the specific requirements of your use case. Keep in mind that the performance differences between these alternatives may not be significant, and the choice ultimately depends on your personal preference, code readability, and maintainability considerations.
Related benchmarks:
Already sorted versus random
Array.Sort vs Math.Min-Max
set.has vs. array.includes vs obj[key] vs map.get 2
shuffle array [dsng-manscaped]
Comments
Confirm delete:
Do you really want to delete benchmark?