Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Array merge
(version: 0)
Comparing performance of:
reduce vs loop vs concat and filter vs set
Created:
3 years ago
by:
Guest
Jump to the latest result
Tests:
reduce
const firstArr = new Array(200).fill(undefined).map((val, i) => `item${i}`) const secondArr = new Array(250).fill(undefined).map((val, i) => `item${i}`) const result = secondArr.reduce( (acc, item) => { return acc.includes(item) ? acc : [...acc, item] }, [...firstArr] )
loop
const firstArr = new Array(200).fill(undefined).map((val, i) => `item${i}`) const secondArr = new Array(250).fill(undefined).map((val, i) => `item${i}`) const result = [] for (let i = 0; i < firstArr.length; i++) { if (result.indexOf(firstArr[i]) == -1) result.push(firstArr[i]) } for (let i = 0; i < secondArr.length; i++) { if (result.indexOf(secondArr[i]) == -1) result.push(secondArr[i]) }
concat and filter
const firstArr = new Array(200).fill(undefined).map((val, i) => `item${i}`) const secondArr = new Array(250).fill(undefined).map((val, i) => `item${i}`) const concatArr = firstArr.concat(secondArr) const result = concatArr.filter((item, idx) => concatArr.indexOf(item) === idx)
set
const firstArr = new Array(200).fill(undefined).map((val, i) => `item${i}`) const secondArr = new Array(250).fill(undefined).map((val, i) => `item${i}`) const result = [...new Set([...firstArr, ...secondArr])]
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (4)
Previous results
Fork
Test case name
Result
reduce
loop
concat and filter
set
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the provided benchmark and explain what is being tested, the options compared, and their pros and cons. **Benchmark Overview** The benchmark measures how fast JavaScript arrays can be merged and filtered to remove duplicates. The test cases provide different approaches to achieve this: 1. `reduce` 2. `loop` 3. `concat` and `filter` 4. `set` **Options Compared** Each test case uses a different approach to merge and filter the array: * **Reduce**: Uses the `Array.prototype.reduce()` method, which applies a function to each element in the array (in this case, an accumulator) and returns the accumulated value. * **Loop**: Uses a traditional `for` loop to iterate over both arrays and push unique elements into a result array. * **Concat** and **Filter**: Uses the `Array.prototype.concat()` method to merge the two arrays and then uses `Array.prototype.filter()` to remove duplicates based on their index in the concatenated array. * **Set**: Uses the `Set` data structure, which automatically removes duplicates when adding elements. **Pros and Cons of Each Approach** Here's a brief summary of each approach: * **Reduce**: + Pros: Efficient, concise, and easy to read. The accumulator can be used to perform additional operations. + Cons: May not work well with non-numeric or complex data types. * **Loop**: + Pros: Easy to understand for developers familiar with traditional loops. + Cons: Can be slower than other methods due to the overhead of pushing elements into an array and comparing indices. * **Concat** and **Filter**: + Pros: Fast, as `concat()` is optimized for performance. + Cons: May require additional memory allocation and copying of elements between arrays. * **Set**: + Pros: Automatic removal of duplicates, simple to implement. + Cons: May not work well with non-unique data types or large datasets. **Library/Tool Used** The benchmark uses the built-in JavaScript `Array` prototype methods (`reduce()`, `concat()`, and `filter()`), as well as the `Set` data structure. These are all part of the standard JavaScript library, so no external libraries are required. **Special JS Feature/Syntax** None of the test cases use any special JavaScript features or syntax that would require additional explanation. **Alternative Approaches** Other approaches to merging and filtering arrays could include: * Using `Map` data structure: Similar to `Set`, but allows for key-value pairs. * Using `SparseArray`: An optimized version of `Array` for sparse data. * Using third-party libraries like Lodash or Ramda, which provide additional utility functions for array operations. Keep in mind that the choice of approach depends on the specific use case and requirements. The benchmark provides a comparison of popular methods, but other approaches may be more suitable for certain scenarios.
Related benchmarks:
Concat VS Spread operator benchmark
Sorting test
spread/concat large
Merge 3 small arrays
Merging two arrays
Comments
Confirm delete:
Do you really want to delete benchmark?