Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
reduce vs map and set
(version: 0)
get unique values
Comparing performance of:
reduce vs map-set
Created:
3 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var testArray = []; // Create test data with duplicate tokenIds for (let i = 0; i < 10000; i++) { testArray.push({ tokenId: Math.floor(Math.random() * 1000) }); }
Tests:
reduce
console.time('reduce'); var tokenIdsReduce = testArray.reduce((acc, { tokenId }) => { if (!acc.includes(tokenId)) { acc.push(tokenId); } return acc; }, []); console.timeEnd('reduce'); console.log('Token IDs (reduce):', tokenIdsReduce);
map-set
console.time('map-set'); var tokenIdsMapSet = [...new Set(testArray.map(({ tokenId }) => tokenId))]; console.timeEnd('map-set'); console.log('Token IDs (map-set):', tokenIdsMapSet);
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
reduce
map-set
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
**Benchmark Overview** The provided benchmark is designed to compare the performance of two approaches for getting unique values from an array: using the `reduce` method and using the `map` function in conjunction with a `Set` data structure. **Options Compared** Two options are compared: 1. **Reduce Method**: This approach uses the `reduce` method, which accumulates a value by applying a function to each element of the array. 2. **Map-Set Approach**: This approach uses the `map` function to transform each element into a unique token ID, and then creates a new set from the resulting array. The `Set` data structure automatically eliminates duplicates. **Pros and Cons** ### Reduce Method Pros: * Simple and concise implementation * No additional memory allocation required Cons: * Can be slower than the map-set approach due to the overhead of the reduce function * May perform poorly for large arrays or arrays with a high number of duplicates ### Map-Set Approach Pros: * Typically faster than the reduce method, especially for large arrays or arrays with a high number of duplicates * More memory-efficient, as only unique values are stored in the set Cons: * Requires more memory allocation and processing power compared to the reduce method * More complex implementation due to the use of map and set data structures. **Library Used** In both test cases, no external libraries are used. However, the `Set` data structure is a built-in JavaScript object that can be used to eliminate duplicates from an array. **Special JS Feature or Syntax** No special JavaScript features or syntax are used in this benchmark. **Other Alternatives** Alternative approaches for getting unique values from an array include: 1. **Array.prototype.filter**: This method can be used to filter out duplicate elements, but it may not be as efficient as the map-set approach. 2. **Set.fromArray**: This method creates a new set from an array, eliminating duplicates automatically. 3. **Deduplication using a custom implementation**: A custom implementation using a data structure such as a hash table or a trie can also be used to eliminate duplicates. In summary, the benchmark compares two common approaches for getting unique values from an array: using the `reduce` method and using the `map` function in conjunction with a `Set` data structure. The map-set approach is typically faster and more memory-efficient, but requires more complex implementation.
Related benchmarks:
Object spread vs New map
Object spread vs New map entries
Map -> Array
test_spread_vs-map
Testering
Comments
Confirm delete:
Do you really want to delete benchmark?