Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Duplicate to unique array time complexity measure w/ different approaches.
(version: 0)
Comparing performance of:
Solution w/ Array methods vs Solution w/ Map vs Solution w/ Set
Created:
2 years ago
by:
Registered User
Jump to the latest result
Script Preparation code:
function initArr(size) { const array = []; for (let i = 0; i < size; i++) { array.push(Math.floor(Math.random() * size)); } return array; }
Tests:
Solution w/ Array methods
const arr = initArr(10000); const uniqueArray = []; arr.forEach(item => { if (!uniqueArray.includes(item)) { uniqueArray.push(item); } });
Solution w/ Map
const arr = initArr(10000); const map = new Map(); const uniqueArray = arr.map(item => { if (!map.has(item)) { map.set(item, true); return item } });
Solution w/ Set
const arr = initArr(10000); const uniqueArray = Array.from(new Set(arr));
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (3)
Previous results
Fork
Test case name
Result
Solution w/ Array methods
Solution w/ Map
Solution w/ Set
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
2 years ago
)
User agent:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.6.1 Safari/605.1.15
Browser/OS:
Safari 15 on Mac OS X 10.15.7
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
Solution w/ Array methods
18.5 Ops/sec
Solution w/ Map
220.2 Ops/sec
Solution w/ Set
239.7 Ops/sec
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the provided benchmark and explain what's being tested. **Benchmark Overview** The benchmark measures the time complexity of three different approaches to duplicate removal from an array: 1. **Solution w/ Array methods**: This approach uses the `forEach` method and the `includes` method on the `uniqueArray` array. 2. **Solution w/ Map**: This approach uses a `Map` data structure to keep track of unique elements in the `arr` array. 3. **Solution w/ Set**: This approach uses the `Set` data structure from the JavaScript API. **Approach Comparison** Each approach has its pros and cons: * **Array methods (Solution w/ Array methods)**: + Pros: Simple, straightforward implementation; no additional libraries or data structures required. + Cons: The `forEach` method is slow for large arrays, and the `includes` method has a high constant factor due to its linear search algorithm. This approach can be slower than the other two options. * **Map (Solution w/ Map)**: + Pros: Efficient way to store unique elements using hash-based data structure; fast lookups. + Cons: Requires creating an additional `Map` object, which might incur overhead for small arrays or initializations. Also, modern JavaScript engines are often optimized for `Set` operations over `Map` operations. * **Set (Solution w/ Set)**: + Pros: Fast and efficient way to store unique elements using a specialized data structure; fast lookups. + Cons: Requires importing the `Set` class or polyfilling it, which might incur overhead. Additionally, not all browsers support `Set` natively. **Library and Syntax Considerations** * **Map**: The `Map` object is a built-in JavaScript API that allows storing key-value pairs in an ordered dictionary-like structure. It's used here to store unique elements as keys with a boolean value indicating presence or absence. * **Set**: The `Set` object is also a built-in JavaScript API, which provides an efficient way to store unique values using a hash-based data structure. **Test Case Explanation** Each test case measures the execution time of each approach. The results indicate that: * **Solution w/ Set** outperforms both other approaches by a significant margin. * **Solution w/ Map** is slightly faster than **Solution w/ Array methods**, but slower than **Solution w/ Set**. **Other Alternatives** If you need to measure the time complexity of duplicate removal from an array, consider using other data structures or algorithms: * **Using `filter()`**: Another common approach would be to use the `filter()` method on the `arr` array, which has similar performance characteristics to the `Set` approach. * **Using a custom algorithm**: You could also implement a custom algorithm for duplicate removal, such as using a combination of sorting and iterating through the array. Keep in mind that these alternatives may require additional libraries or data structures, and their performance may vary depending on your specific use case.
Related benchmarks:
Already sorted versus random
For vs Min
For vs Min1
Array.Sort vs Math.Min-Max
Set.has v.s Array.includes
Comments
Confirm delete:
Do you really want to delete benchmark?