Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
Set vs Filter for unique
(version: 0)
Comparing performance of:
Set spread vs Array from set vs Filter
Created:
6 years ago
by:
Guest
Jump to the latest result
Script Preparation code:
var array = Array.from({length: 40}, () => Math.floor(Math.random() * 140));
Tests:
Set spread
const f = [... new Set(array)]
Array from set
const s = new Set(array) const l = Array.from(s)
Filter
const b = array.filter((i,index) => array.indexOf(i)=== index)
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (3)
Previous results
Fork
Test case name
Result
Set spread
Array from set
Filter
Fastest:
N/A
Slowest:
N/A
Latest run results:
Run details:
(Test run date:
3 months ago
)
User agent:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.6 Safari/605.1.15
Browser/OS:
Safari 17 on Mac OS X 10.15.7
View result in a separate tab
Embed
Embed Benchmark Result
Test name
Executions per second
Set spread
1244635.6 Ops/sec
Array from set
1040889.6 Ops/sec
Filter
1799138.0 Ops/sec
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
Let's break down the provided benchmark and explain what is being tested. **Benchmark Overview** The benchmark compares three approaches to remove duplicates from an array: `Set` spread, `Array.from(new Set(array))`, and filtering using `array.filter()`. The goal is to determine which approach is the fastest. **Options Compared** 1. **Set Spread**: This method uses the spread operator (`...`) to create a new array with unique values from the original array. 2. **Array.from(new Set(array))**: This method creates a new `Set` object from the array and then converts it back to an array using `Array.from()`. 3. **Filtering using `array.filter()`**: This method uses the `filter()` method to create a new array with only unique values. **Pros and Cons of Each Approach** 1. **Set Spread**: * Pros: concise, efficient, and easy to read. * Cons: may not be as efficient as other methods for very large arrays, as it creates an intermediate array. 2. **Array.from(new Set(array))**: * Pros: can handle very large arrays efficiently, as `Set` operations are O(1) on average. * Cons: requires creating an intermediate `Set` object, which may not be desirable for some use cases. 3. **Filtering using `array.filter()`**: * Pros: flexible and easy to understand, but can be slower than other methods due to the overhead of the `filter()` method. * Cons: may not be as efficient as other methods for very large arrays. **Library Used** In this benchmark, the `Set` data structure is used. A `Set` in JavaScript is a collection of unique values that cannot have duplicate elements. It's implemented as a hash table, which allows for fast lookup and insertion times. **Special JS Feature or Syntax** None mentioned in this benchmark. However, it's worth noting that the `filter()` method uses the Arrow Function syntax (`(i,index) => array.indexOf(i)=== index`), which is a modern JavaScript feature introduced in ECMAScript 2015 (ES6). **Alternative Approaches** Other approaches to remove duplicates from an array include: 1. **Using `reduce()`**: Instead of using `Set`, you can use the `reduce()` method to accumulate unique values. 2. **Using `map()` and `includes()`**: You can use `map()` to create a new array with unique values, and then use `includes()` to check for duplicates. For example: ```javascript const uniqueArray = array.map((value, index) => { if (array.includes(value)) return value; }); ``` However, these approaches may not be as efficient as the ones being compared in this benchmark.
Related benchmarks:
Unique values of array
Filter vs Set (get unique elements)
Filter vs Set (unique elements)
Set from array vs array Filter unique
Set vs Good Filter for unique
Comments
Confirm delete:
Do you really want to delete benchmark?