Toggle navigation
MeasureThat.net
Create a benchmark
Tools
Feedback
FAQ
Register
Log In
unique array 2
(version: 1)
Comparing performance of:
array set unique vs array filter unique
Created:
6 years ago
by:
Registered User
Jump to the latest result
Script Preparation code:
existingIds = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20]; newIds = [5,18,21,22,23,24,25,26,27]
Tests:
array set unique
const resultingArray = Array.from(new Set(existingIds.concat(newIds)))
array filter unique
const resultingArray = existingIds.concat(newIds).filter((id, index, array) => index === array.indexOf(id));
Rendered benchmark preparation results:
Suite status:
<idle, ready to run>
Run tests (2)
Previous results
Fork
Test case name
Result
array set unique
array filter unique
Fastest:
N/A
Slowest:
N/A
Latest run results:
No previous run results
This benchmark does not have any results yet. Be the first one
to run it!
Autogenerated LLM Summary
(model
llama3.2:3b
, generated one year ago):
I'll provide an explanation of the provided benchmark. **Benchmark Overview** The benchmark measures the performance of two different approaches to create a unique array from two existing arrays: `existingIds` and `newIds`. The two approaches are: 1. Using the `Set` data structure: Creates a new set by concatenating the two arrays and then converting it back to an array using `Array.from()`. 2. Filtering duplicates using `filter()`: Concatenates the two arrays and then uses `filter()` to remove duplicate elements. **Approach 1: Using Set** The first approach uses a `Set` data structure to create a unique array. A `Set` in JavaScript is an object that stores unique values, which means that if you try to add a value that already exists in the set, it won't be added again. In this benchmark, the `existingIds` and `newIds` arrays are concatenated using the spread operator (`...`) and then passed to a new `Set()` constructor. The resulting set is then converted back to an array using `Array.from()`. This approach is concise and efficient, as it avoids creating duplicate elements in memory. **Pros:** * Efficient use of memory, as only unique values are stored. * Fast lookup and insertion times for sets. **Cons:** * Requires JavaScript version 5 or higher to support the `Set` data structure. * May not be suitable for very large datasets due to potential memory constraints. **Approach 2: Filtering duplicates using filter()** The second approach uses `filter()` to remove duplicate elements from the concatenated array. The `filter()` method takes a callback function as an argument, which is executed for each element in the array. If the callback returns `true`, the element is included in the new array. In this benchmark, the `filter()` method is used with a callback function that checks if an element exists at both its index and the first occurrence of that element in the original array. If it does, the element is skipped (i.e., not included in the resulting array). **Pros:** * Works on JavaScript versions prior to 5. * Can handle large datasets without significant performance degradation. **Cons:** * Less efficient than using a `Set`, as it requires iterating over all elements in memory. * May have slower lookup and insertion times compared to using a `Set`. **Library/Language Feature:** In this benchmark, the `Set` data structure is used. The `Set` constructor is a built-in JavaScript API that has been available since ECMAScript 5. **Special JS feature/Syntax:** There are no special JavaScript features or syntax used in this benchmark. However, it's worth noting that some modern JavaScript engines may optimize the use of sets and other data structures for better performance. **Other Alternatives:** If you need to create a unique array from two arrays and want to avoid using `Set` or `filter()`, you could consider using other approaches, such as: * Using the `reduce()` method to sum up unique values. * Using the `Map` data structure to store unique values and then converting it back to an array. * Using a custom implementation of a set using arrays or objects. Keep in mind that these alternatives may have different performance characteristics and trade-offs compared to the original approaches.
Related benchmarks:
unique elements in array using filter v2
unique elements in array using filter v2.3
unique elements in array using filter - large array
_.uniq vs Array.from(new Set())
JS fastest unique array Set vs uniq vs filter
Comments
Confirm delete:
Do you really want to delete benchmark?